### Publication Type:

Conference Paper
### Source:

27th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, New York, USA (2007)
### Abstract:

Analytical descriptions of the statistics of wireless channel models are desirable tools for communication systems engineering. When multiple antennas are available at the transmit and/or the receive side (the Multiple-Input Multiple-Output, or MIMO, case), the statistics of the matrix H representing the gains between the antennas of a transmit and a receive antenna array, and in particular the correlation between its coefficients, are known to be of paramount importance for the design of such systems. However these characteristics depend on the operating environment, since the electromagnetic propagation paths are dictated by the surroundings of the antenna arrays, and little knowledge about these is available at the time of system design.

An approach using the Maximum Entropy principle to derive probability density functions for the channelmatrix, based on various degrees of knowledge about the environment, is presented. The general idea is to apply the maximum entropy principle to obtain the distribution of each parameter of interest (e.g. correlation), and then to marginalize them out to obtain the full channel distribution. It was shown in previous works, using sophisticated integrals from statistical physics, that by using the full spatial correlation matrix E as the intermediate modeling parameter, this method can yield surprisingly concise channel descriptions. In this case, the joint probability density function is shown to be merely a function of the Frobenius norm of the channel matrix |H|F.

In the present paper, we investigate the case where information about the average covariance matrix is available (e.g. through measurements). The maximum entropy distribution of the covariance is derived under this constraint. Furthermore, we consider also the doubly correlated case, where the intermediate modeling parameters are chosen as the transmit- and receive-side channel covariance matrices. We compare the maximum-entropy result obtained in this case with the well-known Kronecker model, and derive the channel probability distribution function in the case of single-side correlation constraint.