Bohrium
robot
新建

空间站广场

论文
Notebooks
比赛
课程
Apps
我的主页
我的Notebooks
我的论文库
我的足迹

我的工作空间

任务
节点
文件
数据集
镜像
项目
数据库
公开
Independent Processes in Signal Processing
English
Signal Processing
Probability
EnglishSignal ProcessingProbability
AnguseZhang
发布于 2023-07-29
推荐镜像 :Basic Image:bohrium-notebook:2023-04-07
推荐机型 :c2_m4_cpu
赞 2
Independent Processes in Signal Processing
Definition
Example - Comparison of bivariate PDF and product of marginal PDFs
Independence versus Uncorrelatedness
Independence versus Orthogonality
Example - Computation of cross-correlation by dot product

Independent Processes in Signal Processing

This jupyter notebook is part of a collection of notebooks on various topics of Digital Signal Processing. Please direct questions and suggestions to Sascha.Spors@uni-rostock.de.

The independence of random signals is a desired property in many applications of statistical signal processing, as well as uncorrelatedness and orthogonality. The concept of independence is introduced in the following together with a discussion of the links to uncorrelatedness and orthogonality.

代码
文本

Definition

Two stochastic events are said to be independent if the probability of occurrence of one event is not affected by the occurrence of the other event. Or more specifically, if their joint probability equals the product of their individual probabilities. In terms of the bivariate probability density function (PDF) of two continuous-amplitude real-valued random processes and this reads

where and denote the univariate (marginal) PDFs of the random processes for the time-instances and , respectively. The bivariate PDF of two independent random processes is given by the multiplication of their univariate PDFs. It follows that the second-order ensemble average for a linear mapping is given as

The linear second-order ensemble average of two independent random signals is equal to the multiplication of their linear first-order ensemble averages. For jointly wide-sense stationary (WSS) processes, the bivariate PDF does only depend on the difference of the time instants. Hence, two jointly WSS random signals are independent if

Above bivariate PDF is rewritten using the definition of conditional probabilities in order to specialize the definition of independence to one WSS random signal

where denotes the conditional probability that takes the amplitude value under the condition that takes the amplitude value . Under the assumption that and substituting and by and , independence for one random signal is defined as

since the conditional probability for since this represents a sure event. The bivariate PDF of an independent random signal is equal to the product of the univariate PDFs of the signal and the time-shifted signal for . A random signal for which this condition does not hold shows statistical dependencies between samples. These dependencies can be exploited for instance for coding or prediction.

代码
文本

Example - Comparison of bivariate PDF and product of marginal PDFs

The following example estimates the bivariate PDF of a WSS random signal by computing its two-dimensional histogram. The univariate PDFs and are additionally estimated. Both the estimated bivariate PDF and the product of the two univariate PDFs are plotted for different .

代码
文本
[1]
import numpy as np
import matplotlib.pyplot as plt

N = 10000000 # number of random samles
M = 50 # number of bins for bivariate/marginal histograms


def compute_plot_histograms(kappa):
'''Estimate and plot bivariate/product PDFs for given shift'''

# shift signal
x2 = np.concatenate((x1[kappa:], np.zeros(kappa)))

# compute bivariate and marginal histograms
pdf_xx, x1edges, x2edges = np.histogram2d(x1, x2, bins=(
M, M), range=((-1.5, 1.5), (-1.5, 1.5)), density=True)
pdf_x1, _ = np.histogram(x1, bins=M, range=(-1.5, 1.5), density=True)
pdf_x2, _ = np.histogram(x2, bins=M, range=(-1.5, 1.5), density=True)

# plot results
fig = plt.figure(figsize=(10, 10))

plt.subplot(121, aspect='equal')
plt.pcolormesh(x1edges, x2edges, pdf_xx)
plt.xlabel(r'$\theta_1$')
plt.ylabel(r'$\theta_2$')
plt.title(r'Bivariate PDF $p_{{xy}}(\theta_1, \theta_2, \kappa)$')
plt.colorbar(fraction=0.046)

plt.subplot(122, aspect='equal')
plt.pcolormesh(x1edges, x2edges, np.outer(pdf_x1, pdf_x2))
plt.xlabel(r'$\theta_1$')
plt.ylabel(r'$\theta_2$')
plt.title(r'Product of PDFs $p_x(\theta_1) \cdot p_x(\theta_2, \kappa)$')
plt.colorbar(fraction=0.046)

fig.suptitle('Shift $\kappa =$ {:<2.0f}'.format(kappa), y=0.72)
fig.tight_layout()


# generate signal
x = np.random.normal(size=N)
x1 = np.convolve(x, [1, .5, .3, .7, .3], mode='same')

# compute and plot the PDFs for various shifts
compute_plot_histograms(0)
compute_plot_histograms(2)
compute_plot_histograms(20)
代码
文本

Exercise

  • With the given results, how can you evaluate the independence of the random signal?
  • Can the random signal assumed to be independent?

Solution: According to the definition of independence, the bivariate PDF and the product of the univariate PDFs has to be equal for . This is obviously not the case for . Hence, the random signal is not independent in a strict sense. However for the condition for independence is sufficiently fulfilled, considering the statistical uncertainty due to a finite number of samples.

代码
文本

Independence versus Uncorrelatedness

Two continuous-amplitude real-valued jointly WSS random processes and are termed as uncorrelated if their cross-correlation function (CCF) is equal to the product of their linear means, . If two random signals are independent then they are also uncorrelated. This can be proven by introducing above findings for the linear second-order ensemble average of independent random signals into the definition of the CCF

where the last equality is a consequence of the assumed wide-sense stationarity. The reverse, that two uncorrelated signals are also independent does not hold in general from this result.

The auto-correlation function (ACF) of an uncorrelated signal is given as . Introducing the definition of independence into the definition of the ACF yields

where the result for follows from the bivariate PDF of an independent signal, as derived above. It can be concluded from this result that an independent random signal is also uncorrelated. The reverse, that an uncorrelated signal is independent does not hold in general.

代码
文本

Independence versus Orthogonality

In geometry, two vectors are said to be orthogonal if their dot product equals zero. This definition is frequently applied to finite-length random signals by interpreting them as vectors. The relation between independence, correlatedness and orthogonality is derived in the following.

Let's assume two continuous-amplitude real-valued jointly wide-sense ergodic random signals and with finite lengths and , respectively. The CCF between both can be reformulated as follows

where denotes the dot product. The vector is defined as

where denotes the zero vector of length . The vector is defined as

It follows from above definition of orthogonality that two finite-length random signals are orthogonal if their CCF is zero. This implies that at least one of the two signals has to be mean free. It can be concluded further that two independent random signals are also orthogonal and uncorrelated if at least one of them is mean free. The reverse, that orthogonal signals are independent, does not hold in general.

The concept of orthogonality can also be extended to one random signal by setting . Since a random signal cannot be orthogonal to itself for , the definition of orthogonality has to be extended for this case. According to the ACF of a mean-free uncorrelated random signal , self-orthogonality may be defined as

An independent random signal is also orthogonal if it is zero-mean. The reverse, that an orthogonal signal is independent does not hold in general.

代码
文本

Example - Computation of cross-correlation by dot product

This example illustrates the computation of the CCF by the dot product. First, a function is defined which computes the CCF by means of the dot product

代码
文本
[2]
def ccf_by_dotprod(x, y):
'''Computes the CCF by the dot product.'''

N = len(x)
M = len(y)
xN = np.concatenate((np.zeros(M-1), x, np.zeros(M-1)))
yM = np.concatenate((y, np.zeros(N+M-2)))

return np.fromiter([np.dot(xN, np.roll(yM, kappa)) for kappa in range(N+M-1)], float)
代码
文本

Now the CCF is computed using different methods: computation by the dot product and by the built-in correlation function. The CCF is plotted for the computation by the dot product, as well as the difference (magnitude) between both methods. The resulting difference is in the typical expected range due to numerical inaccuracies.

代码
文本
[3]
N = 32 # length of signals

# generate signals
np.random.seed(1)
x = np.random.normal(size=N)
y = np.convolve(x, [1, .5, .3, .7, .3], mode='same')

# compute CCF
ccf1 = 1/N * np.correlate(x, y, mode='full')
ccf2 = 1/N * ccf_by_dotprod(x, y)
kappa = np.arange(-N+1, N)

# plot results
plt.figure(figsize=(10, 4))

plt.subplot(121)
plt.stem(kappa, ccf1, use_line_collection=True)
plt.xlabel('$\kappa$')
plt.ylabel(r'$\varphi_{xy}[\kappa]$')
plt.title('CCF by dot product')
plt.grid()

plt.subplot(122)
plt.stem(kappa, np.abs(ccf1-ccf2), use_line_collection=True)
plt.xlabel('$\kappa$')
plt.title('Difference (magnitude)')
plt.tight_layout()
/tmp/ipykernel_94/1384516438.py:17: MatplotlibDeprecationWarning: The 'use_line_collection' parameter of stem() was deprecated in Matplotlib 3.6 and will be removed two minor releases later. If any parameter follows 'use_line_collection', they should be passed as keyword, not positionally.
  plt.stem(kappa, ccf1, use_line_collection=True)
/tmp/ipykernel_94/1384516438.py:24: MatplotlibDeprecationWarning: The 'use_line_collection' parameter of stem() was deprecated in Matplotlib 3.6 and will be removed two minor releases later. If any parameter follows 'use_line_collection', they should be passed as keyword, not positionally.
  plt.stem(kappa, np.abs(ccf1-ccf2), use_line_collection=True)
代码
文本

Copyright

This notebook is provided as Open Educational Resource. Feel free to use the notebook for your own purposes. The text is licensed under Creative Commons Attribution 4.0, the code of the IPython examples under the MIT license. Please attribute the work as follows: Sascha Spors, Digital Signal Processing - Lecture notes featuring computational examples.

代码
文本
English
Signal Processing
Probability
EnglishSignal ProcessingProbability
已赞2
推荐阅读
公开
Digital Sigal Processing Lecture's Introduction
EnglishnotebookSignal Processing
EnglishnotebookSignal Processing
喇叭花
发布于 2023-08-02
公开
Digital-Signal Processing_02_Spectral Analysis of Deterministic Signals_Summary
EnglishnotebookSignal Processing
EnglishnotebookSignal Processing
喇叭花
发布于 2023-08-02
1 转存文件