BOOK
Probability and Random Processes with Applications to Signal Processing
(2014)
Additional Information
Book Details
Abstract
For courses in Probability and Random Processes.
Probability, Statistics, and Random Processes for Engineers, 4e is a comprehensive treatment of probability and random processes that, more than any other available source, combines rigor with accessibility. Beginning with the fundamentals of probability theory and requiring only college-level calculus, the book develops all the tools needed to understand more advanced topics such as random sequences, continuous-time random processes, and statistical signal processing. The book progresses at a leisurely pace, never assuming more knowledge than contained in the material already covered. Rigor is established by developing all results from the basic axioms and carefully defining and discussing such advanced notions as stochastic convergence, stochastic integrals and resolution of stochastic processes.
Table of Contents
Section Title | Page | Action | Price |
---|---|---|---|
Cover | Cover | ||
Contents | 3 | ||
Preface | 11 | ||
1 Introduction to Probability | 13 | ||
1.1 Introduction: Why Study Probability? | 13 | ||
1.2 The Different Kinds of Probability | 14 | ||
Probability as Intuition | 14 | ||
Probability as the Ratio of Favorable to Total Outcomes (Classical Theory) | 15 | ||
Probability as a Measure of Frequency of Occurrence | 16 | ||
Probability Based on an Axiomatic Theory | 17 | ||
1.3 Misuses, Miscalculations, and Paradoxes in Probability | 19 | ||
1.4 Sets, Fields, and Events | 20 | ||
Examples of Sample Spaces | 20 | ||
1.5 Axiomatic Definition of Probability | 27 | ||
1.6 Joint, Conditional, and Total Probabilities; Independence | 32 | ||
Compound Experiments | 35 | ||
1.7 Bayes' Theorem and Applications | 47 | ||
1.8 Combinatorics | 50 | ||
Occupancy Problems | 54 | ||
Extensions and Applications | 58 | ||
1.9 Bernoulli Trials—Binomial and Multinomial Probability Laws | 60 | ||
Multinomial Probability Law | 66 | ||
1.10 Asymptotic Behavior of the Binomial Law: The Poisson Law | 69 | ||
1.11 Normal Approximation to the Binomial Law | 75 | ||
Summary | 77 | ||
Problems | 78 | ||
References | 89 | ||
2 Random Variables | 91 | ||
2.1 Introduction | 91 | ||
2.2 Definition of a Random Variable | 92 | ||
2.3 Cumulative Distribution Function | 95 | ||
Properties of F[sub(X)](x) | 96 | ||
Computation of F[sub(X)](x) | 97 | ||
2.4 Probability Density Function (pdf) | 100 | ||
Four Other Common Density Functions | 107 | ||
More Advanced Density Functions | 109 | ||
2.5 Continuous, Discrete, and Mixed Random Variables | 112 | ||
Some Common Discrete Random Variables | 114 | ||
2.6 Conditional and Joint Distributions and Densities | 119 | ||
Properties of Joint CDF F[sub(XY)](x,y) | 130 | ||
2.7 Failure Rates | 149 | ||
Summary | 153 | ||
Problems | 153 | ||
References | 161 | ||
Additional Reading | 161 | ||
3 Functions of Random Variables | 163 | ||
3.1 Introduction | 163 | ||
Functions of a Random Variable (FRV): Several Views | 166 | ||
3.2 Solving Problems of the Type Y = g(X) | 167 | ||
General Formula of Determining the pdf of Y = g(X) | 178 | ||
3.3 Solving Problems of the Type Z = g(X, Y) | 183 | ||
3.4 Solving Problems of the Type V = g(X, Y), W = h(X, Y) | 205 | ||
Fundamental Problem | 205 | ||
Obtaining f[sub(VW)] Directly from f[sub(XY)] | 208 | ||
3.5 Additional Examples | 212 | ||
Summary | 217 | ||
Problems | 218 | ||
References | 226 | ||
Additional Reading | 226 | ||
4 Expectation and Moments | 227 | ||
4.1 Expected Value of a Random Variable | 227 | ||
On the Validity of Equation 4.1-8 | 230 | ||
4.2 Conditional Expectations | 244 | ||
Conditional Expectation as a Random Variable | 251 | ||
4.3 Moments of Random Variables | 254 | ||
Joint Moments | 258 | ||
Properties of Uncorrelated Random Variables | 260 | ||
Jointly Gaussian Random Variables | 263 | ||
4.4 Chebyshev and Schwarz Inequalities | 267 | ||
Markov Inequality | 269 | ||
The Schwarz Inequality | 270 | ||
4.5 Moment-Generating Functions | 273 | ||
4.6 Chernoff Bound | 276 | ||
4.7 Characteristic Functions | 278 | ||
Joint Characteristic Functions | 285 | ||
The Central Limit Theorem | 288 | ||
4.8 Additional Examples | 293 | ||
Summary | 295 | ||
Problems | 296 | ||
References | 305 | ||
Additional Reading | 306 | ||
5 Random Vectors | 307 | ||
5.1 Joint Distribution and Densities | 307 | ||
5.2 Multiple Transformation of Random Variables | 311 | ||
5.3 Ordered Random Variables | 314 | ||
5.4 Expectation Vectors and Covariance Matrices | 323 | ||
5.5 Properties of Covariance Matrices | 326 | ||
Whitening Transformation | 330 | ||
5.6 The Multidimensional Gaussian (Normal) Law | 331 | ||
5.7 Characteristic Functions of Random Vectors | 340 | ||
Properties of CF of Random Vectors | 342 | ||
The Characteristic Function of the Gaussian (Normal) Law | 343 | ||
Summary | 344 | ||
Problems | 345 | ||
References | 351 | ||
Additional Reading | 351 | ||
6 Statistics: Part 1 Parameter Estimation | 352 | ||
6.1 Introduction | 352 | ||
Independent, Identically, Observations | 353 | ||
Estimation of Probabilities | 355 | ||
6.2 Estimators | 358 | ||
6.3 Estimation of the Mean | 360 | ||
Properties of the Mean-Estimator Function (MEF) | 361 | ||
Procedure for Getting a δ-confidence Interval on the Mean of a Normal Random Variable When σX Is Known | 364 | ||
Confidence Interval for the Mean of a Normal Distribution When σX Is Not Known | 364 | ||
Procedure for Getting a δ-Confidence Interval Based on n Observations on the Mean of a Normal Random Variable when σX Is Not Known | 367 | ||
Interpretation of the Confidence Interval | 367 | ||
6.4 Estimation of the Variance and Covariance | 367 | ||
Confidence Interval for the Variance of a Normal Random variable | 369 | ||
Estimating the Standard Deviation Directly | 371 | ||
Estimating the covariance | 372 | ||
6.5 Simultaneous Estimation of Mean and Variance | 373 | ||
6.6 Estimation of Non-Gaussian Parameters from Large Samples | 375 | ||
6.7 Maximum Likelihood Estimators | 377 | ||
6.8 Ordering, more on Percentiles, Parametric Versus Nonparametric Statistics | 381 | ||
The Median of a Population Versus Its Mean | 383 | ||
Parametric versus Nonparametric Statistics | 384 | ||
Confidence Interval on the Percentile | 385 | ||
Confidence Interval for the Median When n Is Large | 387 | ||
6.9 Estimation of Vector Means and Covariance Matrices | 388 | ||
Estimation of μ | 389 | ||
Estimation of the covariance K | 390 | ||
6.10 Linear Estimation of Vector Parameters | 392 | ||
Summary | 396 | ||
Problems | 396 | ||
References | 400 | ||
Additional Reading | 401 | ||
7 Statistics: Part 2 Hypothesis Testing | 402 | ||
7.1 Bayesian Decision Theory | 403 | ||
7.2 Likelihood Ratio Test | 408 | ||
7.3 Composite Hypotheses | 414 | ||
Generalized Likelihood Ratio Test (GLRT) | 415 | ||
How Do We Test for the Equality of Means of Two Populations? | 420 | ||
Testing for the Equality of Variances for Normal Populations: The F-test | 424 | ||
Testing Whether the Variance of a Normal Population Has a Predetermined Value | 428 | ||
7.4 Goodness of Fit | 429 | ||
7.5 Ordering, Percentiles, and Rank | 435 | ||
How Ordering is Useful in Estimating Percentiles and the Median | 437 | ||
Confidence Interval for the Median When n Is Large | 440 | ||
Distribution-free Hypothesis Testing: Testing If Two Population are the Same Using Runs | 441 | ||
Ranking Test for Sameness of Two Populations | 444 | ||
Summary | 445 | ||
Problems | 445 | ||
References | 451 | ||
8 Random Sequences | 453 | ||
8.1 Basic Concepts | 454 | ||
Infinite-length Bernoulli Trials | 459 | ||
Continuity of Probability Measure | 464 | ||
Statistical Specification of a Random Sequence | 466 | ||
8.2 Basic Principles of Discrete-Time Linear Systems | 483 | ||
8.3 Random Sequences and Linear Systems | 489 | ||
8.4 WSS Random Sequences | 498 | ||
Power Spectral Density | 501 | ||
Interpretation of the psd | 502 | ||
Synthesis of Random Sequences and Discrete-Time Simulation | 505 | ||
Decimation | 508 | ||
Interpolation | 509 | ||
8.5 Markov Random Sequences | 512 | ||
ARMA Models | 515 | ||
Markov Chains | 516 | ||
8.6 Vector Random Sequences and State Equations | 523 | ||
8.7 Convergence of Random Sequences | 525 | ||
8.8 Laws of Large Numbers | 533 | ||
Summary | 538 | ||
Problems | 538 | ||
References | 553 | ||
9 Random Processes | 555 | ||
9.1 Basic Definitions | 556 | ||
9.2 Some Important Random Processes | 560 | ||
Asynchronous Binary Signaling | 560 | ||
Poisson Counting Process | 562 | ||
Alternative Derivation of Poisson Process | 567 | ||
Random Telegraph Signal | 569 | ||
Digital Modulation Using Phase-Shift Keying | 570 | ||
Wiener Process or Brownian Motion | 572 | ||
Markov Random Processes | 575 | ||
Birth–Death Markov Chains | 579 | ||
Chapman–Kolmogorov Equations | 583 | ||
Random Process Generated from Random Sequences | 584 | ||
9.3 Continuous-Time Linear Systems with Random Inputs | 584 | ||
White Noise | 589 | ||
9.4 Some Useful Classifications of Random Processes | 590 | ||
Stationarity | 591 | ||
9.5 Wide-Sense Stationary Processes and LSI Systems | 593 | ||
Wide-Sense Stationary Case | 594 | ||
Power Spectral Density | 596 | ||
An Interpretation of the psd | 598 | ||
More on White Noise | 602 | ||
Stationary Processes and Differential Equations | 608 | ||
9.6 Periodic and Cyclostationary Processes | 612 | ||
9.7 Vector Processes and State Equations | 618 | ||
State Equations | 620 | ||
Summary | 623 | ||
Problems | 623 | ||
References | 645 | ||
10 Advanced Topics in Random Processes | 647 | ||
10.1 Mean-Square (m.s.) Calculus | 647 | ||
Stochastic Continuity and Derivatives [10-1] | 647 | ||
Further Results on m.s. Convergence [10-1] | 657 | ||
10.2 Mean-Square Stochastic Integrals | 662 | ||
10.3 Mean-Square Stochastic Differential Equations | 665 | ||
10.4 Ergodicity [10-3] | 670 | ||
10.5 Karhunen—Loéve Expansion [10-5] | 677 | ||
10.6 Representation of Bandlimited and Periodic Processes | 683 | ||
Bandlimited Processes | 683 | ||
Bandpass Random Processes | 686 | ||
WSS Periodic Processes | 689 | ||
Fourier Series for WSS Processes | 692 | ||
Summary | 694 | ||
Appendix: Integral Equations | 694 | ||
Existence Theorem | 695 | ||
Problems | 698 | ||
References | 711 | ||
11 Applications to Statistical Signal Processing | 712 | ||
11.1 Estimation of Random Variables and Vectors | 712 | ||
More on the Conditional Mean | 718 | ||
Orthogonality and Linear Estimation | 720 | ||
Some Properties of the Operator & | 728 | ||
11.2 Innovation Sequences and Kalman Filtering | 730 | ||
Predicting Gaussian Random Sequences | 734 | ||
Kalman Predictor and Filter | 736 | ||
Error-Covariance Equations | 741 | ||
11.3 Wiener Filters for Random Sequences | 745 | ||
Unrealizable Case (Smoothing) | 746 | ||
Causal Wiener Filter | 748 | ||
11.4 Expectation-Maximization Algorithm | 750 | ||
Log-likelihood for the Linear Transformation | 752 | ||
Summary of the E-M algorithm | 754 | ||
E-M Algorithm for Exponential Probability Functions | 755 | ||
Application to Emission Tomography | 756 | ||
Log-likelihood Function of Complete Data | 758 | ||
E-step | 759 | ||
M-step | 760 | ||
11.5 Hidden Markov Models (HMM) | 761 | ||
Specification of an HMM | 763 | ||
Application to Speech Processing | 765 | ||
Efficient Computation of P[E|M] with a Recursive Algorithm | 766 | ||
Viterbi Algorithm and the Most Likely State Sequence for the Observations | 768 | ||
11.6 Spectral Estimation | 771 | ||
The Periodogram | 772 | ||
Bartlett's Procedure–Averaging Periodograms | 774 | ||
Parametric Spectral Estimate | 779 | ||
Maximum Entropy Spectral Density | 781 | ||
11.7 Simulated Annealing | 784 | ||
Gibbs Sampler | 785 | ||
Noncausal Gauss–Markov Models | 786 | ||
Compound Markov Models | 790 | ||
Gibbs Line Sequence | 791 | ||
Summary | 795 | ||
Problems | 795 | ||
References | 800 | ||
Appendix A: Review of Relevant Mathematics | A-1 | ||
A.1 Basic Mathematics | A-1 | ||
Sequences | A-1 | ||
Convergence | A-2 | ||
Summations | A-3 | ||
Z-Transform | A-3 | ||
A.2 Continuous Mathematics | A-4 | ||
Definite and Indefinite Integrals | A-5 | ||
Differentiation of Integrals | A-6 | ||
Integration by Parts | A-7 | ||
Completing the Square | A-7 | ||
Double Integration | A-8 | ||
Functions | A-8 | ||
A.3 Residue Method for Inverse Fourier Transformation | A-10 | ||
Fact | A-11 | ||
Inverse Fourier Transform for psd of Random Sequence | A-13 | ||
A.4 Mathematical Induction | A-17 | ||
References | A-17 | ||
Appendix B: Gamma and Delta Functions | B-1 | ||
B.1 Gamma Function | B-1 | ||
B.2 Incomplete Gamma Function | B-2 | ||
B.3 Dirac Delta Function | B-2 | ||
References | B-5 | ||
Appendix C: Functional Transformations and Jacobians | C-1 | ||
C.1 Introduction | C-1 | ||
C.2 Jacobians for n = 2 | C-2 | ||
C.3 Jacobian for General n | C-4 | ||
Appendix D: Measure and Probability | D-1 | ||
D.1 Introduction and Basic Ideas | D-1 | ||
Measurable Mappings and Functions | D-3 | ||
D.2 Application of Measure Theory to Probability | D-3 | ||
Distribution Measure | D-4 | ||
Appendix E: Sampled Analog Waveforms and Discrete-time Signals | E-1 | ||
Appendix F: Independence of Sample Mean and Variance for Normal Random Variables | F-1 | ||
Appendix G: Tables of Cumulative Distribution Functions: the Normal, Student t, Chi-square, and F | G-1 | ||
Index | I-1 | ||
A | I-1 | ||
B | I-1 | ||
C | I-2 | ||
D | I-3 | ||
E | I-3 | ||
F | I-4 | ||
G | I-4 | ||
H | I-4 | ||
I | I-4 | ||
J | I-5 | ||
K | I-5 | ||
L | I-5 | ||
M | I-5 | ||
N | I-6 | ||
O | I-6 | ||
P | I-7 | ||
Q | I-8 | ||
R | I-8 | ||
S | I-8 | ||
T | I-9 | ||
U | I-9 | ||
V | I-9 | ||
W | I-10 | ||
Y | I-10 | ||
Z | I-10 |