Probability, Statistics, and Random Processes For Electrical Engineering, 3rd edition

eText
$54.99Digital book that you can read instantly


Instant access$54.99
Probability, Statistics, and Random Processes For Electrical Engineering (Subscription)
ISBN13: 9780133002577
eText


Print
$207.99Hardcover, paperback, or looseleaf


Free deliveryWas$259.99 Now$207.99
Probability, Statistics, and Random Processes For Electrical Engineering
ISBN13: 9780131471221
Paperback

Overview
This is the standard textbook for courses on probability and statistics, not substantially updated. While helping students to develop their problemsolving skills, the author motivates students with practical applications from various areas of ECE that demonstrate the relevance of probability theory to engineering practice. Included are chapter overviews, summaries, checklists of important terms, annotated references, and a wide selection of fully workedout realworld examples. In this edition, the Computer Methods sections have been updated and substantially enhanced and new problems have been added.
Table of contents
1. Probability Models in Electrical and Computer Engineering.
2. Basic Concepts of Probability Theory.
3. Random Variables.
4. Multiple Random Variables.
5. Sums of Random Variables and LongTerm Averages.
6. Random Processes.
7. Analysis and Processing of Random Signals.
8. Markov Chains.
9. Introduction to Queueing Theory.
Appendix A. Mathematical Tables.
Appendix B. Tables of Fourier Transformation.
Appendix C. Computer Programs for Generating Random Variables.
Answers to Selected Problems.
Index.
Mathematical models as tools in analysis and design. Deterministic models. Probability models.
Statistical regularity. Properties of relative frequency. The axiomatic approach to a theory of probability. Building a probability model.
A detailed example: a packet voice transmission system. Other examples.
Communication over unreliable channels. Processing of random signals. Resource sharing systems. Reliability of systems.
Overview of book. Summary. Problems.
2. Basic Concepts of Probability Theory.
Specifying random experiments.
The sample space. Events. Set operations.
The axioms of probability.
Discrete sample spaces. Continuous sample spaces.
Computing probabilities using counting methods.
Sampling with replacement and with ordering. Sampling without replacement and with ordering. Permutations of n distinct objects. Sampling without replacement and without ordering. Sampling with replacement and without ordering.
Conditional probability.
Bayes' Rule.
Independence of events. Sequential experiments.
Sequences of independent experiments. The binomial probability law. The multinomial probability law. The geometric probability law. Sequences of dependent experiments.
A computer method for synthesizing randomness: random number generators. Summary. Problems.
3. Random Variables.
The notion of a random variable. The cumulative distribution function.
The three types of random variables.
The probability density function.
Conditional cdf's and pdf's.
Some important random variables.
Discrete random variables. Continuous random variables.
Functions of a random variable. The expected value of random variables.
The expected value of X. The expected value of Y = g(X). Variance of X.
The Markov and Chebyshev inequalities. Testing the fit of a distribution to data. Transform methods.
The characteristic function. The probability generating function. The laplace transform of the pdf.
Basic reliability calculations.
The failure rate function. Reliability of systems.
Computer methods for generating random variables.
The transformation method. The rejection method. Generation of functions of a random variable. Generating mixtures of random variables.
Entropy.
The entropy of a random variable. Entropy as a measure of information. The method of a maximum entropy.
Summary. Problems.
4. Multiple Random Variables.
Vector random variables.
Events and probabilities. Independence.
Pairs of random variables.
Pairs of discrete random variables. The joint cdf of X and Y. The joint pdf of two jointly continuous random variables. Random variables that differ in type.
Independence of two random variables. Conditional probability and conditional expectation.
Conditional probability. Conditional expectation.
Multiple random variables.
Joint distributions. Independence.
Functions of several random variables.
One function of several random variables. Transformation of random vectors. pdf of linear transformations. pdf of general transformations.
Expected value of functions of random variables.
The correlation and covariance of two random variables. Joint characteristic function.
Jointly Gaussian random variables.
n jointly Gaussian random variables. Linear transformation of Gaussian random variables. Joint characteristic function of Gaussian random variables.
Mean square estimation.
Linear prediction.
Generating correlated vector random variables.
Generating vectors of random variables with specified covariances. Generating vectors of jointly Gaussian random variables.
Summary. Problems.
5. Sums of Random Variables and LongTerm Averages.
Sums of random variables.
Mean and variance of sums of random variables. pdf of sums of independent random variables. Sum of a random number of random variables.
The sample mean and the laws of large numbers. The central limit theorem.
Gaussian approximation for binomial probabilities. Proof of the central limit theorem.
Confidence intervals.
Case 1: Xj's Gaussian; unknown mean and known variance. Case 2: Xj's Gaussian; mean and variance unknown. Case 3: Xj's NonGaussian; mean and variance unknown.
Convergence of sequences of random variables. Longterm arrival rates and associated averages. Longterm time averages. A computer method for evaluating the distribution of a random variable using the discrete Fourier transform. Discrete random variables. Continuous random variables. Summary. Problems. Appendix: subroutine FFT(A,M,N).
6. Random Processes.
Definition of a random process. Specifying of a random process.
Joint distributions of time samples. The mean, autocorrelation, and autocovariance functions. Gaussian random processes. Multiple random processes.
Examples of discretetime random processes.
iid random processes. Sum processes; the binomial counting and random walk processes.
Examples of continuoustime random processes.
Poisson process. Random telegraph signal and other processes derived from the Poisson Process. Wiener process and Brownian motion.
Stationary random processes.
Widesense stationary random processes. Widesense stationary Gaussian random processes. Cylostationary random processes.
Continuity, derivative, and integrals of random processes.
Mean square continuity. Mean square derivatives. Mean square integrals. Response of a linear system to random input.
Time averages of random processes and ergodic theorems. Fourier series and KarhunenLoeve expansion.
KarhunenLoeve expansion.
Summary. Problems.
7. Analysis and Processing of Random Signals.
Power spectral density.
Continuoustime random processes. Discretetime random processes. Power spectral density as a time average.
Response of linear systems to random signals.
Continuoustime systems. Discretetime systems.
Amplitude modulation by random signals. Optimum linear systems.
The orthogonality condition. Prediction. Estimation using the entire realization of the observed process. Estimation using causal filters.
The Kalman filter. Estimating the power spectral density.
Variance of periodogram estimate. Smoothing of periodogram estimate.
Summary. Problems.
8. Markov Chains.
Markov processes. Discretetime Markov chains.
The nstep transition probabilities. The state probabilities. Steady state probabilities.
Continuoustime Markov chains.
State occupancy times. Transition rates and timedependent state probabilities. Steady state probabilities and global balance equations.
Classes of states, recurrence properties, and limiting probabilities.
Classes of states. Recurrence properties. Limiting probabilities. Limiting probabilities for continuoustime Markov chains.
Timereversed Markov chains.
Timereversible Markov chains. Timereversible continuoustime Markov chains.
Summary. Problems.
9. Introduction to Queueing Theory.
The elements of a queueing system. Little's formula. The M/M/I queue.
Distribution of number in the system. Delay distribution in M/M/I system and arriving customer's distribution. The M/M/I system with finite capacity.
Multiserver systems: M/M/c, M/M/c/c, and M/M/infinity.
Distribution of number in the M/M/c system. Waiting time distribution for M/M/c. The M/M/c/c queueing system. The M/M/infinity queueing system.
Finitesource queueing systems.
Arriving customer's distribution.
M/G/I queueing systems.
The residual service time. Mean delay in M/G/I systems. Mean delay in M/G/I systems with priority service discipline.
M/G/I analysis using embedded Markov chains.
The embedded Markov chains. The number of customers in an M/G/I system. Delay and waiting time distribution in an M/G/I system.
Burke's theorem: Departures from M/M/c systems Proof of Burke's theorem using time reversibility. Networks of queues: Jackson's theorem.
Open networks of queues. Proof of Jackson's theorem. Closed networks of queues. Mean value analysis. Proof of the arrival theorem.
Summary. Problems.
Appendix A. Mathematical Tables.
Appendix B. Tables of Fourier Transformation.
Appendix C. Computer Programs for Generating Random Variables.
Answers to Selected Problems.
Index.
For teachers
All the material you need to teach your courses.
Discover teaching materialPublished by Pearson (December 28th 2007)  Copyright © 2008