×
In real life, there is no such thing as second place. Either you are a winner, or you’re not.
--Your friends at LectureNotes
Close

Note for Digital Signal Processing - DSP By Dipankar Mahato

  • Digital Signal Processing - DSP
  • Note
  • Computer Science Engineering
  • 16 Views
  • Uploaded 11 months ago
0 User(s)
Download PDF

Share it with your friends

Leave your Comments

Text from page-2

Digital Signal Processing with Python Programming Maurice Charbit

Text from page-3

First published 2017 in Great Britain and the United States by ISTE Ltd and John Wiley & Sons, Inc. Apart from any fair dealing for the purposes of research or private study, or criticism or review, as permitted under the Copyright, Designs and Patents Act 1988, this publication may only be reproduced, stored or transmitted, in any form or by any means, with the prior permission in writing of the publishers, or in the case of reprographic reproduction in accordance with the terms and licenses issued by the CLA. Enquiries concerning reproduction outside these terms should be sent to the publishers at the undermentioned address: ISTE Ltd 27-37 St George’s Road London SW19 4EU UK John Wiley & Sons, Inc. 111 River Street Hoboken, NJ 07030 USA www.iste.co.uk www.wiley.com © ISTE Ltd 2017 The rights of Maurice Charbit to be identified as the author of this work have been asserted by him in accordance with the Copyright, Designs and Patents Act 1988. Library of Congress Control Number: 2016955620 British Library Cataloguing-in-Publication Data A CIP record for this book is available from the British Library ISBN 978-1-78630-126-0

Text from page-4

Contents Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix Notations and Abbreviations . . . . . . . . . . . . . . . . . . . . . . . . . . xi A Few Functions of Python® . . . . . . . . . . . . . . . . . . . . . . . . . . xiii Chapter 1. Useful Maths . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1. Basic concepts on probability . . . . . . . . 1.2. Conditional expectation . . . . . . . . . . . . 1.3. Projection theorem . . . . . . . . . . . . . . . 1.3.1. Conditional expectation . . . . . . . . . . 1.4. Gaussianity . . . . . . . . . . . . . . . . . . . 1.4.1. Gaussian random variable . . . . . . . . 1.4.2. Gaussian random vectors . . . . . . . . . 1.4.3. Gaussian conditional distribution . . . . 1.5. Random variable transformation . . . . . . . 1.5.1. General expression . . . . . . . . . . . . 1.5.2. Law of the sum of two random variables 1.5.3. δ-method . . . . . . . . . . . . . . . . . . 1.6. Fundamental theorems of statistics . . . . . 1.7. A few probability distributions . . . . . . . . Chapter 2. Statistical Inferences . . . . . . . . . . . . . . 1 10 11 14 14 14 15 16 18 18 19 20 22 24 . . . . . . . . . . . . . . . . . . . . . . . 29 2.1. First step: visualizing data . . . . . 2.1.1. Scatter plot . . . . . . . . . . . . 2.1.2. Histogram/boxplot . . . . . . . . 2.1.3. Q-Q plot . . . . . . . . . . . . . 2.2. Reduction of dataset dimensionality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 29 30 32 34

Text from page-5

vi Digital Signal Processing with Python Programming 2.2.1. PCA . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.2.2. LDA . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.3. Some vocabulary . . . . . . . . . . . . . . . . . . . . . . . 2.3.1. Statistical inference . . . . . . . . . . . . . . . . . . . 2.4. Statistical model . . . . . . . . . . . . . . . . . . . . . . . 2.4.1. Notation . . . . . . . . . . . . . . . . . . . . . . . . . 2.5. Hypothesis testing . . . . . . . . . . . . . . . . . . . . . . 2.5.1. Simple hypotheses . . . . . . . . . . . . . . . . . . . . 2.5.2. Generalized likelihood ratio test (GLRT) . . . . . . . 2.5.3. χ2 goodness-of-fit test . . . . . . . . . . . . . . . . . 2.6. Statistical estimation . . . . . . . . . . . . . . . . . . . . 2.6.1. General principles . . . . . . . . . . . . . . . . . . . . 2.6.2. Least squares method . . . . . . . . . . . . . . . . . . 2.6.3. Least squares method for the linear model . . . . . . 2.6.4. Method of moments . . . . . . . . . . . . . . . . . . . 2.6.5. Maximum likelihood approach . . . . . . . . . . . . . 2.6.6. Logistic regression . . . . . . . . . . . . . . . . . . . 2.6.7. Non-parametric estimation of probability distribution 2.6.8. Bootstrap and others . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 . 36 . 40 . 40 . 41 . 42 . 43 . 45 . 50 . 57 . 58 . 58 . 62 . 64 . 81 . 84 . 100 . 103 . 107 Chapter 3. Inferences on HMM . . . . . . . . . . . . . . . . . . . . . . . . . 113 3.1. Hidden Markov models (HMM) . . . . . . . . . . 3.2. Inferences on HMM . . . . . . . . . . . . . . . . . 3.3. Filtering: general case . . . . . . . . . . . . . . . . 3.4. Gaussian linear case: Kalman algorithm . . . . . 3.4.1. Kalman filter . . . . . . . . . . . . . . . . . . . 3.4.2. RTS smoother . . . . . . . . . . . . . . . . . . 3.5. Discrete finite Markov case . . . . . . . . . . . . . 3.5.1. Forward-backward formulas . . . . . . . . . . 3.5.2. Smoothing formula at one instant . . . . . . . 3.5.3. Smoothing formula at two successive instants 3.5.4. HMM learning using the EM algorithm . . . . 3.5.5. The Viterbi algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 116 117 118 118 127 129 130 133 134 135 137 Chapter 4. Monte-Carlo Methods . . . . . . . . . . . . . . . . . . . . . . . 141 4.1. Fundamental theorems . . . . . . . . . . . . . 4.2. Stating the problem . . . . . . . . . . . . . . . 4.3. Generating random variables . . . . . . . . . . 4.3.1. The cumulative function inversion method 4.3.2. The variable transformation method . . . . 4.3.3. Acceptance-rejection method . . . . . . . . 4.3.4. Sequential methods . . . . . . . . . . . . . 4.4. Variance reduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141 141 144 144 147 149 151 156

Lecture Notes