Adaptive High-Resolution Sensor Waveform Design for Tracking


Book Description

Recent innovations in modern radar for designing transmitted waveforms, coupled with new algorithms for adaptively selecting the waveform parameters at each time step, have resulted in improvements in tracking performance. Of particular interest are waveforms that can be mathematically designed to have reduced ambiguity function sidelobes, as their use can lead to an increase in the target state estimation accuracy. Moreover, adaptively positioning the sidelobes can reveal weak target returns by reducing interference from stronger targets. The manuscript provides an overview of recent advances in the design of multicarrier phase-coded waveforms based on Bjorck constant-amplitude zero-autocorrelation (CAZAC) sequences for use in an adaptive waveform selection scheme for mutliple target tracking. The adaptive waveform design is formulated using sequential Monte Carlo techniques that need to be matched to the high resolution measurements. The work will be of interest to both practitioners and researchers in radar as well as to researchers in other applications where high resolution measurements can have significant benefits. Table of Contents: Introduction / Radar Waveform Design / Target Tracking with a Particle Filter / Single Target tracking with LFM and CAZAC Sequences / Multiple Target Tracking / Conclusions




Algorithms and Software for Predictive and Perceptual Modeling of Speech


Book Description

From the early pulse code modulation-based coders to some of the recent multi-rate wideband speech coding standards, the area of speech coding made several significant strides with an objective to attain high quality of speech at the lowest possible bit rate. This book presents some of the recent advances in linear prediction (LP)-based speech analysis that employ perceptual models for narrow- and wide-band speech coding. The LP analysis-synthesis framework has been successful for speech coding because it fits well the source-system paradigm for speech synthesis. Limitations associated with the conventional LP have been studied extensively, and several extensions to LP-based analysis-synthesis have been proposed, e.g., the discrete all-pole modeling, the perceptual LP, the warped LP, the LP with modified filter structures, the IIR-based pure LP, all-pole modeling using the weighted-sum of LSP polynomials, the LP for low frequency emphasis, and the cascade-form LP. These extensions can be classified as algorithms that either attempt to improve the LP spectral envelope fitting performance or embed perceptual models in the LP. The first half of the book reviews some of the recent developments in predictive modeling of speech with the help of MatlabTM Simulation examples. Advantages of integrating perceptual models in low bit rate speech coding depend on the accuracy of these models to mimic the human performance and, more importantly, on the achievable "coding gains" and "computational overhead" associated with these physiological models. Methods that exploit the masking properties of the human ear in speech coding standards, even today, are largely based on concepts introduced by Schroeder and Atal in 1979. For example, a simple approach employed in speech coding standards is to use a perceptual weighting filter to shape the quantization noise according to the masking properties of the human ear. The second half of the book reviews some of the recent developments in perceptual modeling of speech (e.g., masking threshold, psychoacoustic models, auditory excitation pattern, and loudness) with the help of MatlabTM simulations. Supplementary material including MatlabTM programs and simulation examples presented in this book can also be accessed here. Table of Contents: Introduction / Predictive Modeling of Speech / Perceptual Modeling of Speech




Analysis of the MPEG-1 Layer III (MP3) Algorithm using MATLAB


Book Description

The MPEG-1 Layer III (MP3) algorithm is one of the most successful audio formats for consumer audio storage and for transfer and playback of music on digital audio players. The MP3 compression standard along with the AAC (Advanced Audio Coding) algorithm are associated with the most successful music players of the last decade. This book describes the fundamentals and the MATLAB implementation details of the MP3 algorithm. Several of the tedious processes in MP3 are supported by demonstrations using MATLAB software. The book presents the theoretical concepts and algorithms used in the MP3 standard. The implementation details and simulations with MATLAB complement the theoretical principles. The extensive list of references enables the reader to perform a more detailed study on specific aspects of the algorithm and gain exposure to advancements in perceptual coding. Table of Contents: Introduction / Analysis Subband Filter Bank / Psychoacoustic Model II / MDCT / Bit Allocation, Quantization and Coding / Decoder




A Survey of Blur Detection and Sharpness Assessment Methods


Book Description

Blurring is almost an omnipresent effect on natural images. The main causes of blurring in images include: (a) the existence of objects at different depths within the scene which is known as defocus blur; (b) blurring due to motion either of objects in the scene or the imaging device; and (c) blurring due to atmospheric turbulence. Automatic estimation of spatially varying sharpness/blurriness has several applications including depth estimation, image quality assessment, information retrieval, image restoration, among others. There are some cases in which blur is intentionally introduced or enhanced; for example, in artistic photography and cinematography in which blur is intentionally introduced to emphasize a certain image region. Bokeh is a technique that introduces defocus blur with aesthetic purposes. Additionally, in trending applications like augmented and virtual reality usually, blur is introduced in order to provide/enhance depth perception. Digital images and videos are produced every day in astonishing amounts and the demand for higher quality is constantly rising which creates a need for advanced image quality assessment. Additionally, image quality assessment is important for the performance of image processing algorithms. It has been determined that image noise and artifacts can affect the performance of algorithms such as face detection and recognition, image saliency detection, and video target tracking. Therefore, image quality assessment (IQA) has been a topic of intense research in the fields of image processing and computer vision. Since humans are the end consumers of multimedia signals, subjective quality metrics provide the most reliable results; however, their cost in addition to time requirements makes them unfeasible for practical applications. Thus, objective quality metrics are usually preferred.




Theory and Applications of Gaussian Quadrature Methods


Book Description

Gaussian quadrature is a powerful technique for numerical integration that falls under the broad category of spectral methods. The purpose of this work is to provide an introduction to the theory and practice of Gaussian quadrature. We study the approximation theory of trigonometric and orthogonal polynomials and related functions and examine the analytical framework of Gaussian quadrature. We discuss Gaussian quadrature for bandlimited functions, a topic inspired by some recent developments in the analysis of prolate spheroidal wave functions. Algorithms for the computation of the quadrature nodes and weights are described. Several applications of Gaussian quadrature are given, ranging from the evaluation of special functions to pseudospectral methods for solving differential equations. Software realization of select algorithms is provided. Table of Contents: Introduction / Approximating with Polynomials and Related Functions / Gaussian Quadrature / Applications / Links to Mathematical Software




Sensor Analysis for the Internet of Things


Book Description

While it may be attractive to view sensors as simple transducers which convert physical quantities into electrical signals, the truth of the matter is more complex. The engineer should have a proper understanding of the physics involved in the conversion process, including interactions with other measurable quantities. A deep understanding of these interactions can be leveraged to apply sensor fusion techniques to minimize noise and/or extract additional information from sensor signals. Advances in microcontroller and MEMS manufacturing, along with improved internet connectivity, have enabled cost-effective wearable and Internet of Things sensor applications. At the same time, machine learning techniques have gone mainstream, so that those same applications can now be more intelligent than ever before. This book explores these topics in the context of a small set of sensor types. We provide some basic understanding of sensor operation for accelerometers, magnetometers, gyroscopes, and pressure sensors. We show how information from these can be fused to provide estimates of orientation. Then we explore the topics of machine learning and sensor data analytics.




Engineer Your Software!


Book Description

Software development is hard, but creating good software is even harder, especially if your main job is something other than developing software. Engineer Your Software! opens the world of software engineering, weaving engineering techniques and measurement into software development activities. Focusing on architecture and design, Engineer Your Software! claims that no matter how you write software, design and engineering matter and can be applied at any point in the process. Engineer Your Software! provides advice, patterns, design criteria, measures, and techniques that will help you get it right the first time. Engineer Your Software! also provides solutions to many vexing issues that developers run into time and time again. Developed over 40 years of creating large software applications, these lessons are sprinkled with real-world examples from actual software projects. Along the way, the author describes common design principles and design patterns that can make life a lot easier for anyone tasked with writing anything from a simple script to the largest enterprise-scale systems.




Sparse Representations for Radar with MATLAB Examples


Book Description

Although the field of sparse representations is relatively new, research activities in academic and industrial research labs are already producing encouraging results. The sparse signal or parameter model motivated several researchers and practitioners to explore high complexity/wide bandwidth applications such as Digital TV, MRI processing, and certain defense applications. The potential signal processing advancements in this area may influence radar technologies. This book presents the basic mathematical concepts along with a number of useful MATLABĀ® examples to emphasize the practical implementations both inside and outside the radar field. Table of Contents: Radar Systems: A Signal Processing Perspective / Introduction to Sparse Representations / Dimensionality Reduction / Radar Signal Processing Fundamentals / Sparse Representations in Radar




Latency and Distortion of Electromagnetic Trackers for Augmented Reality Systems


Book Description

Augmented reality (AR) systems are often used to superimpose virtual objects or information on a scene to improve situational awareness. Delays in the display system or inaccurate registration of objects destroy the sense of immersion a user experiences when using AR systems. AC electromagnetic trackers are ideal for these applications when combined with head orientation prediction to compensate for display system delays. Unfortunately, these trackers do not perform well in environments that contain conductive or ferrous materials due to magnetic field distortion without expensive calibration techniques. In our work we focus on both the prediction and distortion compensation aspects of this application, developing a "small footprint" predictive filter for display lag compensation and a simplified calibration system for AC magnetic trackers. In the first phase of our study we presented a novel method of tracking angular head velocity from quaternion orientation using an Extended Kalman Filter in both single model (DQEKF) and multiple model (MMDQ) implementations. In the second phase of our work we have developed a new method of mapping the magnetic field generated by the tracker without high precision measurement equipment. This method uses simple fixtures with multiple sensors in a rigid geometry to collect magnetic field data in the tracking volume. We have developed a new algorithm to process the collected data and generate a map of the magnetic field distortion that can be used to compensate distorted measurement data. Table of Contents: List of Tables / Preface / Acknowledgments / Delta Quaternion Extended Kalman Filter / Multiple Model Delta Quaternion Filter / Interpolation Volume Calibration / Conclusion / References / Authors' Biographies




Bandwidth Extension of Speech Using Perceptual Criteria


Book Description

Bandwidth extension of speech is used in the International Telecommunication Union G.729.1 standard in which the narrowband bitstream is combined with quantized high-band parameters. Although this system produces high-quality wideband speech, the additional bits used to represent the high band can be further reduced. In addition to the algorithm used in the G.729.1 standard, bandwidth extension methods based on spectrum prediction have also been proposed. Although these algorithms do not require additional bits, they perform poorly when the correlation between the low and the high band is weak. In this book, two wideband speech coding algorithms that rely on bandwidth extension are developed. The algorithms operate as wrappers around existing narrowband compression schemes. More specifically, in these algorithms, the low band is encoded using an existing toll-quality narrowband system, whereas the high band is generated using the proposed extension techniques. The first method relies only on transmitted high-band information to generate the wideband speech. The second algorithm uses a constrained minimum mean square error estimator that combines transmitted high-band envelope information with a predictive scheme driven by narrowband features. Both algorithms make use of novel perceptual models based on loudness that determine optimum quantization strategies for wideband recovery and synthesis. Objective and subjective evaluations reveal that the proposed system performs at a lower average bit rate while improving speech quality when compared to other similar algorithms.