Nonuniform Sampling


Book Description

Our understanding of nature is often through nonuniform observations in space or time. In space, one normally observes the important features of an object, such as edges. The less important features are interpolated. History is a collection of important events that are nonuniformly spaced in time. Historians infer between events (interpolation) and politicians and stock market analysts forecast the future from past and present events (extrapolation). The 20 chapters of Nonuniform Sampling: Theory and Practice contain contributions by leading researchers in nonuniform and Shannon sampling, zero crossing, and interpolation theory. Its practical applications include NMR, seismology, speech and image coding, modulation and coding, optimal content, array processing, and digital filter design. It has a tutorial outlook for practising engineers and advanced students in science, engineering, and mathematics. It is also a useful reference for scientists and engineers working in the areas of medical imaging, geophysics, astronomy, biomedical engineering, computer graphics, digital filter design, speech and video processing, and phased array radar.




Advanced Topics in Shannon Sampling and Interpolation Theory


Book Description

Advanced Topics in Shannon Sampling and Interpolation Theory is the second volume of a textbook on signal analysis solely devoted to the topic of sampling and restoration of continuous time signals and images. Sampling and reconstruction are fundamental problems in any field that deals with real-time signals or images, including communication engineering, image processing, seismology, speech recognition, and digital signal processing. This second volume includes contributions from leading researchers in the field on such topics as Gabor's signal expansion, sampling in optical image formation, linear prediction theory, polar and spiral sampling theory, interpolation from nonuniform samples, an extension of Papoulis's generalized sampling expansion to higher dimensions, and applications of sampling theory to optics and to time-frequency representations. The exhaustive bibliography on Shannon sampling theory will make this an invaluable research tool as well as an excellent text for students planning further research in the field.




Introduction to Petroleum Seismology, second edition


Book Description

Introduction to Petroleum Seismology, second edition (SEG Investigations in Geophysics Series No. 12) provides the theoretical and practical foundation for tackling present and future challenges of petroleum seismology especially those related to seismic survey designs, seismic data acquisition, seismic and EM modeling, seismic imaging, microseismicity, and reservoir characterization and monitoring. All of the chapters from the first edition have been improved and/or expanded. In addition, twelve new chapters have been added. These new chapters expand topics which were only alluded to in the first edition: sparsity representation, sparsity and nonlinear optimization, near-simultaneous multiple-shooting acquisition and processing, nonuniform wavefield sampling, automated modeling, elastic-electromagnetic mathematical equivalences, and microseismicity in the context of hydraulic fracturing. Another major modification in this edition is that each chapter contains analytical problems as well as computational problems. These problems include MatLab codes, which may help readers improve their understanding of and intuition about these materials. The comprehensiveness of this book makes it a suitable text for undergraduate and graduate courses that target geophysicists and engineers as well as a guide and reference work for researchers and professionals in academia and in the petroleum industry.










Artificial Intelligence for Edge Computing


Book Description

It is undeniable that the recent revival of artificial intelligence (AI) has significantly changed the landscape of science in many application domains, ranging from health to defense and from conversational interfaces to autonomous cars. With terms such as “Google Home”, “Alexa”, and “ChatGPT” becoming household names, the pervasive societal impact of AI is clear. Advances in AI promise a revolution in our interaction with the physical world, a domain where computational intelligence has always been envisioned as a transformative force toward a better tomorrow. Depending on the application family, this domain is often referred to as Ubiquitous Computing, Cyber-Physical Computing, or the Internet of Things. The underlying vision is driven by the proliferation of cheap embedded computing hardware that can be integrated easily into myriads of everyday devices from consumer electronics, such as personal wearables and smart household appliances, to city infrastructure and industrial process control systems. One common trait across these applications is that the data that the application operates on come directly (typically via sensors) from the physical world. Thus, from the perspective of communication network infrastructure, the data originate at the network edge. From a performance standpoint, there is an argument to be made that such data should be processed at the point of collection. Hence, a need arises for Edge AI -- a genre of AI where the inference, and sometimes even the training, are performed at the point of need, meaning at the edge where the data originate. The book is broken down into three parts: core problems, distributed problems, and other cross-cutting issues. It explores the challenges arising in Edge AI contexts. Some of these challenges (such as neural network model reduction to fit resource-constrained hardware) are unique to the edge environment. They need a novel category of solutions that do not parallel more typical concerns in mainstream AI. Others are adaptations of mainstream AI challenges to the edge space. An example is overcoming the cost of data labeling. The labeling problem is pervasive, but its solution in the IoT application context is different from other contexts. This book is not a survey of the state of the art. With thousands of publications appearing in AI every year, such a survey is doomed to be incomplete on arrival. It is also not a comprehensive coverage of all the problems in the space of Edge AI. Different applications pose different challenges, and a more comprehensive coverage should be more application specific. Instead, this book covers some of the more endemic challenges across the range of IoT/CPS applications. To offer coverage in some depth, we opt to cover mainly one or a few representative solutions for each of these endemic challenges in sufficient detail, rather that broadly touching on all relevant prior work. The underlying philosophy is one of illustrating by example. The solutions are curated to offer insight into a way of thinking that characterizes Edge AI research and distinguishes its solutions from their more mainstream counterparts.




Advances in multimedia information processing, PCM 2002 [electronic resource]


Book Description

This book constitutes the refereed proceedings of the Third IEEE Pacific Rim Conference on Multimedia, PCM 2002, held in Hsinchu, Taiwan in December 2002. The 154 revised full papers presented were carefully reviewed and selected from 224 submissions. The papers are organized in topical sections on mobile multimedia, digitial watermarking and data hiding, motion analysis, mulitmedia retrieval techniques, image processing, mulitmedia security, image coding, mulitmedia learning, audio signal processing, wireless multimedia streaming, multimedia systems in the Internet, distance education and multimedia, Internet security, computer graphics and virtual reality, object tracking, face analysis, and MPEG-4.




Advances in Multimedia Information Processing — PCM 2002


Book Description

This book constitutes the refereed proceedings of the Third IEEE Pacific Rim Conference on Multimedia, PCM 2002, held in Hsinchu, Taiwan in December 2002. The 154 revised full papers presented were carefully reviewed and selected from 224 submissions. The papers are organized in topical sections on mobile multimedia, digitial watermarking and data hiding, motion analysis, mulitmedia retrieval techniques, image processing, mulitmedia security, image coding, mulitmedia learning, audio signal processing, wireless multimedia streaming, multimedia systems in the Internet, distance education and multimedia, Internet security, computer graphics and virtual reality, object tracking, face analysis, and MPEG-4.




Handbook of Fourier Analysis & Its Applications


Book Description

This practical, applications-based professional handbook comprehensively covers the theory and applications of Fourier Analysis, spanning topics from engineering mathematics, signal processing and related multidimensional transform theory, and quantum physics to elementary deterministic finance and even the foundations of western music theory.




Sampling Theory, a Renaissance


Book Description

Reconstructing or approximating objects from seemingly incomplete information is a frequent challenge in mathematics, science, and engineering. A multitude of tools designed to recover hidden information are based on Shannon’s classical sampling theorem, a central pillar of Sampling Theory. The growing need to efficiently obtain precise and tailored digital representations of complex objects and phenomena requires the maturation of available tools in Sampling Theory as well as the development of complementary, novel mathematical theories. Today, research themes such as Compressed Sensing and Frame Theory re-energize the broad area of Sampling Theory. This volume illustrates the renaissance that the area of Sampling Theory is currently experiencing. It touches upon trendsetting areas such as Compressed Sensing, Finite Frames, Parametric Partial Differential Equations, Quantization, Finite Rate of Innovation, System Theory, as well as sampling in Geometry and Algebraic Topology.