Exponential Data Fitting and Its Applications


Book Description

"Real and complex exponential data fitting is an important activity in many different areas of science and engineering, ranging from Nuclear Magnetic Resonance Spectroscopy and Lattice Quantum Chromodynamics to Electrical and Chemical Engineering, Vision a"




Fitting Models to Biological Data Using Linear and Nonlinear Regression


Book Description

Most biologists use nonlinear regression more than any other statistical technique, but there are very few places to learn about curve-fitting. This book, by the author of the very successful Intuitive Biostatistics, addresses this relatively focused need of an extraordinarily broad range of scientists.




Data Assimilation and Control: Theory and Applications in Life Sciences


Book Description

The understanding of complex systems is a key element to predict and control the system’s dynamics. To gain deeper insights into the underlying actions of complex systems today, more and more data of diverse types are analyzed that mirror the systems dynamics, whereas system models are still hard to derive. Data assimilation merges both data and model to an optimal description of complex systems’ dynamics. The present eBook brings together both recent theoretical work in data assimilation and control and demonstrates applications in diverse research fields.







NBS Special Publication


Book Description




Quantitative Magnetic Resonance Imaging


Book Description

Quantitative Magnetic Resonance Imaging is a 'go-to' reference for methods and applications of quantitative magnetic resonance imaging, with specific sections on Relaxometry, Perfusion, and Diffusion. Each section will start with an explanation of the basic techniques for mapping the tissue property in question, including a description of the challenges that arise when using these basic approaches. For properties which can be measured in multiple ways, each of these basic methods will be described in separate chapters. Following the basics, a chapter in each section presents more advanced and recently proposed techniques for quantitative tissue property mapping, with a concluding chapter on clinical applications. The reader will learn: - The basic physics behind tissue property mapping - How to implement basic pulse sequences for the quantitative measurement of tissue properties - The strengths and limitations to the basic and more rapid methods for mapping the magnetic relaxation properties T1, T2, and T2* - The pros and cons for different approaches to mapping perfusion - The methods of Diffusion-weighted imaging and how this approach can be used to generate diffusion tensor - maps and more complex representations of diffusion - How flow, magneto-electric tissue property, fat fraction, exchange, elastography, and temperature mapping are performed - How fast imaging approaches including parallel imaging, compressed sensing, and Magnetic Resonance - Fingerprinting can be used to accelerate or improve tissue property mapping schemes - How tissue property mapping is used clinically in different organs - Structured to cater for MRI researchers and graduate students with a wide variety of backgrounds - Explains basic methods for quantitatively measuring tissue properties with MRI - including T1, T2, perfusion, diffusion, fat and iron fraction, elastography, flow, susceptibility - enabling the implementation of pulse sequences to perform measurements - Shows the limitations of the techniques and explains the challenges to the clinical adoption of these traditional methods, presenting the latest research in rapid quantitative imaging which has the possibility to tackle these challenges - Each section contains a chapter explaining the basics of novel ideas for quantitative mapping, such as compressed sensing and Magnetic Resonance Fingerprinting-based approaches







Language Modeling for Information Retrieval


Book Description

A statisticallanguage model, or more simply a language model, is a prob abilistic mechanism for generating text. Such adefinition is general enough to include an endless variety of schemes. However, a distinction should be made between generative models, which can in principle be used to synthesize artificial text, and discriminative techniques to classify text into predefined cat egories. The first statisticallanguage modeler was Claude Shannon. In exploring the application of his newly founded theory of information to human language, Shannon considered language as a statistical source, and measured how weH simple n-gram models predicted or, equivalently, compressed natural text. To do this, he estimated the entropy of English through experiments with human subjects, and also estimated the cross-entropy of the n-gram models on natural 1 text. The ability of language models to be quantitatively evaluated in tbis way is one of their important virtues. Of course, estimating the true entropy of language is an elusive goal, aiming at many moving targets, since language is so varied and evolves so quickly. Yet fifty years after Shannon's study, language models remain, by all measures, far from the Shannon entropy liInit in terms of their predictive power. However, tbis has not kept them from being useful for a variety of text processing tasks, and moreover can be viewed as encouragement that there is still great room for improvement in statisticallanguage modeling.







Parameter Estimation and Uncertainty Quantification in Water Resources Modeling


Book Description

Numerical models of flow and transport processes are heavily employed in the fields of surface, soil, and groundwater hydrology. They are used to interpret field observations, analyze complex and coupled processes, or to support decision making related to large societal issues such as the water-energy nexus or sustainable water management and food production. Parameter estimation and uncertainty quantification are two key features of modern science-based predictions. When applied to water resources, these tasks must cope with many degrees of freedom and large datasets. Both are challenging and require novel theoretical and computational approaches to handle complex models with large number of unknown parameters.