Masters Theses in the Pure and Applied Sciences


Book Description

Masters Theses in the Pure and Applied Sciences was first conceived, published, and disseminated by the Center for Information and Numerical Data Analysis and Synthesis (CINDAS)* at Purdue University in 1957, starting its coverage of theses with the academic year 1955. Beginning with Volume 13, the printing and dis semination phases of the activity were transferred to University Microfilms/Xerox of Ann Arbor, Michigan, with the thought that such an arrangement would be more beneficial to the academic and general scientific and technical community. After five years of this jOint undertaking we had concluded that it was in the interest of all concerned if the printing and distribution of the volumes were handled by an international publishing house to assure improved service and broader dissemination. Hence, starting with Volume 18, Masters Theses in the Pure and Applied Sciences has been disseminated on a worldwide basis by Plenum Publishing Corporation of New York, and in the same year the coverage was broadened to include Canadian universities. All back issues can also be ordered from Plenum. We have reported in Volume 40 (thesis year 1995) a total of 10,746 thesis titles from 19 Canadian and 144 United States universities. We are sure that this broader base for these titles reported will greatly enhance the value of this impor tant annual reference work. While Volume 40 reports theses submitted in 1995, on occasion, certain uni versities do report theses submitted in previous years but not reported at the time.




Econometrics for Financial Applications


Book Description

This book addresses both theoretical developments in and practical applications of econometric techniques to finance-related problems. It includes selected edited outcomes of the International Econometric Conference of Vietnam (ECONVN2018), held at Banking University, Ho Chi Minh City, Vietnam on January 15-16, 2018. Econometrics is a branch of economics that uses mathematical (especially statistical) methods to analyze economic systems, to forecast economic and financial dynamics, and to develop strategies for achieving desirable economic performance. An extremely important part of economics is finances: a financial crisis can bring the whole economy to a standstill and, vice versa, a smart financial policy can dramatically boost economic development. It is therefore crucial to be able to apply mathematical techniques of econometrics to financial problems. Such applications are a growing field, with many interesting results – and an even larger number of challenges and open problems.




Neural Networks: From Biology To High Energy Physics - Proceedings Of The 2nd Workshop


Book Description

Neural network models, in addition to being of intrinsic theoretical interest, have also proved to be a useful framework in which issues in theoretical biology can be put into perspective. These issues include, amongst others, modelling the activity of the cortex and the study of protein folding. More recently, neural network models have been extensively investigated as tools for data analysis in high energy physics experiments. These workshop proceedings reflect the strongly interdisciplinary character of the field and provide an updated overview of recent developments.




Neural Networks and Deep Learning


Book Description

This book covers both classical and modern models in deep learning. The primary focus is on the theory and algorithms of deep learning. The theory and algorithms of neural networks are particularly important for understanding important concepts, so that one can understand the important design concepts of neural architectures in different applications. Why do neural networks work? When do they work better than off-the-shelf machine-learning models? When is depth useful? Why is training neural networks so hard? What are the pitfalls? The book is also rich in discussing different applications in order to give the practitioner a flavor of how neural architectures are designed for different types of problems. Applications associated with many different areas like recommender systems, machine translation, image captioning, image classification, reinforcement-learning based gaming, and text analytics are covered. The chapters of this book span three categories: The basics of neural networks: Many traditional machine learning models can be understood as special cases of neural networks. An emphasis is placed in the first two chapters on understanding the relationship between traditional machine learning and neural networks. Support vector machines, linear/logistic regression, singular value decomposition, matrix factorization, and recommender systems are shown to be special cases of neural networks. These methods are studied together with recent feature engineering methods like word2vec. Fundamentals of neural networks: A detailed discussion of training and regularization is provided in Chapters 3 and 4. Chapters 5 and 6 present radial-basis function (RBF) networks and restricted Boltzmann machines. Advanced topics in neural networks: Chapters 7 and 8 discuss recurrent neural networks and convolutional neural networks. Several advanced topics like deep reinforcement learning, neural Turing machines, Kohonen self-organizing maps, and generative adversarial networks are introduced in Chapters 9 and 10. The book is written for graduate students, researchers, and practitioners. Numerous exercises are available along with a solution manual to aid in classroom teaching. Where possible, an application-centric view is highlighted in order to provide an understanding of the practical uses of each class of techniques.




Motivation, Effort, and the Neural Network Model


Book Description

Our understanding of how the human brain operates and completes its essential tasks continues is fundamentally altered from what it was ten years ago. We have moved from an understanding based on the modularity of key structural components and their specialized functions to an almost diametrically opposed, highly integrated neural network model, based on a vertically organized brain dependent on small world hub principles. This new understanding completely changes how we understand essential psychological constructs such as motivation. Network modeling posits that motivation is a construct that describes a modified aspect of the operation of the human learning system that is specifically designed to cause a person to pursue a goal. Anthropologically and developmentally, these goals were initially basic, including things like food, shelter and reproduction. Over the course of time and development they develop into a complex web of extrinsic and then intrinsic goals, objectives and values. The core for all of this development is the inborn flight or fight reaction has been modified over time by a combination of inborn human temperamental characteristics and life experiences. This process of modification is, in part, based on the operation of a network based error-prediction network working in concert with the reward network to produce a system of ever evolving valuations of goals and objectives. These valuations are never truly fixed. They are constantly evolving, being modified and shaped by experience. The error prediction network and learning related networks work in concert with the limbic system to allow affect laden experiences to inform the process of valuation. These networks, operating in concert, produce a cognitive process we call motivation. Like most networks, the motivation system of networks is recruited when the task demands of the situation require them. Understanding motivation from this perspective has profound implications for many scientific disciplines in general and psychology in specific. Psychologically, this new understanding will alter how we understand client behavior in therapy and when being evaluated. This new understanding will provide direction for new therapeutic intervention for a variety of disorders of mental health. It will also inform testing practices concerning the evaluation of effort and malingering. This book is not a project in reductionism. It is the polar opposite. A neural network understanding of the operation of the human brain allows for the integration of what has come before into a comprehensive and integrated model. It will likely provide the basis for future research for years to come.




Improving the Exploration Process by Learning from the Past


Book Description

Learning by experience is both a part of the daily life and the exploration life. A systematic review of the past is essential to improve the exploration process by better managing risks and uncertainties. Learning through different disciplines has become a favoured technique. With new tools for interpretation and simulation, integration and data and the creation of cross-discipline teams, we can take major step forward in understanding the exploration task and its different elements.Global views and lessons learned on the Norwegian Continental Shelf on risk management and retrospective prospect assessment are presented in this book. Detailed exploration case histories from the Norwegian Continental Shelf documenting both positive and negative experiences and highlighting the benefits of integrated thinking and methods are presented. The impact of the application of various state-of-the art and developing technologies on portfolio management, opportunity evaluation and volumetric and risk assessment of prospects and discoveries are reviewed, and the future technological challenges in exploring the remaining hydrocarbon potential of the Norwegian continental Shelf are summarised.







ChatGPT and the Future of AI


Book Description

An insightful exploration of Chat GPT and other advanced AI systems—how we got here, where we’re headed, and what it all means for how we interact with the world. In ChatGPT and the Future of AI, the sequel to The Deep Learning Revolution, Terrence Sejnowski offers a nuanced exploration of large language models (LLMs) like ChatGPT and what their future holds. How should we go about understanding LLMs? Do these language models truly understand what they are saying? Or is it possible that what appears to be intelligence in LLMs may be a mirror that merely reflects the intelligence of the interviewer? In this book, Sejnowski, a pioneer in computational approaches to understanding brain function, answers all our urgent questions about this astonishing new technology. Sejnowski begins by describing the debates surrounding LLMs’ comprehension of language and exploring the notions of “thinking” and “intelligence.” He then takes a deep dive into the historical evolution of language models, focusing on the role of transformers, the correlation between computing power and model size, and the intricate mathematics shaping LLMs. Sejnowski also provides insight into the historical roots of LLMs and discusses the potential future of AI, focusing on next-generation LLMs inspired by nature and the importance of developing energy-efficient technologies. Grounded in Sejnowski’s dual expertise in AI and neuroscience, ChatGPT and the Future of AI is the definitive guide to understanding the intersection of AI and human intelligence.




Models of Neural Networks III


Book Description

One of the most challenging and fascinating problems of the theory of neural nets is that of asymptotic behavior, of how a system behaves as time proceeds. This is of particular relevance to many practical applications. Here we focus on association, generalization, and representation. We turn to the last topic first. The introductory chapter, "Global Analysis of Recurrent Neural Net works," by Andreas Herz presents an in-depth analysis of how to construct a Lyapunov function for various types of dynamics and neural coding. It includes a review of the recent work with John Hopfield on integrate-and fire neurons with local interactions. The chapter, "Receptive Fields and Maps in the Visual Cortex: Models of Ocular Dominance and Orientation Columns" by Ken Miller, explains how the primary visual cortex may asymptotically gain its specific structure through a self-organization process based on Hebbian learning. His argu ment since has been shown to be rather susceptible to generalization.




ARTIFICIAL NEURAL NETWORKS


Book Description