Extreme Statistics in Nanoscale Memory Design


Book Description

Knowledge exists: you only have to ?nd it VLSI design has come to an important in?ection point with the appearance of large manufacturing variations as semiconductor technology has moved to 45 nm feature sizes and below. If we ignore the random variations in the manufacturing process, simulation-based design essentially becomes useless, since its predictions will be far from the reality of manufactured ICs. On the other hand, using design margins based on some traditional notion of worst-case scenarios can force us to sacri?ce too much in terms of power consumption or manufacturing cost, to the extent of making the design goals even infeasible. We absolutely need to explicitly account for the statistics of this random variability, to have design margins that are accurate so that we can ?nd the optimum balance between yield loss and design cost. This discontinuity in design processes has led many researchers to develop effective methods of statistical design, where the designer can simulate not just the behavior of the nominal design, but the expected statistics of the behavior in manufactured ICs. Memory circuits tend to be the hardest hit by the problem of these random variations because of their high replication count on any single chip, which demands a very high statistical quality from the product. Requirements of 5–6s (0.




Machine Learning in VLSI Computer-Aided Design


Book Description

This book provides readers with an up-to-date account of the use of machine learning frameworks, methodologies, algorithms and techniques in the context of computer-aided design (CAD) for very-large-scale integrated circuits (VLSI). Coverage includes the various machine learning methods used in lithography, physical design, yield prediction, post-silicon performance analysis, reliability and failure analysis, power and thermal analysis, analog design, logic synthesis, verification, and neuromorphic design. Provides up-to-date information on machine learning in VLSI CAD for device modeling, layout verifications, yield prediction, post-silicon validation, and reliability; Discusses the use of machine learning techniques in the context of analog and digital synthesis; Demonstrates how to formulate VLSI CAD objectives as machine learning problems and provides a comprehensive treatment of their efficient solutions; Discusses the tradeoff between the cost of collecting data and prediction accuracy and provides a methodology for using prior data to reduce cost of data collection in the design, testing and validation of both analog and digital VLSI designs. From the Foreword As the semiconductor industry embraces the rising swell of cognitive systems and edge intelligence, this book could serve as a harbinger and example of the osmosis that will exist between our cognitive structures and methods, on the one hand, and the hardware architectures and technologies that will support them, on the other....As we transition from the computing era to the cognitive one, it behooves us to remember the success story of VLSI CAD and to earnestly seek the help of the invisible hand so that our future cognitive systems are used to design more powerful cognitive systems. This book is very much aligned with this on-going transition from computing to cognition, and it is with deep pleasure that I recommend it to all those who are actively engaged in this exciting transformation. Dr. Ruchir Puri, IBM Fellow, IBM Watson CTO & Chief Architect, IBM T. J. Watson Research Center




Green Computing with Emerging Memory


Book Description

This book describes computing innovation, using non-volatile memory for a sustainable world. It appeals to both computing engineers and device engineers by describing a new means of lower power computing innovation, without sacrificing performance over conventional low-voltage operation. Readers will be introduced to methods of design and implementation for non-volatile memory which allow computing equipment to be turned off normally when not in use and to be turned on instantly to operate with full performance when needed.




Evolvable Hardware


Book Description

This book covers the basic theory, practical details and advanced research of the implementation of evolutionary methods on physical substrates. Most of the examples are from electronic engineering applications, including transistor-level design and system-level implementation. The authors present an overview of the successes achieved, and the book will act as a point of reference for both academic and industrial researchers.







The Fourth Terminal


Book Description

This book discusses the advantages and challenges of Body-Biasing for integrated circuits and systems, together with the deployment of the design infrastructure needed to generate this Body-Bias voltage. These new design solutions enable state of the art energy efficiency and system flexibility for the latest applications, such as Internet of Things and 5G communications.




Extreme Statistics in Nanoscale Memory Design


Book Description

Knowledge exists: you only have to ?nd it VLSI design has come to an important in?ection point with the appearance of large manufacturing variations as semiconductor technology has moved to 45 nm feature sizes and below. If we ignore the random variations in the manufacturing process, simulation-based design essentially becomes useless, since its predictions will be far from the reality of manufactured ICs. On the other hand, using design margins based on some traditional notion of worst-case scenarios can force us to sacri?ce too much in terms of power consumption or manufacturing cost, to the extent of making the design goals even infeasible. We absolutely need to explicitly account for the statistics of this random variability, to have design margins that are accurate so that we can ?nd the optimum balance between yield loss and design cost. This discontinuity in design processes has led many researchers to develop effective methods of statistical design, where the designer can simulate not just the behavior of the nominal design, but the expected statistics of the behavior in manufactured ICs. Memory circuits tend to be the hardest hit by the problem of these random variations because of their high replication count on any single chip, which demands a very high statistical quality from the product. Requirements of 5–6s (0.




Design Exploration of Emerging Nano-scale Non-volatile Memory


Book Description

This book presents the latest techniques for characterization, modeling and design for nano-scale non-volatile memory (NVM) devices. Coverage focuses on fundamental NVM device fabrication and characterization, internal state identification of memristic dynamics with physics modeling, NVM circuit design and hybrid NVM memory system design-space optimization. The authors discuss design methodologies for nano-scale NVM devices from a circuits/systems perspective, including the general foundations for the fundamental memristic dynamics in NVM devices. Coverage includes physical modeling, as well as the development of a platform to explore novel hybrid CMOS and NVM circuit and system design. • Offers readers a systematic and comprehensive treatment of emerging nano-scale non-volatile memory (NVM) devices; • Focuses on the internal state of NVM memristic dynamics, novel NVM readout and memory cell circuit design and hybrid NVM memory system optimization; • Provides both theoretical analysis and practical examples to illustrate design methodologies; • Illustrates design and analysis for recent developments in spin-toque-transfer, domain-wall racetrack and memristors.




Nanoscale Semiconductor Memories


Book Description

Nanoscale memories are used everywhere. From your iPhone to a supercomputer, every electronic device contains at least one such type. With coverage of current and prototypical technologies, Nanoscale Semiconductor Memories: Technology and Applications presents the latest research in the field of nanoscale memories technology in one place. It also covers a myriad of applications that nanoscale memories technology has enabled. The book begins with coverage of SRAM, addressing the design challenges as the technology scales, then provides design strategies to mitigate radiation induced upsets in SRAM. It discusses the current state-of-the-art DRAM technology and the need to develop high performance sense amplifier circuitry. The text then covers the novel concept of capacitorless 1T DRAM, termed as Advanced-RAM or A-RAM, and presents a discussion on quantum dot (QD) based flash memory. Building on this foundation, the coverage turns to STT-RAM, emphasizing scalable embedded STT-RAM, and the physics and engineering of magnetic domain wall "racetrack" memory. The book also discusses state-of-the-art modeling applied to phase change memory devices and includes an extensive review of RRAM, highlighting the physics of operation and analyzing different materials systems currently under investigation. The hunt is still on for universal memory that fits all the requirements of an "ideal memory" capable of high-density storage, low-power operation, unparalleled speed, high endurance, and low cost. Taking an interdisciplinary approach, this book bridges technological and application issues to provide the groundwork for developing custom designed memory systems.




Embedded Memories for Nano-Scale VLSIs


Book Description

Kevin Zhang Advancement of semiconductor technology has driven the rapid growth of very large scale integrated (VLSI) systems for increasingly broad applications, incl- ing high-end and mobile computing, consumer electronics such as 3D gaming, multi-function or smart phone, and various set-top players and ubiquitous sensor and medical devices. To meet the increasing demand for higher performance and lower power consumption in many different system applications, it is often required to have a large amount of on-die or embedded memory to support the need of data bandwidth in a system. The varieties of embedded memory in a given system have alsobecome increasingly more complex, ranging fromstatictodynamic and volatile to nonvolatile. Among embedded memories, six-transistor (6T)-based static random access memory (SRAM) continues to play a pivotal role in nearly all VLSI systems due to its superior speed and full compatibility with logic process technology. But as the technology scaling continues, SRAM design is facing severe challenge in mainta- ing suf?cient cell stability margin under relentless area scaling. Meanwhile, rapid expansion in mobile application, including new emerging application in sensor and medical devices, requires far more aggressive voltage scaling to meet very str- gent power constraint. Many innovative circuit topologies and techniques have been extensively explored in recent years to address these challenges.