The Army's Bandwidth Bottleneck


Book Description







The Army's Bandwidth Bottleneck


Book Description

Over the past decade, the U.S. Army's principal modernization initiative has been its digitization effort, designed to harness the power of the microchip to significantly improve the fighting capabilities of soldiers on the battlefield. But implementing that initiative presents significant challenges. Digitization requires the rapid transmission of large amounts of information over significant distances. Experiments conducted to date as well as recent operations in Iraq, where troops employed some of the results of the service's digitization efforts, have shown that that requirement is difficult to fulfill not only under battlefield conditions but also in more benign circumstances. Since 1999, the focus of the Army's modernization program has shifted to what it terms transformation-making its forces deployable more quickly while maintaining or improving their lethality and survivability. Although digitization is no longer the Army's primary modernization initiative, it remains a key element of transformation. In the past several years, questions about the size of the information flow associated with digitization and the communications bandwidth to support it-that is, the capacity to move information at sufficient speeds-have spurred the Army to adopt several large radio and network communications programs. Ultimately to cost tens of billions of dollars, those programs are currently planned to provide initial operational capability by about 2010. This Congressional Budget Office (CBO) study, which was prepared for the Subcommittee on Tactical Air and Land Forces of the House Committee on Armed Services, examines the issue of how much bandwidth will be required to achieve the goal of digitization (the bandwidth demand) versus how much bandwidth will be provided by the Army's planned communications programs (the bandwidth supply). CBO finds that significant shortfalls currently exist for the Army in the supply of bandwidth. Moreover, under the service's current plans, demand will continue to exceed supply beyond 2010-after the Army begins fielding its next generation of advanced radios and other communications equipment. The study examines several options that would improve the future match between bandwidth supply and demand. However, in keeping with CBO's mandate to provide impartial analysis, this report makes no recommendations. Paul Rehmus of CBO's National Security Division wrote the study under the general supervision of J. Michael Gilmore. Army Cadet Kevin Linzey provided substantial assistance in the analysis of bandwidth for unmanned aerial vehicles. At CBO, Barbara Edwards, David Moore, and David Newman reviewed early drafts and provided helpful comments. Numerous U.S. Army officers also contributed useful comments on and data for drafts of the report as well as the underlying analysis. The author would like to cite in particular the support of Lt. Col. Charles Gabrielson, Col. Edward Cardon, and former Chief of Staff of the Army Erik Shinseki.




Transforming Military Power since the Cold War


Book Description

An empirically rich account of how the West's main war-fighting armies have transformed since the end of the Cold War.




Department of Defense Appropriations for 2005


Book Description







The Army's Band with Bottleneck


Book Description

Over the past decade, the U.S. Army's principal modernization initiative has been its digitization effort, designed to harness the power of the microchip to significantly improve the fighting capabilities of soldiers on the battlefield. But implementing that initiative presents significant challenges. Questions about the size of the info. flow assoc. with digitization & the commun. bandwidth to support it -- that is, the capacity to move info. at sufficient speeds -- have spurred the Army to adopt several large radio & network commun. programs. This study examines the issue of how much bandwidth will be required to achieve the goal of digitization vs. how much bandwidth will be provided by the Army's planned communications programs. Charts & tables.




Collaborative Financial Infrastructure Protection


Book Description

The Critical Infrastructure Protection Survey recently released by Symantec found that 53% of interviewed IT security experts from international companies experienced at least ten cyber attacks in the last five years, and financial institutions were often subject to some of the most sophisticated and large-scale cyber attacks and frauds. The book by Baldoni and Chockler analyzes the structure of software infrastructures found in the financial domain, their vulnerabilities to cyber attacks and the existing protection mechanisms. It then shows the advantages of sharing information among financial players in order to detect and quickly react to cyber attacks. Various aspects associated with information sharing are investigated from the organizational, cultural and legislative perspectives. The presentation is organized in two parts: Part I explores general issues associated with information sharing in the financial sector and is intended to set the stage for the vertical IT middleware solution proposed in Part II. Nonetheless, it is self-contained and details a survey of various types of critical infrastructure along with their vulnerability analysis, which has not yet appeared in a textbook-style publication elsewhere. Part II then presents the CoMiFin middleware for collaborative protection of the financial infrastructure. The material is presented in an accessible style and does not require specific prerequisites. It appeals to both researchers in the areas of security, distributed systems, and event processing working on new protection mechanisms, and practitioners looking for a state-of-the-art middleware technology to enhance the security of their critical infrastructures in e.g. banking, military, and other highly sensitive applications. The latter group will especially appreciate the concrete usage scenarios included.




Resource Proportional Software Design for Emerging Systems


Book Description

Efficiency is a crucial concern across computing systems, from the edge to the cloud. Paradoxically, even as the latencies of bottleneck components such as storage and networks have dropped by up to four orders of magnitude, software path lengths have progressively increased due to overhead from the very frameworks that have revolutionized the pace of information technology. Such overhead can be severe enough to overshadow the benefits from switching to new technologies like persistent memory and low latency interconnects. Resource Proportional Software Design for Emerging Systems introduces resource proportional design (RPD) as a principled approach to software component and system development that counters the overhead of deeply layered code without removing flexibility or ease of development. RPD makes resource consumption proportional to situational utility by adapting to diverse emerging needs and technology systems evolution. Highlights: Analysis of run-time bloat in deep software stacks, an under-explored source of power-performance wastage in IT systems Qualitative and quantitative treatment of key dimensions of resource proportionality Code features: Unify and broaden supported but optional features without losing efficiency Technology and systems evolution: Design software to adapt with changing trade-offs as technology evolves Data processing: Design systems to predict which subsets of data processed by an (analytics or ML) application are likely to be useful System wide trade-offs: Address interacting local and global considerations throughout software stacks and hardware including cross-layer co-design involving code, data and systems dimensions, and non-functional requirements such as security and fault tolerance Written from a systems perspective to explore RPD principles, best practices, models and tools in the context of emerging technologies and applications This book is primarily geared towards practitioners with some advanced topics for researchers. The principles shared in the book are expected to be useful for programmers, engineers and researchers interested in ensuring software and systems are optimized for existing and next generation technologies. The authors are from both industry (Bhattacharya and Voigt) and academic (Gopinath) backgrounds.




Network World


Book Description

For more than 20 years, Network World has been the premier provider of information, intelligence and insight for network and IT executives responsible for the digital nervous systems of large organizations. Readers are responsible for designing, implementing and managing the voice, data and video systems their companies use to support everything from business critical applications to employee collaboration and electronic commerce.