Second Generation Mainframes


Book Description

Second Generation Mainframes: The IBM 7000 Series describes IBM’s second generation of mainframe computers which introduced new technology, new peripherals and advanced software. These systems were continuations of the instruction sets of the IBM 700 series with significant enhancements, but supported upwards compatibility that preserved customers’ investment in the earlier series. The use of magnetic cores, fast magnetic tapes and disks, and transistors yielded computation speeds that opened new domains for computation. Programming languages continued to be developed and enhanced, and new ones were developed for specific domains, such as SNOBOL, COBOL, and Macro Assemblers. Robust subroutine libraries for mathematical applications appeared. New operating systems provided many capabilities to programmers for data management and file systems, limited multiprocessing, timesharing, programming language support, and better error handling and control of peripherals. Early concepts in persistent file systems on magnetic disks were developed that changed the nature of job processing. The IBM 7000 series led the way in many innovative concepts that helped to establish IBM as the foremost manufacturer of computer systems. However, the diversity of the models put significant strain on IBM’s financial resources and development teams, which ultimately led to OBM’s development of the System/360 family of machines.




Mainframe Computer Systems


Book Description

This volume describes General Electric Corporation’s venture into developing second and third generation mainframe computer systems. The General Electric Corporation (GE), which began its life as the Edison Electric Co., was long involved in electrical appliances and industrial machines. It was also a founder of the Radio Corporation of America, which eventually became one of its competitors, and developed many electrical systems in order to control different types of industrial machines. Its breakthrough into computing came with its winning bid to provide the computing systems for the Electronic Recording Method of Accounting) system developed for the Bank of America by the Stanford Research Institute. The success of this project led GE to develop the GE-200 series which was the foundation for commercial timesharing. The GE-235 was selected by Dartmouth for its Dartmouth Time Sharing System (DTSS), an innovative academic time-sharing system. BASIC was developed on the GE-235 computer system under DTSS. GE enhanced it to develop its Mark II/III Time Sharing System, apparently the first commercial time sharing service in the world. GE develop the GE-300/-400 systems for industrial process control. The GE-600 series replaced the GE-200 series and demonstrated innovation in time-sharing systems. The GE-645 was selected to host Multics, which was developed by MIT. However, GE felt that it could not compete in computing against IBM, Univac, and other mainframes competitors, so it folded its tent and sold its Computer Division to Honeywell, Inc. Nevertheless, GE will be remembered for many innovations which continue to be used in modern computing systems.




First Generation Mainframes


Book Description

This volume describes several different models of IBM computer systems, characterized by different data representations and instruction sets that strongly influenced computer system architecture in the 1950s and early 1960s. They focused on a common system architecture that allowed peripherals to be used on different systems, albeit with specific adapters. These systems were modular, which made them easy to manufacture, configure, and service. Computing with UNIVAC, they used reliable Williams Tubes for memory, and later introduced magnetic core memory. IBM developed its own magnetic tape drives and magnetic drums that were both faster and more reliable than UNIVAC’s peripherals. The first software systems that could reasonably be called “operating systems” enabled more efficient use of programmer time and system resources. The development of programming languages, notably FORTRAN, and assembly language processors, notably Autocoder, improved the productivity of programmers. In addition, IBM developed one of the finest product marketing, sales and servicing organizations in the world. The legacy of the IBM 700 series is found in their popular successors, the IBM 7000 Series, which will be described in a forthcoming volume.




First Generation Mainframes


Book Description

This volume describes several different models of IBM computer systems, characterized by different data representations and instruction sets that strongly influenced computer system architecture in the 1950s and early 1960s. They focused on a common system architecture that allowed peripherals to be used on different systems, albeit with specific adapters. These systems were modular, which made them easy to manufacture, configure, and service. Computing with UNIVAC, they used reliable Williams Tubes for memory, and later introduced magnetic core memory. IBM developed its own magnetic tape drives and magnetic drums that were both faster and more reliable than UNIVAC's peripherals. The first software systems that could reasonably be called "operating systems" enabled more efficient use of programmer time and system resources. The development of programming languages, notably FORTRAN, and assembly language processors, notably Autocoder, improved the productivity of programmers. In addition, IBM developed one of the finest product marketing, sales and servicing organizations in the world. The legacy of the IBM 700 series is found in their popular successors, the IBM 7000 Series, which will be described in a forthcoming volume.




The Server


Book Description

A cutting†‘edge media history on a perennially fascinating topic, which attempts to answer the crucial question: Who is in charge, the servant or the master?†‹ Though classic servants like the butler or the governess have largely vanished, the Internet is filled with servers: web, ftp, mail, and others perform their daily drudgery, going about their business noiselessly and unnoticed. Why then are current†‘day digital drudges called servers? Markus Krajewski explores this question by going from the present back to the Baroque to study historical aspects of service through various perspectives, be it the servants’ relationship to architecture or their function in literary or scientific contexts. At the intersection of media studies, cultural history, and literature, this work recounts the gradual transition of agency from human to nonhuman actors to show how the concept of the digital server stems from the classic role of the servant.




What On Earth is a Mainframe?


Book Description

Confused about zSeries Mainframes? Need to understand the z/OS operating system - and in a hurry? Then you've just found the book you need.Avoiding technical jargon, this book gives you the basic facts in clear, light-hearted, entertaining English. You'll quickly learn what Mainframes are, what they do, what runs on them, and terms and terminology you need to speak Mainframe-ese.But it's not all technical. There's also invaluable information on the people that work on Mainframes, Mainframe management issues, new Mainframe trends, and other facts that don't seem to be written down anywhere else.Programmers, managers, recruitment consultants, and industry commentators will all find this book their new best friend when trying to understand the Mainframe world.







Main Street to Mainframes


Book Description

Tells the story of Poughkeepsie’s transformation from small city to urban region.




The Digital Hand, Vol 3


Book Description

In The third volume of The Digital Hand, James W. Cortada completes his sweeping survey of the effect of computers on American industry, turning finally to the public sector, and examining how computers have fundamentally changed the nature of work in government and education. This book goes far beyond generalizations about the Information Age to the specifics of how industries have functioned, now function, and will function in the years to come. Cortada combines detailed analysis with narrative history to provide a broad overview of computings and telecommunications role in the entire public sector, including federal, state, and local governments, and in K-12 and higher education. Beginning in 1950, when commercial applications of digital technology began to appear, Cortada examines the unique ways different public sector industries adopted new technologies, showcasing the manner in which their innovative applications influenced other industries, as well as the U.S. economy as a whole. He builds on the surveys presented in the first volume of the series, which examined sixteen manufacturing, process, transportation, wholesale and retail industries, and the second volume, which examined over a dozen financial, telecommunications, media, and entertainment industries. With this third volume, The Digital Hand trilogy is complete, and forms the most comprehensive and rigorously researched history of computing in business since 1950, providing a detailed picture of what the infrastructure of the Information Age really looks like and how we got there. Managers, historians, economists, and those working in the public sector will appreciate Cortada's analysis of digital technology's many roles and future possibilities.




Computer Jargon Explained


Book Description

Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not have time to keep abreast of developments that do not immediately affect what they are doing. Nonetheless, they are expected to be experts: to have instant, detailed, accurate answers to every question a non-specialist may pose them. This book provides an alternative for computer professionals who need that wider perspective, a useful companion in familiarizing complicated computer jargons and technical terms.