Information Systems Reengineering, Integration and Normalization


Book Description

Taking a very practical approach, the author describes in detail database conversion techniques, reverse engineering, forward engineering and re-engineering methodologies for information systems, offering a systematic software engineering approach for reusing existing database systems built with “old” technology. He demonstrates how the existing systems can be transformed into the new technologies with the preservation of semantic constraints and without loss of information. In this third edition, with a new chapter on Data Normalization the author shows once the databases have been converted, how to integrate them for consolidating information, and how to normalize them so that they are efficient and user friendly. Many examples, illustrations and case studies together with questions and answers ensure that the methodology is easy to follow. Ideal as a textbook for students studying information systems theories, Information Systems Reengineering Integration and Normalization will also be a valuable management reference book for Information Technology Practitioners. Additional material is available on www.extramaterials/978-3-319-12294-6




Information Systems Reengineering, Integration and Normalization


Book Description

Database technology is an important subject in Computer Science. Every large company and nation needs a database to store information. The technology has evolved from file systems in the 60’s, to Hierarchical and Network databases in the 70’s, to relational databases in the 80’s, object-oriented databases in the 90’s, and to XML documents and NoSQL today. As a result, there is a need to reengineer and update old databases into new databases. This book presents solutions for this task. In this fourth edition, Chapter 9 - Heterogeneous Database Connectivity (HDBC) offers a database gateway platform for companies to communicate with each other not only with their data, but also via their database. The ability of sharing a database can contribute to the applications of Big Data and surveys for decision support systems. The HDBC gateway solution collects input from the database, transfers the data into its middleware storage, converts it into a common data format such as XML documents, and then distributes them to the users. HDBC transforms the common data into the target database to meet the user’s requirements, acting like a voltage transformer hub. The voltage transformer converts the voltage to a voltage required by the users. Similarly, HDBC transforms the database to the target database required by the users. This book covers reengineering for data conversion, integration for combining databases and merging databases and expert system rules, normalization for eliminating duplicate data from the database, and above all, HDBC connects all legacy databases to one target database for the users. The authors provide a forum for readers to ask questions and the answers are given by the authors and the other readers on the Internet.










Information Systems Reengineering and Integration


Book Description

Reengineering involves the re-design of an existing information system, whilst utilising as much of the existing system as possible. This book takes a practical approach to re-engineering existing systems and looks at data integration, and focuses on proven methods and tools. It also describes database conversion techniques in detail.




Knowledge Reuse and Agile Processes: Catalysts for Innovation


Book Description

Innovation, agility, and coordination are paramount in the support of value in the global knowledge economy. Therefore, the long-term success of a company is increasingly dependent on its underlying resilience and agility. Knowledge Reuse and Agile Processes: Catalysts for Innovation addresses flexibility of both business and information systems through component technology at the nexus of three seemingly unrelated disciplines: service-oriented architecture, knowledge management, and business process management. Providing practitioners and academians with timely, compelling research on agile, adaptive processes and information systems, this Premier Reference Source will enhance the collection of every reference library.




Database and Expert Systems Applications


Book Description

This book constitutes the refereed proceedings of the 9th International Conference on Database and Expert Systems Applications, DEXA'98, held in Vienna, Austria, in August 1998. The 81 revised full papers presented were carefully selected from a total of more than 200 submissions. The papers are organized in sections on active databases, object-oriented systems, data engineering, information retrieval, workflow and cooperative systems, spatial and temporal aspects, document management, spatial databases, adaptation and view updates, genetic algorithms, cooperative and distributed environments, interaction and communication, transcation, advanced applications, temporal aspects, oriented systems, partitioning and fragmentation, database queries, data, data warehouses, knowledge discovery and data mining, knowledge extraction, and knowledge base reduction for comprehension and reuse.




Proceedings of Fifth International Congress on Information and Communication Technology


Book Description

This book gathers selected high-quality research papers presented at the Fifth International Congress on Information and Communication Technology, held at Brunel University, London, on February 20–21, 2020. It discusses emerging topics pertaining to information and communication technology (ICT) for managerial applications, e-governance, e-agriculture, e-education and computing technologies, the Internet of Things (IoT) and e-mining. Written by respected experts and researchers working on ICT, the book offers a valuable asset for young researchers involved in advanced studies.




Handbook of Data Management


Book Description

Packed with dozens of no-nonsense chapters written by leading professionals, Handbook of Data Management, 1999 Edition shows your students how to design, build, and maintain high-performance, high-availability databases in multiple environments. Handbook of Data Management, 1999 Edition is the most comprehensive, single-volume guide of its kind. The book provides the latest, most innovative solutions for planning, developing, and running a powerful data management function. Here students will find exhaustive coverage of the range of data repositories (from legacy indexed files to object data bases and data warehouses) as well as details on everything from strategic planning to maximizing database performance. Completely revised and updated to reflect latebreaking technologies, Handbook of Data Management, 1999 Edition includes extensive case studies and straightforward descriptions showing students how to: implement Web-enabled data warehouses build multimedia databases master data mining use enterprise database modeling stay up-to-date with data conversion and migration maximize OLAP architectures and tools Handbook of Data Management, 1999 Edition also provides ongoing coverage of the latest tools and techniques regarding: organization for quality information systems data definition database design and management object and hybrid databases and more Each contributor to Handbook of Data Management, 1999 Edition is an expert with first-hand experience in database and data management. These contributors provide a depth and breadth of coverage you and your students simply won't find anywhere else. Prepare your students for "real-world" business computing. Start them off with Handbook of Data Management, 1999 Edition.




High-Performance Web Databases


Book Description

As Web-based systems and e-commerce carry businesses into the 21st century, databases are becoming workhorses that shoulder each and every online transaction. For organizations to have effective 24/7 Web operations, they need powerhouse databases that deliver at peak performance-all the time. High Performance Web Databases: Design, Development, and