The overwhelming majority of bugs and crashes in computer programming stem from problems of memory access, allocation, or deallocation. Such memory related errors are also notoriously difficult to debug. Yet the role that memory plays in C and C++ programming is a subject often overlooked in courses and in books because it requires specialised knowledge of operating systems, compilers, computer architecture in addition to a familiarity with the languages themselves. Most professional programmers learn entirely through experience of the trouble it causes. This 2004 book provides students and professional programmers with a concise yet comprehensive view of the role memory plays in all aspects of programming and program behaviour. Assuming only a basic familiarity with C or C++, the author describes the techniques, methods, and tools available to deal with the problems related to memory and its effective use.
Recent achievements in hardware and software development, such as multi-core CPUs and DRAM capacities of multiple terabytes per server, enabled the introduction of a revolutionary technology: in-memory data management. This technology supports the flexible and extremely fast analysis of massive amounts of enterprise data. Professor Hasso Plattner and his research group at the Hasso Plattner Institute in Potsdam, Germany, have been investigating and teaching the corresponding concepts and their adoption in the software industry for years.
This book is based on an online course that was first launched in autumn 2012 with more than 13,000 enrolled students and marked the successful starting point of the openHPI e-learning platform. The course is mainly designed for students of computer science, software engineering, and IT related subjects, but addresses business experts, software developers, technology experts, and IT analysts alike. Plattner and his group focus on exploring the inner mechanics of a column-oriented dictionary-encoded in-memory database. Covered topics include - amongst others - physical data storage and access, basic database operators, compression mechanisms, and parallel join algorithms. Beyond that, implications for future enterprise applications and their development are discussed. Step by step, readers will understand the radical differences and advantages of the new technology over traditional row-oriented, disk-based databases.
In this completely revised 2nd edition, we incorporate the feedback of thousands of course participants on openHPI and take into account latest advancements in hard- and software. Improved figures, explanations, and examples further ease the understanding of the concepts presented. We introduce advanced data management techniques such as transparent aggregate caches and provide new showcases that demonstrate the potential of in-memory databases for two diverse industries: retail and life sciences.
An authoritative book for hardware and software designers. Caches are by far the simplest and most effective mechanism for improving computer performance. This innovative book exposes the characteristics of performance-optimal single and multi-level cache hierarchies by approaching the cache design process through the novel perspective of minimizing execution times. It presents useful data on the relative performance of a wide spectrum of machines and offers empirical and analytical evaluations of the underlying phenomena. This book will help computer professionals appreciate the impact of caches and enable designers to maximize performance given particular implementation constraints.
Computer graphics, computer-aided design, and computer-aided manufacturing are tools that have become indispensable to a wide array of activities in contemporary society. Euclidean processing provides the basis for these computer-aided design systems although it contains elements that inevitably lead to an inaccurate, non-robust, and complex system. The primary cause of the deficiencies of Euclidean processing is the division operation, which becomes necessary if an n-space problem is to be processed in n-space. The difficulties that accompany the division operation may be avoided if processing is conducted entirely in (n+1)-space. The paradigm attained through the logical extension of this approach, totally four-dimensional processing, is the subject of this book. This book offers a new system of geometric processing techniques that attain accurate, robust, and compact computations, and allow the construction of a systematically structured CAD system.
The crime fiction novelist presents his first novella, A Lapse of Memory, based on his short story of the same name. When the mafia wants a mark eliminated, they turn to contract killer James Randall, a.k.a. Richard Johnson, and he usually delivers in spades. But Randall is aging and has developed the early stages of Alzheimer's. When a hit goes horribly wrong, he finds himself on the run, pursued by his former employers. Gritty, mesmerizing, A Lapse of Memory excites us with the chase, while also providing a back-story of a time long gone by. Himself in search of a lost past, the emotionally complex Randall is determined to evade his pursuers. But can he any longer differentiate friend from foe? Can he search out the secrets of his past as the present slips away? The novella is ripe with inventive incidents, is evocative of the present as well as the physical atmosphere of the 1950's, and offers a number of interesting characters. As usual in Curry's work, the presence of the sea and the city is keenly felt and connected to the psychological action of the story.
Mobile Computer Technicians Articles
Mobile Computer Technicians Books