Friday, June 05, 2015

Next-Gen Computing?

Radical Next-Gen Computing Guest Editor's Introduction | June 2015 - IEEECS
"In the past half century, driven by rapid, phenomenal advances in microelectronics closely following Moore’s law, computers of different kinds, forms, and shapes have evolved, redefined, and transformed almost everything we deal with. However, they still function on the same fundamental computational principles that Charles Babbage and Alan Turing envisaged and that John von Neumann and others subsequently refined. What’s in store for the next 50 years? Do the fundamental principles and assumptions that define modern computing — and that have guided us so far — require revolutionary rethinking?
  • Quantum Computing
  • Biologically Inspired Computing
  • Nanocomputing"
"The Machine will fuse memory and storage, flatten complex data hierarchies, bring processing closer to the data, embed security control points throughout the hardware and software stacks, and enable management and assurance of the system at scale."

HP has abruptly changed course on its 'Machine,' a new type of memory-driven computer it thinks will radically alter large-scale data processing. When the company first launched it last year, the plan was to use a new kind of memory chip called the "memristor," which is as fast as DRAM but can permanently store data.

HP compromises to get groundbreaking Machine to market - SlashGear
hp-memristor-4
The first Machine will have 320 TB of memory, Fink confirmed, well in excess of the 12 TB its current servers top out at.


Evolution is usually a better way to improve... System architecture of computers needs adjustment for more co-locating of processing and storage and more effective parallel communication, like software actors or lightweight processes in Erlang, implemented in hardware. That may be in category "biologically inspired" but it is essentially engineering optimization. Like a supercomputer made of many smartphones well connected.

HP did something similar before, in later 1990s, when proprietary e-speak protocol was optimized before XML web services "won" by being open and standardized while less optimal.  

No comments: