3/2/2024 0 Comments Luminous computing series bill![]() But as the Internet of Things took off and enabled data collection from literally billions of devices, moving all that data up to the cloud in order to crunch it has become a challenge. With the development of a high-speed Internet, the thinking shifted, and an application could sit in the “cloud” and support thousands, even millions, of users. The second half of the 20th century was all about moving computing from centralized mainframe computers to desktop and laptop distributed computers. And on the edge, companies like Gyrfalcon Technology, Syntiant, and Blaize are promising even greater improvements inperformance and power efficiency. On the server side, companies like Cerebras Systems and Graphcore, and more recently SambaNova, are promising order of magnitude improvements. ![]() But GPUs are not exactly optimized for AI, and so there has been an explosion of startups seeking to develop chips that offer 10x or 100x the performance and power efficiency of GPUs. With the emergence of deep neural network software, engineers realized that Graphics Processing Units, an architecture commercialized by Nvidia, were nicely designed for doing the massive matrix calculations required by neural network models. Writing software that takes full advantage of these new chips is extremely challenging, and so companies like SambaNova Systems are developing operating systems and software compilers that optimize the application code automatically and allocate resources to compute tasks dynamically in real-time as computing demands change. A new class of hardware is emerging that takes advantage of what is called “heterogeneous computing,” multi-core chips that incorporate multiple different co-processors on the chip that are optimized for specialized tasks. New architectures, however,require that both software engineers and hardware engineers work together. Marc Andreessen has famously said, “Software is eating the world.” What he did not make clear, though, is that virtually all the progress in computing over the past 30 years has been thanks to hardware, not software. ![]() The same advances in semiconductor fabrication technology that powered Moore’s Law-the exponential increase in the power of computers over the last several decades-have enabled hardware engineers to develop new architectures that promise to transform the computing landscape over the coming decades.Īt the same time, software engineering is also progressing. But the basic idea of storing instructions and data in binary code, and using on/off digital hardware to execute mathematical and logical operations, has remained roughly the same for decades. Certainly, the hardware has changed tremendously, and software has evolved accordingly. Bill Reichert, Partner, Pegasus Tech Venturesįor roughly 75 years, the fundamental architecture of computers has not changed much.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |