Personal tools
You are here: Home Research Trends & Opportunities New Media and New Digital Economy Future Compute and Microelectronics

Future Compute and Microelectronics

The von Neumann Architecture_012623A
[The von Neumann Architecture - TechTarget]
 

- Overview

Microelectronic devices are critical to nearly every aspect of our lives -- from running small businesses to powering the global economy, from tracking our personal health to fighting pandemics, from powering our homes to protecting our nations' infrastructure. 

Microelectronics include computer chips, power electronics (such as those that control electricity), and other small semiconductor devices. Since the mid-20th century, the rapid reduction in size and cost of microelectronic devices, coupled with increasing performance and energy efficiency, has changed the world in a short period of time. 

However, these transformative devices now face technical and economic challenges that require new innovations. The second revolution in microelectronics will apply new understandings in physical and computational sciences. 

 

- The Future of Microelectronics

For such a tiny part, the transistor plays a huge role in our lives. Transistors - invented in 1947 by former ECE ILLINOIS Professor John Bardeen and two other physicists - have helped usher in the information revolution. They are ubiquitous in technology. Their low cost, flexibility, and reliability have allowed for amazing advancements in computers, machines, equipment, products—anything that involves microelectronics. 

And those advancements have been incremental throughout the years. Consider this: Intel’s 22nm 3D transistors, introduced in 2011, run over 4,000 times as fast as Intel’s first microprocessor, introduced in 1971. They use about 5,000 times less energy, and their price per transistor dropped by a factor of about 50,000. The company manufactures more than 5 billion transistors every second. That adds up to incredible speed at very affordable costs, which translates to ever-improving technology. There is only one catch, but it is a big one. It is embodied in Moore’s law, which projects that the number of transistors in a dense integrated circuit doubles every two years.

“The overarching problem is the semiconducting industry has been on a scaling path for almost 50 years,” says ECE ILLINOIS Professor Wen-Mei W Hwu, who is an affiliate of Illinois Computer Science and the Coordinated Science Lab. But he believes the pace of advancement based on this scaling process is coming to an end soon, “because as the transistors get smaller and smaller, the process has become way too expensive,” Hwu adds. In other words, you can shrink transistors only so far. Then you need to look for advancements in other ways.  

 

- Future Compute

Nearly 60 years old and Moore's Law still stands strong for many in the computing world. But the surge of Artificial Intelligence and machine learning have coincided with the breakdown of Moore’s Law, and for many thought-leaders what lies beyond is not entirely clear.  

The future of technology is uncertain as Moore’s Law comes to an end. However, while most experts agree that silicon transistors will stop shrinking around 2021, this doesn’t mean Moore’s law is dead in spirit - even though, technically, it might be. Chip makers have to find another way to increase power. For example, there are Germanium and other III-V technologies - and, at some point, carbon nanotubes - that provide new ways of increasing power. There is also gate-all-around transistor design, extreme-ultraviolet and self-directed assembly techniques, and so on.  

Progress in technologies such as photonics, micro- and nanoelectronics, smart systems and robotics is changing the way we design, produce, commercialize and generate value from products and related services.

 

- The End of Moore's Law

Moore's Law is one of the best technology predictions of the past 50 years. Gordon E. Moore predicted that the number of components on integrated circuits (ICs) would triple every year. His hypothesis became known as Moore's Law and was confirmed in 1975. The increase in chip density is primarily attributable to four main factors: die size, line size, technical brilliance, and technological innovation.

Since the 1970s, the semiconductor industry has typically shrunk the size of transistors from microns to nanometers. The smartphone in your pocket is more powerful than the supercomputer of half a century ago -- thanks in part to miniaturization. Every 18 months or so, chipmakers pack in twice as many transistors—the famous Moore's Law. But as transistors approach the size of atoms, Moore's Law of classical computing is broken by the complex laws of quantum mechanics. 

Moore's Law is an observation made by Gordon Moore in 1965. Moore's empirical observations are its basis. From the observed statistics, he deduced that the number of transistors on a microchip doubles every year. The increase in chip density is mainly attributable to four main factors: die size, line size, technological brilliance, and technological innovation. Over the years, the law has declined. This weakening is due to the increasing complexity involved in creating cutting-edge technology.

 

- Characteristics of von Neumann Computers

The challenge goes beyond transistor size. Most computer processors are based on the 70-year-old von Neumann model, named for its inventor. In this model, the processing units are independent but connected to a memory unit, which requires instructions from the processing unit and data from the memory unit to be shuffled back and forth during computation. 

Von Neumann computers have the following characteristics:  

  • Separate processing and storage units. A von Neumann computer has a separate central processing unit for processing data and a memory unit for storing data.
  • Binary value. Von Neumann computers encode data using binary values.
  • Speed and energy issues. To perform computations, data must move between separate processing locations and memory locations. This approach is known as a von Neumann bottleneck; it limits the computer's speed and increases its energy consumption. Neural networks and machine-learning software that run on von Neumann hardware often have to offer either fast computation or low energy consumption, one at the expense of the other.

 

Mount Rainier_Washington_052622A
[Mount Rainier, Washington]

- The von Neumann Bottlenecks

The von Neumann Bottleneck concept is a process where the performance of the computer system is reduced. It happens because the relative capability of the processors is reduced as compared to the average top data transfer speeds. The CPU remains inactive for a certain period while the memory is being used. This is the summary of the device architecture or the Von Nuemann Model. It was introduced after the name of a mathematician and computer scientist John Von Neumann. You will know that it is a restriction on the capacity provided to the default personal computer design.

This extended data movement consumes energy and generates heat—the so-called von Neumann bottleneck. For supercomputing and data centers, this means building expensive power and cooling infrastructure. For scientists who want to analyze large amounts of data in real time during experiments, memory access and capacity, among other data bottlenecks, are barriers to scientific discovery.

 

- Reinventing Microelectronics for the 21st Century

In response to challenges such as the end of Moore's Law and the von Neumann bottleneck, the department of energy of the U.S. is funding research to design new materials that exploit properties at the atomic and subatomic scales. It also funds research to develop new models of computing, including neuromorphic computing, which mimics the way the brain works, and quantum computing, which harnesses the physics of quantum mechanics to solve new and complex problems. Data-driven software programs in artificial intelligence and quantum information science will benefit from these new architectures designed for their unique purposes. There may be others -- we may see hybrid or hybrid computing models in the future.

However, it's not just about computers. As the U.S. upgrades its 100-year-old power grid, microelectronics will be key to adding more renewable energy, preventing cyber-attacks and introducing two-way power flow between consumers and the grid, which optimizes usage. Scientists are also working on Plan how future microelectronics will improve research. Energy-efficient, data-flexible microelectronics will allow researchers to collect and analyze more data faster, using equipment closer to experimental setups.

The key to making all of this possible requires a collaborative "co-design" approach that brings together experts in the microelectronics pipeline from start to finish. Historically, each step in the R&D process was carried out independently. However, through co-design, the US department of energy will leverage materials and chemistry scientists, mathematicians, computer engineers, industry partners and others to work together to inform every step of the process and innovate better and faster.

Knowing the enormous impact microelectronics has on our lives today, it is critical to prepare for this important technology of the future. The microelectronics program led by the National Laboratory of the U.S. will help us bridge the gap between what we need to know about how to advance the science of discovery in microelectronics and bringing these new technologies into the lab and into the marketplace.

 

 

 

[More to come ...]



Document Actions