SC247    Topics     Technology    Software    IBM

50 Years On, Moore’s Law and The Future of Technology - Will It Soon Cease To Exist?

The Moore's Law theorem led to the widespread adoption of computers and smartphones, but there are tell-tale signs that it is becoming outdated and could cease to hold within the decade.


In a recent FORTUNE article written by Supratik Guha, director of physical sciences at IBM, Guha states “there are tell-tale signs that Moore’s Law is slowing, and we are almost certain that the law will cease to hold within a decade.”

In 1950, at a time when there were fewer than 10 digital computers worldwide, Bill Pfann, a 33-year-old scientist at Bell Laboratories in New Jersey, discovered a method that could be used to purify elements, such as germanium and silicon. He could not possibly have imagined then that this discovery would enable the silicon microchip and the rise of the computer industry, the Internet, and the emergence of the information age.

Today, there are about 10 billion Internet-connected devices in the world, such as laptops and mobile phones, and at the heart of each of these devices, there is at least one such micro-chip that acts as its “engine”.

Related: 26 Billion Internet of Things Installed Will Alter How the Future Supply Chain Operates

The reason behind this relentless progress is neatly contained in a prophetic law that was announced 50 years ago this Sunday, called Moore’s Law. The microchip is built with tiny electrical switches made of purified silicon called transistors and the law stated that the number of transistors on a chip would double every year. In 1975, Gordon Moore revised his forecast to state that the count would double every two years. The law has held true since.

Supratik Guha, IBM director of physical sciences

“As Moore’s Law’s slows down, innovations in other areas, such as developments in software, will pick up the slack in the short-term” Supratik Guha, IBM director of physical sciences

Why is Moore’s Law relevant? Because this doubling of the number of transistors led to computer chips that could be packed with increasingly sophisticated circuitry that was both energy efficient and cheap. This led to the widespread adoption of computers, mobile phones, and the information technology revolution.

The price of computation is about 10 million times cheaper than it was 40 years ago, and the computing power held in a smartphone outstrips the workstations that computer scientists used in their offices in the 1990s. That we have been able to so far hold true to Moore’s Law is the reason that the electronic circulation of information has been commoditized, changing the way many of us learn, bank, travel, communicate and socialize.

Take the example of social networking using a mobile phone. It works because the cost of a transistor has dropped a million fold and computing is about 10,000 times more energy-efficient since 1980 when this writer first went to engineering school. Consequently, a $200 smartphone powered by a biscuit-sized battery contains a microchip with a few billion transistors in it and enough computing power to digitally process an image, and then upload and share it wirelessly using powerful mathematics to encode the data. This is a consequence of Moore’s Law in action.

Yet, on its 50th anniversary, there are tell-tale signs that Moore’s Law is slowing, and we are almost certain that the law will cease to hold within a decade. With further miniaturization, silicon transistors will attain dimensions of the order of only a handful of atoms and the laws of physics dictate that the transistors and electronic circuits will cease to work efficiently at that point. As Moore’s Law slows down, innovations in other areas, such as developments in software, will pick up the slack in the short-term.

But in the longer-term, there will be fundamental changes in the essential design of the classical computer that, remarkably, has remained unchanged since the 1950s. Designed for precise calculations, today’s computing machines do not make inferences, and qualitative decisions, or recognize patterns from large amounts of data efficiently.

The next substantive leap forward will be in computers with human-like cognitive capabilities that are also energy efficient. IBM’s Watson, the computing system that won the television game show Jeopardy! in 2011, consumed about 4000 times more energy than its human competitors. This experience reinforced the need for new energy-efficient computing machines that are designed differently from the sequential, calculative methodology of classical computers and are inspired, perhaps, by the way, biological brains work.

A journalist recently asked me whether the continuation of Moore’s Law was indispensable. It is the beauty of the collective enterprise of human innovation that ensures that nothing is indispensable indefinitely for technology to progress. Decades later one might look at the era of Moore’s Law as a golden period where computers came of age through a masterful display of an industry’s ability to miniaturize and create billions of flawless and identical copies of tiny circuits at factories throughout the world. But, much as a pack of migratory birds flying in V-formation rotate in at the lead position, there will, at that future time, be many other technologies that will have carried us forward in the information age.

Source: FORTUNE

Moore’s Law and The Future of Technology
In his original paper, Gordon Moore’s idea was to give people a sense of what was possible. His idea was to make electronics so cheap that we could put them into tiny computers that everybody could use. He wanted to show people that very, very rapidly this technology would get very, very cheap.

But then, In 1970, a man named Carver Mead, who worked with Gordon Moore, took a deeper dive into the actual physics of transistors. He named them Moore’s Law in around 1970 and showed that this trend could continue for probably another 20, 30, 40, 50 years now.

In the above video, AEI visiting scholar, Bret Swanson dives into the future of how this established technology will affect our future.

Over the last several decades as Moore’s Law has been advancing, we’ve seen that nearly all the productivity advances in the U.S. economy have come from information technology.

Download Gordon Moore's Original Paper

How can this be? Well, other sectors like healthcare and education have been stagnant. If we can use information technology, if we can liberate these sectors to actually become more productive, they can actually become contributors to the economy in the decades ahead. They can actually fuel a new era of growth. Now we are going to see how information technology changes virtually every industry that we know.

Even if Moore’s Law proper is slowing down because of this atomic limit, think about the ways that we are going to apply computer technology into the future. So until today, medicine has been mostly a biological phenomenon, an arena of trial and error by doctors and researchers. But if you think about making medicine an information technology where we’re looking into people’s DNA and their proteins and actually finding how people’s cells work, how diseases actually work, and targeting each person individually with customized information technology medicine.

There is also a question of whether the digital revolution has benefited just a few Silicon Valley entrepreneurs or whether it’s benefited people across the middle class. My view is that the digital revolution has dramatically improved the lives of people from top to bottom and across all classes and across the world. Even with the same income or even a lower income today. There are people in the lower middle class that have mobile phones and the internet that are able to access so much more information. Access so many more services. Talk to people across the world, to educate themselves, to engage in entrepreneurship.

Information technology has had so much information in large part because it’s been so free. People in Silicon Valley and across the world have been able not only to build their own microchips and invest in new silicon fabrication technologies but then go use those technologies and build software apps and new devices.

There is a real worry today that with more regulation on the internet coming out of Washington D.C. that we may slow down the rate of innovation on the internet and in the digital world. I can’t think of anything worse for the U.S. economy, for innovation, and for living standards going into the next decades than to strangle the internet with these old school regulations.

The next 50 years are going to be just amazing. So the last 50 years have been all about information technology. Now we are going to see how information technology changes virtually every industry that we know. It’s really perhaps going to impact our lives greater in the next 50 years than it has in the past 50.

Source: American Enterprise Institute


Related: ‘Bring Your Own Internet of Things’ Coming To Businesses In 2015

Download the White Paper: The Internet of Things Continues to Expand


Article Topics


IBM News & Resources

Creating Business Value With Embedded Sustainability
The Circular Supply Chain with Lisa Dender, Global Lead for Product Chemical Regulations at IBM
Talking Supply Chain Podcast: On the Road at MHI with Noelle Russell
Cultivating Relentless Supply Chain Agility at IBM
The Retail Supply Chain is Hot Hot Hot
The Rebound Podcast: We’re Going to Need a Bigger Boat?
New IBM study addresses trucking’s future through a digital transformation
More IBM

Latest in Technology

South Korea Finally Overtakes China in Goods Exported to U.S.
SAP Unveils New AI-Driven Supply Chain Innovations
U.S. Manufacturing is Growing but Employment Not Keeping Pace
The Two Most Important Factors in Last-Mile Delivery
Spotlight Startup: Cart.com is Reimagining Logistics
Walmart and Swisslog Expand Partnership with New Texas Facility
Taking Stock of Today’s Robotics Market and What the Future Holds
More Technology
Compliance
Inbound Visibility
Order Shipment Processes

Supply chain management solutions from IBM deliver supply chain planning and execution capabilities across the extended enterprise, enabling companies to anticipate, control and react to demand and supply volatility within the supply chain. Managing how and where you fulfill orders, how much inventory you should store where, and the planning and execution of your shipments to meet customer commitments. IBM provides the ability to understand, manage and direct the supply chain in order to make the supply chain smarter.



View IBM company profile

 

Featured Downloads

The Ultimate WMS Checklist: Find the Perfect Fit
The Ultimate WMS Checklist: Find the Perfect Fit
Warehouse Management System selection requires time, research and diligent planning. In order to help you, Made4net has published this whitepaper to...
GEP Procurement & Supply Chain Tech Trends Report 2024
GEP Procurement & Supply Chain Tech Trends Report 2024
We’ve researched the five biggest trends in the supply chain space this year, and, drawing on our expertise in procurement and...

Unified Control System - Intelligent Warehouse Orchestration
Unified Control System - Intelligent Warehouse Orchestration
Download this whitepaper to learn Unified Control System (UCS), designed to orchestrate automated and human workflows across the warehouse, enabling automation technologies...
An Inside Look at Dropshipping
An Inside Look at Dropshipping
Korber Supply Chain’s introduction to the world of dropshipping. While dropshipping is not for every retailer or distributor, it does provide...
C3 Solutions Major Trends for Yard and Dock Management in 2024
C3 Solutions Major Trends for Yard and Dock Management in 2024
What trends you should be focusing on in 2024 depends on how far you are on your yard and dock management journey. This...