In 1949, a German Engineer and Physicist, Werner Jacobi, came up with the idea of an integrated transistor amplifier.
In 1952, a British Radio Engineer, Geoffrey Drummer, got the novel idea that many standard electronics components could be put together in a semiconductor crystal. Finally, in 1953, Harwick Johnson patented the integrated circuit.
The Integrated circuit, better known as a semiconductor chip is responsible for the development of the digital age, and is akin to the Industrial Revolution that also made a huge impact on the path of humanity as we know it today.
Today, trillions of chips work tirelessly to make our lives comfortable and better. Can you imagine a world where there would be no Apple, Samsung, Google, Microsoft, Laptops, Television, and Space Exploration? That kind of life would be unthinkable, not to mention the millions of jobs that would be lost in the industries that manufacture products made out of the semiconductor chip.
Come to think of it, there would be no internet and you would not be learning about the history of the chip on your computer, like you are doing right now.
In 1958, a man called Jack Kilby working at Texas Instruments put together some electronic components — Diodes and Transistors — and created one of the first precursors of the semiconductor chip that we know today.
To go back to the history of these components, the Transistor was invented by Lee de Forest in 1906, when he found out that placing an electrified mesh between two electrodes in a vacuum amplified the current and acted as a switch. This led to the invention of vacuum tubes which were used in the early radios and telephones.
The first computer: from bumbling baby steps to fastest sprint
Now, the exact time that the potential of the semiconductor chip was amplified was in 1946, when the first digital computer, named ENIAC, was built. ENIAC was a behemoth in size and weighed more than 30 tons. It had more than 100,000 different parts. It has dimensions of 100 feet in length, 10 feet height and 3 feet depth.
ENIAC would consume 200 kilowatts of power when it was switched on. On the first day of testing, western Philadelphia had to use dim lights for the night. Vacuum tubes at that time would consume a lot of power, and they would often fail, which meant that ENIAC needed servicing every few days.
This failure of vacuum tubes caused scientists at AT&T’s Bell Labs to form a team to look into inventing a product that would replace the vacuum tubes. The team was tasked with finding something in the solid state, which would have no vacuum, moving parts or filaments, but would act like the vacuum tubes did. Immediately, the team thought of using semiconductors since their properties were now “the thing” in science circles.
In December 1947, the researchers finally got the product that they were looking for. They found that transistors reduced the power needed to run an electronic circuit. However, such a circuit required several transistors, resistors and capacitors, which would operate via soldered wire connections, and a single wrong connection would stop such a circuit from working.
Now, Jack Kilby, as mentioned above, thought of putting together all these components on a block of semiconductor material, get rid of the wires and faulty connection, and come up with a compact circuit – hence the name integrated circuit. On September 12th 1958, the first Integrated Circuit was shown to the world, and it worked perfectly.
About half a year later, another engineer in California called Robert Noyce came up with another way of making an integrated circuit. He called it a “chip” and it could be manufactured in huge identical quantities. He soon joined a new young company that would grow into one of the largest semiconductor manufacturers in the world. The company was called Intel.
Thus the semiconductor revolution was launched.
In 1961, The United States Air force got the first computer made out of semiconductor chips. Then came the pocket calculator, which was built by Texas Instruments. Thereafter, the invention of chip-based electronics progressed rapidly.
Gordon Moore, one of the founders of Intel, said that the processing power of semiconductor chips doubles every year and the prices drop by half. This led to what is now known as Moore’s law, which is used widely in this digital age. The law has led to incredible growth in the electronics industry, where smaller semiconductor chips are still being invented, and a lot of wealth is being accumulated.
Moore once pointed out that were the auto industry to progress like the semiconductor industry, you would get a Rolls Royce that would go for 500,000 miles on a gallon of fuel, and would be cheaper to throw away than to park. If you look at the semiconductor chips made today, you would find a billion components fitted into one that was the size of your fingernail.
What is the future of the semiconductor chip?
Many wonder whether the semiconductor industry can maintain the pace it has grown with. It has become quite a challenge to increase the power of the chips because basic and essential physical barrier are now being reached, beyond which there can be no progress. Nanotechnology promises to overcome such boundaries, but that too will reach a point where it cannot be expanded anymore.
There are theories about developing semiconductor chips that will be powered by light instead of electricity, and this would make computers operate at the speed of light, in a manner of speaking. Whenever you switch on your television, or as you are reading this article on your computer, take a moment and think about the incredible journey of the semiconductor that made it all possible.
This is truly the semiconductor that changed the world.
Leave a Reply