HOW THE CHIP TRANSFORMED THE INDUSTRY
We tend to take the technology for granted, but modern digital hearing aids as we know them today wouldn’t exist without the humble microchip.
Chances are you are using something with a chip in it right now.
The microchip, or the integrated circuit, is the nervous system that controls just about every electronic device in the world. It is central to computers, mobile phones, satellites, home electronics, aircraft, microwave ovens, washing machines, iPods, cars, the internet and of course, hearing aids.
Before the invention of the chip, electronic devices such as computers and radios used vacuum tubes, or valves, which were cumbersome, heavy and generated a large amount of heat while consuming a lot of power. For example, in the forties, typical computers used over 10,000 vacuum tubes and occupied around 100 square metres of space!
With the introduction of the transistor in 1947, it was suddenly possible to make more complex and faster electronic circuits, resulting in smaller, more efficient devices. However, initially transistors were made as individual components and connected to other electronic components to make a circuit, and eventually this caused problems as there were simply too many components. In order to make circuits even faster, the transistors had to be packed closer and closer together.
The microchip owes its existence to the diligent efforts of two electrical engineers, Jack Kilby and Robert Noyce. As chance would have it, both were working on solving the same problem at roughly the same time in the late fifties, that is – how to make more out of less.
Their solution to the physical limitations of transistors was to pack not just the transistors but all the other electrical components such as resistors, capacitors and diodes, onto a single piece of semiconductor material, or chip, all made from the same material (such as silicon). This meant that everything could be interconnected to form a complete circuit.