SBA

Information | Process | Technology

EU e-Privacy Directive

This website uses cookies to manage authentication, navigation, and other functions. By using our website, you agree that we can place these types of cookies on your device.

You have declined cookies. This decision can be reversed.

You have allowed cookies to be placed on your computer. This decision can be reversed.

The Relentless March of Micro

If I had to name the single biggest event in the history of Computing it was probably World War II. The need to read encrypted German radio signals catalysed UK efforts to develop electro-mechanical and electronic machines capable of breaking the German codes, resulting in the first electronic computer, Colossus, in 1943. The theories and knowledge developed at Bletchley Park by people like Alan Turing, Max Neumann, Tommy Flowers and many others created the concepts of the computers we have today.

 

 

If I had to name the single biggest influence on the development of Computing since Colossus it would undoubtedly be Photography. Allow me to explain ….

 

Colossus was, no way of beating around the bush on this, colossal. It was a huge machine. Computers have become almost immeasurably more powerful since Colossus; if today’s computers were built using the same size of technology then the computing power of your smartphone would require thousands of tons of electronics which would sprawl over the area occupied by a large city.

 

Fortunately in 1947 three American physicists developed the transistor which superseded the vacuum tube diodes used in Colossus, and in 1953 scientists at the University of Manchester developed the first transistor-based electronic computer. Transistors were much smaller (and cheaper and more reliable) than vacuum tube diodes. This is where the story of modern computing begins.

 

At the start of the 1980s I was an Electronics Engineer, designing and fixing discrete electronic circuits. I and my colleagues would draw out our electronic designs on huge A0 drawing boards, work out how to implement them using the electronic components of the day, and then lay out the designs for the printed circuit boards that would hold and connect the components. We often did the latter task in a well known Public House, The Zebra, in Cambridge, at a large table we would sit around with pints of Greene King IPA and using a technique invented long before we would apply our design using thin strips of black adhesive plastic to a large sheet of transparent acetate to create the image of a circuit board which would be 5 to 10 times its real size. The acetate was then photographed, and projected onto the circuit board material using a technique called photo-etching. In this way we miniaturised our design in a similar way to you taking a photo of the Eiffel Tower and being able to hold the print in your hand.

 

Other electronics engineers, starting in 1958 used a similar photo-etching technique to draw designs of arrays of transistors and print them on a single slice of silicon crystal to create the first integrated electronic circuits (“ICs”)- what we now call Chips.

 

Photographic reduction has been the basis of electronic miniaturisation for decades and is still commonly used for miniaturising circuit boards. The makers of integrated circuits have refined the photographic reduction technique much further; the transistors in a modern chip may be as small as 10 nanometres. For reference, a human hair is about 60,000 nanometres in diameter. A modern computer chip may have over 10 million transistors per square millimetre. The march of micro-reduction is set to change shortly because we are reaching the theoretical limit of how small we can make transistors, but it will not halt; to date we have been creating the electronics in chips in a single layer - 2D, the next step is to move to 3D and stack layers on top of each other like the storeys in a building.

 

This miniaturisation is why you can hold in your hand a smartphone that is several thousand times more powerful than the computers which landed the Apollo spacecraft on the moon. It’s also why we can create tiny microcomputers which fit into spectacles and a single satellite so powerful that it is able to provide Internet connections to one million households but still small enough to launch into space (the Manx-registered Viasat-1).

 

So where next? Miniaturisation is not just about size, it’s also about speed, cost and energy consumption. Smaller electronics means we can fit more functions into a chip, they work faster because the electrons have less distance to travel, and similarly they need less electricity. The increase in performance enables us to make tiny computers which can address complex problems, needing fewer components reduces cost, and the reduction in power consumption means they can be battery powered and portable. Wearable computing is one of the next major horizons. Another is the “Internet of Things” because today’s computing chips can be added to almost any type of machine without compromising the design - a refrigerator with a 1990s IBM PC bolted to its side would look a bit odd, but a modern microcomputer coming in at matchbox size or smaller can be incorporated anywhere without spoiling form or function.

 

Pictured above, the Linino One is designed for prototyping Internet of Things applications and contains two computers - one is used for connecting with and controlling machines and sensors, and the other includes WiFi networking to provide a website and Internet communications. It could for instance be used to control a central heating system based on the weather forecast, or to count and report the number of passengers on a bus, or to automatically re-order copier paper when the stationery cupboard is running low. It could be much smaller but the large number of wiring connectors need space even though the computers don’t.

 

Also pictured above, an SD memory card which includes a complete Linux computer with WiFi networking. Normally used to connect digital cameras to smartphones and tablets, it can also be added to almost any other device that can use a SD memory card to connect it to the outside world. When I consider that my first portable PC was the size of a suitcase and weighed 28lb the idea of a computer the size of a postage stamp seems quite incredible.

 

Wearable computing is similarly making great strides. I am quite deaf and need hearing aids. Pictured above is one of them, supplied by Tim Latham of Island Hearing in Port Erin. This is actually a 64 bit computer with a pair of microphones and a loudspeaker which divides sound into 20 frequency bands and adjusts it to compensate for the failing in my hearing, it needs complex programming to tailor it to the wearer’s needs. Tim has analysed my hearing with a sophisticated computer programme, and loaded that analysis into special programming software which then configured the aids for the peculiarities of my hearing, the types of environment where I commonly talk with people such as cafés and meetings, my preferences in music, etc. etc. Modern hearing aids are far more than simple noise amplifiers, they are full-blown audio-processing computers, and this one senses the direction of sound to focus on the person speaking to me, shuts out unwanted sounds, and connects with my smartphone and computer to relay phone calls, digital music etc. directly to my ears. Somehow this tiny device contains all the electronics and a battery big enough to power it for a week or more.

 

Computers have become much smaller and will continue to reduce in size for many years yet. There is no good reason why the PC on your work desk should need to be larger than a packet of cigarettes except that there’s no need to make it that small, but for applications where smaller computers are useful, such as on people, in clothing, attached to machines and so on there is plenty of scope for further shrinkage. It will not be long before we can buy digital spectacles which allow us to zoom in to what we’re looking at, or lightbulbs which automatically adjust themselves depending on the darkness of the room - the only limit to the new uses of computing will be our imagination.

You are here: Home Thinking(s) IT Matters The Relentless March of Micro