The world of billions of hidden computers has been opened up to the public, courtesy of scientists at Switzerland's Federal Institutes of Technology and Microsoft.
On Thursday the partnership presented seven projects, making up the first venture of the Microsoft Innovation Cluster for Embedded Software (ICES).
Over the next two to three years, scientists at the universities in Zurich and Lausanne, in partnership with Microsoft's laboratory in Cambridge, will attempt to advance the science behind embedded systems.
This is the practice of integrating processers into mainstream devices and is widely considered to be where computing is going for the foreseeable future.
Funding for the ICES projects will be split equally between the universities and Microsoft and will depend on the length of the project, the number of researchers involved and overhead expenses. The consortium has received around 20 project proposals since the launch of the programme in March.
"Computers are often hidden now," said Andrew Herbert, managing director of Microsoft Research in Britain.
"If you're sitting in a train coming to Zurich in the morning, you're not aware of how many computers there are in the train, controlling the engines and the signals and making you go down the right tracks. But you're very dependent on them and you want them to work."
Embedded computing is a part of virtually all of today's technology – from digital cameras to telephones to luxury espresso machines – but there is still a relatively large gap between the these devices, characterised by their specialisation and high reliability, and personal computers, which perform a much broader range of functions. Researchers are hoping to close the gap.
"Embedded systems are at the forefront of many challenges in computing. They are often highly concurrent, asynchronous and required to work with high reliability.
"Therefore we require programming tools that enable the rapid production of verifiable trustworthy software," Herbert said.
Not always reliable
Herbert's employer, Redmond-based Microsoft, makes the operating system used on most personal computers. But despite being in such general use, the Windows franchise of operating systems continues to flummox users – not often, but sometimes at inopportune, inconvenient times.
For embedded devices built into highly sensitive environments – medical devices or aircraft avionics, for example – the prospect of failure, if only once, is a non-starter.
"It has to be a reliant system that doesn't fail. It has to keep on working," Herbert said.
To try to achieve reliability, developers take into account design mechanisms and build redundancies into systems. They also attempt to prove mathematically that nothing will go wrong by analysing all the things that a device could possibly do.
Jürg Gutknecht, dean of the informatics department at the Federal Institute of Technology in Zurich, believes it will take massive advances in research before devices achieve absolute consistency.
"In the future, you will have more and more safety-critical applications where you cannot afford a single drop-out. For me, it is a radically different mindset if you have to develop systems that are 100 per cent safety critical," he told swissinfo.
Gutknecht is more pessimistic about reliability and robustness of current commercial and research software, pointing out that even methods to check software models show only errors – and not the absence of errors, except in a few cases.
Like Microsoft's Herbert, he says that programmers need to be conscious of this when they begin, and need to start with a design that has built-in security and fault tolerances.
The current state of computing, according to George Candea at the federal institute in Lausanne, is that not only do computers make mistakes but they make the same mistakes over and over.
In a perfect world computers would never fail. In a better world, Candea told swissinfo, computers would fail but would learn from their mistakes much in the same way as the human body fights off diseases and builds immunity.
His project aims to develop a self-healing mechanism for embedded devices.
Gutknecht predicts that the worlds of embedded and personal, standalone computers are merging. He calls it the "age of the disappearing computer".
"In the future this separation will no longer be valued so much," he said.
A significant challenge for industry as it relates to consumer goods, he believes, is not more research but standardisation to allow our many devices talk to each other.
His own project aims to miniaturise supercomputers to function reliably in applications such as critical care health monitoring. This would involve integrating exponentially more powerful algorithms into embedded systems.
"There's still an infinite stupidity in all of these devices," he said.
swissinfo, Justin Häne in Zurich
Moore's Law says that computing power doubles every 18 months. Traditionally, hardware manufacturers have been increasing the speed of processers, which are the heart of a computer. This generates intense heat and in the absence of new cooling techniques is untenable.
The alternative, which manufacturers have embraced, is to build more than one processing core on one chip. In essence, this is like joining two, four, eight or more computers. This provides engineers with the opportunity to develop systems that use idle time to perform other processes without sacrificing responsiveness.
One of the challenges is the difference in speed between computer chips and computer memory.
When Andrew Herbert was working on his PhD some 30 years ago, his university put in a special application to spend £2 million (SFr4 million) on three truckloads worth of computer memory.What they got, 512 kilobytes, was about one-tenth of one-quarter of one per cent of what is today found in a standard notebook computer.
In compliance with the JTI standards