Published/updated: November 2017
By Richard Edwards
Given quantum computing’s geeky and somewhat science fiction image, it was a little surprising to find a session on the topic at Microsoft’s Future Decoded event, which recently had its fourth annual outing in London. That’s because as an event, Future Decoded leans more towards the non-IT professional, albeit with a side-order of Azure and Microsoft 365 for this year’s 12,000 delegates.
Microsoft presenting its quantum computing credentials at a business IT conference underlines the importance of this technology to the company’s future, and to that of many others. Quantum computing promises to deliver more compute power per unit of energy than classical computing, which is clearly important from a sustainability perspective. Furthermore, it also opens-up an avenue of mathematical problem solving that today’s computers cannot realistically address.
Quantum computing takes advantage of the strange ability of subatomic particles to exist in more than one position or state at any given moment in time. By manipulating quantum bits (qubits) rather than regular bits, a quantum computer can process and store massively more information using less energy than a classical computer, and this is ultimately where the technology’s supremacy lies.
If large scale quantum computing can be made to work, and the signs are looking positive, then it could make today’s hyperscale technology look like pocket calculator stuff. Experts say that the combined power of just a few dozen qubits could solve currently-impossible problems in areas such as climate change, food production and antibiotic resistance. However, they also say it could render much of today’s encryption technology useless.
Beyond the cloud
Quantum computing is one of Microsoft’s “beyond the cloud” technology bets, sitting alongside Mixed Reality (MR) and Artificial Intelligence (AI). After explaining the essence of quantum computing to Future Decoded attendees, the Microsoft Research team described some of the challenging problems that scientists and engineers are tackling to put quantum theory into practice. Getting good quality qubits seems to be their goal.
Understanding quantum computing is challenging to say the least, but Microsoft researchers managed to convey the exponential power and scale of quantum computing by comparing the amount of RAM required by a “classical computer” to simulate a qubit. While “your mileage may vary”, simulating 30 qubits requires 16 gigabytes of RAM. Add 10 more, and you’ll need 16 terabytes of RAM to simulate 40 qubits. 16 petabytes of RAM would be required to simulate 50 qubits.
Earlier this year, a 45-qubit simulation at the Swiss Federal Institute of Technology in Zurich required 500 terabytes of memory. Why are people simulating quantum computers I hear you ask? Well, until we get stable quantum computers, someone has to check the results of quantum computers and their algorithms.
The quantum leap
Let there be no doubt about it, the quantum computing arms race has already begun. If what we hear from the teams at IBM, Intel, Google and Microsoft are to be believed, then ‘quantum supremacy’ might be just around the corner, meaning by the end of this decade. These companies are not alone in their desires to build quantum computers using hundreds, or even thousands, of qubits, so we really are witnessing the birth of a new computing age.
D-Wave, a company that describes itself as “the world’s first quantum computing company”, is focusing on putting quantum computing to work to solve a range of “industry-scale classification, machine learning, and optimization problems”, including cyber security. The latter is the area that governments and cyber security experts are worrying about, as a sufficiently powerful computer running Shor’s algorithm can break open the cryptographic public-key algorithms that secure our privacy and ecommerce transactions.
Democratising quantum computing
When it comes to programming quantum computers, Microsoft clearly wants to get in on the ground floor. Later this year, the company will release a quantum development toolset that will extend Visual Studio to produce code that will run on a quantum simulator, scaling to more than 40 qubits when hosted on Azure.
Microsoft has a strong track record of democratising technologies, bringing them to the masses in an affordable and accessible manner, but it’s been late catching the wave in recent years, especially where mobile and cloud were concerned. So, as the industry looks for the ‘next big thing’, Microsoft will want to make sure it doesn’t miss the quantum computing wave, as it looks like it’s going to be a massive one.
Richard Feynman, the Nobel prize-winning physicist, conceived the idea of a ’quantum computer’ in the 1980s, so its startling to think that quantum computing might well become commercially available – Quantum computing-as-a-Service – within just 50 years of these ideas. Will Microsoft be first to market? Who knows, but there’s a strong chance it will be one of the first to make the power of quantum computing accessible to everyone.
By Tony Lock
A recent global survey of 1279 IT and business professionals highlighted that rapidly changing business and regulatory demands are driving a need to modify how security is managed in their software development processes. ...more
By Dale Vile
In the drive towards ever faster and more granular software delivery cycles, it’s important to ensure that speed and responsiveness don’t come at the expense of quality. Insights from 327 IT professionals in a recent survey shed light on the issues and practicalities. ...more
By Richard Edwards
By Dale Vile
By Bryan Betts and Dale Vile
Yesterdays software delivery processes are not up to dealing with today’s demands, but modernising you approach is not just about implementing Agile, even creating a DevOps culture. You need to focus on some specific, hard-core principles. ...more
By Dale Vile & Jack Vile
Cloud services are increasingly becoming part of the IT delivery mix, but a recent study of 378 senior IT professionals suggests a parallel commitment to ongoing investment in the datacentre. This in turn shines a light on the key role of modern application platforms. ...more