A team of researchers from Google led by John Martinis have demonstrated quantum supremacy for the first time. This is the point at which a quantum computer is shown to be capable of performing a task that’s beyond the reach of even the most powerful conventional supercomputer.
Fortune obtained a copy of Google’s paper, which was posted to NASA.gov earlier this week before being taken down. The Financial Times first reported the news.
The experiment described in the paper sampled randomly generated numbers produced through a specialized scenario involving quantum phenomena. The researchers said they determined that their quantum computer beat regular computers at the task, which involved calculating the output of certain specialized circuits.
“While our processor takes about 200 seconds to sample one instance of the quantum circuit 1 million times, a state-of-the-art supercomputer would require approximately 10,000 years to perform the equivalent task,” the researchers said.
A source at Google familiar with the situation suggested, however, that NASA accidentally published the paper early, before its team’s claims could be thoroughly vetted through scientific peer review, a process that could take anywhere from weeks to months. If the paper holds up under the scrutiny of the scientific community, it will herald a watershed moment in quantum science. Its central claim counters doubt that some unforeseen law of nature may prevent quantum computers from operating as hoped. “Quantum speedup is achievable in a real-world system and is not precluded by any hidden physical laws,” the Google researchers write.
Azure IoT Central is a fully managed IoT Software as a Service (SaaS) offering that makes it easy to connect, monitor, and manage your IoT devices and products. Azure IoT Central integration with IoT Plug and Play takes this one step further by allowing solution developers to integrate devices without writing any embedded code. IoT solution developers can choose devices from a large set of IoT Plug and Play certified devices to quickly build and customize their IoT solutions end-to-end.
At the center of IoT Plug and Play is a schema that describes device capabilities. We refer to this as a device capability model, which is a JSON-LD document. It’s structured as a set of interfaces comprised of properties (attributes like firmware version, or settings like fan speed), telemetry (sensor readings such as temperature, or events such as alerts), and commands the device can receive (such as reboot).
Solution developers can start with a certified device from the device catalog and customize the experience for the device, such as editing display names or units. Solution developers can also add dashboards for solution operators to visualize the data. There is also the option to auto generate dashboards and visualizations to get up and running quickly. Once the dashboard and visualizations are created, solution developers can run simulations based on real models from the device catalog. Developers can also integrate with the commands and properties exposed by IoT Plug and Play capability models to enable operators to effectively manage their device fleets. IoT Central will automatically load the capability model of any certified device, enabling a true Plug and Play experience!
Modern finance employs large amounts of computational resources for a variety of tasks. Computers are used for example for the analysis of historical data, high-frequency trading, pricing of exotic financial derivatives, portfolio optimization and risk management.
The sheer size of the financial market means that a vast amount of data is processed by the electronic exchanges. For each asset, such as stocks, bonds, options, and so on, the prices are moving on ever-shortening time scales, as short as milliseconds or below. For managing a diversified portfolio or for performing a quantitative analysis of the economy, a large number of high-dimensional vectors have to be processed. Each high-dimensional vector represents the time series of a particular asset. The complexity often arises from the analysis of time series of asset prices over long periods involving many assets, motivating the use of quantum computing for financial problems. With the advent of intermediate-scale quantum computers, employing their power in finance is becoming more and more viable.
H. Markowitz is recognized for introducing the modern version of portfolio theory. An optimal investment strategy achieves a certain desired return while risk is minimized. Equivalently, the problem can be posed as attaining a certain desired risk while maximizing the return. This simple idea of risk-return optimization leads to the notion of portfolio diversification, that is, the optimal portfolio is likely one investing in many relatively uncorrelated assets. In contrast, when the chosen strategy is to only optimize returns, one obtains that the optimal portfolio selects only the single asset (or a few assets) with the highest expected return, irrespective of the risk involved.
Authors present quantum algorithms for portfolio management, specifically portfolio optimization and risk analysis. Given quantum access to the historical record of returns, the presented algorithm determines the optimal risk return tradeoff curve and allows one to sample from the optimal portfolio.
Performing risk analysis simulations on financial assets usually requires an enormous amount of time and resources. Whereas the traditional method requires an enormous amount of simulations (around thousands or millions, depending on the case), when implementing the quantum algorithm, only around dozens of simulations are required. In terms of time, it involves cutting down the complex work of several days to just a few minutes.
In the following paper: “Credit Risk Analysis using Quantum Computers” published in July 2019, authors estimate the economic capital requirement, i.e. the difference between the Value at Risk and the expected value of a given loss distribution. The economic capital requirement is an important risk metric because it summarizes the amount of capital required to remain solvent at a given confidence level.
IBM published an open-source online textbook, called Learn Quantum Computation Using Qiskit, as a tool for self-learners and educators preparing the next generation of quantum developers. Written by experienced educators and leading researchers in the field, this textbook explores quantum computing through practical problems that are run on both simulators and real quantum hardware, with the aim of helping students connect theory to practice. And most importantly, because this textbook is open-source, the field’s top educators and contributors will continually update this text to ensure that students learn the latest and most-relevant quantum computing skills. The textbook also includes problem sets that can be included in coursework.
The powerful recipe for the company success comes from technology giants Bell Labs and Xerox PARC, and it is incredibly simple:
- Loosely management structure which cares about hiring and keeping bright people and giving them only general directions without tell anyone what they should be working on.
- Visions instead of goals, milestones instead of deadlines.
- Avoid that sales and marketing people runs the company.
More in blogs:
Unix came about because Bell Labs hired smart people and gave them the freedom to amuse themselves, trusting that their projects would be useful more often than not. Before Unix, researchers at Bell Labs had already invented the transistor and the laser, as well as any number of innovations in computer graphics, speech synthesis, and speech recognition.
Cancellation of Multics meant the end of the only project that the programmers in the Computer science department had to work on. Luckily for computer enthusiasts, constraint can at times lead to immense creativity. Keeping a handful of programmers squirreled away on the top floor of the Murray Hill complex was not going to bankrupt the company.
Sam Morgan, who managed the Computing Science Research Department (which consisted of McIlroy’s programmers and a group of mathematicians), was not going to lean on McIlroy’s team because they suddenly had nothing in particular to work on.
“The management principles here are that you hire bright people and you introduce them to the environment,” Morgan himself recalled for the Unix Oral History project. “You give them general directions as to what sort of thing is wanted, and you give them lots of freedom.” So rather than provide specific direction,
Morgan preferred to exercise what he called “selective enthusiasm” to encourage a particular research project, noting, “if you mistakenly discourage or fail to respond to something that later on turns out to be good, if it’s really a strong idea it will come back”. “He let people do their own thing and never tried to tell anyone what they should be working on,” Kernighan recalled. At the time, Bell Labs also stressed collaboration across disciplines. “Everyone kept their doors open all the time, so if you had a problem, there was an expert nearby and it was fine to walk in and ask for help”, is how Kernighan remembers it.