Toggle light / dark theme

This could lead to chat gpt infinite ♾️ ✨️


Machine learning is a fascinating and exciting field within computer science. Recently, this excitement has been transferred to the quantum information realm. Currently, all proposals for the quantum version of machine learning utilize the finite-dimensional substrate of discrete variables. Here we generalize quantum machine learning to the more complex, but still remarkably practical, infinite-dimensional systems. We present the critical subroutines of quantum machine learning algorithms for an all-photonic continuous-variable quantum computer that can lead to exponential speedups in situations where classical algorithms scale polynomially. Finally, we also map out an experimental implementation which can be used as a blueprint for future photonic demonstrations.

“AI is a challenge for global governance,” says a regulations expert.

The Cyberspace Administration of China (CAC), China’s internet regulator, proposed rules to govern artificial intelligence (AI) tools like OpenAI’s ChatGPT on Tuesday.

“China supports the independent innovation, popularization and application and international cooperation of basic technologies such as AI algorithms and frameworks,” CAC said in the draft regulation published on its website.


AndreyPopov/iStock.

Noisy intermediate-scale quantum algorithms, which run on noisy quantum computers, should be carefully designed to boost the output state fidelity. While several compilation approaches have been proposed to minimize circuit errors, they often omit the detailed circuit structure information that does not affect the circuit depth or the gate count. In the presence of spatial […]…

In the math of particle physics, every calculation should result in infinity. Physicists get around this by just ignoring certain parts of the equations — an approach that provides approximate answers. But by using the techniques known as “resurgence,” researchers hope to end the infinities and end up with perfectly precise predictions.

Exciting.


The search for the chemical origins of life represents a long-standing and continuously debated enigma. Despite its exceptional complexity, in the last decades the field has experienced a revival, also owing to the exponential growth of the computing power allowing for efficiently simulating the behavior of matter—including its quantum nature—under disparate conditions found, e.g., on the primordial Earth and on Earth-like planetary systems (i.e., exoplanets). In this minireview, we focus on some advanced computational methods capable of efficiently solving the Schrödinger equation at different levels of approximation (i.e., density functional theory)—such as ab initio molecular dynamics—and which are capable to realistically simulate the behavior of matter under the action of energy sources available in prebiotic contexts.

Whether this “complements or contradicts existing religious value systems depends largely on the interpretation of those systems by the people who have adopted them,” said Frank. “However, my interviews with astronauts of faith suggest that their religious perspective was strengthened, rather than being weakened.”

Frank notes that his cosmology has parallels with Yuval Harari ’s “dataism,” described by Harari as the “most interesting emerging religion.” Dataism, as defined by Harari, “says that the universe consists of data flows, and the value of any phenomenon or entity is determined by its contribution to data processing.” This may sound kind of cold and metallic, but if life is an algorithm and self-awareness is data processing the parallels with Frank’s ideas are evident.

At the MVA webinar there wasn’t time to address all the points I wanted to discuss with Frank. So I had this new conversation with him. Please find below my comments and questions (all related to space philosophy, cosmic metaphysics, and religion), and listen to hear Frank’s thoughtful replies and other points that came up.

Every time a person dies, writes Russian novelist Vasily Grossman in Life and Fate, the entire world that has been built in that individual’s consciousness dies as well: “The stars have disappeared from the night sky; the Milky Way has vanished; the sun has gone out… flowers have lost their color and fragrance; bread has vanished; water has vanished.” Elsewhere in the book, he writes that one day we may engineer a machine that can have human-like experiences; but if we do, it will have to be enormous—so vast is this space of consciousness, even within the most “average, inconspicuous human being.”

And, he adds, “Fascism annihilated tens of millions of people.” Trying to think those two thoughts together is a near-impossible feat, even for the immense capacities of our consciousness. But will machine minds ever acquire anything like our ability to have such thoughts, in all their seriousness and depth? Or to reflect morally on events, or to equal our artistic and imaginative reach? Some think that this question distracts us from a more urgent one: we should be asking what our close relationship with our machines is doing to us.

Jaron Lanier, himself a pioneer of computer technology, warns in You Are Not a Gadget that we are allowing ourselves to become ever more algorithmic and quantifiable, because this makes us easier for computers to deal with. Education, for example, becomes less about the unfolding of humanity, which cannot be measured in units, and more about tick boxes.

The data centers that help train ChatGPT-like AI are very ‘thirsty,’ finds a new study.

A new study has uncovered how much water is consumed when training large AI models like OpenAI’s ChatGPT and Google’s Bard. The estimates of AI water consumption were presented by researchers from the Universities of Colorado Riverside and Texas Arlington in a pre-print article titled “Making AI Less ‘Thirsty.’”

Of course, the water used to cool these data centers doesn’t just disappear into the ether but is usually removed from water courses like rivers. The researchers distinguish between water “withdrawal” and “consumption” when estimating AI’s water usage.


In 1918, the American chemist Irving Langmuir published a paper examining the behavior of gas molecules sticking to a solid surface. Guided by the results of careful experiments, as well as his theory that solids offer discrete sites for the gas molecules to fill, he worked out a series of equations that describe how much gas will stick, given the pressure.

Now, about a hundred years later, an “AI scientist” developed by researchers at IBM Research, Samsung AI, and the University of Maryland, Baltimore County (UMBC) has reproduced a key part of Langmuir’s Nobel Prize-winning work. The system— (AI) functioning as a scientist—also rediscovered Kepler’s third law of planetary motion, which can calculate the time it takes one space object to orbit another given the distance separating them, and produced a good approximation of Einstein’s relativistic time-dilation law, which shows that time slows down for fast-moving objects.

A paper describing the results is published in Nature Communications on April 12.

As quantum advantage has been demonstrated on different quantum computing platforms using Gaussian boson sampling,1–3 quantum computing is moving to the next stage, namely demonstrating quantum advantage in solving practical problems. Two typical problems of this kind are computational-aided material design and drug discovery, in which quantum chemistry plays a critical role in answering questions such as ∼Which one is the best?∼. Many recent efforts have been devoted to the development of advanced quantum algorithms for solving quantum chemistry problems on noisy intermediate-scale quantum (NISQ) devices,2,4–14 while implementing these algorithms for complex problems is limited by available qubit counts, coherence time and gate fidelity. Specifically, without error correction, quantum simulations of quantum chemistry are viable only if low-depth quantum algorithms are implemented to suppress the total error rate. Recent advances in error mitigation techniques enable us to model many-electron problems with a dozen qubits and tens of circuit depths on NISQ devices,9 while such circuit sizes and depths are still a long way from practical applications.

The difference between the available and actually required quantum resources in practical quantum simulations has renewed the interest in divide and conquer (DC) based methods.15–19 Realistic material and (bio)chemistry systems often involve complex environments, such as surfaces and interfaces. To model these systems, the Schrödinger equations are much too complicated to be solvable. It therefore becomes desirable that approximate practical methods of applying quantum mechanics be developed.20 One popular scheme is to divide the complex problem under consideration into as many parts as possible until these become simple enough for an adequate solution, namely the philosophy of DC.21 The DC method is particularly suitable for NISQ devices since the sub-problem for each part can in principle be solved with fewer computational resources.15–18,22–25 One successful application of DC is to estimate the ground-state potential energy surface of a ring containing 10 hydrogen atoms using the density matrix embedding theory (DMET) on a trapped-ion quantum computer, in which a 20-qubit problem is decomposed into ten 2-qubit problems.18

DC often treats all subsystems at the same computational level and estimates physical observables by summing up the corresponding quantities of subsystems, while in practical simulations of complex systems, the particle–particle interactions may exhibit completely different characteristics in and between subsystems. Long-range Coulomb interactions can be well approximated as quasiclassical electrostatic interactions since empirical methods, such as empirical force filed (EFF) approaches,26 are promising to describe these interactions. As the distance between particles decreases, the repulsive exchange interactions from electrons having the same spin become important so that quantum mean-field approaches, such as Hartree–Fock (HF), are necessary to characterize these electronic interactions.