Applied Materials said it has reached a breakthrough in chip wiring that will enable semiconductor chip production to miniaturize to chips so the width between circuits can be as little as three billionths of a meter. Current chip factories are making 7nm and 5nm chips, so the 3nm chips represent the next generation of technology.
These 3nm production lines will be part of factories that cost more than $22 billion to build — and generate a lot more revenue than that. The breakthrough in chip wiring will enable logic chips to scale to three nanometers and beyond, the company said.
Chip manufacturing companies can use the wiring tools in their huge factories, and the transition from 5nm factories to 3nm factories could help ease a shortage of semiconductor chips that has plagued the entire electronics industry. But it will be a while before the chips go into production. In addition to interconnect scaling challenges, there are other issues related to the transistor (extending the use of FinFET transistors and transitioning to Gate All Around transistors), as well as patterning (extreme ultraviolet and multi-patterning).
In the very last moments of the movie, however, you would also see something unusual: the sprouting of clouds of satellites, and the wrapping of the land and seas with wires made of metal and glass. You would see the sudden appearance of an intricate artificial planetary crust capable of tremendous feats of communication and calculation, enabling planetary self-awareness — indeed, planetary sapience.
The emergence of planetary-scale computation thus appears as both a geological and geophilosophical fact. In addition to evolving countless animal, vegetal and microbial species, Earth has also very recently evolved a smart exoskeleton, a distributed sensory organ and cognitive layer capable of calculating things like: How old is the planet? Is the planet getting warmer? The knowledge of “climate change” is an epistemological accomplishment of planetary-scale computation.
Over the past few centuries, humans have chaotically and in many cases accidentally transformed Earth’s ecosystems. Now, in response, the emergent intelligence represented by planetary-scale computation makes it possible, and indeed necessary, to conceive an intentional, directed and worthwhile planetary-scale terraforming. The vision for this is not to be found in computing infrastructure itself, but in the purposes to which we put it.
On Jan. 15, a hacker tried to poison a water treatment plant that served parts of the San Francisco Bay Area. It didn’t seem hard.
The hacker had the username and password for a former employee’s TeamViewer account, a popular program that lets users remotely control their computers, according to a private report compiled by the Northern California Regional Intelligence Center in February and seen by NBC News.
After logging in, the hacker, whose name and motive are unknown and who hasn’t been identified by law enforcement, deleted programs that the water plant used to treat drinking water.
Big Blue has, for the first time, built a quantum computer that is not physically located in its US data centers. For the company, this is the start of global quantum expansion.
Check out my short video in which I explain some super exciting research in the area of nanotechnology: de novo protein lattices! I specifically discuss a journal article by Ben-Sasson et al. titled “Design of biologically active binary protein 2D materials”.
Though I am not involved in this particular research myself, I have worked in adjacent areas such as de novo engineering of aggregating antimicrobial peptides, synthetic biology, nanotechnology-based tools for neuroscience, and gene therapy. I am endlessly fascinated by this kind of computationally driven de novo protein design and would love to incorporate it in my own research at some point in the future.
I am a PhD candidate at Washington University in St. Louis and the CTO of the startup company Conduit Computing. I am also a published science fiction writer and a futurist. To learn more about me, check out my website: https://logancollinsblog.com/.
In search for a unifying quantum gravity theory that would reconcile general relativity with quantum theory, it turns out quantum theory is more fundamental, after all. Quantum mechanical principles, some physicists argue, apply to all of reality (not only the realm of ultra-tiny), and numerous experiments confirm that assumption. After a century of Einsteinian relativistic physics gone unchallenged, a new kid of the block, Computational Physics, one of the frontrunners for quantum gravity, states that spacetime is a flat-out illusion and that what we call physical reality is actually a construct of information within [quantum neural] networks of conscious agents. In light of the physics of information, computational physicists eye a new theory as an “It from Qubit” offspring, necessarily incorporating consciousness in the new theoretic models and deeming spacetime, mass-energy as well as gravity emergent from information processing.
In fact, I expand on foundations of such new physics of information, also referred to as [Quantum] Computational Physics, Quantum Informatics, Digital Physics, and Pancomputationalism, in my recent book The Syntellect Hypothesis: Five Paradigms of the Mind’s Evolution. The Cybernetic Theory of Mind I’m currently developing is based on reversible quantum computing and projective geometry at large. This ontological model, a “theory of everything” of mine, agrees with certain quantum gravity contenders, such as M-Theory on fractal dimensionality and Emergence Theory on the code-theoretic ontology, but admittedly goes beyond all current models by treating space-time, mass-energy and gravity as emergent from information processing within a holographic, multidimensional matrix with the Omega Singularity as the source.
There’s plenty of cosmological anomalies of late that make us question the traditional interpretation of relativity. First off, what Albert Einstein (1879 — 1955) himself called “the biggest blunder” of his scientific career – t he rate of the expansion of our Universe, or the Hubble constant – is the subject of a very important discrepancy: Its value changes based how scientists try to measure it. New results from the Hubble Space Telescope have now “raised the discrepancy beyond a plausible level of chance,” according to one of the latest papers published in the Astrophysical Journal. We are stumbling more often on all kinds of discrepancies in relativistic physics and the standard cosmological model. Not only the Hubble constant is “constantly” called into question but even the speed of light, if measured by different methods, and on which Einsteinian theories are based upon, shows such discrepancies and turns out not really “constant.”
Toshiba’s Cambridge Research Laboratory has achieved quantum communications over optical fibres exceeding 600 km in length, three times further than the previous world record distance.
The breakthrough will enable long distance, quantum-secured information transfer between metropolitan areas and is a major advance towards building a future Quantum Internet.
The term “Quantum Internet” describes a global network of quantum computers, connected by long distance quantum communication links. This technology will improve the current Internet by offering several major benefits – such as the ultra-fast solving of complex optimisation problems in the cloud, a more accurate global timing system, and ultra-secure communications. Personal data, medical records, bank details, and other information will be physically impossible to intercept by hackers. Several large government initiatives to build a Quantum Internet have been announced in China, the EU and the USA.