Toggle light / dark theme

NIST Acknowledges First Four Quantum-Resistant Encryption Tools

The US Department of Commerce’s National Institute of Standards and Technology (NIST) has selected the first-ever group of encryption tools that could potentially withstand the attack of a quantum computer.

The four selected encryption algorithms will now reportedly become part of NIST’s post-quantum cryptographic (PQC) standard, which should be finalized in about two years.

More specifically, for general encryption (used for access to secure websites), NIST has selected the CRYSTALS-Kyber algorithm.

Quantum Processor Completes 9,000 Years of Work in 36 Microseconds

The future is now!


Technology continues to move forward at incredible speeds and it seems like every week we learn about a new breakthrough that changes our minds about what is possible.

Researchers in Toronto used a photonic quantum computer chip to solve a sampling problem that went way beyond the fastest computers and algorithms.

The paper the researchers published says that the Borealis quantum chip took only 36 microseconds to solve a problem that would take supercomputers and algorithms 9,000 years to figure out.

Photonic synapses with low power consumption and high sensitivity

Neuromorphic photonics/electronics is the future of ultralow energy intelligent computing and artificial intelligence (AI). In recent years, inspired by the human brain, artificial neuromorphic devices have attracted extensive attention, especially in simulating visual perception and memory storage. Because of its advantages of high bandwidth, high interference immunity, ultrafast signal transmission and lower energy consumption, neuromorphic photonic devices are expected to realize real-time response to input data. In addition, photonic synapses can realize non-contact writing strategy, which contributes to the development of wireless communication.

The use of low-dimensional materials provides an opportunity to develop complex brain-like systems and low-power memory logic computers. For example, large-scale, uniform and reproducible transition metal dichalcogenides (TMDs) show great potential for miniaturization and low-power biomimetic device applications due to their excellent charge-trapping properties and compatibility with traditional CMOS processes. The von Neumann architecture with discrete memory and processor leads to high power consumption and low efficiency of traditional computing. Therefore, the sensor-memory fusion or sensor-memory-processor integration neuromorphic architecture system can meet the increasingly developing demands of big data and AI for and high performance devices. Artificial synaptic devices are the most important components of neuromorphic systems. The performance evaluation of synaptic devices will help to further apply them to more complex artificial neural networks (ANN).

Chemical vapor deposition (CVD)-grown TMDs inevitably introduce defects or impurities, showed a persistent photoconductivity (PPC) effect. TMDs photonic synapses integrating synaptic properties and optical detection capabilities show great advantages in neuromorphic systems for low-power visual information perception and processing as well as brain memory.

Former SpaceX Rocket Scientist Now Makes High-Tech Pizza

Making pizza is not rocket science, but for this actual rocket scientist it is now. Benson Tsai is a former SpaceX employee who is now using his skills to launch a new venture: Stellar Pizza, a fully automated, mobile pizza delivery service. When a customer places an order on an app, an algorithm decides when to start making the pizza based on how long it will take to get to the delivery address. Inside Edition Digital’s Mara Montalbano has more.

10 Best Machine Learning Software

10. Microsoft Cognitive Toolkit (CNTK)

Closing out our list of 10 best machine learning software is Microsoft Cognitive Toolkit (CNTK), which is Microsoft’s AI solution that trains the machine with its deep learning algorithms. It can handle data from Python, C++, and much more.

CNTK is an open-source toolkit for commercial-grade distributed deep learning, and it allows users to easily combine popular model types like feed-forward DNNs, convolutional neural networks (CNNs), and recurrent neural networks (RNNs/LSTms).

Researcher Tells AI to Write a Paper About Itself, Then Submits It to Academic Journal

😳!


It looks like algorithms can write academic papers about themselves now. We gotta wonder: how long until human academics are obsolete?

In an editorial published by Scientific American, Swedish researcher Almira Osmanovic Thunström describes what began as a simple experiment in how well OpenAI’s GPT-3 text generating algorithm could write about itself and ended with a paper that’s currently being peer reviewed.

The initial command Thunström entered into the text generator was elementary enough: “Write an academic thesis in 500 words about GPT-3 and add scientific references and citations inside the text.”

/* */