One of the trade-offs of today’s technological progress is the big energy costs necessary to process digital information. To make AI models using silicon-based processors, we need to train them with huge amounts of data. The more data, the better the model. This is perfectly illustrated by the current success of large language models, such as ChatGPT. The impressive abilities of such models are due to the fact that huge amounts of data were used for their training.
The more data we use to teach digital AI, the better it becomes, but also the more computational power is needed.
This is why to develop AI further; we need to consider alternatives to the current status quo in silicon-based technologies. Indeed, we have recently seen a lot of publications about Sam Altman, the CEO of OpenAI topic.
Leave a reply