A powerful neural network has been created in China: it is 10 times more productive than analogues

In China, they created a neural network 10 times more powerful than the most “advanced” neural network GPT-3.

Experts from the Beijing Academy of Artificial Intelligence announced the creation of a generative deep learning neural network Wu Dao 2.0 with record properties. Its power is much higher than that of its closest competitors – Open AI GPT-3 and Google Switch Transformers.

The advantage of the new neural network is power. They are measured in the number of training parameters: that is, this is the number of factors that the neural network can use in the course of its work.

The Open AI GPT-3 neural network uses 175 billion parameters – this power allows it to generate meaningful texts, create music and program code.

Wu Dao 2.0 has more than 1.75 trillion parameters – 10 times more powerful than GPT-3.

A new development can perform different functions, it:

  • writes an essay,
  • poems and couplets in traditional Chinese,
  • recognizes images and generates them by verbal description,
  • imitates speech,
  • creates culinary recipes,
  • predicts the three-dimensional structure of proteins.

During the training of the neural network, the developers used almost 5 TB of images and texts, including 1.2 TB of texts in Chinese and English.

If you have found a spelling error, please, notify us by selecting that text and pressing Ctrl+Enter.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Author: John Kessler
Graduated From the Massachusetts Institute of Technology. Previously, worked in various little-known media. Currently is an expert, editor and developer of Free News.
Function: Director

Spelling error report

The following text will be sent to our editors:

137 number 0.350054 time