The AI based text generator considered too dangerous to be released!

The AI based text generator considered too dangerous to be released!

In a recent blog post, OpenAI released details about a large scale language model they've created that is able to generate paragraphs of text that are indistinguishable from that written by your average person.

The model they've named GPT-2 is trained using over 8 million web pages which is about 40GB worth of text to predict the next word in a paragraph of text. The model can also do other interesting comprehension tasks such as answering questions, resolution of ambiguous pronouns, translations and even summarisation of news articles.

Although the model could lead to significant improvements in areas such as unsupervised translations and better speech recognition OpenAI have decided not to release the trained model. This is mainly due to concerns about it being used to create misleading news articles, inappropriate social media use and the automation of spam/phishing emails. It's another great breakthrough in what the future of AI could be but also a reminder of the danger of it!

Previous Next