⟵ Blog home
OpenAI Unveils GPT-4: The Advanced AI Model that Responds to Images and Powers Bing Search Engine, but Warns of Disinformation Risks
The new model can process up to 25,000 words and can respond to images, including providing recipe suggestions from photos of ingredients.
The new model can respond to images -providing recipe suggestions from photos of ingredients, for example, as well as writing captions and descriptions.
OpenAI said it had spent six months on safety features for GPT-4, and had trained it on human feedback, but warned it may still be prone to sharing disinformation.
GPT-4 will initially be available to ChatGPT Plus subscribers, who pay $20 per month for premium access to the service.
GPT-4 has more advanced reasoning skills than its predecessor and can respond to images.
- It’s already powering Microsoft’s Bing search engine platform. The tech giant has invested $10b into OpenAI
- OpenAI also announced new partnerships with language learning app Duolingo and Be My Eyes, an application for the visually impaired, to create AI Chatbots to assist their users using natural language.
- OpenAI has spent six months on safety features for GPT-4, but warns that it may still be prone to sharing disinformation and making reasoning errors.