Since OpenAI, an AI research and deployment company, introduced its groundbreaking GPT-3 natural language model platform last year, users have discovered countless things that these AI models can do with their powerful and comprehensive understanding of language.
For instance, a sports franchise that’s developing a new app to engage with fans during games could use the models’ ability to quickly and abstractly summarize information to convert transcripts of live television commentary into game highlights that someone could choose to include within the app.
The marketing team could use GPT-3’s capability to generate original content and its understanding of what’s happening in the game to help the team brainstorm ideas for social media or blog posts and engage with fans more quickly.
At its Ignite conference today, Microsoft announced it will help its customers uncover these kinds of experiences with the new Azure OpenAI Service, which allows access to OpenAI’s API through the Azure platform and will initially be available by invite only. The new Azure Cognitive Service will give customers access to OpenAI’s powerful GPT-3 models, along with security, reliability, compliance, data privacy and other enterprise-grade capabilities that are built into Microsoft Azure.
Microsoft will also offer Azure OpenAI Service customers new tools to help ensure outputs that the model returns are appropriate for their businesses, and it will monitor how people are employing the technology to help ensure it’s being used for its intended purposes.
“We are just in the beginning stages of figuring out what the power and potential of GPT-3 is, which is what makes it so interesting,” said Eric Boyd, Microsoft corporate vice president for Azure AI. “Now we are taking what OpenAI has released and making it available with all the enterprise promises that businesses need to move into production.”
Built by OpenAI, GPT-3 is part of a new class of models that can be customized to handle a wide variety of use cases that
This article is purposely trimmed, please visit the source to read the full article.