While talking about investments in Amazon Web Services (AWS) during Q1 2023 earnings call, Jassy highlighted that the company will continue to invest in AI in cloud computing services.
“Our recent announcement on Large Language Models (LLMs) and generative AI and the chips and managed services associated with them is another recent example,” he said.
Earlier this month, AWS launched the Bedrock service for generative AI service in a limited preview. Through Bedrock, AWS will offer access to its own first-party language models called Titan, one of which can generate text for blog posts, emails or other documents.
He also said that really good LLMs take billions of dollars to train.
Alexa a starting point of investment in AI
Jassey said that LLMs accelerate “the possibility of building that [Alexa] world’s best personal assistant.”
“I think when people often ask us about Alexa, what we often share is that if we were just building a smart speaker, it would be a much smaller investment. But we have a vision, which we have conviction about that we want to build the world’s best personal assistant. And to do that, it’s difficult,” he said.
Jassey noted that the company has a large language model underneath Alexa technology but the company is building one that’s much larger and much more generalised and capable.
“And I think we start from a pretty good spot with Alexa because we have a couple of hundred million endpoints being used across entertainment and shopping and smart home and information and a lot of involvement from third-party ecosystem partners,” the CEO added.
“And I think that’s going to really rapidly accelerate our vision of becoming the world’s best personal assistant. I think there’s a significant business model underneath it,” he added.