본문 바로가기

Amazon sets new team to trains ambitious AI model codenamed... > 자유게시판

본문 바로가기

회원메뉴

쇼핑몰 검색

회원로그인

회원가입

오늘 본 상품 0

없음

자유게시판

Amazon sets new team to trains ambitious AI model codenamed...

페이지 정보

profile_image
작성자 Rosalinda
댓글 댓글 0건   조회Hit 17회   작성일Date 25-02-02 03:28

본문

By Krystal Hu Nov 6 (Reuters) - Amazon is investing millions in training an ambitious large language model (LLMs), hoping it could rival top models from OpenAI and Alphabet, two people familiar with the matter told Reuters. The model, codenamed as "Olympus", has 2 trillion parameters, the people said, which could make it one of the largest models being trained. OpenAI's GPT-4 models, one of the best models available, is reported to have one trillion parameters.

The people spoke on condition of anonymity because the details of the project were not yet public. Amazon declined to comment. The Information reported on the project name on Tuesday. The team is spearheaded by Rohit Prasad, former head of Alexa, who now reports directly to CEO Andy Jassy. As head scientist of general artificial intelligence (AI) at Amazon, Prasad brought researchers who had been working on Alexa AI and the Amazon science team together to work on training models.

Amazon has already trained smaller models such as Titan. It has also partnered with AI model startups such as Anthropic and AI21 Labs, offering them to Amazon Web Services (AWS) users. Amazon believes having homegrown models could make its offerings more attractive on AWS, where enterprise clients want to access top-performing models, sources said. LLMs are the underlying technology for AI tools that learn from huge datasets to generate human-like responses.

Training bigger AI models is more expensive given the amount of computing power required. In an earnings call in April, Amazon executives said the company would increase investment in LLMs and generative AI while cutting back on fulfillment and Scatter Hitam Mahjong transportation in its retail business. (Reporting by Krystal Hu in San Francisco. Editing by Gerry Doyle)

댓글목록

등록된 댓글이 없습니다.