NVIDIA Digital Human Technologies Bring AI Characters to Life
- NVIDIA partners with top AI developers to utilize advanced digital human technologies for creating lifelike avatars and dynamic characters.
- The collaboration spans across industries like healthcare, gaming, financial services, and retail, showcasing the versatility of NVIDIA's technologies.
- Leading developers like Hippocratic AI, Inworld AI, and UneeQ leverage NVIDIA ACE, NeMo, and RTX to enhance customer experiences and gaming interactions.
- The suite of digital human technologies enables developers to create AI-powered natural language interactions, making conversations more engaging and realistic.
- NVIDIA's ACE microservices, NeMo platform, and RTX rendering technologies provide a comprehensive solution for creating digital humans and virtual assistants.
- Game publishers worldwide are exploring the potential of NVIDIA ACE to enhance the gaming experience by bringing dynamic non-playable characters to life.
- Developers across various sectors such as healthcare, gaming, financial services, and media are embracing NVIDIA's ACE and generative AI technologies to revolutionize user interactions.
- None.
Leading AI Developers Use Suite of NVIDIA Technologies to Create Lifelike Avatars and Dynamic Characters for Everything From Games to Healthcare, Financial Services and Retail Applications
SAN JOSE, Calif., March 18, 2024 (GLOBE NEWSWIRE) -- NVIDIA announced today that leading AI application developers across a wide range of industries are using NVIDIA digital human technologies to create lifelike avatars for commercial applications and dynamic game characters. The results are on display at GTC, the global AI conference held this week in San Jose, Calif., and can be seen in technology demonstrations from Hippocratic AI, Inworld AI, UneeQ and more.
NVIDIA Avatar Cloud Engine (ACE) for speech and animation, NVIDIA NeMo™ for language, and NVIDIA RTX™ for ray-traced rendering are the building blocks that enable developers to create digital humans capable of AI-powered natural language interactions, making conversations more realistic and engaging.
“NVIDIA offers developers a world-class set of AI-powered technologies for digital human creation,” said John Spitzer, vice president of developer and performance technologies at NVIDIA. “These technologies may power the complex animations and conversational speech required to make digital interactions feel real.”
World-Class Digital Human Technologies
The digital human technologies suite includes language, speech, animation and graphics powered by AI:
- NVIDIA ACE — technologies that help developers bring digital humans to life with facial animation powered by NVIDIA Audio2Face™ and speech powered by NVIDIA Riva automatic speech recognition (ASR) and text-to-speech (TTS). ACE microservices are flexible in allowing models to run across cloud and PC depending on the local GPU capabilities to help ensure the user receives the best experience.
- NVIDIA NeMo — an end-to-end platform that enables developers to deliver enterprise-ready generative AI models with precise data curation, cutting-edge customization, retrieval-augmented generation and accelerated performance.
- NVIDIA RTX — a collection of rendering technologies, such as RTX Global Illumination (RTXGI) and DLSS 3.5, that enable real-time path tracing in games and applications.
Building Blocks for Digital Humans and Virtual Assistants
To showcase the new capabilities of its digital human technologies, NVIDIA worked across industries with leading developers, such as Hippocratic AI, Inworld AI and UneeQ, on a series of new demonstrations.
Hippocratic AI has created a safety-focused, LLM-powered, task-specific Healthcare Agent. The agent calls patients on the phone, follows up on care coordination tasks, delivers preoperative instructions, performs post-discharge management and much more. For GTC, NVIDIA collaborated with Hippocratic AI to extend its solution to use NVIDIA ACE microservices, NVIDIA Audio2Face along with NVIDIA Animation graph and NVIDIA Omniverse™ Streamer Client to show the potential of a generative AI healthcare agent avatar.
“Our digital assistants provide helpful, timely and accurate information to patients worldwide,” said Munjal Shah, cofounder and CEO of Hippocratic AI. “NVIDIA ACE technologies bring them to life with cutting-edge visuals and realistic animations that help better connect to patients.”
UneeQ is an autonomous digital human platform specialized in creating AI-powered avatars for customer service and interactive applications. Its digital humans represent brands online, communicating with customers in real time to give them confidence in their purchases. UneeQ integrated the NVIDIA Audio2Face microservice into its platform and combined it with Synanim ML to create highly realistic avatars for a better customer experience and engagement.
“UneeQ combines NVIDIA animation AI with our own Synanim ML synthetic animation technology to deliver real-time digital human interactions that are emotionally responsive and deliver dynamic experiences powered by conversational AI,” said Danny Tomsett, founder and CEO of UneeQ.
Bringing Dynamic Non-Playable Characters to Games
NVIDIA ACE is a suite of technologies designed to bring game characters to life. Covert Protocol is a new technology demonstration, created by Inworld AI in partnership with NVIDIA, that pushes the boundary of what character interactions in games can be. Inworld’s AI engine has integrated NVIDIA Riva for accurate speech-to-text and NVIDIA Audio2Face to deliver lifelike facial performances.
Inworld’s AI engine takes a multimodal approach to the performance of non-playable characters (NPCs), bringing together cognition, perception and behavior systems for an immersive narrative with stunning RTX-rendered characters set in a beautifully crafted environment.
“The combination of NVIDIA ACE microservices and the Inworld Engine enables developers to create digital characters that can drive dynamic narratives, opening new possibilities for how gamers can decipher, deduce and play,” said Kylan Gibbs, CEO of Inworld AI.
Game publishers worldwide are evaluating how NVIDIA ACE can improve the gaming experience.
Developers Across Healthcare, Gaming, Financial Services, Media & Entertainment and Retail Embrace ACE
Top game and digital human developers are pioneering ways ACE and generative AI technologies can be used to transform interactions between players and NPCs in games and applications.
Developers and platforms embracing ACE include Convai, Cyber Agent, Data Monsters, Deloitte, Hippocratic AI, IGOODI, Inworld AI, Media.Monks, miHoYo, NetEase Games, Perfect World, Openstream, OurPalm, Quantiphi, Rakuten Securities, Slalom, SoftServe, Tencent, Top Health Tech, Ubisoft, UneeQ and Unions Avatars.
More information on NVIDIA ACE is available at https://developer.nvidia.com/ace. Platform developers can incorporate the full suite of digital human technologies or individual microservices into their product offerings.
Developers can start their journey on NVIDIA ACE by applying for the early access program to get in-development AI models. To explore available models, developers can evaluate and access NVIDIA NIM, a set of easy-to-use microservices designed to accelerate the deployment of generative AI, for Riva and Audio2Face on ai.nvidia.com today.
About NVIDIA
Since its founding in 1993, NVIDIA (NASDAQ: NVDA) has been a pioneer in accelerated computing. The company’s invention of the GPU in 1999 sparked the growth of the PC gaming market, redefined computer graphics, ignited the era of modern AI and is fueling industrial digitalization across markets. NVIDIA is now a full-stack computing infrastructure company with data-center-scale offerings that are reshaping industry. More information at https://nvidianews.nvidia.com/.
For further information, contact:
Benjamin Berraondo
NVIDIA Corporation
+44 7979 384482
bberraondo@nvidia.com
Certain statements in this press release including, but not limited to, statements as to: the benefits, impact, performance, features, and availability of NVIDIA’s products and technologies, including NVIDIA digital human technologies, NVIDIA Avatar Cloud Engine, NVIDIA NeMo, NVIDIA RTX such as RTX Global Illumination and DLSS 3.5, NVIDIA Audio2Face, NVIDIA Riva automatic speech recognition and text-to-speech, NVIDIA ACE microservices, NVIDIA Animation graph, and NVIDIA Omniverse Streamer Client; the benefits and impact of NVIDIA’s collaborations with third parties, and the features and availability of their services and offerings; third parties’ use or adoption of NVIDIA products, technologies and platforms, and the benefits and impacts thereof; AI-powered technologies for digital human creation offered by NVIDIA powering the complex animations and conversational speech required to make digital interactions feel real; and top game and digital human developers pioneering ways ACE and generative AI technologies can be used to transform interactions between players and NPCs in games and applications are forward-looking statements that are subject to risks and uncertainties that could cause results to be materially different than expectations. Important factors that could cause actual results to differ materially include: global economic conditions; our reliance on third parties to manufacture, assemble, package and test our products; the impact of technological development and competition; development of new products and technologies or enhancements to our existing product and technologies; market acceptance of our products or our partners' products; design, manufacturing or software defects; changes in consumer preferences or demands; changes in industry standards and interfaces; unexpected loss of performance of our products or technologies when integrated into systems; as well as other factors detailed from time to time in the most recent reports NVIDIA files with the Securities and Exchange Commission, or SEC, including, but not limited to, its annual report on Form 10-K and quarterly reports on Form 10-Q. Copies of reports filed with the SEC are posted on the company's website and are available from NVIDIA without charge. These forward-looking statements are not guarantees of future performance and speak only as of the date hereof, and, except as required by law, NVIDIA disclaims any obligation to update these forward-looking statements to reflect future events or circumstances.
© 2024 NVIDIA Corporation. All rights reserved. NVIDIA, the NVIDIA logo, Audio2Face, NVIDIA NeMo, NVIDIA Omniverse, and NVIDIA RTX are trademarks and/or registered trademarks of NVIDIA Corporation in the U.S. and other countries. Other company and product names may be trademarks of the respective companies with which they are associated. Features, pricing, availability and specifications are subject to change without notice.
A photo accompanying this announcement is available at https://www.globenewswire.com/NewsRoom/AttachmentNg/28c9fb52-b11d-4d0c-bf7b-48f60b951ac1
FAQ
How are leading AI developers using NVIDIA technologies?
What are the key technologies offered by NVIDIA for digital human creation?
Which industries are benefiting from NVIDIA's digital human technologies?
How is Hippocratic AI using NVIDIA technologies in healthcare applications?
What is UneeQ's specialization in creating AI-powered avatars?
How is NVIDIA ACE being utilized in gaming?