Digital humans can now be created by anyone: Nvidia to grant access to ACE microservices.

Nvidia Announces Wide Access to ACE Microservices at Computex 2024

Graphics processor giant Nvidia has unveiled its Avatar Cloud Engine (ACE) at Computex 2024. Based on generative artificial intelligence, ACE is set to hasten the creation of realistic avatars for use in gaming, virtual reality applications, and customer service in various fields such as healthcare.

Nvidia CEO Foresees Transformation Across Industries

Nvidia CEO, Jensen Huang, predicts that the advent of digital humans could revolutionize various sectors. Huang believes that Nvidia’s technologies such as multimodal big language models and neural graphics, promise a future of “intention-driven computing” where interacting with computers is as natural as human-to-human conversation.

ACE Services Adapted for Personal Computing Devices

The use of ACE services was initially restricted to developers in data centers. Nvidia has now adjusted these services for use on personal computers and laptops presenting the Nemotron-3 4.5B language model, the Audio2Face model, and the Riva ASR model. These models will soon be accessible. To facilitate ACE deployment on PCs, Nvidia crafted an AI Inference Manager tool. This tool configures the required AI components and models and synchronizes their operation on the device and cloud.

Covert Protocol Revealed Alongside ACE at Computex 2024

At Computex 2024, a refined version of the Covert Protocol was showcased, a technology jointly developed by Nvidia and Inworld AI. It allows gamers to interact with digital characters using their voice, with the help of Audio2Face and Riva ASR models operated on PCs equipped with GeForce RTX graphics cards.

ACE’s Existing Applications

As per Nvidia’swebsite, companies such as Aww, Dell, Gumption, Hippocratic AI, Inventec, OurPalm, Perfect World Games, Reallusion, ServiceNow, Soulbotix, SoulShell, and UneeQ are already employing ACE for creating virtual assistants and gaming characters.

Companies Embrace ACE for Enhanced Visual Interactivity

One notable example is the Japanese company Aww, who will use Audio2Face for real-time animation of its characters. Game developer Perfect World Games has integrated ACE into its Legends demo for creating an interactive AI character that can communicate in both English and Mandarin.

The availability of Nvidia’s generative AI services allows more companies to develop realistic virtual agents for customer service, marketing, education, medicine, and more. This paves the way for advanced digital avatar technologies and natural language processing computing.

This post was last modified on 06/02/2024

Harry Males: Hey there, I'm Harry Males, your go-to news writer at Dave's iPAQ, where I traverse the intricate landscape of technology, reporting on the latest developments that shape our digital world. With a pen in hand and a passion for all things tech, I dive deep into the realms of Software, AI, Cybersecurity, and Cryptocurrency to bring you the freshest insights and breaking news. Artificial Intelligence is not just a buzzword for me – it's a captivating realm where machines mimic human intelligence. From the wonders of machine learning to the ethical considerations of AI, I'm dedicated to keeping you informed about the advancements that are reshaping industries and everyday life. Beyond the bylines and breaking news, I believe in fostering a community of tech enthusiasts. Whether it's engaging in discussions on forums, attending tech conferences, or sharing insights on social media, I aim to connect with readers who share a passion for the ever-evolving world of technology.