Hardware maker Nvidia is ramping up its efforts to make a stand in the Metaverse. On Tuesday, the company revealed a new set of developer tools focused on metaverse environments, including new AI capabilities, simulations, and other creative assets.
Creators utilizing the Omniverse Kit, along with apps such as Nucleus, Audio2Face and Machinima, will be able to access the new upgrades. Nvidia says one primary function of the tools will be to help enhance building "accurate digital twins and realistic avatars."
Quality of metaverse interaction is a hot topic in the industry, as developers and users ponder the quality of experiences over the quantity. One example of this could be seen during the first-ever metaverse fashion week, which happened in spring.
Overwhelmingly stated in the feedback of the event was the lack of quality in the digital environments, garments, and particularly the avatars with which people interacted.
The new Nvidia toolkit includes the Omniverse Avatar Cloud Engine (ACE). The developers claim that ACE will improve building conditions of “virtual assistants and digital humans.”
Digital identity is a key focus of the update in the Audio2Face application. The official statement from Nvidia says users can now direct the emotion of digital avatars over time, including full-face animation.
It’s clear that engagement in the metaverse will continue to develop. In fact, the metaverse market share should surpass $50 billion in the next four years, signaling an increase in participation. Moreover, new events, workplaces, and even university classes are popping up in digital reality.
Therefore, more users will seek to create digital versions of themselves. The development of technology to support mass metaverse
Read more on cointelegraph.com