{"id":15758,"date":"2020-12-16T05:28:11","date_gmt":"2020-12-16T05:28:11","guid":{"rendered":"https:\/\/www.jurn.link\/dazposer\/?p=15758"},"modified":"2020-12-16T05:28:11","modified_gmt":"2020-12-16T05:28:11","slug":"release-nvidia-omniverse","status":"publish","type":"post","link":"https:\/\/jurn.link\/dazposer\/index.php\/2020\/12\/16\/release-nvidia-omniverse\/","title":{"rendered":"Release: NVIDIA Omniverse"},"content":{"rendered":"<p><a href=\"https:\/\/blogs.nvidia.com\/blog\/2020\/12\/15\/omniverse-open-beta-available\/\">NVIDIA Omniverse<\/a> has been released in open beta. In its current form it appears to be an extensible virtual production studio, giving teams the ability to&#8230; &#8220;simultaneously work together on projects with real-time photorealistic rendering&#8221; but also to &#8220;work concurrently between different software applications&#8221; via Omniverse Connectors which bridge into &#8220;leading&#8221; content creation software. Most interestingly, there is a promised Connector bridge to the free Blender in the near future. Naturally, your studio&#8217;s creatives all need to be brewing their wizardry on fast n&#8217; shiny NVIDIA graphics cards and Windows.<\/p>\n<p>The Omniverse platform is only in open beta at present, but already has several working modules within it. Including &#8216;Omniverse View&#8217; for architects, and &#8216;Omniverse Create&#8217; for designers and creators. It seems to use the Pixar USD format for universal &#8216;in-out porting&#8217; of the 3D scenes and moving them around the various applications?<\/p>\n<p>&#8220;Early next year&#8221; this virtual studio platform will see the release of&#8230;<\/p>\n<blockquote><p>&#8220;&#8216;Omniverse Audio2Face&#8217;, AI-powered facial animation; and &#8216;Omniverse Machinima&#8217; for GeForce RTX gamers&#8221;. <\/p><\/blockquote>\n<p>Machinima being the term for real-time WYSIWYG animation using a game-engine, and from the sound of it &#8216;Omniverse Machinima&#8217; seems to be tilted toward Unreal Engine users and TV studios &mdash; rather than the hobbyist crowd that is currently using iClone. <\/p>\n<p>The &#8216;Audio2Face&#8217; module is more interesting and will aim to have an AI&#8230; &#8220;generate expressive facial animation from just an audio source&#8221; without any need for expensive and fiddly camera-based mo-cap. That makes a lot of sense. Train an AI to match millions of audio vocalisations with visual expressions, then have it generate expressions purely from audio. In fact I&#8217;m a bit surprised such a thing doesn&#8217;t already exist in software &mdash; beyond the existing &#8216;vocal audio to mouth phonemes&#8217; lip-sync automation. But perhaps animating a full face <em>and<\/em> escaping from &#8216;the uncanny valley&#8217; in real-time may need a Cloud connection and a zillion back-end NVIDIA GPUs to work? My guess is that you would need a second AI to weed out the &#8220;ugh, no&#8230;  uncanny valley&#8221; results.<\/p>\n<p>Anyway NVIDIA Omniverse looks good and may even be free(?), albeit after the entry-ticket price of a 30-series NVIDIA graphics card and (ugh) Windows 10. When it&#8217;s all polished up and hooked to a Blender bridge, that could make it very interesting for small indie animation studios. But what are the prospects for non-techie hobbyists?  Well, DAZ is also an NVIDIA partner, so I guess if DAZ Studio implements a Pixar USD-format bridge then they could also enter the Omniverse?<\/p>\n","protected":false},"excerpt":{"rendered":"<p>NVIDIA Omniverse has been released in open beta. In its current form it appears to be an extensible virtual production studio, giving teams the ability to&#8230; &#8220;simultaneously work together on projects with real-time photorealistic rendering&#8221; but also to &#8220;work concurrently between different software applications&#8221; via Omniverse Connectors which bridge into &#8220;leading&#8221; content creation software. Most [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[13,5,8,4,7],"tags":[],"class_list":["post-15758","post","type-post","status-publish","format-standard","hentry","category-companion-software","category-daz-studio","category-real-time-animation","category-spotted-in-the-news","category-the-animation-industry"],"_links":{"self":[{"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/posts\/15758","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/comments?post=15758"}],"version-history":[{"count":0,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/posts\/15758\/revisions"}],"wp:attachment":[{"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/media?parent=15758"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/categories?post=15758"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/tags?post=15758"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}