{"id":16458,"date":"2021-04-20T21:42:29","date_gmt":"2021-04-20T21:42:29","guid":{"rendered":"https:\/\/www.jurn.link\/dazposer\/?p=16458"},"modified":"2021-04-20T21:42:29","modified_gmt":"2021-04-20T21:42:29","slug":"first-thoughts-on-metahuman","status":"publish","type":"post","link":"https:\/\/jurn.link\/dazposer\/index.php\/2021\/04\/20\/first-thoughts-on-metahuman\/","title":{"rendered":"First thoughts on MetaHuman"},"content":{"rendered":"<p>Hmmm&#8230; MetaHuman. First thoughts. <\/p>\n<p>It&#8217;s obviously destined for semi-pro and small-studio videogames makers who want to shave a few days or weeks off a too-tight production schedule. It appears to output very good starting points for your hundreds of NPCs, that will later be optimised in-game for 60 frames per second. However, flip through the latest <em>PC Gamer<\/em> magazine for a second. Do you see any &#8220;uncanny valley&#8221; hyper-real characters, that look like the standard AAA humans MetaHuman is pushing? Very few, these days. In fact, the new U.S. edition has a manga\/anime girl on the cover. She runs real-time in the latest hit game. <\/p>\n<p><a href=\"https:\/\/www.jurn.link\/dazposer\/oldimages\/gamer-1.jpg\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.jurn.link\/dazposer\/oldimages\/gamer-1.jpg\" alt=\"\" width=\"212\" height=\"300\" class=\"alignnone size-medium wp-image-16463\" \/><\/a><\/p>\n<p>That said, the head-and-shoulders MetaHuman demo shows obvious superiority to previous &#8220;quickie avatar\/NPC makers&#8221; for games, of the sort that now litter the bankruptcy registers.  As such there&#8217;s also obvious potential for real-time motion-capture movie-making use, if (and it&#8217;s a big if) the mo-cap and AI-aided tweaking in postwork can get the footage past the &#8220;uncanny valley&#8221;. The average screen entertainment viewer wants Marilyn Monroe, not the 2020s equivalent of a <em>Thunderbirds<\/em> puppet. But 98% of digital art hobbyists have no interest in making storytelling movies, nor in the full-body motion-capture rigs needed to make that happen.<\/p>\n<p>The MetaHuman tech-demo is set to evolve into a free ongoing cloud service, if the early reports are correct. As such I&#8217;d say they have two money making options which will dictate their add-ons&#8230; <\/p>\n<p>1) make the exporter modules a paid item, if you&#8217;re not sending your figure to the Unreal engine. Exporters to push your 12Gb&#8217;s of .FBX figure to Blender, Cinema 4D etc, maybe even to DAZ. But, most likely, never directly to their competitors such as Unity, NVIDIA Omniverse etc. Since their build-a-human service is in the cloud, the exporters cannot be pirated. That sounds steadily lucrative, and for not much ongoing effort.<\/p>\n<p>2) or build a vast sprawling content eco-system on this, complete with &#8216;anatomically-correct&#8217; figures, skimpy frillies, ankle-bracelets etc. Then promise not to look at the &#8216;megaboobs&#8217; and other naughty figures that people make and download. But that would damage their brand, and also be a big hassle to admin and do PR for. Why bother, when you have the money coming in via option one? For that reason, I can&#8217;t see that DAZ or Renderosity will have a great deal to worry about. The &#8216;silent majority&#8217; of their users will not want to use cloud services, and will be content with clothes-swopping, kit-bashing and morph-tweaking in privacy. Even if it means staying a step below the current state-of-the-art in hyper-realism. Much the same is true of those who want the wealth of creative science-fiction and fantasy content that DAZ and Poser now provide, royalty-free. Not to mention toon, animals and monsters.<\/p>\n<p>It also seems to me that the average dedicated DAZ user will fairly soon just say&#8230; &#8220;I got a new PC and a 30-series NVIDIA card, so I run DAZ iRay in realtime now&#8221;. True, there&#8217;s still a damnable graphics-card drought but that surely can&#8217;t last forever. This means the &#8220;ooh&#8230; it works in real-time&#8221; thing is a bit of a red-herring. The only caveat there is the hair. Adding 3D hair has always caused a huge drop in scene pliability, and it&#8217;s just possible that MetaHuman has done more than create awesome-looking &#8216;helmet hair&#8217;. Real-time hairs that are true &#8216;stranded grooms&#8217; and which can be easily re-styled&#8230; that would be quite something.<\/p>\n<p>Finally, the &#8216;elephant in the room&#8217; is AI. We&#8217;ve recently seen the Deep Nostalgia service very ably auto-animate a still 2D vintage photo with head-turns and eye-blinks. How much further will that go in the next few years? We may yet see Reallusion popping out a &#8216;CrazyTalk AI&#8217;, so don&#8217;t count them out yet either.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Hmmm&#8230; MetaHuman. First thoughts. It&#8217;s obviously destined for semi-pro and small-studio videogames makers who want to shave a few days or weeks off a too-tight production schedule. It appears to output very good starting points for your hundreds of NPCs, that will later be optimised in-game for 60 frames per second. However, flip through the [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[13,5,3,8,4,7],"tags":[],"class_list":["post-16458","post","type-post","status-publish","format-standard","hentry","category-companion-software","category-daz-studio","category-poser","category-real-time-animation","category-spotted-in-the-news","category-the-animation-industry"],"_links":{"self":[{"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/posts\/16458","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/comments?post=16458"}],"version-history":[{"count":0,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/posts\/16458\/revisions"}],"wp:attachment":[{"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/media?parent=16458"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/categories?post=16458"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/tags?post=16458"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}