{"id":20818,"date":"2023-01-08T18:58:08","date_gmt":"2023-01-08T18:58:08","guid":{"rendered":"https:\/\/jurn.link\/dazposer\/?p=20818"},"modified":"2023-01-10T04:12:49","modified_gmt":"2023-01-10T04:12:49","slug":"the-midas-touch","status":"publish","type":"post","link":"https:\/\/jurn.link\/dazposer\/index.php\/2023\/01\/08\/the-midas-touch\/","title":{"rendered":"The MiDaS touch"},"content":{"rendered":"<p><a href=\"https:\/\/huggingface.co\/spaces\/pytorch\/MiDaS\">MiDaS<\/a> uses trained AI to take a normal 2D image and output a 3D depth-map. In Poser-speak it&#8217;s like Poser&#8217;s &#8216;auxiliary Z-depth&#8217; pass or render. <\/p>\n<p><a href=\"https:\/\/jurn.link\/dazposer\/wp-content\/uploads\/2023\/01\/2023-01-08_185104.jpg\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/jurn.link\/dazposer\/wp-content\/uploads\/2023\/01\/2023-01-08_185104.jpg\" alt=\"\" width=\"640\" height=\"206\" class=\"aligncenter size-large wp-image-20819\" srcset=\"https:\/\/jurn.link\/dazposer\/wp-content\/uploads\/2023\/01\/2023-01-08_185104.jpg 1559w, https:\/\/jurn.link\/dazposer\/wp-content\/uploads\/2023\/01\/2023-01-08_185104-300x96.jpg 300w, https:\/\/jurn.link\/dazposer\/wp-content\/uploads\/2023\/01\/2023-01-08_185104-1024x329.jpg 1024w, https:\/\/jurn.link\/dazposer\/wp-content\/uploads\/2023\/01\/2023-01-08_185104-768x247.jpg 768w, https:\/\/jurn.link\/dazposer\/wp-content\/uploads\/2023\/01\/2023-01-08_185104-1536x494.jpg 1536w\" sizes=\"auto, (max-width: 640px) 100vw, 640px\" \/><\/a><\/p>\n<p>Free and public, no sign-up needed. Just drag-and-drop your image. It can probably also be installed locally, though I haven&#8217;t looked at the requirements for that.<\/p>\n<p>Once you have it you can use the usual Photoshop layer inversion\/blending-mode tricks to create &#8216;depth-fog&#8217; in the scene, where there was none before.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>MiDaS uses trained AI to take a normal 2D image and output a 3D depth-map. In Poser-speak it&#8217;s like Poser&#8217;s &#8216;auxiliary Z-depth&#8217; pass or render. Free and public, no sign-up needed. Just drag-and-drop your image. It can probably also be installed locally, though I haven&#8217;t looked at the requirements for that. Once you have it [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[17,23,13,4],"tags":[],"class_list":["post-20818","post","type-post","status-publish","format-standard","hentry","category-3d-utilities","category-automation","category-companion-software","category-spotted-in-the-news"],"_links":{"self":[{"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/posts\/20818","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/comments?post=20818"}],"version-history":[{"count":4,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/posts\/20818\/revisions"}],"predecessor-version":[{"id":20823,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/posts\/20818\/revisions\/20823"}],"wp:attachment":[{"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/media?parent=20818"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/categories?post=20818"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/tags?post=20818"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}