{"id":21905,"date":"2023-09-16T03:31:46","date_gmt":"2023-09-16T03:31:46","guid":{"rendered":"https:\/\/jurn.link\/dazposer\/?p=21905"},"modified":"2023-11-04T07:48:34","modified_gmt":"2023-11-04T07:48:34","slug":"release-openpose-for-poser-poser-to-sd","status":"publish","type":"post","link":"https:\/\/jurn.link\/dazposer\/index.php\/2023\/09\/16\/release-openpose-for-poser-poser-to-sd\/","title":{"rendered":"Release: OpenPose for Poser &#8211; Poser to SD"},"content":{"rendered":"<p>Ken K has released <a href=\"https:\/\/www.renderosity.com\/marketplace\/products\/161157?AID=4737\">OpenPose for Poser 12<\/a>, which provides the first &#8216;Poser to Stable Diffusion&#8217; pipeline. <\/p>\n<p><a href=\"https:\/\/jurn.link\/dazposer\/wp-content\/uploads\/2023\/09\/poserai-videoframe.jpg\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/jurn.link\/dazposer\/wp-content\/uploads\/2023\/09\/poserai-videoframe.jpg\" alt=\"\" width=\"640\" height=\"386\" class=\"aligncenter size-large wp-image-21906\" srcset=\"https:\/\/jurn.link\/dazposer\/wp-content\/uploads\/2023\/09\/poserai-videoframe.jpg 1189w, https:\/\/jurn.link\/dazposer\/wp-content\/uploads\/2023\/09\/poserai-videoframe-300x181.jpg 300w, https:\/\/jurn.link\/dazposer\/wp-content\/uploads\/2023\/09\/poserai-videoframe-1024x617.jpg 1024w, https:\/\/jurn.link\/dazposer\/wp-content\/uploads\/2023\/09\/poserai-videoframe-768x462.jpg 768w\" sizes=\"auto, (max-width: 640px) 100vw, 640px\" \/><\/a><\/p>\n<p>The <a href=\"https:\/\/www.youtube.com\/watch?v=P38qPPzArLc\">Video Demo<\/a> suggests it&#8217;s useful for speeding up the feeding of SD with exact poses, and it looks especially useful for hands. While a Controlnet can provide native openpose estimation, the hands are often not well detected. Ken&#8217;s method lets you get excellent hands.<\/p>\n<p><a href=\"https:\/\/jurn.link\/dazposer\/wp-content\/uploads\/2023\/09\/hands-poser-sd.jpg\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/jurn.link\/dazposer\/wp-content\/uploads\/2023\/09\/hands-poser-sd.jpg\" alt=\"\" width=\"768\" height=\"1024\" class=\"aligncenter size-full wp-image-21907\" srcset=\"https:\/\/jurn.link\/dazposer\/wp-content\/uploads\/2023\/09\/hands-poser-sd.jpg 768w, https:\/\/jurn.link\/dazposer\/wp-content\/uploads\/2023\/09\/hands-poser-sd-225x300.jpg 225w\" sizes=\"auto, (max-width: 768px) 100vw, 768px\" \/><\/a><\/p>\n<p>The free alternative might be something like a .BVH to OpenPose converter. But no-one seems to have made one, which seems rather amazing. Everyone wants to do it from analysed video pixels rather than the skeletons of 3D figures. So Ken&#8217;s new product seems unique.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Ken K has released OpenPose for Poser 12, which provides the first &#8216;Poser to Stable Diffusion&#8217; pipeline. The Video Demo suggests it&#8217;s useful for speeding up the feeding of SD with exact poses, and it looks especially useful for hands. While a Controlnet can provide native openpose estimation, the hands are often not well detected. [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[23,3,26],"tags":[],"class_list":["post-21905","post","type-post","status-publish","format-standard","hentry","category-automation","category-poser","category-poser-12"],"_links":{"self":[{"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/posts\/21905","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/comments?post=21905"}],"version-history":[{"count":7,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/posts\/21905\/revisions"}],"predecessor-version":[{"id":22159,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/posts\/21905\/revisions\/22159"}],"wp:attachment":[{"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/media?parent=21905"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/categories?post=21905"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/jurn.link\/dazposer\/index.php\/wp-json\/wp\/v2\/tags?post=21905"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}