Announcement

Collapse
No announcement yet.

3d 360 panoramas for SAMSUNG Gear VR

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • 3d 360 panoramas for SAMSUNG Gear VR

    Hi,

    Its been years since i posted to this forum so I wanted to share something made using vray in my VR work.
    I've been using the strip method to render left and right eye parallax corrected 3d 360 panoramas to view on the Oculus Rift DK2 for some time. Its a great format but hindered by the tethered HMD. Recently I found a way of converting them for use on the Gear VR.

    Attached are several vray examples - they can be viewed using the ORBX viewer from OTOY - just copy the files into the ORBX/samples directory and they will show in the viewer.

    They are made by converting the spherical panoramas to cubemap faces using Pano2VR. For those interested in the conversion process the faces need to be organised into a strip in the following order:

    0,2,4,5,3,1 - face 4 needs to be rotated 90degress anticlockwise and face 5 needs to be rotated 90degress clockwise. IMPORTANT You will need to mirror (horizontally) your 360 panoramas prior to conversion.

    The size of the faces should be 1536sq with the resulting strip being 18432x1536.

    I'm wondering if Chaosgroup have plans to release a lens shader to simplify the stereo panorama creation process as this would be awesome ?

    Thanks for looking



    Last edited by deflix; 15-05-2015, 01:04 AM.
    Immersive media - design and production
    http://www.felixdodd.com/
    https://www.linkedin.com/in/felixdodd/

  • #2
    Originally posted by deflix View Post
    I'm wondering if Chaosgroup have plans to release a lens shader to simplify the stereo panorama creation process as this would be awesome ?
    Yes. In fact, it's almost done for the nightly builds - not sure if there is time to include it in the service pack, but at least the spherical stereo panorama will be there.

    Best regards,
    Vlado
    I only act like I know everything, Rogers.

    Comment


    • #3
      Cool stuff!

      That is some crazy patience to be working with the strip rendering method
      I've long lost patience for it and have since got a pretty good workflow for baking Vray into the textures of my scene geometry. Reflections are still the biggest issue. I'm waiting for proper Oculus Integration in Unity 5 before I dig into the latest reflection system.

      But spherical stereo panoramas that work out of the box might still be fun to play around with.
      Brendan Coyle | www.brendancoyle.com

      Comment


      • #4
        Hey deflix that's cool.

        I saw that Samsung released a developer version of the Gear VR on Friday but I couldn't find enough info on whether I would be able to view stereo or mono panoramas created in V-Ray. As I have a Nexus 6 already I thought about trying one of these and trying it with some existing projects.
        Dan Brew

        Comment


        • #5
          Originally posted by DanielBrew View Post
          I thought about trying one of these and trying it with some existing projects.
          We have one and an oculus - can confirm that the new stereo panorama rendering in vray works perfectly with both.

          Comment


          • #6
            Originally posted by Neilg View Post
            We have one and an oculus - can confirm that the new stereo panorama rendering in vray works perfectly with both.
            Cheers for that Neil. I've ordered the Google Cardboard thing and sent a 8000x2000px stereo pano to render
            Dan Brew

            Comment


            • #7
              This is a great topic!
              I actually just came on the forum to talk about VR pano now.

              Sorry if I hijack a little bit this thread but I was wondering if we could push things a bit further.

              We need to showcase some of our stuff though the oculus and need to make it looking great in RV.
              Therefore, we are currently using KRPano 1.19 pre-release 2 that came out 4 days ago :
              http://krpano.com/forum/wbb/index.ph...9838#post59838

              It's great and it loads really easily a 360 pano into vr world.
              It supports tours and video as well, just awesome.

              I though of something that would push the boundaries a bit more :
              What if we wanted to create the real parallax that would happen when you rotate your head? The eye distance is consistent, but not along the same axe as when we render out 1 single shot pano render :
              Click image for larger version

Name:	SNAG-0148.jpg
Views:	1
Size:	254.5 KB
ID:	856048

              Now bare with my extreme foolishness, but, what if :

              If we could render out a animation 200, 2000 frames (whatever we can afford) and loads this into let's say pdplayer.
              Pdplayer would then have a live link with the oculus.
              Moving the head left or right, it would play the animation back or forwards according to the n. frames/angle of movement.
              Because it's all stored into ram, it's super fast and responsive.
              Obviously, the more rendered frames, the smoother the feeling would be.

              I guess it's something like that that otoy has been working on but with real footage :
              http://home.otoy.com/otoy-demonstrat...apture-for-vr/

              It just make sense that one position if space would not be enough to get the ultimate experience, otoy develops a internal tool to handle all that raw data, but could we not have something similar to play with, pdplayer would be a great platform to display whatever comes out of Vray for that matter.
              My limited knowledge make me think that a dual video pano would already make the trick, but maybe there would be a even better solution out there?

              Vlado, any though on this subject?

              Stan
              Last edited by 3LP; 13-05-2015, 08:28 AM.
              3LP Team

              Comment


              • #8
                You're basically talking about their light field format. I think thats a whole new file format and camera, trying to render individual frames and set up an animation playback driven by position wouldn't work because you're dealing with rotation and movement in all axis. you'd need to be able to switch sets of frame when you move forwards/backwards, up and down and left/right, as well as track rotation.

                I would absolutely love it if vlado could make a light field vray camera though... set the bounding box and automatically render as much data as it needs to a single file. otoys isnt even out yet though, and it's going to be cloud-only (rendering and viewing)

                Comment


                • #9
                  Originally posted by Neilg View Post
                  otoys isnt even out yet though, and it's going to be cloud-only (rendering and viewing)
                  not excited about cloud-only software.
                  Brendan Coyle | www.brendancoyle.com

                  Comment


                  • #10
                    I'd actually like to see how that works in practice before spending efforts on it. There are so many ways to show VR right now based on all kinds of stuff (point clouds, deep images, light fields). Some of these will be unsuccessful and will die out, others will become standard. I've no idea which ones though...

                    Best regards,
                    Vlado
                    I only act like I know everything, Rogers.

                    Comment


                    • #11
                      Originally posted by vlado View Post
                      I'd actually like to see how that works in practice before spending efforts on it. There are so many ways to show VR right now based on all kinds of stuff (point clouds, deep images, light fields). Some of these will be unsuccessful and will die out, others will become standard. I've no idea which ones though...

                      Best regards,
                      Vlado
                      We wont know until you implement them all for us to test
                      Brendan Coyle | www.brendancoyle.com

                      Comment


                      • #12
                        Hehe yeah there is that....

                        Best regards,
                        Vlado
                        I only act like I know everything, Rogers.

                        Comment


                        • #13
                          Originally posted by vlado View Post
                          I'd actually like to see how that works in practice before spending efforts on it. There are so many ways to show VR right now based on all kinds of stuff (point clouds, deep images, light fields). Some of these will be unsuccessful and will die out, others will become standard. I've no idea which ones though...

                          Best regards,
                          Vlado
                          Wise words indeed.
                          Interestingly i have found strip rendered panoramas (converted to gear vr or not) to give a more robust stereo effect than otoys native output which breaks down rapidly when rotating the hmd from side to side. Ive yet to try the new vray stereo output but still firmly stand by biased rendering in real world production environments. eliminating noise and controlling render times being key ingredients of this. For my money static animated or still 360 stereo panoramas are hard to beat especially on mobile VR. Similarly for realtime old school forward rendering path (in unity) with baked lighting and other cheap techniques will rule for some time as hmd resolutions and framerates climb way faster than gpus can keep up. Time and time again i find that despite moments of excitement such as lightfield or brigade style path tracing it is always an incremental process without 'magic' as some may claim.
                          Last edited by deflix; 13-05-2015, 02:08 PM.
                          Immersive media - design and production
                          http://www.felixdodd.com/
                          https://www.linkedin.com/in/felixdodd/

                          Comment


                          • #14
                            I am curious to try different stereo shaders and will look tomorrow at vray version - i guess is time to upgrade to 3.0. Im also interested in this one -http://www.andrewhazelden.com/blog/2014/10/render-spherical-stereo-content-with-the-domemaster3d-v1-6-alpha/ and have seen another made for Arnold. It is strange how we revitalise this Victorian technology on the back of VR
                            Immersive media - design and production
                            http://www.felixdodd.com/
                            https://www.linkedin.com/in/felixdodd/

                            Comment


                            • #15
                              That's just doing the same thing. The vray one works with render elements though so it means we can actually produce real, final images with it instead of raw renders.

                              Lightfields do actually work really well though, it's a genuinely astonishing experience. Not sure how doing post work would work unfortunately. At least with 360's we can do post in fusion on a straight image and swap out all the channels to the 360 ones later.
                              point clouds seems like an interesting approach, never thought of that. same issue with post though.

                              Comment

                              Working...
                              X