Announcement

Collapse
No announcement yet.

Denoise Benchmarks

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Denoise Benchmarks

    Hey there,

    Since the upcoming denoiser that we are working on will be hardware accelerated, we decide to do few benchmarks using different hardware and I thought you could find those interesting.
    Here are the results with the current implementation.
    Click image for larger version

Name:	denoiser_bench.png
Views:	1
Size:	183.2 KB
ID:	884625
    As a rough conclusion ...
    * Intel integrated GPUs denoises about ~1.5 to 2.0 times faster than an average Intel i7 CPU. Integrated Intel GPU (which comes with the most CPUs basically for free) gives roughly the performance of 1500$ Xeon CPU.
    * AMD GPUs are about 3 to 5 times faster compared to NVIDIA GPUs.
    * Having high end GPU in your system gives 10 to 30 times speed up. Having whatever GPU still makes it a lot faster.
    * Intel Xeon Phi works for denoising as well, if you happen to have one of those around.
    V-Ray fan.
    Looking busy around GPUs ...
    RTX ON

  • #2
    interesting results. any ideas why the AMD cards are so fast?

    Comment


    • #3
      Does the hardware acceleration take advantage of multiple GPU's?
      "I have not failed. I've just found 10,000 ways that won't work."
      Thomas A. Edison

      Comment


      • #4
        thanks for that! Interesting on the AMD GPU's. wish that was the case with Vray RT also

        Comment


        • #5
          Originally posted by eyepiz View Post
          Does the hardware acceleration take advantage of multiple GPU's?
          Not yet, but eventually we will get to that. There are also some more optimizations that we will have to try at some point as well.

          Originally posted by Donfarese View Post
          thanks for that! Interesting on the AMD GPU's. wish that was the case with Vray RT also
          Originally posted by Companioncube View Post
          interesting results. any ideas why the AMD cards are so fast?
          I guess that it is just that the different hardware is better suited for different tasks.
          V-Ray fan.
          Looking busy around GPUs ...
          RTX ON

          Comment


          • #6
            Too bad that when rendering in network all the render slaves have to have a GPU... In our case they don't most of them, and if they do they're pretty basic.

            Comment


            • #7
              Originally posted by Moriah View Post
              Too bad that when rendering in network all the render slaves have to have a GPU... In our case they don't most of them, and if they do they're pretty basic.
              You don't need to have GPUs in your render slaves.
              The image is denoised only on the client, so if you render in DR, only the client hardware matters for the denoiser.
              If you render different frames on different nodes, it is okay as well. We automatically find the most appropriate hardware for the denoising. If some of the nodes don't have GPUs, they still can denoise just as good using their CPUs. This happens automatically. And you still can disable the hardware acceleration at all if you need that for some reason (there is a checkbox for that in the denoiser settings).

              Best,
              Blago.
              V-Ray fan.
              Looking busy around GPUs ...
              RTX ON

              Comment


              • #8
                Originally posted by Moriah View Post
                Too bad that when rendering in network all the render slaves have to have a GPU...
                How did you come to this (wrong) conclusion? The benchmark table specifically lists a couple of CPU results...

                Best regards,
                Vlado
                Last edited by vlado; 20-05-2016, 05:16 AM.
                I only act like I know everything, Rogers.

                Comment


                • #9
                  Originally posted by savage309 View Post
                  Hey there,

                  Since the upcoming denoiser that we are working on will be hardware accelerated, we decide to do few benchmarks using different hardware and I thought you could find those interesting.
                  Here are the results with the current implementation.
                  [ATTACH]30338[/ATTACH]
                  As a rough conclusion ...
                  * Intel integrated GPUs denoises about ~1.5 to 2.0 times faster than an average Intel i7 CPU. Integrated Intel GPU (which comes with the most CPUs basically for free) gives roughly the performance of 1500$ Xeon CPU.
                  * AMD GPUs are about 3 to 5 times faster compared to NVIDIA GPUs.
                  * Having high end GPU in your system gives 10 to 30 times speed up. Having whatever GPU still makes it a lot faster.
                  * Intel Xeon Phi works for denoising as well, if you happen to have one of those around.
                  Just to clarify, this means that even with a regular V-Ray Advanced CPU rendering (no RT), the GPU is engaged only for the denoising stage? If so, wow!
                  www.dpict3d.com - "That's a very nice rendering, Dave. I think you've improved a great deal." - HAL9000... At least I have one fan.

                  Comment


                  • #10
                    Originally posted by dlparisi View Post
                    Just to clarify, this means that even with a regular V-Ray Advanced CPU rendering (no RT), the GPU is engaged only for the denoising stage? If so, wow!
                    Yes, exactly... if there is a usable GPU; if not, we have regular CPU code that gets the job done, just somewhat slower.

                    Best regards,
                    Vlado
                    I only act like I know everything, Rogers.

                    Comment


                    • #11
                      interesting that on cpu opencl is faster than c++

                      Comment


                      • #12
                        Originally posted by savage309 View Post
                        You don't need to have GPUs in your render slaves.
                        The image is denoised only on the client, so if you render in DR, only the client hardware matters for the denoiser.
                        If you render different frames on different nodes, it is okay as well. We automatically find the most appropriate hardware for the denoising. If some of the nodes don't have GPUs, they still can denoise just as good using their CPUs. This happens automatically. And you still can disable the hardware acceleration at all if you need that for some reason (there is a checkbox for that in the denoiser settings).

                        Best,
                        Blago.
                        Nice to know it chooses the best available option!

                        Originally posted by vlado View Post
                        How did you come to this (wrong) conclusion? The benchmark table specifically lists a couple of CPU results...

                        Best regards,
                        Vlado
                        Yes, i just presumed it was going to work in a different way, my bad.

                        Comment


                        • #13
                          Originally posted by super gnu View Post
                          interesting that on cpu opencl is faster than c++
                          Yes, but only slightly and keeping in mind that we didn't really spend a lot of time to optimize the C++ code (it's not that complicated anyways) it's actually not bad.

                          Best regards,
                          Vlado
                          I only act like I know everything, Rogers.

                          Comment


                          • #14
                            Will the AMD GPUs be always faster for this kind of job? Seems odd, although I have no idea what's going on behind the scenes.
                            A.

                            ---------------------
                            www.digitaltwins.be

                            Comment


                            • #15
                              Originally posted by savage309 View Post
                              * Intel integrated GPUs denoises about ~1.5 to 2.0 times faster than an average Intel i7 CPU. Integrated Intel GPU (which comes with the most CPUs basically for free) gives roughly the performance of 1500$ Xeon CPU.
                              Err, lolzor?
                              Lele
                              Trouble Stirrer in RnD @ Chaos
                              ----------------------
                              emanuele.lecchi@chaos.com

                              Disclaimer:
                              The views and opinions expressed here are my own and do not represent those of Chaos Group, unless otherwise stated.

                              Comment

                              Working...
                              X