It's still CPU-only but there is now some definite work in the pipeline to add GPU support. Unfortunately I don't know when this will be; DCP-o-matic is still a spare-time project and the amount of spare time is limited and quite variable.
Kind regards,
Carl
GPU based DCP encoding
-
- Posts: 2804
- Joined: Tue Apr 15, 2014 9:11 pm
- Location: Germany
Re: GPU based DCP encoding
@checcontr: Absolutely NO GPU encoding currently, and as Carl says, it may take a while until it arrives. Can you tell me the specs of your current setup - Mainboard, CPU, amount of memory? From what do you encode - compressed video files (e.g. MP4, ProRes, DNxHD), or still images series (e.g. TIFF).
- Carsten
- Carsten
-
- Posts: 13
- Joined: Sat Dec 30, 2017 7:12 pm
Re: GPU based DCP encoding
When GPU encoding arrives, what GPU will it be optimized for? I'm building a new machine at home and want to buy the correct hardware
-
- Site Admin
- Posts: 2548
- Joined: Thu Nov 14, 2013 2:53 pm
Re: GPU based DCP encoding
At the moment, the most likely first "version" of GPU encoding will be for nvidia, on Windows/Linux only (not macOS) and not free.
-
- Posts: 2804
- Joined: Tue Apr 15, 2014 9:11 pm
- Location: Germany
Re: GPU based DCP encoding
Last edited by Carsten on Fri Nov 13, 2020 2:10 pm, edited 1 time in total.
-
- Posts: 13
- Joined: Sat Dec 30, 2017 7:12 pm
Re: GPU based DCP encoding
Thanks Carl. I understand it won't be free, more than happy to splurge for it. Thanks!
-
- Posts: 20
- Joined: Mon Oct 14, 2019 3:48 am
- Location: Australia
Re: GPU based DCP encoding
I have delved into GPU encoding.
Talking to the main developer of Kakadu, the advantages of GPU encoding are minimal as there is other bottlenecks involved (Getting the data in and out of the GPU memory can be a bottleneck compared to just doing it all in the CPU.. (But with universal memory and PCIe4 in the new GPUs that may be a different story now).
They are more focused on general CPU encoding as its more universal and can be applied to any platform. And in the world of cloud computing, you can just add typical VMs to a cloud compute solution to archive whatever you want. Not custom servers/CPU/GPU instances.
Just optimization of J2K on current CPUs would be 5-10 times faster than OpenJPEG2000.
I would suggest looking at licensing kakadu as I know the guys and they are incredibly understanding on this market.
This thread is MANY years old and nothing has happened. so.. Would you pay $70usd for a addition to make it render 5-10 times faster?
I would..
It must be reasonable as it comes with Resolve now, and it's cheap as for what it does.
Talking to the main developer of Kakadu, the advantages of GPU encoding are minimal as there is other bottlenecks involved (Getting the data in and out of the GPU memory can be a bottleneck compared to just doing it all in the CPU.. (But with universal memory and PCIe4 in the new GPUs that may be a different story now).
They are more focused on general CPU encoding as its more universal and can be applied to any platform. And in the world of cloud computing, you can just add typical VMs to a cloud compute solution to archive whatever you want. Not custom servers/CPU/GPU instances.
Just optimization of J2K on current CPUs would be 5-10 times faster than OpenJPEG2000.
I would suggest looking at licensing kakadu as I know the guys and they are incredibly understanding on this market.
This thread is MANY years old and nothing has happened. so.. Would you pay $70usd for a addition to make it render 5-10 times faster?
I would..
It must be reasonable as it comes with Resolve now, and it's cheap as for what it does.
-
- Posts: 2804
- Joined: Tue Apr 15, 2014 9:11 pm
- Location: Germany
Re: GPU based DCP encoding
We do already now see bottlenecks with very fast CPU setups, e.g. in content examination, audio analysis and hashing. On a very fast multicore CPU, parallel J2K can be performed very quickly, but post processing/hashing so far is single threaded and can add considerable time to the overall encoding time. I don't know if there are ways to speed this up other than segmenting into multiple reels. But we have to wait for Carl to finish his work on the GPU encoder to see how important that will become, and I guess then Carl needs to take a look at these bottlenecks. Using fast M2. SSDs will probably be one solution, as some of these operations are IO-bound.
-
- Posts: 6
- Joined: Wed Feb 17, 2021 3:59 am
Re: GPU based DCP encoding
I too am looking for solutions to speeding up this process. I run a small Art House in Colorado and eating up my PC for 13-33 hours PER Movie is rough. Especially since they take up massive amounts of space once they are converted, it's tough to hold many of them even on large external Storage systems. The 33 hours one was Dune (Ext) [1984] that clocks in at 3 hours. Here's what I'm figuring based off what I read in this Thread to get maximum speed out of this process right now.
My biggest issue is the time. I L-O-V-E this software and have no thought of using any others. With it taking so long I can't do Aud rentals 'spur of the moment' unless they want to chose a Movie we already have on DCP, that I haven't deleted due to space constraints. On that note, I have a Synology DS920+ with (2) 4TB WD Red NAS drives in Raid 1 for my "big storage". I will be added dual 12TB soon for 20TB in total. These take too long to convert to have a drive fail and lose the 17-22 you can fit in 4TB.
In the short term, I'm thinking using multiple old PC's is the best option. We've been in business almost 27 years and have TONS of older PC's just laying around that we don't use any more. I'm positive I can build at least 2-3 by frankenstiening them from all the rest. If we can't really do "faster" yet, maybe we should just look into "more simultaneously" instead.
- Fastest Single Core CPU available.
M.2 for Windows & DCP-o-matic.
SSD to compile the DCP to.
[later on] high end Video card to speed up the calculations.
-OR- multiple older PC's that we have no use for, all converting movies to DCP to double/triple/quadruple/etc how many you can make at once.
My biggest issue is the time. I L-O-V-E this software and have no thought of using any others. With it taking so long I can't do Aud rentals 'spur of the moment' unless they want to chose a Movie we already have on DCP, that I haven't deleted due to space constraints. On that note, I have a Synology DS920+ with (2) 4TB WD Red NAS drives in Raid 1 for my "big storage". I will be added dual 12TB soon for 20TB in total. These take too long to convert to have a drive fail and lose the 17-22 you can fit in 4TB.
In the short term, I'm thinking using multiple old PC's is the best option. We've been in business almost 27 years and have TONS of older PC's just laying around that we don't use any more. I'm positive I can build at least 2-3 by frankenstiening them from all the rest. If we can't really do "faster" yet, maybe we should just look into "more simultaneously" instead.
-
- Posts: 2804
- Joined: Tue Apr 15, 2014 9:11 pm
- Location: Germany
Re: GPU based DCP encoding
You need as many cores as possible to speed up the conversion. Even very old CPUs with many cores can be faster than modern higher clocked CPUs with less cores.
It is possible to create a feature DCP at around real-time, that is, 3hrs conversion time for a 3h feature, at modest cost.
https://dcpomatic.com/benchmarks/input.php?id=1
- Carsten
It is possible to create a feature DCP at around real-time, that is, 3hrs conversion time for a 3h feature, at modest cost.
https://dcpomatic.com/benchmarks/input.php?id=1
- Carsten