Well it's a toss up that you have to decide. From what I see GPU's normally run hot, my air cooled GPU runs at near 90C normally but my CPU on air cooling would not go above 60C. A well cooled CPU will auto overclock more so you may want to get the most out of that knowing that the GPU will run...
the bigger problem we face with PC cooling is that the colder the components the better but there will always be a difference on ambient so the same parts that would reject a lot more heat from 90C coolant allow less heat rejection since 90-25 is a bigger number than 50-25. I find that on my CPU...
Try telling my employer that, they only make radiators. I can assure you that when you have an engine rejecting 100kWh of heat that must not go over a certain temperature with radiators of 2 square meters and 200mm deep you don't just guess. Thermo physics are fairly well understood and not that...
You confuse heat and temperature. Heat is energy, energy out is broadly energy in. My RX 5500 XT is 120W, my CPU is 65W, my GPU makes more heat therefor. What temperature you can cool to is a different matter. If you have two identical GPU's or CPU's and run them on an identical load and then...
GPU's make more heat. My RX 5500 XT is using up to 120W when running folding at home, my CPU is 65W, rarely CPU's go over 100W. I think GPU's are supposed to be allowed to get hotter so you want to send coolant to the CPU first as it's speed is more temperature dependent and then the GPU...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.