Thursday, April 12, 2012

GPU (Video Card) Overheating

[:1]

Hi all,

I want to discuss about the Graphic Card draining that this game cause.
First of all, if you monitor the game with some diagnostic tool (I use RivaTuner, keeping under costant monitoring on external monitor) you can notice that the frames per seconds developed by the game can overtake you screen refresh rate (i.e. if you have a 60Hz monitor there's no sense of having a higher frame rate than 60fps, but the game goes over these, reaching higher values for nothing).

The conseguence of this behavior is a major heat develop for calculating frames that will never been displayed. The "solution" about this is enabling V-Sync, that will force a cap to the frames generated to the refresh rate of the screen (so if you have a 60Hz monitor, the game will not overtake the 60fps). V-Sync has his defects: it introduces input latency and tearing (you can notice it moving around on the map with the camera: the games seems lagging even if the fps are good).

In some cases V-Sync isn't even a good solution for overheating. For examples gaming laptops have always a poor cooling system (compared to desktops computers), so GPU overheating is a common problem. For fixing this problemdefinitively, the game developer must enable an option to cap the maximum frame rate at a fixed value: in this way your GPU can throttle when it will exceed a setted frame rate (it's common pratice fixing the maximum frame rate at 30fps: is a good compromise between graphic fluidity and heat generated!).

For archeiving this, there's a tool called fpslimiter, but it can't be used with games that starts with a launcher (that recreates a new video context), like AOEO. So this tool cannot come in hand.

So, the only hope is that the staff introduce this new feature (for example, under 'Graphic Options' settings menu), which would save energy in desktop environments, and make the game accessible from notebooks without compromising the health and longevity of the hardware.

How to propose this fast implementing featureto the staff?

Thanks for reading in advance.

|||

I don't know if they will take the time to fix this because when a GPU maxes out it should be able to keep it's self cool.

I wouldrecommend(for advanced users)gettinga GPUcooleror another case fan.

Also, I don't know a lot about FPSlimiter but all the launcher does is check files are up to date then launchSpartan.exe. Couldn't you set FPSlimiter to reduceframeright on that? Though as I said, I don't know much about the program.

|||

It's not only a matter of heat in some use cases, but it's a good general resource management criteria: if you can measure the wattage of your desktop you can be surprised on how much energy drain does this overtake on the FPS (take some game with the ability of frame capping and try it!). There's plenty of articles on this on the net. AOEOis responsible for a huge energy waste committed by all its users, at this time.

FPSLimiter can launch Spartan.exe. The problem is that Spartan will close saying that you must launch that from the launcher, even passing the same command line parameters LauncherLang and LauncherLocale. Evidently that launcher creates an external environment needed for the correct initialization of Spartan.exe.

Thanks btw for the interest :)

|||

I do agree with the majority of what you're saying from a base consumer point of view.. Problem here is, I do have access to almost every computer and electronic developers group. I do know some technologies are going beyond what we have now. As example,the second generation 3D. Where you would have a frame rate of say 60 fsp as an example, it will run at 120 fps because it displays 2 side by side images.

Knowing this, frame limiters will not work or at very least, become a major headache. It will not be long before we don't even have desktops and laptops but cell phones that will run wireless to any monitor, tv, keyboard, mouse and our "computers" will be our phones. Tabs are the crossover right now but within a couple of years tabs will be outdated to this new technology.

So knowing what is going to happen within the next few years, I'm sure you can see the developers are not going to invest time and money into a new technology that is already outdated before it even starts.

Great suggestion tho.

Bill

|||

Are you configured for Crossfire / SLI? Are you overclocking your machine? The issue is clearly a result of people with high performance machines being unable to limit the framerate which results in their machines overworking.


|||

Mine machine is a laptop with a known faulty line of air cooling: the Acer Aspire 8930g. It has not SLI/CF, and it's not overclocked (rather, it's underclocked by stock bios).

By the way, the point here is not the problem that i'm experiencing as an user of this game, but the usage that this game (and most other) makes of the graphic resources: It's greedy of pointless calculus. Infact if you have a 60Hz monitor (with no 3d technology active) you can effectively view at most 60fps. Any frame generating over the 60 per seconds will not be desplayed by monitor, so wheres the point of calculating that? It's only a waste of energy. Hence, here partially comes VSync, but it introduces some side effects that will get a worst gameplay experience for the extra buffering that it involves. So, here, the solution is capping the framerate to the maximum refresh rate of the screen (and it's a really fast implementing feature!).

Now, let's think about everyone who gets overheating problems. These guys (and I'm inside this group) can try to fix out problems disabling graphic effects and options, usually. But in this game this is not a solution: just because lower graphics requirements involves greater framerates, (moreover unuseful in most of the cases) which causes the graphic card to work every time at full load!

The only solution that I've found in these years (and with me, a lot of other people) is capping the framerate at a fixed value which preserves graphic fluidity and not excessive work for the GPU (30fps generally are enought). Some tool where developed by the community to fill this gap left by the developers, but even here there are some limitations (ie the graphics context must not be recreated by the game, that must inherit the ones created by the tool) that prevent an universal application of this solution.

In every environment (not only overheated notebooks) it can be a good feature just for the energy not wasted.

I've saw, as developer, that in D3D and OGL that maximum framerate can be controlled dinamically, so why not add a new advanced setting in graphic menu (like some game producer already made) that allows you to set the FPS upperbound?

Everyone would benefit, and is a feature of rapid deployment.

Thanks in advance.

|||

@bill

First of all sorry for double post but i can't get how to make a double reply on this board :/

You are talking about true things, with a good future analisys, but I want to be sure that you have not misunderstood me.
I'm not talking of some strange configuration involving actual top technology (such for us, stereoscopic view is), but a common problem, for every system that is related to this game, from desktop to notebook, with every configuration that meets the minimal requirements.

This simply feature that I'm suggesting is small but effective, you can test by yourself even with a Wattometer on your power adapter. :) So.. why not add that? :)

|||Screen tearing is a problem on high resolution monitors which V-Sync should take away the tearing, but in this game it doesn't. I only see this in capitol city which is minor. There is nothing you can do if your machine can't cool itself. Laptops run hot just in general windows running, and when you add intessive GFX programs it is gonna build the heat.|||

hmm if you have high resolution is maybe cause your video card to overheat thats why i am running any game and 1024 - 768 it keeps my video card at optimal heat like 60C

|||Seems that every time this issue comes up the game is being run on a laptop. All the points made in this thread make sense, but I think it really boils down (no pun intended) to cooling inefficiency. For reference, I'm on a reasonablyhigher endmachine and my GPU is a 560 Ti. Even on maximum settings, this game only draws ~30% of the GPU on average.|||

Wait, I'm not making a thread on laptop gaming problems or whatelse, since if you scroll this forum thread there are a lot of people that reports problem heat-related, even in desktop environment.

I've just proposed an universally and low cost applicable solution, a fast implementing and easy mantainable stupid feature that will solve this problems in most of the cases.

I know too that if a laptop runs hot it's not a fault due to software producer, but to the hardware assembler. I'm only saying that this game is wasting energy, indipendently on the hardware platform on which it runs.

|||

I've tryed: higher resolution (720p to 1080p) goes hot with poor frame rate, and lower resolution (between 1024x768 and 1366x768) goes hot too because of the higher frame rate (over 100, a waste with a 60Hz monitor). Even disabling effects will not take result, because of the FPS addition that comes.

I had this problem in a lot of other context, and I always solved with frame capping, but this game has not this feature and even external tools can't force it.

|||

Mine 9700M GT is used around at 50% too when it reaches 98 degrees. It's simptomatic of a bad cooling efficency (we all know this, Acer too that has totally modified the cooling system on the newer lines).

But try to measure your wattage without and with a frame limit, with any game that permits it. Your GPU will be less stressed, less power hungry and less heater, without compromising game fluidity and responsiveness.

so... why not?

|||

It's funny because myself and someone else recently had this discussion in private and within days multiple cases have popped up related to the issue...

I'm going to comment about desktops, not laptops. :)

We were talking about how all these new systems are seriously under powering the computers.. Problem is, under normal circumstances the average user will never know their computer is under powered.. Then you get gamers and this is where the issue kicks in... It's a simple theory... More resources require more power, more power supplies more heat...

Problem is... As the power supply maxes out the CPU and GPU have to work harder to gain optimumperformance. this causes even more heat when an item has to work at 110% capacity to obtain 100%. I'm NOT ruling out the game is a pig on resources. It just does not help the situation.

When manufactures design computers now they seem to be cutting back on the essentials such as power supplies. Years ago one of the main specs people saw (and asked about) was the power supply. They use to list the manufacture and wattage. These days, you have to dig to find out the specs. Some computers don't even mark the maker or wattage on the power supply.

When the power supply is setup to minimum requirements of the base computer, once you start adding new hard drives, DVD burners, upgraded sound and/or video it does require more power. Problem is, even a manufacture upgrade of components, they're not upgrading the power supply or have the option upon purchase.

It seems they're more worried about keeping an Energy Star or Leet standing for energy conservation than performance.

Someone was asking me in Email what is the easiest way to tell if your computer is under powered without an Electronic Engineer orMathematicsdegree. I would say do a benchmark test running at optimal for at least 1 hour. Monitor the core temperature and compare it to what the manufacture says are optimal and ranged are. If you're over the ranged on a benchmark test then your power supply is under powered causing the CPU and GPU to work harder and longer.

The strange part with a lot of these overheating issues is the cards. People with low end cards are not getting overheating issues. Those with high end graphics cards are. You would think it would be the opposite but it's not. A low end card should be working harder to keep up causing more heat. This is why I'm thinking the GPU is pulling a drain off the power supply and working harder than it should.

It's a theory that should be investigated further.

Bill

|||

In this period, for high end systems, a good power supply is required. The days where you could rely on chip 800W PSU are gone, such because components requires high and, most of all, stable amperage nowadays.

And when building a new computer, it's necessary to estimate the system full load energy draining, adding a factor component for capacitor aging (every PSU will lose his capacitor power, providing less energy during time).

Even Hardware Producers aren't making a good job about, sending references of their products with not correct TDP specifications (for example Intel i3 2100t is referenced with a TDP of 35w, and by stock clocks it drains around 55).

Actual technology is amazing in terms of performances, and about all even in costs (last weekend I could make for 900€ a system for a friend with dual GPU and an amazing processor, like I5-2500k is), but the energy draining of these monsters is always something really near to a vacuum cleaner..

You are right when you say that the power supply isn't taken too much in consideration, even in computer shops (that makes computers for job), I can confirm this. And everyone around the net is suggesting upgrade possibilities without considering the powerdraining restyling that must be done about (ie if you check the suggestions about Graphic Cards shoppings of this period, around 300€, you will find as suggested a dual gpu system of AMD 6850/6870, without specify that you have to invest your money even in your power unit, so imho it must be inserted in another price range). Just because they will focus only on the performance.

I know that it might sound strange that higher price cards comes with more problems than the chipper ones: with overheating, power draining, etc. But you must keep in mind that all actual technology is CMOS based, and CMOS drains power when his circuitry changes states/values.

High-end cards have more circuitry, for pipelining, multiprocessing, and fastest clock handlers. So more power is needed, and for the Joule effect, it will translate at most in heat. Instead, low end cards are simpler, there's the core design of the high end cards, where a restyling has take off the most advanced computing parts (less Core units, bus tighter, slower chips, so slower clocks, etc), making the card simplier and more energy friendly.

The strangest fact is that you pay less the hardware that has the most designing work behind. I couldn't get this since some years ago, but I found the answer thanks to my studies: what you pay is the testing behind an hardware product. Advanced products needs more sophisticated machines for testing their capabilities (for example a processor running at 3GHz needs an ATE, automated testing equipment, running at least 4 times faster). And the testing job is the most expensive of all the developing chain, just because it has some yeld loss, false positive, products restyling, and most of all it needs these huge and sophisticated machines for the testing. Design and production costs are nothing compared to the testing behind, trust me.

I hope to have clearified some aspect of the matter :)

No comments:

Post a Comment