• We have updated our Community Code of Conduct. Please read through the new rules for the forum that are an integral part of Paradox Interactive’s User Agreement.
Status
Not open for further replies.
Tried again with the new Nvidia driver, at first it looked like it fixed things, temps looked normal and no more 665 fps with activated vsync while idling in the starting screen.

Loaded my savegame, played for like 10-15 minutes and all seemed to work properly when all of the sudden for whatever reason the game decided to go apeshit again at my GPU and temps skyrocketed in a matter of seconds and were still rising.
Seems either vsync is still partially broken, or there is simply some buggy load cycle looping itself like crazy and causing insane workload on the GPU.

Curious and wanting to see if it's just some load cycle spike, or if it will keep pushing on, i waited a few minutes longer while monitoring MSI AB data until i heard a condensator/capacitator starting to whistle and i instantly quit the game ...

My GPU never ever did this before, even after excessive marathon gaming sessions on demanding games.
This usually only happens with modern GPUs if you torture them to the max in a stresstest/BURN-IN environment where the condensators/capacitators reach very high frequencies (which hints at vsync still not properly working).

I'm probably not gonna bother anymore and just refund, just another lesson learned, never ever gonna buy a game running on unity engine again, sad because i was really hyped about this game as somebody who played the TT in the 90s.

p.s.:
Just read a Reddit post from a guy who reported that even with the new driver his Titan X is most of the time sitting on 100% load.
A game with mediocre graphics capping out a thousand dollar GPU and there are still some shills in here white knighting this bull****, telling people it's their fault or their potato rig and trying everything to blame the user/customer instead of Paradox/HBS who actually f***** up LUL.
 
Huh, seems like I'm not crazy and lucky I invested in a Watercooling system to prevent this sort of thing.
I have noticed that Battletech brings the gpu temp up to 60°C on my gtx 1080 which is a lot only very highend games like ac:eek:rigins were able to scratch 50°C until now.
So chances are that without watercooling my gpu would be toast right now aswell.
I think that I will play something else until Battletech is safe to play.....
 
Vsync works fine with the latest NVIDIA drivers and also you can force Vsync through the NVIDIA control panel and I believe through AMD's control panel too.
I did not say that driver Vsync is broken. What's broken is the in-game Vsync toggle. Sometimes it works, sometimes it does nothing, sometimes it does the opposite. There is a dev posting about it somewhere. It's a known bug.

Furthermore, I have no interest in forcing Vsync because I run a G-SYNC monitor and G-SYNC only works with Vsync disabled. What I need is either a FPS cap option (something which most newer games have) or the devs programming the game in a way that does not let the unity engine cause runaway load on the GPU (again, a known Unity engine problem). The only way to currently cap FPS is to use a third party application.

Look at some of the other comments here and on Reddit of people reporting significantly lower GPU usage after updating the to the latest NVIDIA driver.
The thread about the new drivers on the Battletech subreddit says nothing of that sort.

The bottom line, though, is that software can not physically damage your graphics card, only bad hardware / bad cooling / bad PC airflow / overclocked voltage...
And you say that based on what exactly? You have obviously never heard of Furmark. At any rate, stressing your GPU hardware to the absolute limit for no apparent reason (I mean, look at the game! My GTX1080 should be idling most of the time instead of getting the workout of its life!) may very well reduce its lifespan, in addition to wasting a lot of energy on a game with graphics that should run on a toaster.
 
Something strange is going on. I can run the game on the crap integrated graphics that my macbook pro has with totally normal temps. I wonder if there's some sort of issue with newer nvidia cards.
 
can confirm that newest nVidia driver really messed up my computer, Palit Dual GTX 1060 6gb. Unable to recover, but will do a clean windows install after some harware upgrades later today.
 
I am using my MacBookPro and just the main menu gets my fan running at full speed in no time. Being a dev myself, I think they have not set Application.targetFrameRate in Unity. The default is -1 so that means the computer is trying to kill itself for maximum FPS. Have had the same issues in the (small retro) games I did and as soon as I set a reasonable FPS rate the problem disappeared. So this is one patch away, assuming the devs are monitoring the forums.

EDIT: to avoid setting too low FPS for people that want to run max, it should be something to change in the game settings for everyone's personal taste.
 
I gave it a try yesterday, i wanted to see how hot my GPU can get.
Well...somehow that "issue" is gone for me. My fps are fine and stable, my gpu usage too.
In loading screens the usage drops to 0%, in missions it jumps between 60% and 96%, so no 100% usage at all.
Temp is between 40°C and 55°C, so everything is fine.

Card is a GTX960
Settings are on Medium, Vsync enabled, Motion Blur disabled
 
The game caused 2 bsod and now my gpu is badly damaged. It artifacts, causes the screen to flash on and off then inevitably crashes the computer to bsod. There has to be a way to get reimbursed for a replacement card. This is ridiculous.

I heard about a similar story on another thread (Might have been you actually)
This game nearly nuked my GPU so this shit better get fixed soon or they are gonna have some lawsuits on their hands

I can confirm that my GPU fan is running on high RPM's even when I'm not in battle. It's very strange indeed

All above doesn't change the fact that game is far to taxing on your hardware for what it is. Pretty explosions and landscapes not withstanding.


So i gave up going through the list of ..well .. uninformed posts. So ill just use the Op and these other 3 as examples.

Long story short, im a PC enthusiats with many many years of expeirence with high end custom rigs.

Let me make it as clear as possible

GAMES can NOT kill, or damage hardware PERIOD.

All graphics cards, CPU's and other PC hardware, are designed to be run 100% full load without issue.
If a game runs the GPU at full load, and your GPU dies, it shows a problem was allready present, caused by somthing else (likely poor maintenece resulting in a fault, or poor airflow/cooling design by the PC builder, be it the user or place of purchase)

I continue to be amazed at how little people ,who use PC's alot for gaming, know about their own system. I see this all to often " this game broke my computer blah blah blah". Its utter nonsense.
The only software that can affect hardware on that level is specific OverClocking software that can reduce fan speeds and increase votlages, which can then lead to overheating.


Finaly, as for why the GPU runs high usage,even when not in battle. It is becouse the game has no Vsync option, and runs in windowed fullscreen. Thus if you have global Vsync off for your desktop, you wont have Vsync in game, meaning the GPU will try to run the game as fats as possible above and beyond what your normal frame limit may be. the game is NOT to ttaxing for what it is.
Iit is somewhat unoptimised imo, i have seen instances where GPU usage drops and FPS tanks, likely due to a game engine bottleneck, but thats about it.
 
Huh, seems like I'm not crazy and lucky I invested in a Watercooling system to prevent this sort of thing.
Regular ol' air cooling works just fine. Watercooling is useful for overclocking/overvolting, but it isn't a solution for poor airflow in a system. Other components still need some sort of air flow to avoid overheating.
 
So i gave up going through the list of ..well .. uninformed posts. So ill just use the Op and these other 3 as examples.

Long story short, im a PC enthusiats with many many years of expeirence with high end custom rigs.

Let me make it as clear as possible

GAMES can NOT kill, or damage hardware PERIOD.

All graphics cards, CPU's and other PC hardware, are designed to be run 100% full load without issue.
If a game runs the GPU at full load, and your GPU dies, it shows a problem was allready present, caused by somthing else (likely poor maintenece resulting in a fault, or poor airflow/cooling design by the PC builder, be it the user or place of purchase)

I continue to be amazed at how little people ,who use PC's alot for gaming, know about their own system. I see this all to often " this game broke my computer blah blah blah". Its utter nonsense.
The only software that can affect hardware on that level is specific OverClocking software that can reduce fan speeds and increase votlages, which can then lead to overheating.


Finaly, as for why the GPU runs high usage,even when not in battle. It is becouse the game has no Vsync option, and runs in windowed fullscreen. Thus if you have global Vsync off for your desktop, you wont have Vsync in game, meaning the GPU will try to run the game as fats as possible above and beyond what your normal frame limit may be. the game is NOT to ttaxing for what it is.
Iit is somewhat unoptimised imo, i have seen instances where GPU usage drops and FPS tanks, likely due to a game engine bottleneck, but thats about it.

So basicaly it is games fault as far as absence/bug of certain option/mechanics is concenred. Almost a standard one for the games at that.
 
GAMES can NOT kill, or damage hardware PERIOD.
(...)
I continue to be amazed at how little people ,who use PC's alot for gaming, know about their own system. I see this all to often " this game broke my computer blah blah blah". Its utter nonsense.
Exactly. :) Plenty of other people in this thread who think they know something because they worked on their PCs at home and doing X caused Y, yet they have no background beyond their home computers or no grounding in IT professionally.
 
GAMES can NOT kill, or damage hardware PERIOD.

All graphics cards, CPU's and other PC hardware, are designed to be run 100% full load without issue.
If a game runs the GPU at full load, and your GPU dies, it shows a problem was allready present, caused by somthing else (likely poor maintenece resulting in a fault, or poor airflow/cooling design by the PC builder, be it the user or place of purchase)

I continue to be amazed at how little people ,who use PC's alot for gaming, know about their own system. I see this all to often " this game broke my computer blah blah blah". Its utter nonsense.
The only software that can affect hardware on that level is specific OverClocking software that can reduce fan speeds and increase votlages, which can then lead to overheating.


Finaly, as for why the GPU runs high usage,even when not in battle. It is becouse the game has no Vsync option, and runs in windowed fullscreen. Thus if you have global Vsync off for your desktop, you wont have Vsync in game, meaning the GPU will try to run the game as fats as possible above and beyond what your normal frame limit may be. the game is NOT to ttaxing for what it is.
Iit is somewhat unoptimised imo, i have seen instances where GPU usage drops and FPS tanks, likely due to a game engine bottleneck, but thats about it.

This is not entirely correct.
Hardware is supposed to be designed to handle 100% load, that doesn't mean that a specific GPU was actually designed that way. To err is human and all that stuff. That being said, this game causing modern GPUs to run at 100% load is definitely a bug, as there's nothing about in-game graphics that should cause that. Your average 10-series Nvidia card should be almost idle all the time even with V-sync off. Especially when not in battle - what the heck does it do that requires that much 3D rendering when you're merely going through the menu?

On a side note, how did you make it run in windowed fullscreen? I only see options for regular fullscreen and windowed (with border).
 
So i gave up going through the list of ..well .. uninformed posts. So ill just use the Op and these other 3 as examples.

Long story short, im a PC enthusiats with many many years of expeirence with high end custom rigs.

Let me make it as clear as possible

GAMES can NOT kill, or damage hardware PERIOD.

All graphics cards, CPU's and other PC hardware, are designed to be run 100% full load without issue.
If a game runs the GPU at full load, and your GPU dies, it shows a problem was allready present, caused by somthing else (likely poor maintenece resulting in a fault, or poor airflow/cooling design by the PC builder, be it the user or place of purchase)

Please stop. Games are 'software' and software in any form can damage hardware. It can be cumulative damage or immediate damage, but that's a reality. Look, with your many years of experience and all that, I'm sure you're patting yourself on the back. I can trot out the 25 years in IT/Sysadmin/design and layout and half a dozen other things(but that's a shitflinging match nothing more nothing less), and find plenty of examples of malicious software causing damage, seen server racks more expensive then your car burn out because poor active cooling designs too. Now look at a past example that someone has already pointed out: Starcraft 2. The interactive scene between missions had no FPS limit, it would draw as hard as it could, soak up as much resources as it could. That *did* cause hardware damage, but limited liability makes it your problem not Blizzards.

There is a TDL(thermal design limit) on all hardware, and a point where heat generated can't be dissipated fast enough with the cooling solution at hand. It's also based on the average air temperature in a particular sized room - usually 21C(~70F)@ 12x12'(3.6m x 3.6m) Let's look at benchmarking/burn-in testing, the idea is to 'stress' the hardware enough to see if there's a failure. Whether it be heat(by drawing), voltage(causing instability), or whatever else. Now let's look at the game, which is/was doing the same(again like SC2). Instead of only doing this for a short period of time, it's doing it for hours. To the point where the TDL is reached and one of a few things happen: It crashes, the TOL thermister kicks in and throttles it down, the TOL kicks in but the TDL is passed anyway. When the TDL is passed one of three things then happen. It shuts down, it crashes(soft or hard), it begins to thermally runaway aka time for the magic smoke to come out.

Now let's not forget that the videocard manufactures can also cause these problems. Remember a few years ago when nvidia's drivers were baking video cards because the fan profiles were set too low to dissipate heat? Remember the massive TDR problem on the nvidia 500, 600, 700 series cards? Where nvidia for over a year claimed it was "a user problem" until someone figured out that the drivers were undervolting the gpu, in order to lower the thermal profile. This then led nvidia to pay to have people's PC's shipped to California at their cost to do indepth testing, and they then had to figure out another option other then undervolting to lower the thermal profile leading to a redesigned cooler.
 
i have seen instances where GPU usage drops and FPS tanks, likely due to a game engine bottleneck, but thats about it.
This is almost certainly your card throttling because it's running hot.

There is something seriously wrong with the game as it is. Battletech, as much as I like its visual stylings, should not cause my 980Ti to hit 60+ degrees and the fan to spin up to 100% when The Witcher 3 on high settings does not.
 
This is not entirely correct.
Hardware is supposed to be designed to handle 100% load, that doesn't mean that a specific GPU was actually designed that way. To err is human and all that stuff. That being said, this game causing modern GPUs to run at 100% load is definitely a bug, as there's nothing about in-game graphics that should cause that. Your average 10-series Nvidia card should be almost idle all the time even with V-sync off. Especially when not in battle - what the heck does it do that requires that much 3D rendering when you're merely going through the menu?

On a side note, how did you make it run in windowed fullscreen? I only see options for regular fullscreen and windowed (with border).

"Fullscreen" in the game settings is actualy windowed fullscreen. This is why its a PITA to get vsync to work. Windowed fullscreen acts as windowed, so you need to activate a global vsync and framelimiter , not a program specific profile.

I found using Nvida Inspector and setting global profile framelimit to 60 worked all the time, while forcing global vsync only worked in parts of the game.
 
So i gave up going through the list of ..well .. uninformed posts. So ill just use the Op and these other 3 as examples.

Exactly. :) Plenty of other people in this thread who think they know something because they worked on their PCs at home and doing X caused Y, yet they have no background beyond their home computers or no grounding in IT professionally.

[Mod Edit: Disrespect]

Just a question for the self proclaimed experts:

What GPU will have a longer lifespan ?
The one that i put through Furmark Burn-in 24/7, or the one of a typical desktop user who maybe plays a few hours a day ?

Just to give you a hint:

There is a reason big server farms have tons of backup HDDs, it's because the lifespan of them goes down significantly compared to normal desktop usage and the thingys actually die pretty often if you put them on permanent heavy duty, or get replaced after a certain anmount of read/write cycles.

The same way you can't expect that you get a lot of mileage out of your car-engine if you redline the thing all the time.
Just because you can, doesn't mean you should.

So yeah putting your GPU needlessly under max load won't evaporate it instantly but will accelerate the process of degradation, thats just physics, it probably will shorten the lifespan for some users enough that the card will die before the natural replacement cycle through the user who is upgrading his system.

There is simply no reason a game that looks as dated as BT, should needlessly put on workload not even the latest triple A graphic bombs manage to do and tax a halfway decent modern GPU like a hardcore stresstest.
I mean there are people with 1080tis and Titans reporting the game caps their cards on 100% load ffs.

FYI
I've got 4 BeQuiet Silentwings in my Lian-Li B10 case and a BeQuiet Dark Rock 3 on my CPU, which are some of the best air cooling solutions in the market, the HDD rack got removed for optimal airflow and cable management is tight.
I never had any cooling issues with this setup no matter what i threw at it, the highest temps i ever saw on my GPU were mid to high 70s on the most demanding triple A games after hours of gaming, until a game that looks like it's from 2010 called Battletech came along.
 
Last edited by a moderator:
"Fullscreen" in the game settings is actualy windowed fullscreen. This is why its a PITA to get vsync to work. Windowed fullscreen acts as windowed, so you need to activate a global vsync and framelimiter , not a program specific profile.

I found using Nvida Inspector and setting global profile framelimit to 60 worked all the time, while forcing global vsync only worked in parts of the game.

That's weird, mine doesn't look like windowed fullscreen. When you alt-tab out of windowed fullscreen game, you'd normally see the game window in the background. This is certainly not the case here.

Edit: nevermind, found a dev's post about how to do it: https://forum.paradoxplaza.com/foru...ing-screen-resolutions-aspect-ratios.1092119/
 
Last edited:
Games are 'software' and software in any form can damage hardware.
That's incorrect at face value. Modern processors typically have internal protection circuits to throttle/shut down in the event of hitting a pre-determined thermal limit. Software can't "damage" hardware in a regulated system. Now if you use software to bypass the safeties in hardware, then yes, you're right, it can. But that isn't possible with modern GPUs, modern drivers, and games.

As for heavy usage, that's wear and tear, not "damage".
 
Status
Not open for further replies.