Marvel Mods

Off-Topic => Talk about anything => Topic started by: Pooltastic on April 05, 2010, 06:53PM

Title: Hellgate London + FX5500 + DVI = FAIL?!
Post by: Pooltastic on April 05, 2010, 06:53PM
Hiya.
Since no amount of googling will help me it seems, I thought maybe ask advice from all of yous

I have got Hellgate London and it was working sweet on my Nvidia GeForce FX 5500 until recently.

I recently got a new BenQ G2220HD 21.5" monitor and of course switched from VGA to DVI.

Oh the horror, now every time i go in-game, the screen gets grainy as hell and keeps on turning off.
No error messages, no warnings. Nothing! Just on for a second, off for the next 10...

I love the game and am practically banging my head on the wall trying to find the solution.

Any suggestions welcome.
Title: Re: Hellgate London + FX5500 + DVI = FAIL?!
Post by: cjohnstryker on April 13, 2010, 09:13PM
Im not sure if this will help, but: I had a similar problem. I went from a 1950xt to a hd3850. When I ran the game on hdmi to my tv the fps was worse with the new card than with the old and the game kept crashing. Long story short: the game is glitchy. Download the patch and then turn off any card based graphics utilities. Basically let the game have total control of the hardware. I wish I could help more, but its been a couple of years since I played/messed with it. Hope you get it working, its a fun game.
Title: Re: Hellgate London + FX5500 + DVI = FAIL?!
Post by: Pooltastic on April 16, 2010, 04:41PM
interesting.
but what patch do you mean exactly?