![]() |
|
#12
|
|||
|
Adding my $0.02 + inflation...
I have a laptop with an Intel 4000 and an Nvidia 550M and I've come across this same problem a number of times on other games. Using the Nvidia control panel to force the use of the Nvidia GPU only seems to work for some people on some games, no matter which way you do it (program specific or default all to the Nvidia). And for the record it seems to be common with ATI's too. From what I've worked out so far it's an issue with enumeration of the GPUs. Many programs don't expect multiple GPUs and so when they go to enumerate, they assume the only GPU index number is 0 and therefore only the first GPU is used (indices a 0 based) which is the Intel. In most cases (not all and I don't know the difference or why) this happens regardless of the settings in the Nvidia panel. Unity games are common, Minecraft (Java) does it, many others do too. For some bizarre reason though, this isn't universal. I've seen some Minecraft players successfully point the Nvidia panel at their java executable and it works. I've followed every possible variation of those same steps and it doesn't work for me. Yet I've got games that it's worked for and others can't get it to work. Unfortunately I also tend to forget that it does in fact work for some games. I've owned Elite Dangerous for about a year and my Intel only gave me a few frames per sec so I gave up on it. It only occurred to me a couple of months ago to try the Nvidia panel and suddenly wooshka(*), it's playable! So if you haven't tried the nvidia panel, by all means do so. You might be one of the lucky ones. If not, I'm sorry to be the bearer of bad tidings but no one seems to have come up with a workable solution as yet. __________________ (*) = if you're not an Aussie and/or aren't familiar with the NRL commentary team of several years ago or the 12th Man's spoof of them, replace this word with "voila". And you're missing out on some comedy gold [You must be logged in to view images. Log in or Register.] | ||
|
|