Showing posts with label gaming. Show all posts
Showing posts with label gaming. Show all posts

Sunday, 21 September 2014

Why Destiny The Game Will Not Be ON PC ?

Why Destiny The Game Will Not Be ON PC ?


Bungie explained why Destiny Will not be on PC:

On 9th September Destiny was launch on PlayStation 4, PlayStation 3, Xbox One and Xbox 360 - but not PC.

It's an unfortunate situation for many who had hoped to play Bungie's next first-person shooter with a mouse and keyboard. And for many it seems like a strange omission, given the game was built on PC in the first place. Surely it wouldn't take much time or effort for Bungie to release a PC version, then?

Destiny is Bungie's first multiplatform game in over a decade, and its first on a PlayStation platform. The studio handles development of all the versions internally, rather than outsourcing development to other studios. It even handles quality assurance itself - one of the many reasons there are over 500 people currently working on the game inside Bungie's Bellevue, Washington base.
"The console SKUs are really important for us and that's what we're focusing on," Bakken continued. "We're doing it all internally ourselves. That's a huge endeavour. That's not something we've ever done before.
"So when I'm playtesting and I'm trying to play PS4, Xbox One, Xbox 360 and PS3, that's a lot of work. Adding another thing on there is just crazy. It's crazy to think of right now."
Bungie COO Pete Parsons echoed Bakken's response when we quizzed him on the same subject.
"I think four platforms on day one is a lot, considering we've been a one platform team for a very long time," he said.
So what about the people who spent a thousand Dollars to get the Extreme gaming PC in order to get the most vs console?
well, unfortunately they should wait for GTA V or they should move onto the next Far Cry Game or maybe the Crysis 4...
Don't forget to Like, subscribe and comment!

Saturday, 20 September 2014

Hello World, We Are BACK!!!



Computer Cluster is BACK!

We are glad to inform you that we are back to get you the most out of web!
Computer Cluster went shutdown for some copyright issues but the hard time has passed away and we, now, taking a beautiful new start!

Please visit computercluster.blogspot.com daily, for awesome content and information!

Like, Subscribe and Comment!

Its awesome to be back with you!

Thursday, 30 January 2014

How to use PSP as a Monitor



This Tutorial will show you how to set up your PSP to be used as a Second Monitor on your PC. This can be helpful in many ways, for example watching a tutorial on the psp and doing it on your main monitor, or simply moving your media player out of the way.

Step 1: What you'll need

For this little project, you'll need:
- a PSP running a custom firmware above 3.71 [I used 4.01 M33-2]
PSPDisp by Jochen Schleu
- A USB Cable

My PSP is a Phat PSP with 4.01 M33-2. I haven't tested this on a slim or any other firmware, so I'm not sure how well (or not) this will work on other setups.

From the PSPDisp Readme, you'll also need:
- a PC running Windows XP 32 bit (it will not work on 64 bit versions or Vista, it may run under Windows 2000 but this is untested)
- a suitably fast processor, for comparison:
-> 25 % CPU load on single core Pentium 4 3.06 Ghz
-> 40 % CPU load on single core AMD Athlon 64 3200+ (2.2 Ghz)
It should be fine on any modern PC.

Step 2: Setting up PSPDisp


Start by downloading and installing PSPDisp from the provided link.
Run the installer and go to the install directory afterwards (eg. C:\Program Files\PSPDisp\).

Go into the "bin" folder, and then to psp\PSP\GAME.
Copy the PSPDisp folder to the PSP\GAME folder on your PSP's memory stick.

Now, on your PSP, connect the USB Cable (if you haven't already done so), and run PSPDisp PSP application.

Now, you'll get a popup about new hardware, "PSP Type B". When it asks for the driver, tell it to search in "X:\%pspdisp install path%\bin\driver_usb", "X:\%pspdisp install path%" being wherever you installed PSPDisp.

Step 3: Installing Display Driver


Picture of Installing Display Driver

driverinstall5.jpgdriverinstall6.jpg

driverinstall7.jpg
driverinstall8.jpg

driverinstall9.jpg

Okay now you've got nearly all of PSPDisp's components up and running, all you need now is the display driver.

To start, bring up the add new hardware wizard. This wizard will either be in Printers and Other Devices (on Category View), or there'll simply be an icon for it (on Classic View).

Installing Driver Instructions:
1] On the first screen, choose "Yes i have already connected the hardware".
2] Next, scroll to the bottom and select "Add new hardware device".
3] Now select "Install what I manually select from a list".
4] In that, select "Show all devices"
5] Don't bother searching for it (it's not there). Just click "have disk".
6] Now, browse to X:\%PSPDisp Install Directory%\bin\driver_display.
That's it. Just follow the rest of the instructions the Wizard gives you! Ignore the warning about the driver not being digitally signed.

Step 4: Setting up the Display



Okay now that you have everything you need installed, go ahead and start PSPDisp itself. It's in the \bin\app folder.

It will show up in the system tray as a little picture of a PSP, and to control it just right click the icon. Start off by enabling the display driver. This adds another "Monitor" to the computer, with a resolution of 960x544px.

Now enable PSPDisp to output to your PSP. Hopefully you still have the app running on the PSP. If not start it now :P

Step 5: Finishing Up



Now you have to position the new desktop.
This can be done by right clicking your desktop and selecting properties -> settings, or you can launch it from PSPDisp's menu.

As you can see in the picture below, I have my new desktop centred under my monitor, but if you want to move your monitor on top or at the sides that'll work too (might be confusing at the sides though :P).

An excellent program for Multi-Display setups is "UltraMon". It also works a charm for this setup, so I recommend you try it out. It can extend the taskbar to the second desktop, change wallpapers of desktops respectively, and more. It's a trial version, though and costs $39.95 for a licence. There's probably (definately) a free alternative, if you find one, please comment :)

Step 6: Outtro

Well that's it. You now have a multi-desktop setup, with a PSP! :)
If you have anything you want to say, please comment, it would be much appreciated!

Gaming in Nvidia 4K

Gaming in Nvidia 4K

Nvidia GeForce GTX 780Ti: Gaming in Glorious 4K
Earlier this year, Nvidia dropped a bomb on the world of graphics processing with theTitan, a real luducrious powerhouse what cost a whopping $1,000. Now, the monsterous Titan is getting (another) "affordable" twin in the form of the Gefore GTX 780Ti, which Nvidia's calling the best gaming GPU on the planet.

The result is that (a pair) of GeForce GT 780Tis can handle modern, games like Batman Arkham Origins, or Assassin's Creed 4: Black Flag at
 4K resolutions with all the bells and whistles. Though that's also a trick you could pull of with a pair of Titans (crazy expensive) or a pair of 780s (not quiite as good) as well.The GeForece GTX 780Ti comes with 3GB of the fastest GDDR5 available on any graphics card anywhere, providing, 336GB/sec of peak memory bandwidth. It's also got 2,880 CUDA Cores, 25 percent more than the GeForce GTX 780, the previous peak of non-Titan power. And then it also comes with GPU Boost 2.0 which helps to maximize clockspeeds in all scenarios, and support for G-sync, Nvidia's recently announced tech that aims to end screen tearing forever. *fingers crossed*
Of course, getting a setup you could appreciate that on will cost you a smallish fortune. The GeForce GTX 780Ti still runs the pricey, devoted enthusiast cost of $700 or $1400 for two. But if you can afford to pick one up (and a 4K display), you probably won't be disappointed.

What is Nvidia PhysX

What is Nvidia PhysX.



Comparison of physics levels in Mafia II.
(PC) The top screenshot shows how debris is simulated in Mafia II when PhysX is turned to the highest level in the game settings. The bottom screenshot shows a similar scene with PhysX turned to the lowest level.

PhysX is a proprietary realtime physics engine middleware SDK. It was developed by Ageia with the purchase of ETH Zurich spin-off NovodeX in 2004. Ageia was acquired by Nvidia in February 2008.[1]
The term PhysX can also refer to the PPU expansion card designed by Ageia to accelerate PhysX-enabled video games.
Video games supporting hardware acceleration by PhysX can be accelerated by either a PhysX PPU or a CUDA-enabled GeForce GPU (if it has at least 256MB of dedicated VRAM), thus offloading physics calculations from the CPU, allowing it to perform other tasks instead. In theory this should result in a smoother gaming experience and allow additional visual effects.
Middleware physics engines allow game developers to avoid writing their own code to handle the complex physics interactions possible in modern games. PhysX is one of the handful of physics engines used in the large majority of today's games.[2]
The PhysX engine and SDK are available for Microsoft WindowsMac OS XLinuxPlayStation 3,[3][4] Xbox 360[5] and the Wii.[6] The PhysX SDK is provided to developers for free for both commercial and non-commercial use on Windows. For Linux, OS X and Android platforms the PhysX SDK is free for educational and non-commercial use

What is AMD Mantle

Why AMD developed Mantle

From a game developer’s point of view, creating games for the PC has never been especially efficient. With so many combinations of hardware possible in a PC, it’s not practical to create specialized programming for every possible configuration. What they do instead is write simplified code that gets translated on-the-fly into something the computer can work with.
Just as when two people communicate through a translator, this works, but it isn’t efficient. And it’s the CPU that has to do all this extra work, translating and queuing data for the graphics card to process. PCs are meant to be the ultimate gaming platform — they have the power — but all this translation slows things down, and game developers approached AMD asking for something better.

What Mantle does

Mantle is the harmony of three essential ingredients:
  1. A driver within the AMD Catalyst™ software suite that lets applications speak directly to the Graphics Core Next architecture;
  2. A GPU or APU enabled with the Graphics Core Next architecture;
  3. An application or game written to take advantage of Mantle.
Mantle reduces the CPU’s workload by giving developers a way to talk to the GPU directly with much less translation. With less work for the CPU to do, programmers can squeeze much more performance from a system, delivering the greatest benefits in gaming systems where the CPU can be the bottleneck.

What it means for gamers

Now that Mantle has freed up some extra CPU capacity, we expect Mantle will lead to better games, and more of them, since Mantle makes game development easier.
That’s not all Mantle will do for gamers. By shifting work to the GPU, a mid-range or older CPU isn’t the same handicap it was before. With Mantle, the GPU becomes the critical part of the system, and GPU upgrades will have a bigger impact than before.

AMD: We're going after Nvidia graphics with Kaveri APU

AMD: We're going after Nvidia graphics with Kaveri APU

AMD: We're going after Nvidia graphics with Kaveri APU

The war is nothing new, but Kaveri is

To give a taste of its new Kaveri APU's kick, AMD ran a side-by-side Battlefield 4 demo during its opening APU13 keynote last week. The game ran on two machines; frames per second ticked away in the upper left hand corners as the opening, "Total Eclipse of the Heart"-sound tracked sequence rolled.
Kaveri came away as the clear winner, holding nearly double the frame rates and running with hardly a hiccup. The second machine, equipped with an Intel Core i7 4470K CPU and a Nvidia GeForce GT630 GPU, stuttered and lurched from scene to scene.
It was an effective, visceral demonstration, but the question quickly circulated why AMD would pit the top-end A10-7850K Kaveri APU against this CPU/GPU combo. Adam Kozak, senior product marketing manager at AMD, laid the company's logic out for TechRadar.
"We want to position an A10 versus Intel plus a graphics card," he told us during a post-keynote rendezvous. "Obviously you can go up so high before the graphics card gets faster, and that's why we picked the 630. There is a new 640 we're looking at, and we'll take a look at that as we get closer to launch."
It's not just CPU then that AMD is targeting with its first APU of 2014.
"We want to go after the idea where it makes sense, at least from our perspective, that you don't need to buy a certain graphics card," Kozak said. "In fact, Nvidia probably sells 70-80% of their entire stack at 630 and below. People kind of know that Intel is very weak [with] GPU, so now we're going after something people think is strong."
Adam Kozak
AMD's Adam Kozak talks the Kaveri talk

Kaveri in action, but what about Mantle?

We had caught up with Kozak to see another BF4 demo played on a different desktop - one presumably less meaty than the machine used for the keynote head-to-head. The settings were on medium except for a custom graphics quality setting. AMD Inclusion, which controls how shadows are placed on overlapping objects, was turned off.
"The difference between low and medium is huge, and then from medium to high, you see a little bit more details in the soft and shadows," Kozak said of how visible Kaveri's footprint becomes on different settings. "From high to ultra, it's more of a post-processing so you're light rays and everything are kind of blended a little more.
"For me, the biggest jump is from low to medium, and then from there it just gradually looks nicer and nicer, depending on what you've got."
The frame rates hit 39 or so as they did during the keynote, and the play never lapsed. Granted, this wasn't a particularly action-heavy demo we were being shown - Kozak was really just wondering through a ruined building.
The first Kaveri demo was more graphics intensive, but neither it nor the one Kozak played ran a Mantle-optimized version of BF4. AMD's Mantle API, developed with the help of EA's DICE, is designed to push frame rates higher and improve graphics fidelity. Pair it with Kaveri, and the hope is for near-perfect renderings.
That's the idea, of course, and Kozak for one is keeping his forecasts on a more even keel.
"Personally my expectations are low," he said of a Mantle-plus-Kaveri combo. "But there is an Oxide demo here and they are seeing substantial speed-ups, beyond what anyone internally has guessed at. I'm optimistic it's going to be more than the 5% I'm hoping for and more towards the double digits."
Battlefield 4
Busting through the frame rates
In fact, we're told Mantle is still fairly new for AMD internally, and it's partners like DICE who are seeing frame gaps vaporize.
"What I have heard from DICE is that what [Mantle] does with the discrete card is it equalizes the CPUs," Kozak explained. "It was only a couple of frames faster before because the CPU doesn't really play into things like that, but [Mantle] eliminates any gap. And essentially it does that by allowing the graphics card to do more, so it becomes the bottle neck."
When a Mantle optimization-bearing update arrives for Battlefield 4 in December, we'll have an accurate idea of just how much the API improves graphics performance in the real world.

Masters of productivity

Our meeting with Kozak held more than just a BF4 run through.
He also showed us a JPEG decode accelerator that overrides the usual routines found in Windows and speeds them up with Kaveri. In one thumbnail decode run, performance increased by 110.1%.
When the first Kaveri APU desktops become available January 14, they'll have the decoder built in. More than simply decoding family photos faster, the driver shines a light on an area AMD wants to target with Kaveri.
"Productivity is sort of a new one for us," Kozak told us.
The company plans to have additional productivity compute acceleration examples at launch, and Kozak said that "the idea there is to gain interest, [to] get Microsoft and others aware that you can make things a lot faster on tasks that people still care about in a professional level with these crazy spreadsheets that the normal consumer level may not be much of a big deal."
Render speeds
No more trips to the kitchen while JPEGs decode

Kaveri price and scalability

Not all is known about Kaveri - CES 2014 is the APU's "big coming out party," as we've been told by AMD.
With the first Kaveri desktops due early next year, the all-important price question will be answered in short order. Until then Kozak and AMD are keeping mum on cost, but we suspect Kaveri will be priced in the same range as Richland desktop was when that APU was released in the channel.
Kozak noted, as AMD has, that the company has reversed its normal APU release order and is taking Kaveri first to desktop.
"It's sort of a chicken and egg thing," he said of the decision. "We're really interested in getting Kaveri out there as fast as possible. If it is just the desktop, and not the bread and butter of mobile, it's because we need guys to start programming for it. We give them the fastest implementation and they can start optimizing their code, and obviously from there can start optimizing for lower TDPs."
While scalability with Kaveri is a big selling point, Kozak said not to expect it in something as small as a smartphone anytime soon.
"Right now we're going as low as 15W, which is not a phone, all the way up to the typical desktop," he said. Though Kozak didn't mention it, AMD has stated Kaveri will head to embedded systems and servers as well.
Snipping through Battlefield 4 and JPEGs is all well and good, but when it comes to real-world implementation, CES and the days following are going to be Kaveri's true gauntlet run.
Early showings have been impressive, and Kozak said more work is being done to fine-tune Kaveri.
"I expect this one to get even better for us," he said, referring to the Battlefield 4 desktop demo. "We still have our engineers working with DICE on Kaveri optimizations. That's going on as we sit here, and we still have DICE working with Mantle. There's a two-step prong that's going to make this even better."

Why AMD's Radeon R9 290 is both awesome and awful

Why AMD's Radeon R9 290 is both awesome and awful


Why AMD's Radeon R9 290 is both awesome and awful

Victory snatched from the jaws of defeat. Or is it the other way round? Either way, only AMD could pull it off with such perverted panache.
I speak of the new AMD Radeon R9 290. Yes, specifically the 290, not the 290X. By most metrics, it's by far and away and without a shadow of a doubt the best graphics card you can currently buy. And yet somehow, AMD managed to launch it in a state that some leading review sites felt they couldn't recommend it.
Let me re-emphasise that. AMD conspired to create the best graphics card on the market and yet make it sufficiently flawed that some experts advised PC gamers not to buy it.
Let's remind ourselves first of what makes the 290 great. It's not actually the fastest graphics chip in the world. That accolade falls to Nvidia's might GeForce GTX 780 Ti. It's not even AMD's fastest. The Radeon R9 290X takes that prize.

Huge performance, plausible price

But what it does do is deliver frame rates that I suspect are largely in distinguishable from those faster chipset in subjective gaming terms. And it does so at a fraction of the price.
As I type these words, an Nvidia GeForce GTX 780 Ti will sock you for about £500. AMD Radeon R9 290Xs start at about £420. But the 290 is yours for just over £300. Nice.
And yet none other than Anandtech had this to say about the 290:
"To get right to the point then, this is one of a handful of cards we've ever had to recommend against."
So, that's one of a handful of cards Anandtech has ever unambiguously recommended against in around a decade of graphics card reviews.
What on earth is wrong with the 290? As it happens, I think Anandtech blundered pretty badly to put the 290 into such undistinguished company. But the 290 is undoubtedly flawed.
There are two closely related problems. The GPU at the heart of the 290 runs very hot and it sports a very noisy fan. The latter problem, if it is a problem, depends on your point of view. Some won't be too worried about a bit of din when the GPU is under heavy load. To be clear, the card is only noisy when rendered detailed 3D graphics.

Piping hot pixel pumper

The temperature issue is potentially more serious and raises concerns about the long-term reliability of 290 boards. AMD says the running temps are fine, but history shows computer chips soaking up 90 degrees-plus with regularity tend to go pop eventually.
The full story is a little more complicated yet and involves some last minute tweaking of the settings controlling the 290's fan. But the details really don't matter too much. The bottom line is that AMD had managed to launch a card that simultaneously all comers while looking seriously flawed, at least to some.
The irony here is that all AMD needed to have done was launch with a quieter fan. The temps could have remained the same and while they would have been remarked upon, I doubt they would have become the main story.
But the hurricane-force fan did a stand up of grabbing all the headlines and spoiling the excellent work of the engineers who designed the GPU.
Fortunately, a solution to all this is on its way. Pretty soon, 290 boards with custom cooling will become available and the din will die down, both figuratively and literally. And then the 290 will take its rightful place as the best graphics card in the world.
In the meantime, I can only marvel at how AMD could make such a mess of such an inherently great product.

Wednesday, 29 January 2014

Nintendo's strategy: We're not losing focus on consoles

Nintendo's strategy: We're not losing focus on consoles


(Credit: Nintendo)
Things have been a bit rocky in Mario land. A day after Nintendo announced disappointing earnings, including a 30 percent decline in profits and lower Wii U sales than in 2012, its president, Satoru Iwata, on Wednesday held a strategy briefing in Tokyo to discuss a strategic turnaround.
Iwata kicked off the briefing by insisting he's not pessimistic about the outlook for gaming consoles. He also said that Nintendo doesn't plan to give up on its hardware business -- game consoles will continue to be the center of its strategy.

This
 reiterates the statement he made to Engadget yesterday, denying earlier reports that the company would offer free minigameson smartphones that will act as demos of full-priced console and 3DS games."Lots of people have said we should go onto smartphones over the last few years, telling us our business would increase," he told analysts. "But our approach is not to put our games on smartphones."
Itawa acknowledged that change is important, but he pointed to the massive changes that the company has undergone throughout its history, including moving from Hanafuda cards to game consoles.
Perhaps the worst bit of news during Nintendo's earnings call was that the company expects to sell just 400,000 Wii U units worldwide during the first quarter of 2014. Itawa said the company plans to counteract this by focusing on making software that takes advantage of the GamePad's abilities, particularly the NFC (near-field communications) technology.
On the plus side, Nintendo announced that it plans to release Mario Kart 8 in May, which should be welcome news to avid fans of the popular franchise.
Iwata also hinted at a new market the company is planning to enter: health. Noting that there are already a bevy of wearable devices on the market, he said Nintendo is going to try out "non-wearables" to monitor people's health -- though what he meant by non-wearables wasn't entirely clear. The only hint that he gave was that it wouldn't be something you would use in your living room. The company plans to discuss what it means by non-wearables in more details later this year.

Source: Cnet

Tuesday, 28 January 2014

nVIDIA 4K Gameplay: How Fast Can You Drive? How About 1.5 Billion Pixels a Second

                          12kdisplayWeb

Acres of lifelike beaches, delectable food, amazing cityscapes seem to stretch across many booths featuring 4K displays at this week’s International Consumer Electronics Show (CES).
Our booth has some amazing scenery, too: A twisty, photo-realistic track displayed with stunning clarity on a trio of 65-inch 4K Panasonic displays.
Here’s what makes ours different: You can grab a controller and blast through the landscape like a lunatic, as stereo speakers deafen those around you with the throaty roar of a neon green Pagani two-seater.
This is 4K you can pick up and play. At a cost of just under $30,000.
“Amazing,” said Carlos Cuello, a Colombian journalist who writes for Enter.Co, the nation’s largest tech publication, as he watched our live demo. “This is another dimension.”
4K, of course, is the industry term for the latest generation of ultra-high-resolution monitors. It refers to their ability to cram 4,000 or more pixels on a single horizontal line.
To the human eye, it brings an enormous step up from the high-definition displays that can now be purchased for a few hundred bucks at the nearest electronics store.
Creating video that takes advantage of what these displays can do isn’t easy. Videographers have to upgrade to a new generation of cameras and video-editing workstations.
Even harder: creating interactive experiences that can wring the most out of all those pixels. This is something that’s simply beyond what the latest generation of game consoles can do.
Brought to you by Titan: a 12k display you can play.
Brought to you by GeForce GTX Titan: a 12K display you can play.
But it makes 4K content – particularly on multiple screens – the perfect demo of what’s ahead.
“I’m looking at 4K as the panels get cheaper, and more content appears,” said Patrick Danford, a PC gamer who works at Walt Disney Studios, after taking a close look at the water-cooled innards of the PC powering our demo through a clear panel on the side of the machine’s case. “This is pretty cool.”
To build our demo, we started with one of Origin’s ferocious Genesis PCs. We then equipped it with four of our GeForce GTX Titan graphics cards yoked together with our SLI technology.
We then loaded it up with a pre-release version of Project CARS (Community Assisted Racing Simulator), a driving simulator being developed by Slightly Mad Studios. Then, we cranked up the settings to run the game at a ridiculously smooth 60 frames per second.
1.5 Billion Pixels Per Second
Hook up all this to a trio of 65-inch displays and you’re driving more than 1.5 billion pixels per second into an interactive 12K display.
And visitors to our booth are having a blast with it.
“That was a lot of fun,” said Stanley, whose home PC is equipped with an NVIDIA GeForce GTX 650 Ti, as he nodded towards our enormous screens. “If only I could fit those into my apartment.”
- See more at: http://blogs.nvidia.com/blog/2014/01/07/4k-gameplay-how-fast-can-you-drive-how-about-1-5-billion-pixels-a-second/#sthash.7BTiTjwB.dpuf

Play Heavy Games on Low Spec PCs

Virtual Graphics Card-Play latest Games without Graphic Cards


This sounds almost too good to be true,doesn’t it?
Many times we are stuck up with a system that just doesn’t provide enough juice to run the latest game…the result being-
  • you either spend a pot of money (atleast Rupees 3500 or 70$) for getting the latest graphics card
  • or you just read reviews of games like Crysis , Far Cry on gamespot and feel like a noob,fully knowing that your system just won’t be able to support it.
Take heart ! Here’s a wicked software with which you can beat 128-256 MB of graphics card requirements with a very modest 1GB DDR2 RAM.
The savior is 3D-Analyzer.Click to download.
What it does is,it uses a part of your RAM as Graphics card memory. For example,if you got 1GB DDR2 RAM,then it’ll use 128MB of it as a Virtual Graphics card,and the remaining 896MB will be used as a regular RAM.
Cool right….so you can Virtually emulate NVIDIA GeForce TI 4600,NVIDIA GeForce FX 4900 ULTRA,ATi Radeon  8500,ATi Radeon 9800 PRO.Follow the given procedure:
After downloading the software,first install it.
You’ll see something like this:
3D Emulator-Starting Screen
3D Analyzer-Starting Screen
Next,click on ‘Select’ option.
Select
Select
Select  FarCry.exe
Next step,select the options as shown.Note that i have selected the VendorID as 4098 and DeviceID 20400 as i want to select the ATi Radeon 9800 PRO graphics card(use the index provided in the screen on the right). Click on Run.Doesn’t matter even if you don’t save the batch file.Your ready to roll ! Please note that the .exe file is to be selected from the main folder,and not the shortcut present on the desktop.
Final Screenshot
I got Oblivion 4 and Neverwinter Nights 2 to work using this. For running Oblivion however,you need to install a patch called Oldblivion.Click to download and follow the instuctions on Oldblivion site.
For Neverwinter Nights 2,you need to tweak the stuff a bit.
Go to Start->Run->regedit.Then press F3 and search for hardwareinformation.memorysize.Click on the file,and modify the binary values to 00 00 00 08.This fools the system into believing that you have 128MB video RAM.Then select only the ‘emulate hw tnl caps’ option.Do not select any other.Select ATi Radeon 9800.Click on Run.


Comments please!

The Elder Scrolls V: Skyrim has surpassed 20m copies sold

Skyrim hits 20m copies sold
The Elder Scrolls V: Skyrim has surpassed 20m copies sold across all platforms, Bethesda has announced.
The studio revealed the figure yesterday, which accounts for retail and digital sales of the game and its Legendary Edition. Though the company failed to mention if it was referring to units shipped or sold regarding the milestone's retail portion.
Not that the clarification really matters. The 2011 RPG established itself as one of the decade's hottest sellers at launch, moving 3.5m copies in its first two days on sale. Time also notes the game is now one of the top 20 bestselling games of all time.
Skyrim (along with the aforementioned Legendary Edition) is available for PC, PS3, and Xbox 360. Meanwhile, Bethesda is rumoured to be hard at work on Fallout 4.