Showing posts with label pc. Show all posts
Showing posts with label pc. Show all posts

Saturday, 20 September 2014

How to download and install SteamOS beta

How to download and install SteamOS beta


In this Tutorial, i will be teaching you HOW TO DOWNLOAD and INSTALL SteamOS.

As we know that the company haven't launched a stable release of SteamOS, so we will install the latest release of BETA Version of SteamOS.

First of all, download the SteamOS beta version from the Official Download page of Steam's Website:

CLICK HERE to download the latest BETA release.

After accepting the terms and policy, and clicking the DOWNLOAD STEAMOS BETA button, your download will start.

The download file will be round about 2.25 GB.

Download and save the file into a directory where you can access them.

Installing and Customizing SteamOS

What are the SteamOS Hardware Requirements?


Processor:
 
Intel or AMD 64-bit capable processor
 
Memory:
 
4GB or more RAM
 
Hard Drive:
 
500GB or larger disk
 
Video Card:
 
NVIDIA graphics card
AMD graphics card (RADEON 8500 and later)
Intel graphics
 
Additional:
 
UEFI boot support
USB port for installation
 

How do I install SteamOS?

There are two different installation methods for SteamOS. The recommended method is the Default Installation method, which is a pre-configured image-based install using CloneZilla. The other method uses Debian Installer, which allows for customization after an automated install step. Please choose one of those methods below.
WARNING: Both installation methods will erase all content on the target computer

Default Installation

You will need to create a SteamOS System Restore USB stick to perform this install. The image provided here requires at least a 1TB disk.
  1. Download the default SteamOS beta installation
  2. Format a 4GB or larger USB stick with the FAT32 filesystem. Use "SYSRESTORE" as the partition name.
  3. Unzip the contents of SteamOSImage.zip to this USB stick to create the System Restore USB stick.
  4. Put the System Restore USB stick in your target machine. Boot your machine and tell the BIOS to boot off the stick. (usually something like F8, F11 or F12 will bring up the BIOS boot menu).
  5. Make sure you select the UEFI entry, it may look something like "UEFI: Patriot Memory PMAP". If there is no UEFI entry, you may need to enable UEFI support in your BIOS setup.
  6. Select "Restore Entire Disk" from the GRUB menu.
  7. When it is complete it will shutdown. Power on the machine to boot into your freshly re-imaged SteamOS.

Custom Installation

The second method is based on the Debian Installer. It requires additional configuration steps, but works with a smaller disk:
  1. Download the custom SteamOS beta installation
  2. Unzip the SteamOS.zip file to a blank, FAT32-formatted USB stick.
  3. Put the USB stick in your target machine. Boot your machine and tell the BIOS to boot off the stick. (usually something like F8, F11, or F12 will bring up the BIOS boot menu).
  4. Make sure you select the UEFI entry, it may look something like "UEFI: Patriot Memory PMAP". If there is no UEFI entry, you may need to enable UEFI support in your BIOS setup.
  5. Selected "Automated install" from the menu.
  6. The rest of the installation is unattended and will repartition the drive and install SteamOS.
  7. After installation is complete, the system will reboot and automatically log on and install Steam. At this point an internet connection is required. If you have an internet connection, Steam will automatically install itself. If you do not have an internet connection (for instance, if you need to connect to a WiFi access point) you will get a popup telling you this. Close the popup and you will get the network configuration UI where you can set up your network. Once you are connected to the internet, close this UI and Steam will install itself.
  8. After Steam finishes installing, your system will automatically reboot and create a backup of the system partition.
  9. When the backup completes, select "reboot" to boot into your freshly installed SteamOS
AND YOU ARE DONE!

Don't forget to Like, Subscribe and Comment!

Steam Universe

Steam Universe 




Yeah, i know, you heard that many times that Steam Universe is expanding but the question is, how much is it expanding and when?

Steam technologies are extremely awesome and some time ago, they said that they will be Giving a Free OS known as Steam OS which will include the awesomeness of steam gaming client and the power of Linux's backbone but the users are still waiting and anxious to see the Steam's Expanding Universe.


The Valve Corporation is developing an operating system that will power your living room,
with the features like gaming, multimedia entertainment etc.

They also said that they are developing a unique gaming controller, unlike playstation or xbox controller, the SteamController will be unique, comfortable and something new!
the conceptual image of SteamController is:



SteamOS will be extremely secure and will be awesome, the developers said that!

The users are still waiting but the company has just launched the beta version of SteamOS
and the following tutorial can tell you how to download and install SteamOS(BETA):


have some patience and you will be informed when the company launches updates on SteamUniverse

thanks for reading and don't forget to like, share and comment!

Hello World, We Are BACK!!!



Computer Cluster is BACK!

We are glad to inform you that we are back to get you the most out of web!
Computer Cluster went shutdown for some copyright issues but the hard time has passed away and we, now, taking a beautiful new start!

Please visit computercluster.blogspot.com daily, for awesome content and information!

Like, Subscribe and Comment!

Its awesome to be back with you!

Thursday, 30 January 2014

How to use PSP as a Monitor



This Tutorial will show you how to set up your PSP to be used as a Second Monitor on your PC. This can be helpful in many ways, for example watching a tutorial on the psp and doing it on your main monitor, or simply moving your media player out of the way.

Step 1: What you'll need

For this little project, you'll need:
- a PSP running a custom firmware above 3.71 [I used 4.01 M33-2]
PSPDisp by Jochen Schleu
- A USB Cable

My PSP is a Phat PSP with 4.01 M33-2. I haven't tested this on a slim or any other firmware, so I'm not sure how well (or not) this will work on other setups.

From the PSPDisp Readme, you'll also need:
- a PC running Windows XP 32 bit (it will not work on 64 bit versions or Vista, it may run under Windows 2000 but this is untested)
- a suitably fast processor, for comparison:
-> 25 % CPU load on single core Pentium 4 3.06 Ghz
-> 40 % CPU load on single core AMD Athlon 64 3200+ (2.2 Ghz)
It should be fine on any modern PC.

Step 2: Setting up PSPDisp


Start by downloading and installing PSPDisp from the provided link.
Run the installer and go to the install directory afterwards (eg. C:\Program Files\PSPDisp\).

Go into the "bin" folder, and then to psp\PSP\GAME.
Copy the PSPDisp folder to the PSP\GAME folder on your PSP's memory stick.

Now, on your PSP, connect the USB Cable (if you haven't already done so), and run PSPDisp PSP application.

Now, you'll get a popup about new hardware, "PSP Type B". When it asks for the driver, tell it to search in "X:\%pspdisp install path%\bin\driver_usb", "X:\%pspdisp install path%" being wherever you installed PSPDisp.

Step 3: Installing Display Driver


Picture of Installing Display Driver

driverinstall5.jpgdriverinstall6.jpg

driverinstall7.jpg
driverinstall8.jpg

driverinstall9.jpg

Okay now you've got nearly all of PSPDisp's components up and running, all you need now is the display driver.

To start, bring up the add new hardware wizard. This wizard will either be in Printers and Other Devices (on Category View), or there'll simply be an icon for it (on Classic View).

Installing Driver Instructions:
1] On the first screen, choose "Yes i have already connected the hardware".
2] Next, scroll to the bottom and select "Add new hardware device".
3] Now select "Install what I manually select from a list".
4] In that, select "Show all devices"
5] Don't bother searching for it (it's not there). Just click "have disk".
6] Now, browse to X:\%PSPDisp Install Directory%\bin\driver_display.
That's it. Just follow the rest of the instructions the Wizard gives you! Ignore the warning about the driver not being digitally signed.

Step 4: Setting up the Display



Okay now that you have everything you need installed, go ahead and start PSPDisp itself. It's in the \bin\app folder.

It will show up in the system tray as a little picture of a PSP, and to control it just right click the icon. Start off by enabling the display driver. This adds another "Monitor" to the computer, with a resolution of 960x544px.

Now enable PSPDisp to output to your PSP. Hopefully you still have the app running on the PSP. If not start it now :P

Step 5: Finishing Up



Now you have to position the new desktop.
This can be done by right clicking your desktop and selecting properties -> settings, or you can launch it from PSPDisp's menu.

As you can see in the picture below, I have my new desktop centred under my monitor, but if you want to move your monitor on top or at the sides that'll work too (might be confusing at the sides though :P).

An excellent program for Multi-Display setups is "UltraMon". It also works a charm for this setup, so I recommend you try it out. It can extend the taskbar to the second desktop, change wallpapers of desktops respectively, and more. It's a trial version, though and costs $39.95 for a licence. There's probably (definately) a free alternative, if you find one, please comment :)

Step 6: Outtro

Well that's it. You now have a multi-desktop setup, with a PSP! :)
If you have anything you want to say, please comment, it would be much appreciated!

Gaming in Nvidia 4K

Gaming in Nvidia 4K

Nvidia GeForce GTX 780Ti: Gaming in Glorious 4K
Earlier this year, Nvidia dropped a bomb on the world of graphics processing with theTitan, a real luducrious powerhouse what cost a whopping $1,000. Now, the monsterous Titan is getting (another) "affordable" twin in the form of the Gefore GTX 780Ti, which Nvidia's calling the best gaming GPU on the planet.

The result is that (a pair) of GeForce GT 780Tis can handle modern, games like Batman Arkham Origins, or Assassin's Creed 4: Black Flag at
 4K resolutions with all the bells and whistles. Though that's also a trick you could pull of with a pair of Titans (crazy expensive) or a pair of 780s (not quiite as good) as well.The GeForece GTX 780Ti comes with 3GB of the fastest GDDR5 available on any graphics card anywhere, providing, 336GB/sec of peak memory bandwidth. It's also got 2,880 CUDA Cores, 25 percent more than the GeForce GTX 780, the previous peak of non-Titan power. And then it also comes with GPU Boost 2.0 which helps to maximize clockspeeds in all scenarios, and support for G-sync, Nvidia's recently announced tech that aims to end screen tearing forever. *fingers crossed*
Of course, getting a setup you could appreciate that on will cost you a smallish fortune. The GeForce GTX 780Ti still runs the pricey, devoted enthusiast cost of $700 or $1400 for two. But if you can afford to pick one up (and a 4K display), you probably won't be disappointed.

What is Nvidia PhysX

What is Nvidia PhysX.



Comparison of physics levels in Mafia II.
(PC) The top screenshot shows how debris is simulated in Mafia II when PhysX is turned to the highest level in the game settings. The bottom screenshot shows a similar scene with PhysX turned to the lowest level.

PhysX is a proprietary realtime physics engine middleware SDK. It was developed by Ageia with the purchase of ETH Zurich spin-off NovodeX in 2004. Ageia was acquired by Nvidia in February 2008.[1]
The term PhysX can also refer to the PPU expansion card designed by Ageia to accelerate PhysX-enabled video games.
Video games supporting hardware acceleration by PhysX can be accelerated by either a PhysX PPU or a CUDA-enabled GeForce GPU (if it has at least 256MB of dedicated VRAM), thus offloading physics calculations from the CPU, allowing it to perform other tasks instead. In theory this should result in a smoother gaming experience and allow additional visual effects.
Middleware physics engines allow game developers to avoid writing their own code to handle the complex physics interactions possible in modern games. PhysX is one of the handful of physics engines used in the large majority of today's games.[2]
The PhysX engine and SDK are available for Microsoft WindowsMac OS XLinuxPlayStation 3,[3][4] Xbox 360[5] and the Wii.[6] The PhysX SDK is provided to developers for free for both commercial and non-commercial use on Windows. For Linux, OS X and Android platforms the PhysX SDK is free for educational and non-commercial use

What is AMD Mantle

Why AMD developed Mantle

From a game developer’s point of view, creating games for the PC has never been especially efficient. With so many combinations of hardware possible in a PC, it’s not practical to create specialized programming for every possible configuration. What they do instead is write simplified code that gets translated on-the-fly into something the computer can work with.
Just as when two people communicate through a translator, this works, but it isn’t efficient. And it’s the CPU that has to do all this extra work, translating and queuing data for the graphics card to process. PCs are meant to be the ultimate gaming platform — they have the power — but all this translation slows things down, and game developers approached AMD asking for something better.

What Mantle does

Mantle is the harmony of three essential ingredients:
  1. A driver within the AMD Catalyst™ software suite that lets applications speak directly to the Graphics Core Next architecture;
  2. A GPU or APU enabled with the Graphics Core Next architecture;
  3. An application or game written to take advantage of Mantle.
Mantle reduces the CPU’s workload by giving developers a way to talk to the GPU directly with much less translation. With less work for the CPU to do, programmers can squeeze much more performance from a system, delivering the greatest benefits in gaming systems where the CPU can be the bottleneck.

What it means for gamers

Now that Mantle has freed up some extra CPU capacity, we expect Mantle will lead to better games, and more of them, since Mantle makes game development easier.
That’s not all Mantle will do for gamers. By shifting work to the GPU, a mid-range or older CPU isn’t the same handicap it was before. With Mantle, the GPU becomes the critical part of the system, and GPU upgrades will have a bigger impact than before.

AMD: We're going after Nvidia graphics with Kaveri APU

AMD: We're going after Nvidia graphics with Kaveri APU

AMD: We're going after Nvidia graphics with Kaveri APU

The war is nothing new, but Kaveri is

To give a taste of its new Kaveri APU's kick, AMD ran a side-by-side Battlefield 4 demo during its opening APU13 keynote last week. The game ran on two machines; frames per second ticked away in the upper left hand corners as the opening, "Total Eclipse of the Heart"-sound tracked sequence rolled.
Kaveri came away as the clear winner, holding nearly double the frame rates and running with hardly a hiccup. The second machine, equipped with an Intel Core i7 4470K CPU and a Nvidia GeForce GT630 GPU, stuttered and lurched from scene to scene.
It was an effective, visceral demonstration, but the question quickly circulated why AMD would pit the top-end A10-7850K Kaveri APU against this CPU/GPU combo. Adam Kozak, senior product marketing manager at AMD, laid the company's logic out for TechRadar.
"We want to position an A10 versus Intel plus a graphics card," he told us during a post-keynote rendezvous. "Obviously you can go up so high before the graphics card gets faster, and that's why we picked the 630. There is a new 640 we're looking at, and we'll take a look at that as we get closer to launch."
It's not just CPU then that AMD is targeting with its first APU of 2014.
"We want to go after the idea where it makes sense, at least from our perspective, that you don't need to buy a certain graphics card," Kozak said. "In fact, Nvidia probably sells 70-80% of their entire stack at 630 and below. People kind of know that Intel is very weak [with] GPU, so now we're going after something people think is strong."
Adam Kozak
AMD's Adam Kozak talks the Kaveri talk

Kaveri in action, but what about Mantle?

We had caught up with Kozak to see another BF4 demo played on a different desktop - one presumably less meaty than the machine used for the keynote head-to-head. The settings were on medium except for a custom graphics quality setting. AMD Inclusion, which controls how shadows are placed on overlapping objects, was turned off.
"The difference between low and medium is huge, and then from medium to high, you see a little bit more details in the soft and shadows," Kozak said of how visible Kaveri's footprint becomes on different settings. "From high to ultra, it's more of a post-processing so you're light rays and everything are kind of blended a little more.
"For me, the biggest jump is from low to medium, and then from there it just gradually looks nicer and nicer, depending on what you've got."
The frame rates hit 39 or so as they did during the keynote, and the play never lapsed. Granted, this wasn't a particularly action-heavy demo we were being shown - Kozak was really just wondering through a ruined building.
The first Kaveri demo was more graphics intensive, but neither it nor the one Kozak played ran a Mantle-optimized version of BF4. AMD's Mantle API, developed with the help of EA's DICE, is designed to push frame rates higher and improve graphics fidelity. Pair it with Kaveri, and the hope is for near-perfect renderings.
That's the idea, of course, and Kozak for one is keeping his forecasts on a more even keel.
"Personally my expectations are low," he said of a Mantle-plus-Kaveri combo. "But there is an Oxide demo here and they are seeing substantial speed-ups, beyond what anyone internally has guessed at. I'm optimistic it's going to be more than the 5% I'm hoping for and more towards the double digits."
Battlefield 4
Busting through the frame rates
In fact, we're told Mantle is still fairly new for AMD internally, and it's partners like DICE who are seeing frame gaps vaporize.
"What I have heard from DICE is that what [Mantle] does with the discrete card is it equalizes the CPUs," Kozak explained. "It was only a couple of frames faster before because the CPU doesn't really play into things like that, but [Mantle] eliminates any gap. And essentially it does that by allowing the graphics card to do more, so it becomes the bottle neck."
When a Mantle optimization-bearing update arrives for Battlefield 4 in December, we'll have an accurate idea of just how much the API improves graphics performance in the real world.

Masters of productivity

Our meeting with Kozak held more than just a BF4 run through.
He also showed us a JPEG decode accelerator that overrides the usual routines found in Windows and speeds them up with Kaveri. In one thumbnail decode run, performance increased by 110.1%.
When the first Kaveri APU desktops become available January 14, they'll have the decoder built in. More than simply decoding family photos faster, the driver shines a light on an area AMD wants to target with Kaveri.
"Productivity is sort of a new one for us," Kozak told us.
The company plans to have additional productivity compute acceleration examples at launch, and Kozak said that "the idea there is to gain interest, [to] get Microsoft and others aware that you can make things a lot faster on tasks that people still care about in a professional level with these crazy spreadsheets that the normal consumer level may not be much of a big deal."
Render speeds
No more trips to the kitchen while JPEGs decode

Kaveri price and scalability

Not all is known about Kaveri - CES 2014 is the APU's "big coming out party," as we've been told by AMD.
With the first Kaveri desktops due early next year, the all-important price question will be answered in short order. Until then Kozak and AMD are keeping mum on cost, but we suspect Kaveri will be priced in the same range as Richland desktop was when that APU was released in the channel.
Kozak noted, as AMD has, that the company has reversed its normal APU release order and is taking Kaveri first to desktop.
"It's sort of a chicken and egg thing," he said of the decision. "We're really interested in getting Kaveri out there as fast as possible. If it is just the desktop, and not the bread and butter of mobile, it's because we need guys to start programming for it. We give them the fastest implementation and they can start optimizing their code, and obviously from there can start optimizing for lower TDPs."
While scalability with Kaveri is a big selling point, Kozak said not to expect it in something as small as a smartphone anytime soon.
"Right now we're going as low as 15W, which is not a phone, all the way up to the typical desktop," he said. Though Kozak didn't mention it, AMD has stated Kaveri will head to embedded systems and servers as well.
Snipping through Battlefield 4 and JPEGs is all well and good, but when it comes to real-world implementation, CES and the days following are going to be Kaveri's true gauntlet run.
Early showings have been impressive, and Kozak said more work is being done to fine-tune Kaveri.
"I expect this one to get even better for us," he said, referring to the Battlefield 4 desktop demo. "We still have our engineers working with DICE on Kaveri optimizations. That's going on as we sit here, and we still have DICE working with Mantle. There's a two-step prong that's going to make this even better."

Why AMD's Radeon R9 290 is both awesome and awful

Why AMD's Radeon R9 290 is both awesome and awful


Why AMD's Radeon R9 290 is both awesome and awful

Victory snatched from the jaws of defeat. Or is it the other way round? Either way, only AMD could pull it off with such perverted panache.
I speak of the new AMD Radeon R9 290. Yes, specifically the 290, not the 290X. By most metrics, it's by far and away and without a shadow of a doubt the best graphics card you can currently buy. And yet somehow, AMD managed to launch it in a state that some leading review sites felt they couldn't recommend it.
Let me re-emphasise that. AMD conspired to create the best graphics card on the market and yet make it sufficiently flawed that some experts advised PC gamers not to buy it.
Let's remind ourselves first of what makes the 290 great. It's not actually the fastest graphics chip in the world. That accolade falls to Nvidia's might GeForce GTX 780 Ti. It's not even AMD's fastest. The Radeon R9 290X takes that prize.

Huge performance, plausible price

But what it does do is deliver frame rates that I suspect are largely in distinguishable from those faster chipset in subjective gaming terms. And it does so at a fraction of the price.
As I type these words, an Nvidia GeForce GTX 780 Ti will sock you for about £500. AMD Radeon R9 290Xs start at about £420. But the 290 is yours for just over £300. Nice.
And yet none other than Anandtech had this to say about the 290:
"To get right to the point then, this is one of a handful of cards we've ever had to recommend against."
So, that's one of a handful of cards Anandtech has ever unambiguously recommended against in around a decade of graphics card reviews.
What on earth is wrong with the 290? As it happens, I think Anandtech blundered pretty badly to put the 290 into such undistinguished company. But the 290 is undoubtedly flawed.
There are two closely related problems. The GPU at the heart of the 290 runs very hot and it sports a very noisy fan. The latter problem, if it is a problem, depends on your point of view. Some won't be too worried about a bit of din when the GPU is under heavy load. To be clear, the card is only noisy when rendered detailed 3D graphics.

Piping hot pixel pumper

The temperature issue is potentially more serious and raises concerns about the long-term reliability of 290 boards. AMD says the running temps are fine, but history shows computer chips soaking up 90 degrees-plus with regularity tend to go pop eventually.
The full story is a little more complicated yet and involves some last minute tweaking of the settings controlling the 290's fan. But the details really don't matter too much. The bottom line is that AMD had managed to launch a card that simultaneously all comers while looking seriously flawed, at least to some.
The irony here is that all AMD needed to have done was launch with a quieter fan. The temps could have remained the same and while they would have been remarked upon, I doubt they would have become the main story.
But the hurricane-force fan did a stand up of grabbing all the headlines and spoiling the excellent work of the engineers who designed the GPU.
Fortunately, a solution to all this is on its way. Pretty soon, 290 boards with custom cooling will become available and the din will die down, both figuratively and literally. And then the 290 will take its rightful place as the best graphics card in the world.
In the meantime, I can only marvel at how AMD could make such a mess of such an inherently great product.

Tuesday, 28 January 2014

A Short Summary of intel's announcements at CES

Intel-CEO-Brian-Krzanich-CES2014-635x475.jpg
Computer chip giant Intel unveiled a major new push Monday into wearables and connecting everyday devices as it seeks to leapfrog the competition in mobile computing.
Chief executive Brian Krzanich said Intel would produce on its own or with partners a range of products from a health monitor integrated into baby clothes to heart monitor in earbuds.
Speaking at the opening keynote of the massive Consumer Electronics Show in Las Vegas, Krzanich showed the company's new "personal assistant" dubbed Jarvis, which is Intel's answer to the voice-activated Google Now and Apple's Siri.
Intel will be producing a smartwatch with "geofencing" which allows families to get alerts if children or elderly parents leave a specific geographic area.
The new devices shown to the large CES crowd will all be available this year, Krzanich said, without offering details on pricing or specific partners for the products.
Krzanich said Intel is taking a new approach to wearable computing, seeking to address specific problems with the simplest technology.
He showed a turtle-shaped sensor on baby clothing which can send information to a smart coffee cup about an infant's breathing, temperature and position.
He said the earbuds would enable runners and athletes who already listen to music while exercising to get detailed health information in real time.
"We want to make everything smart. That's what Intel does," he said.
The chief executive who took the reins at Intel last year said the new technology all revolves around its new chip called Edison, which is said integrates a full-fledged computer in the size of a memory card.
He said Intel will be partnering with the luxury retailer Barneys New York, the Council of Fashion Designers of America and design house Opening Ceremony to explore and market smart wearable technology.
And Intel will offer $1.3 million in prizes for other developers who come up with new ideas for wearable computing, including a first prize of $500 million.
"This will allow creation and innovation to come to life" in wearables, he said.
To address questions about security, Intel will offer its McAfee mobile software free of charge.
"We believe this will allow this ecosystem to flourish."
Intel remains the world's biggest producer of chips for personal computers but has been lagging in the surging mobile marketplace of tablets and smartphones. The new initiative could allow the California firm to get a bigger slice of the mobile market's newest iterations.
Intel also said its new chips would allow for a "dual boot" that enables computer makers to include Microsoft Windows and Google Android on a single device, with users able to change with the switch of a button.
"There are times you want Windows, there are times you want Android," he said. "You don't have to make a choice, you can have both."
Intel also unveiled a new 3D camera called RealSense which can be integrated into tablets and enable users to produce and manipulate three-dimensional images.
This can for example allow a user to design a toy or other object and then send it to a 3D printer. Intel produced chocolate bars using the technology which were handed out to the attendees at CES.
Mooly Eden, senior vice president for perceptual computing, said Intel is moving to a more intuitive kind of computing.
"We'll make human-computer interaction natural, intuitive, immersive. We'll make it more human," Eden said.
"We finally removed the fiction from science fiction and made it real."
Intel will implement a new policy in 2014 ending the use of "conflict minerals," from the Democratic Republic of Congo, as part of an effort to reduce the money flowing from the technology sector to those committing atrocities, Krzanich said.
"We are inviting the entire industry to join us in this effort," he said. Stay in touch with the latest from CES 2014, via our CES page.

Source: NDTV

nVIDIA 4K Gameplay: How Fast Can You Drive? How About 1.5 Billion Pixels a Second

                          12kdisplayWeb

Acres of lifelike beaches, delectable food, amazing cityscapes seem to stretch across many booths featuring 4K displays at this week’s International Consumer Electronics Show (CES).
Our booth has some amazing scenery, too: A twisty, photo-realistic track displayed with stunning clarity on a trio of 65-inch 4K Panasonic displays.
Here’s what makes ours different: You can grab a controller and blast through the landscape like a lunatic, as stereo speakers deafen those around you with the throaty roar of a neon green Pagani two-seater.
This is 4K you can pick up and play. At a cost of just under $30,000.
“Amazing,” said Carlos Cuello, a Colombian journalist who writes for Enter.Co, the nation’s largest tech publication, as he watched our live demo. “This is another dimension.”
4K, of course, is the industry term for the latest generation of ultra-high-resolution monitors. It refers to their ability to cram 4,000 or more pixels on a single horizontal line.
To the human eye, it brings an enormous step up from the high-definition displays that can now be purchased for a few hundred bucks at the nearest electronics store.
Creating video that takes advantage of what these displays can do isn’t easy. Videographers have to upgrade to a new generation of cameras and video-editing workstations.
Even harder: creating interactive experiences that can wring the most out of all those pixels. This is something that’s simply beyond what the latest generation of game consoles can do.
Brought to you by Titan: a 12k display you can play.
Brought to you by GeForce GTX Titan: a 12K display you can play.
But it makes 4K content – particularly on multiple screens – the perfect demo of what’s ahead.
“I’m looking at 4K as the panels get cheaper, and more content appears,” said Patrick Danford, a PC gamer who works at Walt Disney Studios, after taking a close look at the water-cooled innards of the PC powering our demo through a clear panel on the side of the machine’s case. “This is pretty cool.”
To build our demo, we started with one of Origin’s ferocious Genesis PCs. We then equipped it with four of our GeForce GTX Titan graphics cards yoked together with our SLI technology.
We then loaded it up with a pre-release version of Project CARS (Community Assisted Racing Simulator), a driving simulator being developed by Slightly Mad Studios. Then, we cranked up the settings to run the game at a ridiculously smooth 60 frames per second.
1.5 Billion Pixels Per Second
Hook up all this to a trio of 65-inch displays and you’re driving more than 1.5 billion pixels per second into an interactive 12K display.
And visitors to our booth are having a blast with it.
“That was a lot of fun,” said Stanley, whose home PC is equipped with an NVIDIA GeForce GTX 650 Ti, as he nodded towards our enormous screens. “If only I could fit those into my apartment.”
- See more at: http://blogs.nvidia.com/blog/2014/01/07/4k-gameplay-how-fast-can-you-drive-how-about-1-5-billion-pixels-a-second/#sthash.7BTiTjwB.dpuf

Three Huge Applications Accelerated by World’s Most Powerful Graphics Chip

Three Huge Applications Accelerated by World’s Most Powerful Graphics Chip 




Weather forecasting. Tracking billions of tweets. Mitigating big-time financial risks. These burly applications gobble up computing power like an NFL linebacker plowing through a box of Ding Dongs.
So when we tore the lid off the supercharged Tesla K40 GPU accelerator – the highest-performance GPU ever built – our customers quickly put it to work tackling some of their toughest problems. And their initial results show this GPU is even better than advertised.

Learn about the Tesla K40 and more at a free webinar we’re hosting this Thursday at 9am PST: https://www2.gotomeeting.com/register/841521626

Improving Weather Forecasting
Weather prediction is one of the toughest computing jobs around.
The U.S. National Weather Service’s supercomputers run an application called WRF to forecast weather conditions throughout the country.
Dr. Bormin Huang, from the University of Wisconsin-Madison, uses K40 GPUs to accelerate WRF to deliver more accurate, higher resolution forecasts.
His latest findings show that, at baseline, the Tesla K40 runs WRF nearly 30 percent faster than the Tesla K20.
But when he turns on the new GPU Boost feature, which converts power headroom into a user-controlled performance boost, it’s 46 percent faster.
Accelerating WRF allows him to run more detailed, higher resolution weather models for extended periods of time, leading to more precise and accurate forecasting.
slidea
Tracking Twitter Trends – Analyzing a Billion Tweets in Real Time
A staggering 2.5 exabytes of data is created every day on the internet (source: http://en.wikipedia.org/wiki/Big_data). Top social media sites contribute to this massive volume of information, and can be a treasure trove of info on the latest trends.
But in the real-time world of Twitter, it’s a major challenge to extract useful data, and then analyze it before today’s trends become yesterday’s news. 
Enter Map-D, a start-up out of the Massachusetts Institute of Technology.
Without a budget to build a large CPU server farm, Map-D founders turned to GPUs to create a cutting-edge visualization system that can track more than a billion tweets worldwide and deliver real-time analysis of virtually any Twitter query one can think of – an impressive database application hosted on a single server equipped with eight Tesla K40 GPUs.
With an aggregate GPU memory of 96GB, it’s large enough to store 1 billion tweets in memory for ultra-fast analysis. And with GPUs, it’s over 70 times faster than with CPUs alone, or fast enough to let you know what’s happening in the Twitterverse at any given moment in time.
Want to know what people thought of the Oscar nominees? Curious who are the fan favorites to win at the 2014 Sochi Olympics?
Yep, they can track all that. In real time, even as it changes from moment to moment.
slideb
Reduced Financial Risk 
Monte Carlo algorithms – used to analyze the valuation of complex financial instruments – are among the most important in the financial services industry.
Software vendor Xcelerit uses GPU accelerators to deliver unprecedented performance for applications built with Monte Carlo algorithms.
Once again, the Tesla K40 dramatically accelerates the application – compared to the Tesla K20X, it’s consistently 20-30 percent faster.
slidec
Test Drive a Tesla K40 Yourself
We’ve made it easy – and free – for you to test drive a K40. Simply register for the test drive program at www.nvidia.com/GPUTestDrive.
The site is pre-loaded with a number of different applications – from AMBER to GROMACS – allowing you to see how the Tesla K40 can supercharge your work.
- See more at: http://blogs.nvidia.com/blog/2014/01/21/three-huge-applications-accelerated-by-the-worlds-most-powerful-graphics-chip/#sthash.y3AMdVYR.dpuf