Quote:
Originally Posted by qasdfdsaq
Maybe not in the bottom end like bog standard 1080p with a midrange card, but the whole point of PCs is that they're extensible.
Play a game on triple 4K displays and then come back saying there's not a huge amount in it! Granted, most people can't afford that, but people who could afford a five times faster console than a PS4 can't buy one, because no such option exists.
|
The problem with such a high end rig is that it's likely not many (if any) games will make full use of it. Purely because it's cheaper to develop for the minimum spec platform then share whatever assets you can, rather than have a few teams working on different sets of assets at different resolutions..
Quote:
Originally Posted by Qtx
Consoles came out many years before that. The Atari 2600 was released in 1977 and that was a second generation gaming console. The Nintendo NES console was in the shops in 1986 and the Sega Megadrive in 1990. Then we had the things like sega saturn, original playstation and Nintendo 64 around 1995.
Being a similar age I had a Commodore Vic 20 and a Commodore plus 4 and 64 before having an Amiga but so glad it came along, even if the 1MB memory upgrade was the size of a brick  The Atari ST and Amiga fanboy wars were the first I remember.
At one point in time PC's outperformed consoles quite a bit but these day's theres not a huge amount in it as you say. Not a significant difference anyway when it comes down to how the game looks and plays. Sometimes games are good looking but the actual game play sucks. If we look back at how crappy some of the early console or computer games looked, yet they kept us hooked and playing for ages.
Now tv's are bigger and some people have decent home cinema systems, it's preferable to play on the big screen and on the comfort of the sofa rather than sitting at a pc on a smaller screen. The steam box makes sense but if the hardware can't be updated then it essentially becomes a console anyway. Just one that runs pc games.
|
If you look at the computing market as a whole, it's going in circles a little. Back in the 60s, 70s and early 80s, most companies who had any kind of computer had a mainframe and a load of dumb terminals. These got replaced first with Personal Computers (and I mean Personal Computers, although obviously the IBM PC compatibles dominated), then PCs with networking. These got slowly more and more powerful until the average PC got far more powerful than the average user needed. Some PCs (particularly in commercial use) have been replaced with Virtual PCs running on a cloud server somewhere, essentially turning the user's own hardware into a dumb terminal, and offloading the processing to a server somewhere, which is essentially what a mainframe did in the 60s, 70s and early 80s. For things like web surfing, email and instant messaging, people are increasingly turning to tablets and smart phones.
I think, in a few years, people will become disillusioned with cloud services (which can fail in so many more spectacular ways than PCs purely because there is more that go wrong) and viewing the internet on their phone/watch/tablet or TV, and start to come back to some form of desktop or tower computer.
It's the same with consoles. First we had the Atari 2600 and Intellivision. Then, that market failed and we got the first 8 bit home computers. These were gradually replaced by 16 bit computers (Amiga, ST etc), which ultimately lost out to the 5th Generation of consoles (particularly the Playstation and to some extent, the Saturn). Then the PC gaming market came of age, and has endured for a long time, despite stiff competition from the Xbox 360, PS3 and Wii. This isn't a complete history of consoles or gaming by any stretch of the imagination, but is intended to illustrate that I believe the gaming market goes in cycles, alternating between consoles and computers.