PDA

View Full Version : The future of video games



Beefy
October 29th, 2014, 7:31 AM
I've a vague recollection that we did a similar thread a few years ago but can't find it now. Either way I'd be interested to hear opinions from people now as to what games we're likely to be seeing in the next 10-20 years.

I remember when the Amiga came out I thoguht it wouldn't get any better than that. Then playing Crazy Taxi or something similar on the Dreamcast and being blown away. Surely we're getting towards the point now where graphically the step up between future generations won't be so profound but it'll be other non-graphical features which will make the most of new technology (I think we're seeing that with the PS4 and XBOne at the moment).

How far can they take that?

As games get bigger and look much more beautiful the time to make them gets longer and longer and the costs go up and up. Have we peaked with the size of our gameworld's with GTA V?

This generation we seem to be seeing the rise of the Indie games, backed up by the likes of Kickstarter filling in the space where a number of larger developers and studios have disappeared from. At the same time we're seeing more and more AAA franchises printing money by churning out regular updates of established games (FIFA, COD).

Is this the last generation to use physical media? Will this be the last console generation? Will the likes of Oculus Rift finally see Virtual Reality take hold in the way that Tomorrow's World promosed that it would 25 years ago?

I don't know the answers.

Simon
October 29th, 2014, 8:00 AM
Graphically, I think it's pretty simple - the next noteworthy checkpoint is entirely lifelike graphics. There are still steps beyond that as we move towards a graphical hyper-reality where you have completely realistic graphics coupled with an ability to create scenarios that don't exist in real life, which are limited only by the designers' imaginations, but in the same way that the first 3D games or arguably HD graphics were the last real game-changer, I think the next one is reaching a point where graphics and animations are indistinguishable from real life.

On a similar topic, I think we are getting to a point where the idea of Uncanny Valley becomes a legitimate and relevant issue. Not exactly the same thing, but I was buying a new TV the other day and looked at an ultra HD TV, and experienced a weird feeling that I associate with the Uncanny Valley theory - being able to see a computerised image in such astonishing clarity felt uncomfortable in a way I can't really explain, as if the lines between reality and television were no longer there.

With regard to gameplay, there's no real limit to what can be achieved other than imagination - things like virtual reality and the recent attempts at multi-screen activity show that people are unwilling to sacrifice gameplay and immersion for comfort and ease of use, but the myriad of ways in which you can enhance gameplay is pretty much limitless except for simulation games which are limited by their intent to match real life. For example FIFA can only ever hope to match real football, albeit a boiled-down version emphasising the most exciting moments.

If you take FIFA (or sports games in general) as a case study, the emphasis will continue to be on allowing full immersion and simulation. In the early 90s, every team was the same bar the kit. The next generation teams would vary in quality but the individual players in each team were the same level. The following generation you had individual player qualities, and now we've reached a point where team members have not only individual abilities but tendencies and behaviours - I expect this to continue, with AI continuing to evolve presumably to the point of each team member being able to react in real-time to stimulus around them (possibly along the same lines as Football Manager, with players having not only ability stats but mental stats that affect their behaviour depending on the way the game is going). Essentially I think in simulation games, gameplay evolution will emphasise AI on an individual basis.

On a more general basis, I wouldn't be surprised if we end up with a situation where your avatar exists in a built-in game world (possibly mapped on your part of the real-life globe, just an idea) and you load games within that game world. For example imagine if you turn your machine on and are automatically loaded into a sandbox map, and can travel to various parts of the map to load up whatever game you want to play. That kind of interactivity between games might be a possibility, I'm just thinking aloud here really...might be difficult to implement but you already have GTA creating a less detailed version of that by being able to play mini-games within the GTA world, in the future it might be possible to seamlessly travel to the football stadium to play FIFA, the jungle/desert to play Call of Duty etc. There are a million holes you can find in that idea I'm sure, but as a starting point for a more interactive gaming world I don't think it's a bad idea.

Beefy
October 29th, 2014, 8:25 AM
Re the last point isn't that what Sony hoped to move towards with PlayStation Home at the start of the PS3's life. You'd have your own space in a pre-built vitual world and could move your avatar around to communicate with other people, visit the PS Store, watch films, etc. I'm sure they were planning to use it as a way to get into games and switch between them. It didn't work though and was fairly quickly dropped.

I think the way that PS Move and Kinect have not taken off have shown that ultimately some of the cornerstones of how we play are not going to change very much. The controller is a fundamental way that we interact with these machines and whilst voice command and recognition of hand movements are nice gimmicks ultimately I don't see anything replacing it as the primary means of control in most games.

Re photo-realistic graphics, is that actually likely to ever happen? Surely there's a limit to how realistic graphics can get, certainly in any game which is remotely non-linear? Again though I did think we were approaching that limit 20 years ago.

Simon
October 29th, 2014, 8:31 AM
I don't see why it's impossible...I suppose the advent of HD has made it more difficult in that imperfections can be more easily spotted, but the difference between current-day graphics and graphics so lifelike as to be indistinguishable from real life to the human eye is far closer than the difference between current-day graphics and graphics from even ten years ago, let alone twenty or thirty.

Like you, I remember being a young kid and fthinking games were impossibly realistic - I specifically remember being about eight or nine and telling my stepdad he had to come and see my game of FIFA on the SNES as it was just like watching real football.

http://www.consoleclassix.com/info_img/FIFA_International_Soccer_SNES_ScreenShot2.jpg

The Rogerer
October 29th, 2014, 8:39 AM
The problem with realistic graphics is that you hit an artistic wall. Also, CGI in films can vary between seamless and completely unconvincing, and they can use massive server farms to spend hours rendering a few frames. A console has to be an affordable home appliance knocking out 30 or 60 frames a second. The biggest thing that always makes things look better are lighting and shadows, which doesn't make the objects in the world more realistic, but when you see convincing lighting, it tricks your brain into thinking it looks real more easily.

Talking about FIFA, the business model of games is what's shaping the biggest change. It's good to see that the companies are being far more receptive to independent development as this is what is going to keep games going. The middle fell out of the industry a few years ago, so now we have smaller studios able to take risks and make surprising new things, and then the bigger companies locked into a massive arms race.

The FIFA UT thing made big changes. I don't even know what it is - I assume it's like a CCG concept where you buy packs and get a random selection of footballers that you can use in a team? They hit on the thing that mobile games and other things went to do, a way to tap into the weaknesses of the mind to gamble and expect rewards and a licence to print money. This sort of thing hasn't developed as far as I thought it might, but it certainly did influence design decisions, and a number of the early Xbox One release games were built around microtransactions, which just seemed to rub people up the wrong way. Mobile/Facebook games are the most evil end of this spectrum as they are literally engineered to precision to stick their hooks into people. Reading about these things are fascinating because it's an exercise in psychology about why a dog for Pet Store Simulator should cost 7 gold coins instead of 8.

It's not so bad when that genre of game was always meant to go that way, but when a bigger, more traditional game starts to compromise it's design to put hooks into you it's a problem. Not that I'm opposed to people getting more money out. The value you get out of a Call of Duty or sports game is often insane compared to a 6 hour story campaign.

Fro
October 29th, 2014, 8:49 AM
http://m.quickmeme.com/img/0c/0cf3a81625675211b4f04b9259adbfe1c1e575478df972f0a7 e3043b16948beb.jpg

Jez
October 29th, 2014, 10:42 AM
I think that as impressive and fun as virtual reality technology is that it will be too niche for at least another decade and even beyond then not everyone will buy into it due to fears of damage to their eyesight or motion sickness. Like motion gaming before it I also think it will be a bit of a fad that becomes incredibly popular at some point but it will die out within about 5 years after the limitations are realised and shovelware ruins the novelty. As much as I'd love to see virtual reality be a total game changer I really don't think it will be.

I definitely think there will be at least one more generation of home consoles because as promising as cloud gaming is there are huge numbers of people who don't have access to broadband and even if they did the technology on the server end won't be perfect for another few years. After cloud gaming does become the norm (and I do think it will) I can see a Netflix like service becoming very popular which guarantees studios a subscription fee and provides access to a big catalog of big and small games which aren't brand new but are well reviewed. Perhaps Sony and Microsoft will be the ones running these services similar to how Netflix and Amazon are the big VOD services now.

*Edit* Forgot to mention I think when cloud gaming does become the norm consoles will look a bit like the Apple TV or Fire TV look now - a small little box that sits under TVs discreetly and requires very little power. It will provide access to all the VOD services just like current consoles.

As much as it pains me to say it I think Nintendo will cease to be a big player in the industry within the next decade. The 3DS and especially the Wii U has shown they really don't have a clue what the market really wants and as much ignoring trends worked for the Wii and DS it hasn't worked this time. I don't know if they'll go the way of Sega and start developing games for other consoles or whether they'll license their characters to other studios but I can see the Big 3 becoming the Big 2 before too long.

My final prediction relates to mobile gaming and this one is probably more of a safe bet since this has started already - sales of most games will flatline which will discourage studios developing anything but the most basic of games. Games like Angry Birds will always make a profit since they cost so little to develop but studios will take less risks with big budget games and ports and indie studios will start releasing on PS4, XB1 and PC first before considering a iOS or Android version. This will ultimately result in a steady and more stable platform for mobile games since games will get more attention when they are released. What this means for the ever increasing graphics capabilities of phones I do not know - perhaps that will be less of a focus for phone manufacturers in the future which would allow them to concentrate on features that users actually want like longer battery life.

El Capitano Gatisto
October 29th, 2014, 10:49 AM
My problem with big open world games and ever more realistic graphics is that they just make game issues more incongruous - like looped behaviour and dialogue, background that cannot be interacted with because of invisible walls, flat backgrounds, unrealistic stagger mechanics etc. These things aren't as much of a problem in contained environments. Some of these things are just gamey features that will never go away, but they are exacerbated by more open or ambitious game worlds.

The Rogerer
October 29th, 2014, 11:01 AM
That's the problem I talked about with Uncharted 2 and The Last of Us. Bioshock Infinite coming out last year as well felt like overblown cinematic style stories had reached a peak and can't go any further without just looking silly.

I see the trend to counter this coming where games are getting more overtly 'gamey'. The indie movment is massively into this, and Destiny still has the Bungie po-faced pomp, but it openly admits at every step that it's a video game and embraces the stupid things about that medium. Demons/Dark Souls becoming popular was influential in this. A game experience is generally about dying a lot, so they wrote a story that acknowledges this rather than ignoring it.

RuneEdge
October 29th, 2014, 12:50 PM
This is a topic I've wanted to discuss for a while but kept forgetting to make the thread. Props to Beefs for bringing it up. :yes:

When it comes to visuals, it seems to me that photo-realistic graphics aren't going to happen, IMO. I mean, CGI movies can't achieve photo-realistic visuals, and they're rendered on super computers that don't have to dedicate processing power to gameplay mechanics. The real advancement seems to be in two areas. Resolution and frame-rate.
Last gen games were mostly rendered in 720p. In this generation, we're getting more 1080p games. If you take a look at these screen comparisons below, you can see that the visuals look pretty much the same in terms of "how realistic it looks". The difference is merely the resolution and the extra details that the PS4 and XB1's CPU can afford. Last gen doesn't necessarily look any less realistic.
http://assets1.ignimgs.com/vid/thumbnails/user/2014/02/19/MetalGearComparisonThumb.jpg
It's almost like the difference in clarity you might get in every day life when you view things with and then without glasses.

So right now we have 1080p. The next logical step in the future will probably be 4k resolution, then 8k, and so on. Apple and Dell have already released 5k displays, and 8k displays have been shown off at tech conventions in Japan a couple of years back. Once those are out, the gaming hardware will follow to keep up with the advancement. It's a similar case with mobile phone displays and the increase in resolution they're getting. I suppose TVs will eventually reach that point where they'll almost have a "retina" display and the pixel density will increase to the point where your TV set will look like a window into another world.

Kinda like this 8k TV in the video below.

http://www.youtube.com/watch?v=9U7e_quvkPQ
Notice how the extremely high pixel density almost makes the bezels seem like a window frame.

Now with frame-rates, console games have always lagged behind the PC master race. Most games on consoles output 30fps, mainly due to a lack of horsepower in the CPU. A good example of this can be seen in FIFA, for those who have that game on next gen consoles. Notice the frame-rate when you're playing a match (which should look normal to you), and then notice how much smoother things look on the post match replays. The replays are running at a higher frame-rate (possibly 60fps).
PC gamers get to experience this in pretty much all of their games. Take a look at this video of Crysis 3 on the PC (will need to follow the instructions at the start of the video to get 60fps).
The smoother gameplay seen here is what console gaming will/should be aiming for going forward.

https://www.youtube.com/watch?v=LIji4bhoo-s

Beefy
October 29th, 2014, 1:04 PM
Resolution is a hot potato for me right now... I've loved my 4th Gen iPad this last 18 months and always thought that the retina display was unbeatable. I got a Sony Xperia Z3 on Monday though and everything seems much clearer. I'm struggling to look at things on the iPad now :(

That has nothing to do with this topic.

Bill Casey
October 29th, 2014, 2:36 PM
Anita Sarkeesian just wrote a New York Times piece on the future of video games...

http://www.nytimes.com/2014/10/29/opinion/anita-sarkeesian-on-video-games-great-future.html?_r=0


It’s Game Over for ‘Gamers’
Anita Sarkeesian on Video Games’ Great Future

By ANITA SARKEESIAN
OCT. 28, 2014

SAN FRANCISCO — I remember, when I was a kid, desperately trying to persuade my mom and dad to buy me a Game Boy. They were very reluctant. The conventional wisdom of the early ’90s said that video games would rot kids’ brains, and as immigrants who came to North America from Iraq to provide a better life for me and my sister, my parents bought into that myth.

But there was another, more pernicious reason my mother questioned my interest: She thought it was a toy for boys. And could I really blame her? It was right there in the name: Game Boy.

I persisted, however, and after some months of campaigning finally convinced my parents that Nintendo’s hand-held gaming device was, in fact, appropriate for their little girl.

This was a story I was planning to share a couple of weeks ago at Utah State University. Unfortunately, I was not able to give my scheduled lecture there. The school received emailed threats to carry out “the deadliest school shooting in American history” if I were allowed to speak on campus. When the Utah campus police said they could not search attendees for firearms, citing the state’s concealed carry laws, I felt forced to cancel the event.

This wasn’t the first time my life had been threatened over video games. To parts of the gaming community, I have become something of a folk demon. My nonprofit organization, Feminist Frequency, creates educational videos, available on YouTube, that deconstruct representations of women in popular culture. Recently, I’ve focused on the negative, often sexist, ways in which women are portrayed in games. For this, I have been harassed and threatened for more than two years.

My own contentious relationship with gaming continued through high school and college: I still enjoyed playing games from time to time, but I always found myself pushed away by the sexism that permeated gaming culture. There were constant reminders that I didn’t really belong.

As a kid, I didn’t understand that this feeling of alienation wasn’t unique to me, but was part of a systemic problem. Traditionally, advertisements for mainstream games were almost exclusively aimed at men and boys. When women and girls appeared, typically it was either as eye candy or as annoying girlfriends.

The games often reinforce a similar message, overwhelmingly casting men as heroes and relegating women to the roles of damsels, victims or hyper-sexualized playthings. The notion that gaming was not for women rippled out into society, until we heard it not just from the games industry, but from our families, teachers and friends. As a consequence, I, like many women, had a complicated, love-hate relationship with gaming culture.

In 2006, I was drawn back into video games when Nintendo introduced a new system with intuitive motion controls and a quirky name, Wii. Nintendo projected the message that this new console was for everyone. Commercials featuring the tagline “Wii would like to play” showed families and friends of all ages. Nintendo’s console may not have been as technologically splashy as that of its Sony and Microsoft competitors, but it was deliberately designed and marketed to appeal to a wider audience — especially women and girls.

The Wii reignited my interest in gaming, offering play experiences I found engaging and rewarding, like Mario Kart, de Blob and The Beatles: Rockband. From there, I immersed myself in zany PC games like Plants vs. Zombies, World of Goo and Spore, and eventually became a fan of mainstream first-person titles like Mirror’s Edge, Portal and Half-Life 2.

Even though I was playing lots of games, I still didn’t call myself a “gamer” because I had associated that term with the games I wasn’t playing — instead of all the ones I was playing. This was largely because I’d bought into the myth that to be a “real gamer,” you had to be playing testosterone-infused blockbuster franchises like Grand Theft Auto, God of War or Call of Duty.

The Wii helped pave the way for the current explosion of popular indie, mobile and experimental titles — everything from serious, text-based games about mental illness to addictive mobile games about multiples of three; dance games like Dance Central, physics-based games like Angry Birds, artistic games like Monument Valley and immersive story-exploration games like Gone Home. Many offer an accessible learning curve or simple controls, and can be played right on your phone, making gaming available to new and diverse audiences.

Instead of celebrating the expansion of the industry, though, some who self-identify as “hard-core gamers” attack these types of interactive experiences as too casual, too easy, too feminine and therefore “not real games.” Players from marginalized groups are also targeted because they’re seen as outsiders, invading a sacred boys’ club.

The time for invisible boundaries that guard the “purity” of gaming as a niche subculture is over. The violent macho power fantasy will no longer define what gaming is all about.

Those who police the borders of our hobby, the ones who try to shame and threaten women like me into silence, have already lost. The new reality is that video games are maturing, evolving and becoming more diverse.

Those of us who critique the industry are simply saying that games matter. We know games can tell different, broader stories, be quirky and emotional, and give us more ways to win and have fun.

As others have recently suggested, the term “gamer” is no longer useful as an identity because games are for everyone. These days, even my mom spends an inordinate amount of time gaming on her iPad. So I’ll take a cue from my younger self and say I don’t care about being a “gamer,” but I sure do love video games.

Anita Sarkeesian is a media critic and the executive director of Feminist Frequency.

The Rogerer
October 29th, 2014, 2:52 PM
I came to the same feeling quite a few years ago. That's not a 'I thought of it first' brag, but just that pigeonholing yourself is stupid, and I have always been annoyed by awkward nerds who stopped worrying about their problems and started claiming them as points of pride.

Romford Pele
October 30th, 2014, 7:11 AM
I want to know when Virtual Reality will be available. I have been waiting for it since watching Lawnmower Man in 1992.

Bad Collin
October 30th, 2014, 9:21 AM
My thoughts on the future:

- I don't think we are near the limit of graphical capability yet. I think we'll see similar facial scanning to LA Noire across the board. We'll see more scenes filmed and then overlayed with the character model.

- The biggest difference will be the number of things that can happen at once. Already we have seen more cars, people etc populating our games and this can be used to make the game worlds feel more alive, but

- The upper limit of this will be developer time rather than technologhy. Devs can only sell a certain number of games and without massive price hikes they will only be able to employ a certain number of devs.

- I don't think VR will be a gimmick as long as they can make it affordable. If there is the option to have a VR headset for £150 I know a lot of people who will go for it. VR could also have many applications around the home.

The Rogerer
October 30th, 2014, 9:25 AM
Virtural Reality has a lot of inherent problems. We only now have home headsets with decent response latency times because John Carmack become obsessed with solving the problem. It has it's industrial uses, and there are plenty of indie games that make use of it for the Oculus Rift. One big obstacle is just the reality of getting two screens close to your face and your eyes not being able to naturally focus on things at a distance.

Beefy
October 30th, 2014, 9:31 AM
- I don't think we are near the limit of graphical capability yet. I think we'll see similar facial scanning to LA Noire across the board. We'll see more scenes filmed and then overlayed with the character model.

- The biggest difference will be the number of things that can happen at once. Already we have seen more cars, people etc populating our games and this can be used to make the game worlds feel more alive, but

- The upper limit of this will be developer time rather than technologhy. Devs can only sell a certain number of games and without massive price hikes they will only be able to employ a certain number of devs.


This is the key though isn't it? Games won't get much more expensive because every time anyone tries to get the price above £45-£50 it never works. Somehow games are cheaper today in real terms than they were 15 years ago. Whilst the technology will exist for real big blockbusters which know that they will shift 20m+ units most developers won't have the time to put that sort of effort in until creating that facial scanning becomes much quicker and the software to incorporate that into games becomes easier.

Romford Pele
October 30th, 2014, 12:07 PM
Who remembers the Neo Geo?

Was £100+ for a game, and £400+ for the console at a time when Megadrive and SNES were being sold for £130/£140. And this is early nineties money. Mental.

The Rogerer
October 30th, 2014, 3:04 PM
Yeah. The deal was you were getting actual arcade games in your house. The thing was, the cost of the hardware was probably fair at the time.

I remember £65 for SNES Street Fighter II. Can't remember if I ever went higher than that.