2018
07.08

From Playboy

Cinematic adaptations of video games are a thing we take for granted these days. In this very year so far, we’ve had movies based on Tomb Raider and Rampage, while the last couple of years have seen Resident EvilAssassin’s Creed and World of Warcraft receive the same treatment. Looking ahead, there are plans for Uncharted and Five Nights at Freddy’s movies, and even a Sonic the Hedgehog one, too. While it’s heartening, in a way, for gamers to see their medium appreciated by a wider audience, a regrettable pattern has also emerged—painfully low review scores. According to review aggregator Rotten Tomatoes, of the 34 live-action video game movies theatrically released throughout the world to date, just one has crept above 50 percent (the aforementioned Rampage)—and only by a few percentage points at that. The vast majority fall well below the halfway mark, right down to 0 percent, a dishonor held by the panned Tekken film of 2009.

This seems too common an occurrence to be a series of coincidences, and it’s led many to theorize as to why these films invariably suck. A common argument is that the immersive, player-controlled appeal of games is lost when the controller is taken out of the audience’s hands, creating a fundamental disconnect between the creation and the consumer. It then becomes harder to suspend disbelief when faced with the inherent surrealism of gaming. For instance, Mario, a series in which a rotund plumber kicks tortoises, harnesses the power of mushrooms and prunes carnivorous plants in order to save a princess from a reptilian overlord, sounds like the diary of a madman, but when you as the player are given the power to control such chaos, it becomes madcap, addictive fun. Just what would happen if that personal connection were taken away?
Update me when site is updated
Be Sociable, Share!

No Comment.

Add Your Comment

Please complete the following bot-baffling conundrum: * Time limit is exhausted. Please reload CAPTCHA.