Text
The Legend of Zelda: Breath of the Wild impressions - the first fifteen hours
March 5th, 2017

(Minor spoilers below)
I approach a bridge in the western region of the Great Plateau, the area you begin the game in. I’m trying to reach the Keh Namut shrine, one of the four shrines I must visit before I gain the ability to leave the Great Plateau and wander around the massive world unrestrained. According to my map, the bridge allows me to cross the River of the Dead, an ominous name for a body of water I don’t want to enter. When I finally get to the bridge, I discover it’s been shattered, and when I enter the water to try to swim to the remaining pieces, I drown in the frozen depths. I’ll have to find another way around.
Exploration has been one of the foundations of gaming since the original Legend of Zelda came out in 1989, but over the course of this current generation of gaming, it’s invaded nearly every genre of gaming. The biggest titles of the year now nearly always include open-worlds, ripe for adventure. The Witcher 3, Watch Dogs 2, Assassin’s Creed, Grand Theft Auto 5. All featuring different styles of gameplay, but all taking place in an open-world environments. Some series even added open-world settings to later sequels, like Metal Gear Solid 5: The Phantom Pain.
However, the question remains: if every game is doing open-world environments, with missions appearing as dots on your map like a virtual to-do list, is it even worth making more games with similar settings?
The Legend of Zelda: Breath of the Wild is the first open-world 3D Zelda game, and it’s the answer to that question. It’s the best open-world game I’ve ever experienced, fixing some of the problems earlier open-world games have displayed in the past. It takes place in a massive version of Hyrule, where even the opening location feels large until you venture outside its cliffs into the great unknown. It’s a game where you can look in the distance, see a mountain, and set off to climb it. However, unlike similar titles (ahem, Skyrim), whatever you find atop that mountain is going to be something interesting and new.
youtube
I look to my left and I see a nearby gang of bokoblins, the game’s earliest enemy, surrounding a fire with a pot above it, cooking a piece of meat. I crouch and sneak-dash behind a rock, using my bow and arrow to quickly take out the three enemies before they notice me. Looking at my map, it seems I can scale the mountain around the River of the Dead to reach the shrine. It’s a longer route, but it seems to be the only way.
I move forward past the camp but things quickly turn cold as it begins to snow. My health starts to drop; the cold’s going to kill me. I turn back and return to the bokoblin campfire. I take some hot peppers I’d found earlier out of my bag, throw them into the pot, and they begin to cook. A moment later, I have some cooked spicy peppers and I’m ready to go.
I spent this past January playing through The Legend of Zelda: Twilight Princess HD, a game I hadn’t beat when I owned it for the Wii. It was a good game, and I (mostly) enjoyed my time playing it. It wasn’t perfect, however. There were plenty of moments in my 50 hours of gametime where I’d run into something that felt like busywork, or one of the controls (typically revolving around combat or horse-riding) would prove they were designed over a decade ago.
I haven’t had those moments in Breath of the Wild. Even simple tasks like traversing the land are enjoyable and interesting; you never know what you’ll find, even in areas you’ve already wandered through. You’ll wander through a village on your way back to a separate area, and a new side-quest you didn’t know was there will pop up. You’ll try to help a hopeless romantic win his dream date, even though you know it’s futile. You’ll help a little girl cook dinner for her sister, and then find her sister who wants to play tag. You’ll follow a man whose dream is to see a great fairy fountain, only he can’t make it up the hill without exhausting himself. There’s so much to do, the game feels so alive, you can’t help but wander around and return back to areas you thought you already knew.

Eating the peppers provided me with twelve minutes of cold protection, so I start running around the river to get to the other side. On the way, I discover a few new bokoblin hideouts, quickly taking them out or sneaking past, trying to make sure I make it to the shrine before my peppers wear off. I make it to the southern bridge on the river, cross, and start climbing up the mountain chain, keeping a close eye on my stamina meter and only occasionally pushing myself to exhaustion.
Finally, with a few minutes remaining on my clock, I make it to the shrine, the third I’ve visited today. I activate the fast-travel system using my Sheikah Slate, and enter the Shrine, solving a puzzle in order to gain a new skill, and a valuable Spirit Orb, which can be used to raise my health or stamina.
One of my favorite things Zelda does over its open-world peers is the sense of discovery. When I arrive into a new area, the world is unmarked, my map completely blank. To gain the necessary map data, I have to make my way through the area to one of the many Sheikah Towers covering Hyrule, climb my way up, and activate my tablet. Towers are an open-world trope at this point, being heavily featured in games like Assassin’s Creed. In Zelda, however, the towers only provide you with topographical features, names of rivers, lakes and mountains. To find the actual areas you’re looking for, like shrines, treasures, or quests, you have to go looking for them. Instead of the map being bogged down with glowing dots, waiting for your attention, it’s a world ready for you to uncover its secrets. You have to do the work, but the reward is so much greater than any game before.

I’m fifteen hours into Breath of the Wild, and I’ve barely scratched the surface of what it has to offer. I’ve run into dead ends, locations I’ve arrived at only to discover I’m not powerful enough to reap the rewards they offer. They’re marked on my map and in my quest book, reminders to return at a later date. I’ve talked to townsfolk and learned secrets of the land, mysteries yet unsolved. I’ve even found unmarked islands off the coast of Hyrule. accessible only by cliff-jumping. The exploration of the game makes every moment feel like an adventure, and every discovery a story. And while the threat of Ganon lingers over my journey, the story of Link’s awakening (heh) has only just begun.
My cold protection begins to fade as I teleport out of the shrine. My map displays the final marker I placed on a shrine located to the far eastern side of the Great Plateau, the opposite side of the area. I warp back to the center of the Plateau where I began my journey, and quickly start running east, towards the fourth and final shrine. It’ll end my adventures on the Plateau, but it’ll only be the beginning of my experiences in Hyrule.
image credit: Nintendo, VG247, IGN
#breath of the wild#breath of the wild impressions#zelda#wii u#switch#nintendo switch#nintendo#games#zelda wii u#zelda switch
0 notes
Text
The return of friend codes prove Nintendo hasn’t learned from the mistakes of the Wii U
March 2nd, 2017

The first teaser for the Nintendo Switch, released back in October, is an example of marketing done right. The three minute and thirty-six second video tells you everything you need to know about Nintendo’s new console without using confusing gaming terminology. Just visuals and sound effects. It’s a home console. Gotta go somewhere? It’s a portable console too. Wanna play with some friends? Hand them one of your controllers. Wanna add more friends? Throw in an extra Switch.
If you compare it to the launch of the Wii U, where even journalists initially had a hard time figuring out if it was a new console or not, the Switch’s reveal was pretty close to perfect. It’s a really well made video, with a catchy song behind it and the now-infamously satisfying Switch-click. It fooled people into thinking Nintendo had learned from its mistakes.
Yes, I said it. Fooled.
Nintendo’s January conference was promising; it fell somewhere short of the hype that the initial reveal had, but they showed a lot of promising games (with the unfortunate reality that not many of them would be ready for launch), and laid out a lot of the details of what the Switch would be like when it launched in less than two months.
But there were a lot of unanswered questions, and as the Switch’s launch date of March 3rd, 2017 grew nearer, those questions rang louder. What was the paid online system going to look like? What did the eShop look like? Would it launch with the Virtual Console? Would you be able to transfer old purchases from the 3DS or Wii U? Would it support apps like Netflix?
Most of those questions have, in some way, been answered in the past week and a half. Setting aside the amount of time Nintendo took to address basic questions from the press about their new console, most of the news about the Switch has been largely negative. It wasn’t launching with the Virtual Console. It wouldn’t have apps like Netflix, or even a browser, rendering it unable to compete with the tablet market outside of gaming capabilities. There were major problems with the left JoyCon losing connection, a problem Nintendo hasn’t even admitted to yet. Even some launch titles came into question about whether they’d actually be ready at launch.
And today, the day-one patch was pushed to reviewers, and we learned that not only had Nintendo kept quiet about their online service, they’d outright lied.
In January, CNet published an interview with Reggie Fils-Aime, Nintendo of America’s President and notable meme. Here’s an excerpt:
Connecting with friends will be on a case-by-case basis, but Nintendo is hoping to create a standardized experience. "There are no friend codes within what we're doing," Fils-Aime said, referencing the company's past cumbersome system for adding contacts.
Today, news broke that, following Nintendo’s day-one patch that added online capabilities to the Switch, including the eShop and multiplayer options, that the way to add friends was not through their Nintendo Network ID system, but through friend codes.
The friend codes they’d done away with for the Wii U. The friend codes they said they weren’t using. They brought them back, without explanation.
If you don’t know what friend codes are, a quick explainer: friend codes were 12-digit codes introduced with the Wii that you needed in order to add someone to your friends list. They were used on the Wii, and again on the 3DS, before being dumped on the Wii U. No one ever liked friend codes, and as they aged, it became clear that Nintendo didn’t quite understand how online multiplayer worked.
Arguably, they still don’t. Today, Nintendo is competing against the robust online services offered by both Sony and Microsoft. Nintendo is going to start making users pay to play online in the fall, albeit for a cheaper price than either Sony or Microsoft.
And they’re still using friend codes. They also have a 300-friend cap, in comparison with the 2000 and 1000 person caps offered by Sony and Microsoft, respectively.
To me, an early Wii U owner, this spells disaster. To me, this says they haven’t learned from their many mistakes made by the previous console. It tells me they aren’t even competing in the console market. Gamers expect to be able to easily add their friends online. Why should someone play a hypothetical third-party online FPS on the Switch when it’s so easy to get online with friends on other consoles? If this year’s Call of Duty launches on the PS4, Xbox One, and Switch, tell me why anyone would want to play it online rather than on the other two consoles. Nintendo fans will be quick to point out that Nintendo hasn’t directly competed against the other console manufacturers since, arguably, the Gamecube. The Switch isn’t about power or online gaming, it’s about portability and the option to play at home or on the go. Sure, that’s an argument to be had, but it’s fundamentally flawed. Whether Nintendo wants to admit it or not, they’re in the gaming industry, and they have to convince gamers to buy their console over recently-released PS4 Pro and upcoming Project Scorpio. If online play on the Switch sucks, they could very well be in trouble.
Troublesome online play could create a chain of events all too familiar: if people don’t want to play Switch games online, developers and publishers aren’t going to bring games that focus on online multiplayer to the console. If developers and publishers start electing to only bring some of their games to the Switch, gamers are going to go to platforms that have the games they want. If the Switch doesn’t have the games they want, they’re going to go to the PS4, Xbox One, or PC. Finally, if gamers aren’t buying the Switch, developers are going to make even less games for the platform, until it loses the majority of third party support.
If this sounds familiar, it’s because it’s almost exactly what happened with the Wii U. In fairness, the Wii U wasn’t sunk by its online problems, it was sunk by how difficult it was to develop for and how poorly the average consumer understood what the damn thing even was. The Switch, by most accounts, is far easier to port to and develop with, and this is in Nintendo’s favor. But if something causes gamers to not show up, developers are going to pull out. They did it with the Wii U, and they did it pretty early. If the Switch starts to flounder, I doubt publishers will be afraid to abandon the console.
Now, I don’t think the complication of friend codes alone is actually enough to sink the Switch, but it’s one more note to add to a growing list of concerns. When will the virtual console launch? Will VC titles you bought on the Wii U or 3DS be transferable? What will the actual experience of playing online look like? Will you have to use your smartphone to communicate online? Are the problems with the left JoyCon firmware-based, or should we expect a recall? When will you add streaming apps like Netflix? Without streaming apps like Netflix, and basic apps like a browser, will this device be able to compete with tablets? If it isn’t competing with consoles, and it isn’t competing with tablets, what goddamn market is this thing even in?
Among many things, marketing sunk the Wii U, and Nintendo has absolutely course-corrected. Nintendo’s marketing team has been killing it with the Switch. Their first teaser was great. Their Super Bowl commercial - their first ever - was a good introduction to the console. And they have Zelda, which has been getting 9’s and 10’s almost universally.
But there’s enough we don’t know one day before launch to make most consumers hesitate to drop $300 for a new, unproven, unfinished platform. And that should be enough to make even a hardcore Nintendo fan nervous.
image credit: Forbes
0 notes
Text
Twilight Princess: A Game Stuck in Time
(AN: The following is an unproduced script for a proposed video series called BKLG (pronounced Backlog), which I unfortunately had to put on hold for the time being. It has been slightly modified to read as an article, but the writing below is perhaps a bit more conversational than it otherwise would be.)
Allow me a bold statement upfront: The Legend of Zelda: Twilight Princess would not exist as it does today without a demo reel shown at Nintendo’s defunct trade show Spaceworld. At Spaceworld 2000, a demo reel for the upcoming Gamecube was shown to attendees to represent the graphical power of Nintendo’s new console. Twelve seconds of an unannounced Zelda game were shown and the fanbase lost it’s collective mind. IGN wrote a five paragraph essay about the clip, writing, “There's far too much detail to believe that Nintendo would scrap the models and make new ones. So, we think it's safe to say the new Link will look a lot like this. Overall, we're very happy with his new immaculate hero look.” Right.
youtube
IGN might’ve lost their minds, but behind the scenes, director Eiji Aonuma wasn’t pleased; in fact, he actually hated the design. A decade later, he told IGN it wasn’t the game he wanted to make at all. To him, it wasn’t Zelda.
So a year later, at Spaceworld 2001, Nintendo announced The Legend of Zelda: The Wind Waker. The internet revolted. This wasn’t their Zelda, they said. This was Cel-da. This was kid stuff. Where was their mature, grounded take on the series? I do wonder if that sounds like any other fanbase out there today.
Wind Waker was released in North America in 2003 to critical praise. Wikipedia has it listed on twenty-three separate Best Of lists. The HD re-release on the Wii U only gained the game further acclaim. The visuals have stood the test of the time, aging far better than similar games released around the same era.
youtube
But all of that didn’t matter. To fans, the cartoonish visuals meant the game was meant for children. As a follow up to Ocarina, its sales were disappointing, selling less than half of what the first 3D Zelda had sold. Nintendo directly attributed this slump to the reaction of fans in North America after the graphics were first shown in 2001. So, despite accidentally announcing in 2004 that an upcoming GameCube Zelda game had the working title of The Wind Waker 2, Aonuma became concerned that the game wouldn’t sell well in North America. After the game was announced at E3 2004, Shigeru Miyamoto told IGN that the art style of the new Zelda adventure was created to fulfill that customer demand created six years before Twilight Princess was even released.
So why spend a massive amount of time detailing the history of a decade old game? Because in a lot of ways, The Legend of Zelda: Twilight Princess and its HD remake, feel beholden to the demand of its fan base in a way not a lot of Zelda games are. Despite the preceding two games in the series, Wind Waker and Majora’s Mask, featuring spectacularly different play styles, Twilight Princess feels like a reimagining of 1998’s Ocarina of Time, and while this doesn’t make Twilight Princess a bad game by any means, it certainly makes it feel more derivative than any adventure game starring the Hero of Time deserves to feel.
So, in the honor of Zelda, let’s divide this into two needlessly convoluted timelines. There’s also one where I die while writing this, and it never comes out, so if you’re reading this now, please assume you aren’t in this timeline.
One.
Twilight Princess is a good game doing weird things.
Yeah, really, it is. All the fun of Zelda, right there, baked into it. It’s got some dark, goofy undertones and the game is weird as hell. The wolf segments are mostly fun, especially once you gain the freedom to turn into a wolf whenever you please. The characters are all really memorable in a way that I think is underplayed when people talk about Zelda. The Snow Yeti couple who are secretly possessed. Zant is a weird Twilight villain who is being played by Ganondorf. Colin’s storyline of overcoming the bullying and taunting of the rest of his friends makes him my favorite of the four children by far. And Midna is the best - the best - Zelda assistant ever. That’s a really low bar to clear, sorry Navi and Fi, and Tatl. Y’all can buzz off, because Midna has you beat for days. She is excellent, and never really a bother, even when she tells you something you already know.
The swordfighting in this game, particularly when fighting the Darknuts throughout the last chunk of the game, feels spectacular. I’m assuming this is less true with waggle controls on the Wii, but playing through the HD remake felt pretty spectacular. Some of the dungeon design is the best in the series - Snowpeak, for all its flaws and played out ice block puzzles, is perfectly built, and the Temple of Time’s reversal after the miniboss felt really refreshing. It also, and I cannot overstate this enough, had my favorite minigame in all of Zelda: snowboarding.
That’s not to say Twilight Princess is a perfect game. There’s plenty to nitpick - the puzzles don’t feel like puzzles! Why are half of the puzzles just shooting objects on a wall with an item! Why aren’t there more snowboarding levels! Why do half the items have almost no use outside the dungeon! Why aren’t there more snowboarding levels! Why can’t I ride the Spinner everywhere! Why aren’t there more snowboarding levels!
So instead of nitpicking on small things like, why aren’t there more snowboarding levels, let me go ahead and lay out the biggest flaw in this game, the one that everyone probably saw coming before you even clicked on this article: the opening.
Here’s how the opening tutorial for 1992’s The Legend of Zelda: A Link to the Past plays out: There’s a short cutscene before you gain control of Link. You leave bed and you grab the Lamp from the nearby chest. The guards don’t let you into the castle, so you head around to the right and you move a bush to let yourself into the dungeon of the castle. Your uncle, who has been defeated, gives you a sword and shield. Then you begin your journey through the first dungeon of the game, Hyrule Castle.
youtube
Cool. Easy. Done. Now here’s the opening tutorial for Twilight Princess: You talk to Rusl, you watch a cutscene, you run to the Ordon Spring, you talk to Ilia, you get Epona, you run through Ordon, you get to the ranch, you herd some goats in what is one of my least favorite mini-games in all of Zelda, you run back to Ordon, you talk to the kids, you talk to Uli, who can’t give you the fishing rod because she lost her cradle, then you talk to Jaggle, you summon a hawk, you shoot the hawk at a monkey, you bring the cradle back to Uli, and you get the fishing rod. From there, you go fishing, you catch a fish for the cat, you watch the cat run around town and back into the shop, where you can now get a free bottle. If you haven’t already, you run around collecting rupees until you reach the magical number of 30, in which you buy the slingshot and you show the children that you’ve bought it. Now you can re-enter your house and, would you look at that, the sword is there. The kids teach you how to use a sword. Then the kids chase a monkey into the woods. So, you summon Epona, you get the lantern, and you enter the North Faron woods on your quest to find Talo. You make your way through the woods in what is sort of similar to a dungeon, you free Talo and the monkey using your sword, and Rusl thanks you for saving Talo. Then you herd more goats - 20 this time, thanks Fado. Ilia claims that you hurt Epona or something, and she steals your horse. She’s also locked you out of the spring where she’s hidden Epona, so you sneak into the spring in a crawlspace, which triggers a cutscene, and boom, you’re a wolf stuck in prison.Technically, the wolf section is also a bit of a tutorial, but I think the point’s been made.
The opening of this game is terrible. It slows progress in the game down to a crawl right when the game should be trying to get you to sink yourself in. It takes hours to complete, and even longer if you haven’t played the game before and don’t know what you’re doing. And, in some ways, it’s indicative of a larger problem in the more modern era of Zelda games - not trusting the player to figure the game out on their own.
A quick note on the other divisive aspect of this game: tear collecting. I won’t comment much on it because it’s been talked to death and, to me, the tutorial is far more problematic in terms of game structure, but the tears fetch quest isn’t a whole lot of fun. At best, it’s inoffensive; at worst, it’s boring and yet another way to get players to put the controller down before the game reaches its second half. The HD remaster fixes the quest somewhat, lowering the required tear count from 16 to 12. It’s still cumbersome, but ending 25 percent sooner helps alleviate the negative feeling each section leaves on the player.
Two.
Twilight Princess is a good game unable to move beyond its past and its fanbase.
Majora’s Mask was released to critical acclaim, but it sold about half of Ocarina’s numbers two years earlier. Perhaps, Nintendo probably thought at the time, this had to do not with the quality of the game or what the fanbase wanted, but the required usage of the Expansion Pak and the impending launch of the GameCube.
As mentioned earlier, it was Wind Waker’s sales that scared the creative team into redirecting their efforts from a sequel to Wind Waker to an entirely new game with a new, more realistic design.
But Wind Waker’s struggles didn’t just change the art design of the new game. It ensured that the next Zelda game would be more like Ocarina of Time than both Wind Waker and Majora’s Mask, a direct sequel, ever would.
And they did it. Twilight Princess, more than any other game in the series, plays like a reimagining of a former game, in this case, Ocarina of Time. Especially in the first half of the game, both play out in incredibly similar ways, from your humble beginnings in a small village to your travels to Hyrule Castle, to the similarly themed opening dungeons, to your new companion following you around, offering advice. Majora’s Mask was a game that took chances, shook the Zelda formula up in ways no one had seen since Zelda II. Wind Waker stayed more true to the classic Zelda road, while still thinking up new ideas, from its presentation to its high seas setting. Twilight Princess is a good, safe game, seemingly designed to make sure that everyone who owned a copy of Ocarina of Time and had seen the Spaceworld 2000 demo would no longer feel disappointed about the cartoon stylings of Wind Waker.
And it worked. That feeling of nostalgia for Ocarina, combined with the success of the Wii, ensured the game would become the best-selling title in Zelda’s history, assuming you don’t include the 3DS remake of Ocarina into Ocarina’s N64 sales.
Of course, unlike Ocarina, nostalgia for Twilight Princess hasn’t fared quite as well. The game received an HD remaster in 2016, both as a 30th anniversary celebration of the series and as a pseudo-apology from Nintendo for delaying Breath of the Wild to 2017 in order to simultaneously release on the Wii U and the Switch. The HD remaster of Twilight Princess sold a little more than a million copies globally, a similar number to 2015’s forgotten spin-off, TriForce Heroes.
It took nearly another decade to get Nintendo to take more chances on changing up the Zelda formula. Ignoring the portable titles for a moment, 2011’s Twilight Princess follow up, Skyward Sword, was critically acclaimed at launch, but has, for the most part, been largely forgotten about in the five-plus years since its release. Skyward Sword often appears near the bottom of best-Zelda lists, and often doesn’t appear at all when the list is limited to ten games. That game has similar flaws to Twilight Princess, with a drawn out opening section and frustrating collect-a-thons like the music note section late into the game.
All of this is to say, I think we’re about to entire a new era of Zelda, or at least, a return to classic, pre-Ocarina of Time adventuring. Next week’s Breath of the Wild promises an open world with plenty to explore. The opening of the game seems to draw from the original and from Link to the Past far more than from Twilight Princess or Skyward Sword. What we’re looking at isn’t the end of Zelda, but the first of a new chapter.
0 notes
Text
A Debate on Debates: Do They Still Matter?
Author’s note: this essay was originally published on December 12, 2016. It has been republished here as a sample of my work.
On September 26th, 1960, Richard Nixon walked onto stage and sat in the chair previously prepared for him. Opposite him was his political opponent, John F. Kennedy. In between himself and Kennedy was Howard Smith, sitting in front of a small table. In front of them, for the first time, were television cameras, ready to film the two men’s responses. Nixon could feel himself sweating; he hadn’t yet recovered from his recent hospital stay. He looked over at the young Kennedy; he looked calm and confident, and, it seemed, was wearing makeup. Nixon heard the director begin to countdown to broadcast. Slowly, he dabbed the sweat off his forehead.
That day in 1960 kicked off what is now a welcomed tradition in the American election cycle: presidential general election debates. The debates have a short, but extensive history in how American politics is presented, and how debates can change and influence elections. However, despite the historical precedence of debates, growing evolutions in the twenty-four hour news cycle and divides in partisanship have reduced the need for debates in our country.
Though the first general debate between presidential candidates wasn’t held until 1960, the idea of a debate has historical predecessors dating back to the 1800s. One of the earliest debates occurred between Abraham Lincoln and Senator Stephen Douglas, for a US senate seat from Illinois. The Lincoln-Douglas debates, or the Great Debates of 1858, was a series of seven debates between the two men. The debates were held in seven cities around Illinois between August and October. Interestingly, several of the debates grew large crowds of out-of-state citizens, for one simple reason: the debate over slavery. Although Illinois was a free state, the idea of Illinois electing a pro or anti-slavery senator would affect the balance of congress and therefore the state of slavery in the country. Douglas ranted against Lincoln for opposing the Dred Scott decision, while Lincoln stated he feared the next Dred Scott decision would create a market for slavery in free states. It’s worth noting that, although Lincoln himself did not argue for equality between white and black citizens, he did consider that black citizens should have the same rights to liberty as anyone else.
Interestingly, the media played a large role in the election of 1858, and, in turn, the presidential election of 1860. For one, pro-Democrat newspapers published transcripts of the debates with Douglas’ speeches edited for clarity, without removing the typos and pauses of Lincoln’s speeches. For those who could not attend the debate in person, it made it seem as though Lincoln was performing far worse than his opponent. Douglas won his reelection (despite Lincoln winning the popular vote by over 3,000 votes), though Lincoln’s notoriety had been boosted by the national coverage of the seven debates. Lincoln did go on to publish a collection of the debates without the corrections provided to Douglas by news reporters, and, of course, he won the presidential election in 1860. One of his opponents in that election was none other than Douglass himself.
After the 1960 Nixon/Kennedy debates, another nationally televised presidential debate was not held until 1976, between President Gerald Ford and Jimmy Carter. 1976 set the precedent of how debates are held to this day: three debates between the candidates, with an additional debate between the vice presidential candidates. Though the first debate was marred with technical issues, included a portion of the debate losing audio, it was the second debate between Ford and Carter than swung the race in Carter’s favor, when Ford denied the existence of an Eastern Bloc. The election was close, but Ford’s blunder proved his downfall. Four years later, heading into the 1980 debate, President Carter held a lead over his Republican opponent, Ronald Reagan. However, Carter was absent for the first debate in 1980 when he refused to appear so long as Independent candidate John Anderson would be on stage, and so the first debate was held between Reagan and Anderson, with both the second debate and the vice presidential debate canceled. The third debate was held between Carter and Reagan on October 28th, 1980, only days before the election itself. This debate would garner press attention years later when, unbeknownst to Carter’s team, Reagan’s campaign had acquired Carter’s briefing papers prior to the debate. Carter was seen to had lost the third debate, and Reagan won the White House in return. It is still unclear to this day whether Carter would’ve won the race had he been seen to win the third debate; his popularity with the American people was crumbling prior to Reagan’s election.
In 1992, the first presidential debate to feature both major-party candidates, as well as a third-party candidate, aired on national television. Ross Perot appeared on stage next to incumbent President George H.W. Bush and Governor Bill Clinton. Interestingly, Bush initially did not want to appear on stage with his two challengers, and was criticized widely by the press for looking at his watch continuously throughout the debate. His campaign described this action as merely tracking the other candidates’ time as they spoke; later, this was confirmed to be a lie. Perot’s appearance in the debate helped him rise in popularity, and although he came in third place, he managed to score nineteen percent of the vote, the highest a third-party candidate had ever managed. Much of his percentage came from Bush voters, and Clinton won an easy election.
Since the turn of the millennium, debates have shifted in the importance. While still watched every four years by tens of millions of onlookers (on every local channel, plus cable news channels and online), the debates have been increasingly seen as out of date mechanisms for choosing who should be the next president. Part of this, certainly, is due to the access to news through internet and cable channels. The existence of a 24-hour news cycle has made it impossible for those who care about such matters as debates not to know what’s happening in the election. Likewise, the internet has diminished the need for a set stage where politicians explain where they feel the country should be headed. Looking at 2016 for an example, it’s impossible to say voters couldn’t see the difference between the America that Clinton wanted versus the America Trump was promising. But perhaps more prominently than the overabundance of media channels diminishing debates is the increased, and still increasing, amount of partisanship over the past two decades. Many American citizens have been sticking to their own political beliefs stronger than ever. Whereas in the 1970s, where the electorate sees a chance to switch from a failing Ford to a promising Carter, voters today see a chance to switch the government back to “their team.” The rhetoric being preached to both Democrats and Republicans by top leaders in both categories creates a disconnect in the American public. For example, in a 2016 Pew Research Center poll, 49 percent of Republicans said the Democratic party made them feel afraid, 46 percent said angry, and 57 percent said frustrated. For the Democrats, those numbers change to 55, 47, and 58 percents, respectively.
So then, with all the emotion being spilled over regarding whether or not the other party’s candidate should even be fit to govern, it’s no surprise that the state of debates in 2016 seem to be weaker than ever. The debates between Clinton and Trump were not only messy and seemingly created to cause further hatred between both the two candidates and the two parties, but they carried no weight: in national polls, Clinton was said to have won all three debates. Though Clinton is no master-debater - look no further than her primary debates against Obama in 2008 for proof - in all three debates, she held more than her own against Trump, allowing him to crumble on his own. When election day arrived, it became apparent that it didn’t matter. Trump rose to victory, based largely on the state of a few swing states in the midwest. The debates - watched by hundreds of millions of people, reported on for days following, called attention to repeatedly by pundits and comedians and everyone in between - were forgotten by the American public.
So, do we need debates anymore? No, we don’t. Do they matter? Perhaps they used to, although even that could be left up to further discussion. But with 2020 coverage only two years away from really heating up, it’s a discussion worth having now: does it truly matter to watch two candidates debate themselves when the answers are often binary in nature (pro-life versus pro-choice, etc)? In a world with as much media coverage as the one we exist in now, don’t expect the debates to disappear anytime soon. But the shrinking relevance of debates invite a new name to grace the platform. A quick glance at my thesaurus brings up a couple relevant terms: controversy, dispute, or perhaps best of all, tiff.
citations:
CPD: Debate history. (2015). Retrieved December 12, 2016, from
http://www.debates.org/index.php?page=debate-history
Erb, K. P. (2016, September 26). 13 quick facts about the history of presidential debates
in America. Forbes. Retrieved from
http://www.forbes.com/sites/kellyphillipserb/2016/09/26/13-quick-facts-about-the
history-of-presidential-debates-in-america/#7b7cc1b83331
Sides, J. (2016, July 5). Do presidential debates really matter? Retrieved from
http://washingtonmonthly.com/magazine/septoct-2012/do-presidential-debates-r
ally-matter/
The nation: THE BLOOPER HEARD ROUND THE WORLD. (1976, October 18).
Retrieved December 12, 2016, from Time, Inc.,
http://content.time.com/time/magazine/article/0,9171,946700,00.html
0 notes
Text
GamerGate Supporters Have Shown a New Weapon Against Women Online: Anonymity
Author’s note: this article was originally published on December 4th, 2015, as a corresponding opinion piece to my research on GamerGate. It has been republished here as a sample of my work.
When GamerGate began in August 2014, it had been a culmination of a decade of trends, combined with dozens of viewpoints and opinions of select people. The arguments of GamerGaters didn’t bring much new to the table in terms of actual points; at its base, it was formed around misogyny and lies about a woman sleeping with gaming journalists to get publicity for a game that she’d published eighteen months earlier. There was nothing to this argument; that claim had been disproven days after the controversy began. And yet, GamerGate is a continuing trend to this day, still influencing decisions made by game developers, journalists, and gamers themselves. If there was one thing GamerGate brought to the table, it was a new weapon against women in the internet age: anonymity.
Zoe Quinn, Anita Sarkeesian, and Brianna Wu, among others, have faced more than a year of online threats, verbal abuse, and fear. Each one of them was forced to leave their homes at one point in 2014, after being doxxed and fearing angry GamerGaters to come storming to their house. They’ve been threatened with rape, murder, death of their families and friends, and the destruction of their reputations. What arrived in their lives in the summer of 2014 was an anonymous army, called into action by Quinn’s ex-boyfriend. Sarkeesian has dealt with this struggle since she launched her Kickstarter in 2013 for her video series on YouTube. Each of her videos have comments and ratings disabled. Searching for Anita Sarkeesian on YouTube brings up dozens of hate-filled counter-arguments against her. When Sarkeesian appears on stage, she worries about death and bomb threats - specific ones, too.
Supporters of GamerGate will claim that the movement has nothing to do with the control of women in the industry; instead, they say, it’s about ethics in gaming journalism. They claim that the GamerGate movement has been unrightfully criticized for its most extreme members, and point to other internet-based movements, like #BlackLivesMatter, as proof of journalism bias towards their operation. They post on forums online anonymously, writing screeds about how reporters are denouncing GamerGate purely because it is a movement against their own journalistic intentions. Yet, they boldly choose to ignore the codes of ethics gaming websites often share on their homepage, clearly writing in detail what is accepted by said gaming publication, and what is disallowed. If GamerGate was truly about ethical journalism, would these code of ethics not be enough? Wouldn’t there be evidence that these codes have been broken?
Truly, this fight comes down to fear of a loss of identity. These gamers, almost certainly all men, are terrified that they will lose what it means to be a gamer if they allow women and casual players to be invited into this industry. For years, threats of rape during online play, shouted demands of “show me your tits,” and even using the term “casual” to be a derogatory insult towards those who play games like Candy Crush Saga have slowed the growth of gaming. But gaming is too big for these players. Gaming is a multibillion dollar industry made up of millions and millions of players in the United States, and quite surely billions worldwide. The term “gamer” is evolving; it no longer means what it meant in the 1990s. Anyone, and everyone, is a gamer.
Likewise, gaming itself is changing. Games are no longer simply about driving several pixels on the screen from point A to point B; rather, they can be about anything the creator wants them to be about. Purists will argue that a game has to be action packed to be considered a video game, but in the era where a single person - a child, even - can build a video game from the ground up, this is no longer the case. Games now exist on a higher level, shared with television and film. Games tell stories, they make statements about society or ourselves. The sense of elitism that spawns from viewpoints claiming that games cannot and are not made for stories is incorrect. The last half decade alone has seen stories praised, from the AAA titles like The Last of Us, to smaller titles such as The Walking Dead and Life is Strange, to independent titles, including Polygon’s controversial Game of the Year 2013, Gone Home.
Perhaps GamerGate supporters are not ready to accept their change in status, from outsiders twenty years ago, to the accepted mainstream today. But it’s true: gaming is everywhere. It has evolved. And all the attacks and threats and false superiority on women, all the usage of anonymity on the internet to try to get your way, it will not succeed in changing the path gaming finds itself on. This is the future, this is the mainstream.
Welcome to the new gaming.
0 notes
Text
A History of Gamergate
Author’s note: this essay was originally published on December 2nd, 2015. It has been republished here as a sample of my work.
It is a fair assumption to state that anonymity will bring out the worst in humanity, and the internet was built on anonymity. Usernames, fake email address, IP address spoofs. Hiding who you are on the internet is an easy game to play, assuming you want to stay hidden in the shadows.
The summer of 2014 bred a perfect storm of misogyny, anonymity, and controversy in humanity lead to personal attacks on three internet personalities: Game developers Zoe Quinn and Brianna Wu, and feminist blogger Anita Sarkeesian. These three women became entangled in a shrouded attack from individuals in the gaming industry taking things far too personally. They survived threats of rape, murder, and being forced out of their own homes. All because a minority of male gamers could no longer stand to see women in their industry.
In August 2014, Michael Brown was shot in Ferguson, Missouri. Robin Williams passed away after committing suicide. An American journalist was beheaded by ISIS members. Surrounding this, Zoe Quinn found herself going through a breakup. Her boyfriend, Eron Gjoni, found himself hurt and angry. Utilizing the power of the internet, he did what any angry man in a post-Mark Zuckerberg world would do: he blogged about it.
Zoe Quinn was the developer of a game called Depression Quest, a text-based browser game designed to help people understand what depression feels like. It was well received from a number of websites for its accurate portrayal of depression and trying to receive help. The game was released a year and a half prior to the controversy, in February 2013. One of the websites to not write a review about Depression Quest was gaming website Kotaku, a Gawker publication. Despite this, Gjoni, in one of his blog posts on “The ZoePost,” claiming that Zoe had cheated on him with numerous people, including Kotaku journalist Nathan Grayson, in order to gain publicity for her game. Whether or not Zoe Quinn cheated on her ex didn’t matter to a legion of gamers who saw their industry as being “infected” by women. To them, gaming was something that was to be by men, for men. A video was created and circulated entitled the “Quinnspiracy,” gaining popularity and interest on 4Chan and reddit.
A loud minority of gamers picked up their metaphorical pitchforks and demanded Quinn pay for her “crimes” with her life. She was quickly doxxed, a term that means to uncover private and personal information about an individual and share it publicly online. With her phone number and her address available online, she was forced to abandon her house and begin sleeping on the couches of friends. It was then that Gjoni updated his blog post to indicate that he had no proof that Quinn had slept with Grayson, or had even had personal contact with him beyond online. He also asked those reading the blog to leave Quinn alone and to cease attacking her.
The “Quinnspiracy” was the final straw for the “traditionalist” gamers who saw their boys-only club being left to the girls. Throughout the last few years before Quinn, these men had held another woman in their sights: Anita Sarkeesian.
Sarkeesian is a blogger and prominent feminist, running the site Feminist Frequency. In 2012, she launched a Kickstarter for a YouTube series entitled “Tropes Vs. Women in Video Games,” an extension of her already-existing “Tropes. Vs. Women series made in partnership with Bitch magazine. The launch of this fundraiser lit the internet in a blaze, despite being funded in less than 24 hours. Once again, gamers saw it as an attack on their personal hobby: a woman coming into the industry as an outsider and telling them about equality. To them, Sarkeesian was nothing more than a SJW - social justice warrior, a term with all sorts of negative connotations.
In the days following her Kickstarter, Sarkeesian received rape and death threats, hacks performed on her website and social media accounts, edits on her Wikipedia article, and negative, brutal comments on anonymous websites. One developer made an online game where you could punch a photo of Sarkeesian in the face. The controversy only further pushed supporters of Sarkeesian to donate to the Kickstarter; she made twenty-six times the requested amount.
Meanwhile, another game developer, Brianna Wu, was thrown into the fire in October 2014. Her game, “Revolution 60”, had found success at gaming conference PAX Prime, and had launched in July of 2014 to positive reviews. The game focused on a team of four women working in an anime-themed spec ops unit. In October, she tweeted multiple posts in support of Quinn and against the raging commenters online, ridiculing them for: “fighting an apocalyptic future where women are [eight] percent of programmers and not [three.]”
She received a similar reaction to Sarkeesian: doxxing, rape threats, death threats, and finally being forced from her own home. Wu, at one moment in time, even received photographs of mutilated dogs following the death of her own dog, Crash.
These supporters gathered under a title now well-known to the industry and the country as a whole: GamerGate. Actor Alan Baldwin (known for the science fiction show Firefly; not related to Alec or the other Baldwin brothers) tweeted a simple hashtag in support of the attacks on Quinn and Wu: #GamerGate.
To fully understand GamerGate, you also have to dig deep into another subculture of the internet, one that shares many members with GamerGate: men’s rights activists. Branching off of the Men’s Liberation movement of the 1970s, men’s rights activists, or MRAs, focus on what they see as oppression, discrimination, and disadvantage towards men. To many, men’s rights is seen as a countermovement towards feminism, with many MRAs believing that feminism has actually harmed men. Their main issues include violence against men from women, child custody, false rape accusations, paternity fraud, and divorce. Though existing since the 1970s, the movement largely remained underground until the rise of the internet, specifically in the early 2010s as their voices grew louder. Men’s rights activism reach mainstream notoriety in 2014 when Elliot Rodger murdered three female students outside UCSB, as well as three men in his apartment. Before the shooting, Rodger posted a video statement online, claiming that women deserved to die for not having sex with him, and sexually-active men deserved to die for living what he saw as a more enjoyable life than his own.
There is one specific website that many attribute responsibility to the uprise of men’s rights, as well as strengthening and growing the misogyny in members of the movement. Return of Kings is a blog that bills itself as a “website for heterosexual, masculine men.” The articles there are frequently offensive, containing references to homophobia, misogyny, slut shaming, and other ideals considered “traditionally masculine.” The website has received plenty of coverage from websites like Jezebel, deeming the site a “vile troll website” composed of internet trolls “discussing how much they hate ugly women.” Published articles include “5 Reasons to Date a Girl with an Eating Disorder,” “24 Signs She’s a Slut,” “20 Things Women Do that Should be Shamed, not Celebrated,” and “Fat Shaming Week.” In this sense, the website is easily compared to other websites known for controversial and offensive ideas, like 4Chan and certain subreddits. Many of these men’s right’s supporters were quick to join in on GamerGate, and as such, any credible arguments found in GamerGate quickly drowned in the surrounding misogyny.
By the end of August, GamerGate had become the hottest topic in gaming, for better or for worse, and was on its way to becoming a topic in mainstream culture. Even so, GamerGate faced the same problems any anonymous movement will face: their goals were unclear. The core group of supporters in the movement, those who declared that GamerGate had nothing to do with women in gaming, will be quick to say “it’s about ethics in gaming journalism.” These individuals argue that everything they’re doing is in defense of the games industry being taken over and bullied by overzealous journalists, whom they argue are over-obsessed with social issues and political correctness in gaming. This phrase quickly turned into a meme against supporters of GamerGate, with detractors declaring it was a simple cover up for a broader story about misogyny and wanting to keep gaming a boys-only club.
Arguably, much of the hatred that stemmed from GamerGate came from an identity crisis erupting at the core of what a ‘gamer’ is. This idea of lost identity stems from the changing definition of gamer over the last ten years. In the 80s and 90s, gaming was often for teenage boys or younger, something the “nerdier” kids did instead of playing sports or choosing other activities. In the mid-2000s, however, the landscape of gaming changed and evolved with the uprising of casual gaming, beginning with the Nintendo DS in 2004. The game Nintendogs was specifically marketed towards young girls, and pushed forward an entire lineup of games created with a female market in mind. While some of those gamers certainly moved into more mature and mainstream titles, including shooters, RPGs, and platformers, a vast majority of the casual gamers ensnared by the DS’s success moved onto smartphone gaming following the launch of the iPhone in 2007, and specifically the launch of the App Store in 2008 with iPhone OS 2. The rise of mobile gaming has threatened both Nintendo’s and Sony’s own mobile platforms; the 3DS (the DS’s successor) was seen as a disappointment at launch and is still millions of units behind the DS, while the PS Vita was seen as a commercial flop.
With the rise of casual gaming, core gamers suddenly saw their hobby, one that many of them had once been mocked for having, turn mainstream. Suddenly, everyone was a gamer. Your aunt, your mother, the captain of the football team. They all played games in some sense of the word. Meanwhile, as more and more women began to become core gamers themselves, some male gamers suddenly felt threatened in an industry they saw as their own.
In a story where journalism is so heavily involved, it’s imperative to explore how journalists covered GamerGate. In the initial months following the attacks and threats towards Quinn, Sarkeesian, and Wu, gaming websites were urged to publish updates from their editors. Stephen Totilo, EIC for Kotaku, published two separate articles referring to GamerGate. The first was simply an update regarding the code of ethics on reporting at Kotaku. The comments and responses to that update pushed Totilo to publish a second article. “About GamerGate,” posted in early September, functioned as a response to what many had put on Kotaku’s shoulders. Totilo’s post is a definitive GamerGate read, summarizing what, to him and many other journalists, had become a no-win situation. In the article, he writes what he assumes about gamers, game reporters, and game developers: “Good people, most of them.” Finally, he states “I'm a gamer. I don't mind the term. If you do, that doesn't bother me. I'm confident in who I am. If you're a gamer who harasses? Who sends rape threats or stalks Twitter feeds or terrorizes people from their home or gloats at others' struggles? Find a new hobby.”
Polygon, a relatively new gaming site founded by industry veterans from Kotaku, Joystiq, IGN, and other prominent gaming organizations, had also found controversy revolving around GamerGate. The hatred began in 2013, when Polygon gave a negative review to RPG Dragon’s Crown. The game featured female heroes with inaccurate female anatomy. While reviewer Danielle Riendeau commented that these characters were allowed to be positive, strong heroes, she noted that the female NPCs (non-playable characters) were “barely clothed, with heaving chests, backs twisted into suggestive positions, some with their legs spread almost as wide as the screen. They're presented as helpless objects, usually in need of rescue. It's obvious, one-sided and gross.” Over the two weeks that article contained open comments, nearly 1300 comments were published, with many saying the negative review (with a score of 6.5) came purely from the aesthetic choices of the design of the game, not based on technical aspects. Personal attacks were made toward Riendeau in the comments section, leading to heavy moderation and the early closing of comments. In early 2014, Polygon chose their 2013 Game of the Year to be Gone Home, an independent game lasting about two hours and focusing on a story of a young woman coming out to her family, with minimalist gameplay. Again, the choice was controversial, especially after it trumped the second place game, The Legend of Zelda: Link Between Worlds, a game many gamers considered nearly perfect. Further reviews, such as their Bayonetta 2 review, was met with similar hostility after criticizing the over-sexuality of the game’s main character.
So after the outbreak of GamerGate in August, Polygon was seen as a forerunner in the invasion of political correctness in gaming. Polygon’s response to GamerGate was published by editor-in-chief Christopher Grant in October, after GamerGate had hit the front page of the New York Times for its audience’s threats toward feminist critics of video games. In his letter to readers, Grant writes, “when inclusion in said mob is exactly 10 keystrokes away from anyone with a Twitter account — # g a m e r g a t e — it's not only hard but actually impossible to distill that mob's wishes down to any one thing. So we didn't. We were, and I specifically was, paralyzed by indecision. How do you condemn a mob without drawing attention to that same mob?” Later in that same letter, Grant concludes, “But attacks from inside that same culture have led to worldwide media condemnation, a toxic dialogue and violent threats. People don't feel safe in their own homes. No need to jump at shadows of conspiracy or collusion, GamerGaters; you've already unearthed the most damaging force in video games today.”
Continuing into and throughout 2015, GamerGate has maintained its harassment of women in gaming, all for the purpose of “ethics in gaming journalism.” In spite of this, the anti-GamerGate movement has been able to withhold the attacks planned by users on sites such as KotakuInAction. In late January, a man in a skull mask threatened Wu with death in a horrifying rant on YouTube in front of his wrecked car. In an editorial on the Huffington Post, Wu described the attack as “just another Tuesday.” Though the video was later proven to be a hoax, the video was taken seriously by both Wu and law enforcement. Law and Order: SVU aired an episode based on GamerGate; it wasn’t well received. Well-received developer Obsidian included a transgender joke in their game Pillars of Eternity, a joke well criticized by many journalists and anti-GamerGate advocates, including Wu herself. The joke was removed despite outcry from GamerGate supporters. The organizers of the Calgary Comics Expo were able to stop a plan created by members of GamerGate to infiltrate the expo and disrupt panels, after Calgary declared their expo a no-harassment zone. Twitter added new harassment tools to their online service, as part of a response to several controversies including GamerGate. Anne Wheaton, blogger and wife of Wil Wheaton, donated $1 to Sarkeesian for every hateful pro-GG comment she received; she donated $1000 total, which was doubled by several other Twitter users. Pro-feminism panels on gaming were held in expos. John Oliver, host of HBO’s Last Week Tonight, dedicated an episode of his television show to discussing online harassment, specifically geared around GamerGate and Sarkeesian as an example.
In October 2015, SXSW chose to cancel two panels after receiving threats of on-site violence after announcing the panels: "SavePoint: A Discussion on the Gaming Community" and "Level Up: Overcoming Harassment in Games." The panels were not necessarily about GamerGate, according to the Level Up panel host Randi Lee Harper, and indeed, Brianna Wu herself still held two unaffected panels. A week later, Hugh Forrest, director of SXSW Interactive, apologized for the cancellation, announcing a day-long conference on internet harassment. Harper’s panel was not part of the reinstated conference, and she announced she did not support the hosting of Save Point, a pro-GamerGate panel, comparing it to “allowing a perpetrator of domestic violence to stand on a stage next to the woman he abused,” and stating that it was incredibly unsafe to feature both pro-and-anti-GG conferences on the same day. Two weeks later, Save Point was removed from the harassment conference, but not cancelled. Instead, it was simply removed from that specific day, with plans to still hold the panel sometime during SXSW Interactive. According to Perry Jones, GamerGate supporter and lead panelist, Save Point will cover “the current social-political [sic] climate of the gaming community, the importance of journalistic ethics in video game journalism, and the future of the gaming community and the industry.”
GamerGate has fallen out of mainstream press, but continues to be a problem for any pro-feminism gamer on the internet. Each day, a new threat is made, a new development announced. Whether or not this has to do with women growing in gaming, an identity crisis over the term “gamer,” or ethics in gaming journalism, one thing is for sure: GamerGate will not disappear anytime soon.
0 notes
Text
The End, Built into the Beginning: The Psychosis and Neurosis of ‘Synecdoche, New York’
Author’s note: this essay was originally published on May 11th, 2016. It has been republished here as a sample of my work.
The Blu-Ray release for screenwriter Charlie Kaufman’s directorial debut film, Synecdoche, New York, does not feature a commentary track. During the press junket for the film’s release, Mr. Kaufman was asked why this is; why would a film as convoluted, as layered and complex as Synecdoche not have a commentary track? “The whole point of writing,” he responded, “is to get people to have an experience with it, and if I sit here and say, well this means this and this means this, not only is it pointless because it either means that to those people or it doesn’t, but it also gets in the way of those people having their own individual experience.” Synecdoche, New York is a film that begins with a seemingly-normal story idea - a man approaching middle age becomes obsessed with his health, while him and his wife face relationship problems - and quickly becomes something entirely different. It’s a fever dream of a film; the madness starts slow, but once you descend to Caden Cotard’s level of madness, it doesn’t relent until the credits roll - that is, until you’re dead. Synecdoche, New York is a film obsessed with the human condition, obsessed with death and life and time and the lack thereof.
The concept of time in Synecdoche, New York is featured heavily. Caden Cotard, the film’s protagonist and theater director, is a man out of time. He no longer notices weeks passing by him, too wrapped up in his own death to focus on anything else around him. This is especially relevant during the first ten minutes of the film. Somehow, these first ten minutes in Kaufman’s film feel unrelentingly normal for him, a setup for the audience to be lured in and feel comfortable before everything is twisted upside down. But watching the first ten minutes of the film, it’s obvious that Kaufman wants you to notice that something’s wrong with time. Time is the first thing we see in the film (minus a fade into gray, but that’s a topic further down the line). A clock changing from 7:44 to 7:45, and the radio alarm activates. Caden gets up and gets out of bed, and we can hear from the radio hosts that it’s September 22nd, the first day of fall. It’s important to note that Kaufman begins his film on a day that is, essentially, rooted in death. Autumn is about the end of things, and as the poetry writer guest on that radio broadcast makes clear, death is beautiful in a sweet, melancholic way. From there, Caden goes throughout his morning as normal. He gets the paper and reads about death. His wife Adele mostly ignores them, while his child Olive runs around the house. He cuts his head open after a pipe bursts and is taken to the emergency room, and he returns home that night. A seemingly normal day in the middle of a million normal days. This is what Charlie Kaufman wants you to think, but a close read of the film shows otherwise. As I mentioned, this scene opens on September 22nd, 2005. But when Caden has approached the sink in the kitchen. It’s already October 8th (as mentioned in the radio broadcast about an earthquake in Kashmir). After collecting the mail, it’s October 15th; however, the newspaper reads October 14th, signifying a day has gone by since Caden left to get the mail. In the same paper, the obituary section is dated October 17th. The milk Caden says is expired went bad on October 20th. He sits back down; it’s Halloween. Then, suddenly, November 1st on the radio, and November 2nd in the newspaper. When he reaches the hospital for stitches, Christmas decorations are hung. On the ride home from the hospital, Auld Lang Syne plays on the radio, a hint towards the new year.
This is all intentional; these are continuity errors or mishaps. Kaufman wants to set the stage for the film to be a dreamlike experience; at the same time, Caden Cotard is a man out of time; we see this throughout the entire film. Caden experiences moments where he doesn’t know how longs things have been going by. After Adele and Olive leaves, he tells Hazel, his eventual love interest and secretary at his local theater, that it’s been a week since they’ve left. In reality, it’s been a year; Hazel plays this off by saying he needs to be bought a calendar, but there’s more to this. Caden’s mind has stopped recognizing days going by, stopped recognizing that he’s aging and that the people around him are changing and growing and getting married and divorced and having children and dying. Caden recognizes none of this, and as Millicent/Ellen, his replacement director, later says, “Time is concentrated and chronology confused for [Caden].” It’s true; Caden has a hard time recognizing time going by. His daughter lives a full, if somewhat cut short due to infection, life and he never lets go of her being four years old. Hazel starts a family without Caden paying much attention. Caden and Claire get married and have a daughter in the span of seven seconds of the film. One of the most realized moments comes when one of Caden’s actors asks when an audience will be able to see the play. “It’s been seventeen years,” he says. Caden promptly ignores him, either choosing to purposefully, or not understanding his point. Time is a false concept to Caden; he doesn’t have enough of it, and is literally slips right by him. He lives in a dream, surrounded by madness and symbols he can’t make out.
Time relates closely to death in this film, as the idea of lost time can surely relate to inevitable death. Sammy, Caden’s stalker and shadow and eventual lead actor, tells him something similar right before his suicide. “Say goodbye to Hazel for me,” he says, standing on the stop of the set of a building where Caden too once tried to jump. “And say it to yourself, too. None of us has much time.” Caden is a man obsessed with death and dying and sickness and disease. This much is clear from the opening of the film; his misreads a headline about Harold Pinter winning the Nobel Prize as Harold Pinter dying. (In a spot-on example of art imitating life, this is also a reference to Sky News accidentally reporting Pinter had died, before immediately correcting this to say he had in fact won the Nobel Prize.) He reads the obituaries, commenting on how many people are dying. He focuses on his stool, examining it closely for any example of blood. He sees several doctors, one after another, and constantly confuses his wording for what’s wrong with him. All of this is a lot to take in, but Kaufman made it clear right from the character’s name what’s happening. Caden Cotard is named for the Cotard delusion. Also known as Walking Corpse Syndrome, the Cotard delusion is a mental illness in which someone believes they are no longer alive or are dying. While I don’t think Caden has this in a literal sense, it’s certainly important to note that a man obsessed with his own personal health shortcomings shares a last name with a rare delusion. Death haunts Caden, ironically so much that everyone around him dies before him. Adele dies of lung cancer, after foreshadowing this by showing her coughing the first time she’s on screen. Olive dies of infection in her flower tattoos; Olive’s life wilts away just like the flowers on her arm. Hazel dies of what she agreed to die from decades earlier: her burning house takes her life the night after her and Caden finally begin a relationship. Sammy kills himself. His father dies of cancer, his body evidently riddled with it. His mother falls subject to a gruesome murder following a home invasion. Everyone important in Caden’s life leaves or dies, or both. And so it’s important we see Caden as both someone dead and someone alive; he outlives everyone. When the play falls apart at the climax of the movie, Caden is directed by Ellen to drive around the city without a set location in mind. He drives around dead body after dead body, never commenting on the horrors that took place on set after decades of rehearsal. He sees an actress standing in the middle of an alley, the same actress we the audience saw moments ago in a dreamstate. Caden asks her where everyone has gone. “Dead, mostly. Some left.” This is true, both for Caden’s professional life, and for his personal shortcomings. The people he loved, they’re mostly dead. The others all left.
These themes resonate throughout the film in a way not a lot of films can make work. It’s largely due to the talent both onscreen and behind the camera; Kaufman is an excellent writing operating and, if not the top of his craft, certainly with ambition, and Philip Seymour Hoffman carries the load of this film with ease. But the film has another trick up its sleeve as well: postmodernism. Postmodern films reject the norm by replacing it with something new and outside of the established rules of filmmaking, and this film certainly does that. It’s the reason why so many of the clues and hints about what is going on in this film can be read at so many levels, depending on when and where you watch the film. For example, the film’s depiction of Caden Cotard can be read on an entirely new level, one in which Caden has committed suicide either prior to or during the events of the film. This idea is mind-twistingly complicated, but adds a level of emotional complexity and dreaminess to an already complex, dream-like film. No longer simply working on a textual and subtextual level, Synecdoche, New York falls into metatextual and meta-metatextual levels of thought.
The idea that Caden is dead throughout the film naturally follows the idea of death coursing through the veins of this film. Watching and rewatching this film, it becomes apparent there is a major turning point not long into the film, where things go from slightly off-kilter to hallucinatory. Following this idea, it is then when Caden dies and the film enters a sort of purgatory state. It’s a difficult idea to follow, sounding hokey and pretentious to some, but watching the film with this in mind is, at worst, an interesting idea, and at best, a brilliant conclusion and explanation, the period on the end of a long, classic novel. The pivotal scene in this film occurs following Adele’s decision to leave for Berlin with Olive by herself. Up until this point, it is clear that Caden’s life has been falling apart, and he isn’t happy. But it’s after his family leaves that he truly hits rock bottom. He sits in his basement and watches as the cartoon on his television tells the story of his death. He watches as animated-Caden falls from the sky into the ocean, only to be swallowed up and eaten. A song plays as he falls:
“There’s no real way of coping
When your parachute won’t open
You’re going down, you’re going down
You fell, then you died
Maybe someone cried
But not your one-time bride.”
There’s so much symbolism going on in this scene that it’s difficult to break down into one single important moment. Everything here matters. This scene follows Caden losing his family, losing everything he once thought valuable. The animation on the television shows him falling into the water and being courted by three mermaids, here representing Claire, Hazel, and Tammy, the faux-Hazel he sleeps with at his mother’s funeral. It is then when the voice tells him he was not cried for by his one-time bride, Adele, and it is after this scene when things begin to lose sense. He goes to see his therapist, who suddenly has a book to give him, written by herself, which seems to make no sense and follow no logic. At another, later meeting, his therapist tells him about a gruesome, dark novel called “Little Winky,” written by a four year old. She tells him the four year old committed suicide at five, leading Caden to ask why he killed himself. Madeline, his therapist, responds “I don’t know. Why did you?”
The idea of “Little Winky” is linked inherently with Caden in this film, as seen in a poster for the film adaptation seen on the street in once scene, with Caden standing in front of Little Winky. The book is described as the four year old author’s idea of what he would come out of his like in the future. The parallel here is imagining Caden’s life, and the film itself, as the same. Caden dreams of winning a MacArthur grant, of making a play about humanity and the human struggle. And yet, even in this idealized version of life, Caden cannot fulfill his life’s dreams. He is a failure from the start, his biggest accomplishment being receiving the grant that began all this in the first place.
Finally, the film references the time 7:45 multiple times. It begins the film, as I mentioned earlier, with the clock switching from 7:44 to 7:45. It arrives again at the end of the film, painted on a wall covered in graffiti. Millicent, Caden’s director, finally pulls it all together as she makes it clear what’s happening. She recites the following to Caden as he wanders around the vacated post-apocalyptic warehouse:
“...as you learn there was no one watching you, and there never was, you only think about driving - not coming from any place; not arriving at any place. Just driving, counting off time. Now you are here, at 7:43. Now you are here, at 7:44. Now you are...
-gone.”
Once Caden reaches the word “gone,” he has his conversation with the actress from the dream. He rests his head on her shoulder, much like a child would with their parent and referencing his transformation into Ellen. He sits quietly, as the screen slowly fades to gray, mirroring the opening frames of the film, as he quietly says to himself for the final time, “I know what to do with this play now. I have an idea. I think-” before being cut off by Millicent, with one final word: “Die.” The gray drowns the film, and Caden’s hallucination ends. He is dead now, both in the real world and in purgatory. There is no more Caden Cotard.
Roger Ebert used Synecdoche, New York as an example of a film you had to see twice to get everything; there was just too much in the film to focus on in one viewing. I would go a step further: Synecdoche, New York is a film that you owe to yourself to see more than once. It is a life-affirming film; not a happy or joyous film, but one that focuses primarily on the human condition. It is, as critic Adam Johnston said in his analysis, “art-imitating-life-imitating-art-imitating-life and so on.” It is the work of one man, his magnum opus, his masterpiece and his biggest accomplishment. Much can be said about Synecdoche, New York; indeed, I haven’t even scratched the surface on how much is in this film, and I’d need another 7,500 words to do such a thing. Synecdoche is complex, emotional, confusing, draining, funny, heartbreaking, and above all, one of the most important films one will ever have the chance to watch.
0 notes
Text
Silently Drowning: Nintendo’s Quest to Save the Wii U
Author’s note: this article was originally published on May 9th, 2014. It has been republished here as a sample of my work.
Introduction: The Golden Days
In November 2006, Nintendo released their successor to the Gamecube: a small, white console named Wii. Previously codenamed “Revolution”, the Wii was an oddity in the video games market: both Microsoft and Sony had focused on the hardcore market with their seventh-generation consoles, the Xbox 360 (released in 2005) and Playstation 3, respectively. High-definition graphics, strong online components, and all the first-person shooters you could want were promised for those consoles, but Nintendo decided to go a different route. The graphics were only slightly improved from the Gamecube, still running at a low 480p, complete with muddy textures and jagged polygons. The bundled game, Wii Sports, promised an experience not for the average gamer, but for the average family, and the console featured a strong reliance on motion controls, a feature which some saw as nothing more than a gimmick. Regardless, the Wii sold like gangbusters, finding success among the casual audiences that had been ignored by Microsoft and Sony for years, becoming a must-have item for several years before demand finally began to quiet down. Overall, that little white box became sold over 100 million units, more than both Microsoft and Sony’s consoles. So, in 2012, when Nintendo released the successor to the Wii, named the Wii U, many expected similar results; however, a year and a half after its release, the Wii U is struggling to reach ten million units sold, putting Nintendo’s future in questionable territory, with many gamers and journalists asking where the company’s fate lies.
Part One: Cloudy Skies: The Announcement and Release of the Wii U
We open on a warm June morning in Los Angeles. Nearly every gaming journalist has gathered here for the Electronic Entertainment Expo 2011, commonly known as E3. Here, the major gaming companies will gather on stages, with celebrity endorsements and game trailers in hand, ready to show the world what they are working on for the next year in gaming. This year in particular is exciting; just two months prior, Nintendo confirmed they would be announcing the successor to the Wii at E3, to go on sale in 2012. The industry had gone years without a new console being announced; the current generation of consoles, the seventh, had begun in 2005 with the Xbox 360 (or 2004 with the Nintendo DS, if you include handhelds) and was still continuing in 2011. The crowd gathered into the Nokia Theater and watched as Reggie Fils-Aime, president of Nintendo of America, announced the Wii U for a 2012 release. The console displayed high definition graphics, an upgrade from the first Wii, but nothing comparatively better than the aging Xbox 360 and Playstation 3. It fully supported the full backlog of Wii games, as well as continued the use of Wiimotes as controllers, along with the new Gamepad, the major selling point of the Wii U. The Gamepad essentially acted as a controller combined with a tablet, allowing you to use the built-in touchscreen for items and menus or switch off the television and stream the game straight to the tablet. Game journalists were cautiously optimistic: the Wii had aged poorly, with only a handful of solid games released during the last year or two of its lifecycle and awful third-party support, with most game companies choosing to focus exclusively on the Xbox 360 and PS3 for their major titles. However, Nintendo promised strong third-party support for the Wii U, showing off footage of recently announced games from companies such as Ubisoft and Gearbox that would be released on the Wii U. The following year, at E3 2012, Nintendo cemented the release date and price for the Wii U: November 2012, $300-350 depending on model. The launch games were also announced, including games like ZombiU from Ubisoft and Nintendo’s own NintendoLand and New Super Mario Bros U. The price was a bit higher than what Microsoft and Sony were asking for their older consoles, and the launch library was rather small and unimpressive, but the launch itself went rather smooth: about $300 million was made during the launch window, up from the Wii’s launch in 2006 by $30 million (though the Wii was comparatively cheaper, which meant more units were sold). Worldwide numbers in January were estimated to be an approximate 2.5 million units, a decent-to-good number; Nintendo later confirmed 3 million consoles sold during their Q3 2012. This number fell drastically over the course of 2013, however, with only 2.8 million units sold during the entire year. During 2013, Nintendo stopped the sale of the cheaper $300 model and lowered the more expensive $350 model to $300, packaged with an included game and a bigger hard drive. Despite a late 2013 sales bump during the holiday season, Nintendo’s Q4 2013 (consisting of sales from January-April 2014) sales of the Wii U were about 310,000 units. The console’s sales have been mostly surpassed by new consoles from Microsoft and Sony, both launched in late 2013, despite the Wii U’s year-long head start. Nintendo has taken large financial losses for three years in a row due to the Wii U, unheard of in the company’s gaming history, with one games journalist reporting that Nintendo is “silently drowning.”
Part Two: The Heart of the Storm: Why The Wii U is Failing in Sales
If the Wii U was what more hardcore audiences had been asking from the Wii for years, then why was it not a greater success? Truth be told, there is no single problem that plagues the Wii U; instead, a multitude of complications have scared consumers away from the device. First and foremost, the Wii U suffers from poor advertising: frankly, most of the casual players that bought and used the Wii simply do not understand what a ‘Wii U’ is. A combination of the name (Wii U does not imply a successor to the Wii so much as an accessory) and the failure to properly display the actual console in advertisements, many of which only displayed the Gamepad, has left the average consumer to believe that the Wii U is nothing more than an expensive tablet controller for their Wii. In fact, the casual players that made the Wii such a success are another reason for the Wii U’s failure. The Wii and the Wii U were released in significantly different climates; it is difficult to believe, but the first Wii was released just two months before Apple announced the first iPhone. The pre-smartphone revolution was quite different for those who were interested in simple party games that could be picked up in a matter of minutes. The Wii addressed a large demographic that no gaming company had touched since the late 80s: the casual players. Unfortunately for Nintendo and for the Wii U, those players now owned smartphones that could reproduce the same gaming experience right in their pocket, with many free-to-play games waiting for them as soon as the device was in their hands. The casual market abandoned consoles for the iPhone, and without them, the Wii U was doomed from the start. To Nintendo’s credit, they did attempt to achieve better third-party support for the Wii U, but poor sales and the difficulty to develop for such an odd controller quickly scared consumers away. Electronic Arts, possibly the biggest third-party developer in the gaming world currently, released only three games for the system: Madden 13, Mass Effect 3, and Need for Speed: Most Wanted U. Following the release of these games, two of which were ports from older releases, EA announced they would no longer support the console, canceling planned ports of both Battlefield 3 and Crysis 3. Activision, another major third-party, provided some support for the console, particularly in the more casual-based games. Activision also released both Call of Duty: Black Ops II and Call of Duty: Ghosts for the console, two games from, sales-wise, one of the most important game series; however, the recently-announced Call of Duty: Advanced Warfare, the 2014 installment in the popular multi-million dollar series, has made no mention of support for the Wii U. Ubisoft, an early and strong supporter in the console, has also pulled back on games for the console. Originally, Rayman Legends was to be a launch-title Wii U exclusive; the game was delayed and ported to both the Xbox 360 and PS3. The Wii U version of Watch Dogs, another Ubisoft-developed game and one of the most-hyped games of 2014, has been delayed until September 2014, with the other versions of the games releasing during May. These developers should not be faulted for their support of the Wii U: the Wii U versions of Splinter Cell: Blacklist and Call of Duty: Ghosts both sold poorly, with the former accounting for two percent of overall sales of the game and the latter accounting for under one percent. There were also several cancelled versions of Wii U games, including the Gearbox title Aliens: Colonial Marines, Battlefield 3 and 4, and Metro: Last Light. This leads into the final problem for the Wii U: there simply are not enough games on the console. Without third-party support, the console has to lean exclusively on Nintendo’s first-party titles. Indeed, Nintendo has some of the strongest games series of any company: most Mario and Zelda titles alone are praised in each iteration. But Nintendo has not released enough games to keep the Wii U kicking; often, months go by without the release of a new Wii U title. The last major title released for the console this year was February’s Donkey Kong Country: Tropical Freeze; the next upcoming title comes at the end of May with Mario Kart 8. No console can perform strong numbers without games, and Nintendo’s console is going three months without major releases. Meanwhile, other consoles are seeing releases such as Dark Souls II, Titanfall, and Watch Dogs. This is not to say that the games Nintendo is releasing for the console are not good; Super Mario 3D World was widely praised as one of the best Mario games in years following its release in late 2013, but with all of the previously-mentioned problems, the Wii U seems destined to sink further.
Part Three: Relief Fund: Where does Nintendo Go From Here?
Many journalists and internet commenters alike have suggested solutions to the problems Nintendo is facing involving the Wii U. First, the Wii U has drawn comparisons to an old rival of Nintendo’s: Sega. During the 1990s, before the introduction of the Playstation, the two gaming giants were Nintendo and Sega. Sega’s final console, the poorly selling Dreamcast, has been mentioned multiple times in editorials discussing the fate of Nintendo’s future, and there is some validity in these claims. In fact, as Kotaku reported in April 2014, the Dreamcast sold nearly 2-to-1 compared to the Wii U in the same amount of months. These sales comparisons have compelled many to suggest that Nintendo should follow in Sega’s footsteps, removing themselves from the hardware market completely and focusing on developing games for Microsoft and Sony consoles. However, due to Nintendo’s success over the past three decades, they are not nearly in the same position Sega was at the turn of the century; they have a large amount of cash reserves to not only survive through the Wii U, but to continue onto creating more consoles in the future. This thought is also ignoring the major successes of Nintendo’s handheld console, the 3DS, which, despite a similar rough beginning, has seen rising sales due to one of the best game lineups on any console currently available. It is also worth noting that this is not the first time gamers have demanded Nintendo go third-party: it has occurred as far back as the Gamecube days, with gamers and journalists both declaring that staying first-party will see the end of Nintendo. This leads into the second suggestion many have made: to develop titles or release older titles onto Apple’s iOS and Google’s Android platform. The smartphone revolution has seen the success of casual gaming move from the Wii to mobile phones, and many have suggested that Nintendo games released for iOS or Android would be huge monetary successes. However, many have argued against free-to-play games, and have created extensive parodies of how these games would work on mobile platforms. While many declare that mobile will be the only salvation for Nintendo, others declare releasing mobile games would drive consumers further away from buying both the Wii U and the 3DS. That said, Nintendo is releasing companion apps for their devices, including most recently a Mario Kart 8 app that keep track of your standings. So, with these ideas out of the way, it is worth discussing positive ideas Nintendo should embrace in both trying to save the Wii U, and for consoles moving forward. First, acknowledge that Nintendo has several different IPs with large fan bases that are sitting unused. StarFox, Metroid, and F-Zero have not seen releases in several years. with Nintendo focusing much of their power on the continuation of Mario and, to a lesser degree, Zelda games. This is all well and good, but many fans of the series mentioned above have abandoned the company because their favorite games are not receiving new installments. Building off this, Nintendo needs to create new IP, and market it effectively. The newest major series developed (not published, but developed) by Nintendo was Pikmin, created during the Gamecube era a decade ago. This idea is well known by the gaming community, who comment that Nintendo has ceased innovating and creating new ideas, coasting off previous successes. Nintendo can easily combat this idea by creating a new series. The company also needs to improve their marketing efforts. The Wii had a successful advertising campaign which showed a family using the Wii together; with the Wii U meant to be a more single-player experience, Nintendo needs to embrace this. They also need to advertise to a larger market than children, as they did with the Wii. Both Microsoft and Sony advertise on large scales to mainstream audiences, and Nintendo should do the same. One key area Nintendo still lacks in is online play, which has become one of the most important features for most multiplayer games. Microsoft’s Xbox Live and Sony’s Playstation Network both give players strong online features, but Nintendo cannot seem to master online play. Finally, though Nintendo has no reason to cease the production of their hardware, Nintendo absolutely needs to improve their consoles’ specifications, if only for third-party developers. This is absolutely crucial; developers have complained about the difficulty in porting games over from the Playstation 4 and Xbox One due to the large differences in internals. Without better hardware in future consoles, Nintendo is sure to continue losing developers to more popular consoles. Truly, all of these ideas should be taken into consideration at Nintendo.
Conclusion: A Gaming Monolith
Nintendo has been creating both games and consoles since the 1980s, making it one of the oldest companies in the business. It is not an overstatement to say that nearly every gamer today has played at least one Nintendo game in their lifetime. Nintendo has a dedicated fan base who look forward yearly to the new Mario, Zelda, or Pokemon game, and despite their recent troubles with the Wii U, Nintendo is here to stay for a long time. The company has taken a misstep; however, the Wii U will not be the end of this company. Yes, Nintendo needs to reevaluate several of their strategies, including the hardware used for their consoles and their marketing plans, when looking to the future, but all sources indicate that Nintendo is doing just that. They are aware of the faults of the Wii U and are prepared to make changes in the future. 2014 is going to be an important year for the Wii U, and as the industry approaches the third anniversary of the announcement of Nintendo’s current console, all eyes are on the company for their next move. The games announced at next month’s E3 will either reaffirm Nintendo’s dedication to the console, or it will show that Nintendo is prepared to move on over the next few years. For those gamers who have bought and supported the Wii U, one can only hope that some truly great games are on the horizon. Nintendo turned the the 3DS sales around by developing some great must-have games; the true test comes this year. Journalists and gamers alike are focused on Nintendo for their next move; hopefully, they can pull it off.
citations:
Ashcraft, Brian. "Nintendo Confirms Wii Successor. It's On Sale In 2012.." Kotaku. Gawker Media, 25
Apr. 2011. Web. 12 May 2014. <http://kotaku.com/5795241/nintendo-confirms-wii-successor>.
Byford, Sam. "Nintendo reports third consecutive annual loss as Wii U sales fizzle out." The Verge. Vox
Media, 7 May 2014. Web. 11 May 2014. <http://www.theverge.com/2014/5/7/5689878/nintendo
earnings-fy-2013>.
Crecente, Brian. "Live at Nintendo's Next Wii Unveiling." Kotaku. Gawker Media, 7 June 2011. Web. 11
May 2014. <http://kotaku.com/5809435/live-at-nintendos-next-wii-unveiling>.
Davidson, Joey. "Wii U: All That Matter's Now is Nintendo's Support." TechnoBuffalo. TechnoBuffalo, 9
May 2014. Web. 11 May 2014. <http://www.technobuffalo.com/2014/05/09/all-that-matters-for
the-wii-u-now-is-nintendos-support/>.
Grant, Christopher. "Nintendo E3 2011 keynote, live from the Nokia Theater." Joystiq. AOL, 7 June 2011.
Web. 11 May 2014. <http://www.joystiq.com/2011/06/07/nintendo-e3-keynote/>.
Hansen, Steven . "Everyone is right. The Wii U still needs games.." destructoid. Destructoid, 14 Jan.
2014. Web. 11 May 2014. <http://www.destructoid.com/wii-u-development-drama-nintendo-s
box-is-a-nintendo-box-268933.phtml>.
Kinsley, John. "Have third party publishers finally given up on Wii U?." Wii U. Wii U Daily, 2 Mar. 2014.
Web. 11 May 2014. <http://wiiudaily.com/2014/03/wii-u-third-party-publishers/>.
Kuchera, Ben. "Nintendo may be drowning, but it's invested in doing so silently." Polygon. Vox Media,
29 Apr. 2014. Web. 11 May 2014. <http://www.polygon.com/2014/4/29/5664588/nintendo-wii-u
struggling-death>.
McIlroy, Shaun. "Dreamcast Outsells Wii U, By at Least One Metric." Kotaku. Gawker Media, 20 Apr.
2014. Web. 11 May 2014. <http://tay.kotaku.com/dreamcast-outsells-wii-u-by-at-least-one
metric-1565298837>.
Neltz, Andreas. "'Nintendo's Doomed, They Should Go Third-Party!'-Said Everyone, Ever.." Kotaku.
Gawker Media, 26 Feb. 2013. Web. 11 May 2014. <http://kotaku.com/5986942/nintendos
doomed-they-should-go-third-partysaid-everyone-ever>.
Orland, Kyle. "How we'd save Nintendo." Ars Technica. Conde Nast, 11 May 2014. Web. 11 May 2014.
<http://arstechnica.com/gaming/2014/05/how-wed-save-nintendo/>.
Ring, Bennett. "What went wrong with the Nintendo Wii U?." . Sydney Morning Herald, 2 Feb. 2014.
Web. 11 May 2014. <http://www.smh.com.au/digital-life/games/what-went-wrong-with-the
nintendo-wii-u-20140201-31no5.html>.
Serrels, Mark. "How To Fix 'How To Fix Nintendo' Articles." Kotaku. Gawker Media, 30 Jan. 2014. Web.
11 May 2014. <http://kotaku.com/how-to-fix-how-to-fix-nintendo-articles-1512730743>.
Smith, Peter. "The Wii U continues to drag Nintendo down; here's what they need to do to turn things
around." . ITWorld, 8 May 2014. Web. 11 May 2014. <http://www.itworld.com/personal-tech
417896/wii-u-continues-drag-nintendo-down-heres-what-they-need-do-turn-things-around>.
Tassi, Paul. "An Inside Explanation Of Why Third Parties Have Left The Wii U." Forbes. Forbes
Magazine, 11 Jan. 2014. Web. 12 May 2014. <http://www.forbes.com/sites/insertcoin/2014/01/11
an-inside-explanation-of-why-third-parties-have-left-the-wii-u/>.
Totilo, Stephen. "The Horror: If Super Mario Bros. 3 Were Made For Smartphones." Kotaku. Gawker
Media, 23 Aug. 2013. Web. 12 May 2014. <http://kotaku.com/the-horror-if-super-mario-bros-3
was-made-in-2013-for-1168392829>.
0 notes