Red Dead Redemption: First Impressions




Okay, Mr. Marston, I know it's a little late in the game to really call these my 'first' impressions. Granted, it's the first time I've written about the game here, I've already sunk 10+ hours over the past week into the game. But here goes.

I was somewhat skeptical about the game given my (still) mixed feelings about Grand Theft Auto IV. As a pure game, sure, GTA IV is well done to the point I've played (about 10 hours as well) but it also failed as an open world game. The NPCs didn't allow you to be your own character. Content was NOT at your own pace--at least without them grumbling about you 'ignoring' them, failing to help against gangster attacks and whatnot.

 I'm never going to call you on your cell phone.

Red Dead Redemption fixes that glaring design problem. Missions are strictly at your own pace. Heck, if you wanted to explore "Oblivion" style for twenty hours, you could. I'd get a little bored just doing that, because the NPCs outside of the storyline (main and side quests) are really just scenery--albeit scenery you can shoot. Similar to GTA IV, missions are themed around certain main characters and are triggered by coming to designated locations. But, unless you specifically trigger a mission, the plot will not advance. Period.

After all, what's the point of having an open world game without giving the player the freedom to explore and consume content at their own pace? I've really enjoyed having that freedom, especially since the game allows you to be distracted by just about anything unless a mission is in progress or you've taken on a bounty. Shoot animals, ride all over the wilderness, trigger 'mysterious stranger' side quests, etc. Defend yourself against seeming endless supplies of bandits every time you take down a bounty target.

There are a lot of things I could write about, but mission control was my number one concern going into the game--and Rockstar has finally restored the 'open' to their brand of open-world gaming.

A Manifest Destiny for the 21st Century?


Disclaimer: This is, in a sense, what I personally believe--and yet in another sense, what I will be examining here is something I have very real hesitations about. Think of it, perhaps, as a (hopefully) interesting thought experiment in how to change American foreign policy.

Defining Our Terms

A quick visit to Wikipedia reveals that the term 'manifest destiny' can be used in a variety of ways, not all of them mutually compatible. In order to have a productive discussion, it's in our best interest to clearly delineate exactly what I'm referring to when I use it.

When I say 'manifest destiny', I mean the idea that America--in her particular brand of republican democracy--has something to offer the rest of the world. Many things could be changed, including a return to better economic freedom, less progressivism in the tax code, etc. I have no interest in whitewashing or idealizing the United States in actua.

Which, of course, is part of my interest in defining the term. 'Manifest Destiny' was the guiding principle behind much of early American expansionism, but there's little virtue to be found in historical episodes like the Indian Wars. I recognize that any talk of expansionism--particularly via military means--must be very carefully considered. The reality of unintended consequences demands that the proven benefits to be gained vastly outweigh the foreseen costs in lives, money, and temporary degrading of conditions in the acquired territory.

But decades down the road, who can deny that the American Southwest is far better off being a part of the United States than Mexico? It is possible, of course, that if Mexico retained the Southwest it could be a better place, but I'm not holding my breath for a successful historical revision to be written.

Change isn't always for the better. And war is a last, worst, alternative. But the success of the United States has to amount to more than a historical and geographic accident. The expansion of borders is something that should and must occur.

Tensions in Foreign Policy


The United States, especially since the turn of the 20th century, has struggled between the poles of isolationism and interventionism.

The isolationist tendency can be summed up as follows: given that republican democracy is essentially about self-determination, who are we to interfere in other sovereign nations' affairs? We certainly don't want them meddling in ours--and of course, treat that as the only true justification for action. Live and let live, so to speak.

The interventionist tendency views self-determination through some form of democratic representation as the ideal government for every geo-political instance. Countries who do not possess such regimes, particularly those who are oppressive rather than pseudo-benevolent, are targets for intervention. These targets can be prioritized in a variety of ways, but all must eventually be dealt with--and by military force, if necessary.

Strictly speaking, interventionism requires action in the face of any human rights issue. The problem is that many such issues remain internal to the given country (and thus action technically requires violating the sovereignty of another nation, something which the isolationist tendency resists) or do not directly impact the national interest of the United States. The horrible genocides in Cambodia and Rwanda are two such examples.

Any reasonable person regards these tragedies as horrible examples of human depravity. Yet few people argued for American intervention unilaterally, instead content to let the toothless UN put both countries on a further nosedive. This is because even the strongest interventionist must be pragmatic. We are one of the few free countries--and have finite resources. As much as American force could have helped, I totally agree that--in current terms--we can't be the world's policeman.

Policing the world--in a world populated by numerous sovereign entities--is a non-starter. The best argument for this is the current status of Europe. These countries were free to bankrupt themselves with incredible social expenditures--and comparatively little on defense--because they have been protected by America's defense umbrella, both strategic and conventional.

Moreover, even wars fought out of a desire for national security (however debatable) like Iraq and Afghanistan, increasingly become difficult to justify because there is little tangible return on our investment. We have spent incredible amounts of money and lives--and on what? The world is nominally safer, I would argue, but these countries have no geographic nor natural economic ties with our own. We're promoting freedom, and that's never a bad thing, but the maintenance required taxes our resources--and weakens us for the sake of strengthening another sovereign entity.

This self-sacrifice is effect out of a (current) desire to not appear 'imperialistic'. Frankly, though, the more I think about it, why is that such an argument? Now, I'm not going to be advocating actual raw conquest, but why shouldn't we get something tangible out of our toils? Isolationism, given a global economy and the increasingly long reach of 'rogue' states, is untenable. Interventionism becomes increasingly difficult because it represents a constant net drain on resources, both material and personal. There has to be a third way.

A Model for Justified Expansionism

This article has gotten long enough, so I'll try to conclude with reasonable brevity. The model I have in mind envisions a sort of commonwealth. The interchange of citizens, economy, and culture would entail the inevitable absorption into the original country--and the making of many sovereigns into one. Here interventionism can actually pay off, in new blood, new natural resources, etc. Not that it would be easy--after all, look at the drain the former East Germany was (still is?) on the re-unified German state.

Several simple reforms would likely make this easier. For one, I strongly advocate a simplified tax code analogous to Paul Ryan's proposal in his Roadmap for America's Future. A possible structure would be as follows:
  • Abolish the corporate income tax. Corporations are legal fictions and cannot, ultimately, pay anything. All corporate taxation is taxation on a combination of: employees wages and compensation, shareholders, and customers. It's a double tax that only hurts the economy--especially one that is trying to aggressively expand.
  • The remaining personal tax should either be completely equal (a flat tax) or as un-progressive as is reasonably possible. An example would be, say, 10% up to $100,000; 25% on anything beyond that.
  • New members of the commonwealth would only be taxed at the 10% rate--no cap. This would encourage the entrepreneurial members of society to move from the states to the new territories most in need of development.
A commonwealth agreement would be the next logical step of free-trade agreements in the North American region, especially with countries like Columbia, where we also have some limited sharing of military resources. The benefits would run both ways. America gains new territory, citizens, and resources. The member entities (no longer sovereigns, but not yet states) would gain a unified economy and legal code, the benefit of a sponsor aggressively improving infrastructure and military assets, and many other things.

Eventually these members could be absorbed as full states, though that part of the process would be a long ways down the road.

What do you think about this? I know it's a little (a lot?) wild, but do you see the point about isolationism and interventionism?

Please comment!

Telling Stories Without Words


This is just a brief observation I wanted to get down while I was thinking about it. . . .

I've been playing a lot of Fallout 3 lately, but just tonight I noticed a couple really interesting details that really bring the world home to me. One (and for those of you who've played, it's at the raider outpost in Cliffside Caverns) was after fighting my way through a raider-infested series of caverns, only to discover these caverns abut a series of caves home to Yao Guai. On both sides of the door leading to the new set of caves there were frag mines placed.

This really humanized the rather faceless (and voiceless) antagonists of the Capital Wasteland. Yeah, raiders are a real pain-in-the-ass, but they're people too. And they sure as hell don't want to be eaten by the Yao Guai.

 
 You want to be my lunch?
It's a small detail but one that kind of stuck with me.

Another similar moment came at the Warrington Trainyard. Here's an encounter with an area relatively infested with feral ghouls. But as you clear the area, there's a broken catwalk overlooking the yard with decayed skeletons piled around a bunch of ammo boxes and a missile launcher. People died fighting here--but no one ever tells you a story or puts it in a cut-scene. It's just there, for you to find, and make the inference.

 
Not the greatest angle, but it gives some idea of the layout.

Further up a hill nearby is a pseudo-bunker with more remnants of the combatants--and a sniper rifle placed on the window-sill. It's a design decision, of course, but unlike a more typical FPS attitude, the sniper rifle isn't placed there for you to use against the ghouls. You've already killed them if you came from the anticipated direction. The rifle is there because someone died tried to provide 'overwatch' to those poor souls on the catwalk.

Amazing how well-crafted worlds sometimes don't even need words to tell their stories.

Economic Models for RPGs


The RPG genre in video-gaming is very often a love-hate thing. The purpose of this article is not to examine the genre itself or laud any of its components. Rather, I want to look at an often over-looked aspect of RPG design: the in-game economy.

An essential feature of an RPG is its leveling system, and part of that system is the continued upgrading of the player's equipment and, depending on the game, the equipment of the party members. Given the length and depth of most RPGs, the economy in the game is a crucial design element--tying in to its accessibility to new players, its fun-to-play factor, and quite possibly its replay value. I'm going to try and make a brief survey of some basic types, their pros and cons, and then leave it open for you the reader to fill in what I know are many blanks in my knowledge.

Speaking of blanks, I'm going to make the first category one that I admit I have little direct experience with:

Loot Drop

When I think of loot drop in games, I instantly think of the Diablo series. I enjoy the games, but I have wrists that are irritated by excessive mouse-clicking, so I haven't exactly been able to really get into the game. But the sheer depth (yet deliciously semi-random) nature of the weapons and other items that drop from dead enemies fuels the addiction of the dungeon-crawling game.

From a totally different angle, there's the recent game Too Human. I played the demo, and, well, there were a lot of things I could criticize about the game. But here the 'loot drop' model is very poorly constructed. Erik Brudvig in his review for IGN begins by saying that "if you love collecting random loot drops, you'll find a lot to like here." He then adds later: "the game puts a big emphasis on leveling your character and outfitting him with better equipment, but then scales enemies and loot drops to you. The thrill of finding a sweet weapon is nearly lost -- you're stronger but soon enough the enemies are too." I seem to remember another reviewer (but I can't find the article) also commenting that, since many weapons are class-specific, you can have a 'supreme sword of ass-kicking' drop, only to find out that you can't equip it because you're not the right class.


I enjoy loot dropping and foraging in RPGs, but the design mistakes in implementing this particular in-game economy turned me off.


Finite Leveling

To a certain extent, everything is derivative (or somehow similar) to the loot drop model. One way, however, this model changes is when leveling is capped--not via a hard cap--by a 'soft' or 'hidden' cap. This hidden cap comes into play in games that literally have only so much experience to be earned, usually by having a finite number of enemies--and enemies that, once cleared, never respawn. First in my mind when it comes to this model are two games from the same company: BioWare's Knights of the Old Republic and Mass Effect.


Let's take the classic first. There are many things to laud KOTOR about, but it's economy is more of a mixed bag. There are definitely items worth having--especially armor--at high-dollar amounts that are unique to certain vendors. At the same time, the tuning is way off.


**SPOILER**
During the fight through the final planet before the Star Forge, my inventory was filled--FILLED--to the brim with countless blasters, low-level lightsabers/crystals, and other miscellanous items. But no opportunity to cash in for credits, because, well, the endgame had already begun. So I finished the game incredibly rich, but to no use. Yes, it's more of an annoyance, but every time I've played the game I always think it's kind of odd. Sure, there are plenty of useful items like health packs, but. . . .


One thing I have to praise, however, is a side story many players probably never uncovered. One of the locations you can visit in the game is a space station orbiting Yavin. It's relatively unremarkable, though a decent place to stock up for supplies. But if you visit enough (I believe three or four times, if I remember) you get drawn into defending the proprietor from an especially nasty cadre of thugs.


This is a HARD fight--easily the hardest fight in the game. What's your reward? I forget if there's anything you get outright, because that's not the biggie. The owner gives you the opportunity to brows his 'special' stock: the two best lightsaber crystals in the game. They're not cheap (20,000 credits a pop) but well worth it. I love the optional nature of the fight, and the fact that the reward is something more than worthwhile.


In many ways, the later Mass Effect went the wrong direction. Salvage and loot drop in KOTOR is occasionally rewarding, but mostly to build up credits through re-sale. The balance here is almost entirely reversed. I've played the game front to back twice, but several partial starts as well. Even with combat kicked up to the hardest difficult level available, I found items worth purchasing on (at best) an occasional basis. There are tons of vendors--even a quartermaster aboard your own ship!--and almost nothing that was worth buying. The equipment I gained by looting was always better, and I ended each playthrough with an absolutely ridiculous amount of credits. And, honestly, I stopped picked up items towards the end unless they were actually better than my current loadout.


Foraging Realism
Several games could fit the bill here, but my main experience is with the Elder Scrolls III: Morrowind, The Elder Scrolls IV: Oblivion, and Fallout 3. How different could these games get from those above?


All three, but especially Oblivion and Fallout 3, have hard level caps with essentially infinite enemies. It's not so much that areas re-populate with enemies (though many do) or that there are random battles (a la JRPGs like Final Fantasy games), but that your character is maxed out long before you've seen and done everything. These are 'completionist' games--games that take serious addiction/love to actually exhaust the content.


That's one difference. But how does that affect the economy? The sheer expansiveness of the world allows for some interesting design decisions. For one, all three games are filled with essentially useless items. Some are genuinely useless, while others weigh far too much (compared to their value) to make them worth carrying back to a store for re-sale.


Weight of items in an inventory. I didn't really mention this concerning the games above, but this is the only model that makes sense. Certain items weigh more--in a quite realistic progression--and your character can only carry so much. This restricts the foraging and makes it more purposeful. *I realize Diablo could fit in here, but it's not quite the same. Feel free to prove me wrong, though!*



In contrast, games in the previous category either have no cap on inventory or a sheerly arbitrary one (Mass Effect is capped at 200 items, irregardless of item type). That just doesn't make sense, especially the arbitrary caps. It's more realistic, and from a design perspective, it creates a nice rhythm of 'explore, gather loot, visit town, resell, repeat'.


And, if that's not enough realism for you, another element comes in. Your weapons and armor degrade and require repairs. The games from the Elder Scrolls series and Fallout 3 different slightly in terms of mechanics, but the central feature is attrition. Especially in Fallout 3, it's sometimes difficult to survive--simply because, you need more money to buy x, but to buy x you need to explore for loot. And looting wears out your equipment, and usually means getting injured fighting enemies, and that requires expensive medical treatment.


I recently restarted Fallout 3 with combat kicked to the hardest level, and it's a struggle to have enough ammo on hand and stimpaks to heal myself for certain combat situations. This makes the game that much more addictive.


So What's the Point?

Like with anything, there's no absolute right answer, but the foraging realism model seems superior to me. Maybe it's just because I'm a fan of the particular games in question, but it allows the game to fall into a natural rhythm similar to the FPS genre's "30-seconds of fun" model executed so well by the folks at Bungie.
Related Posts with Thumbnails