As I sidled up to the sofa for yet another hour of Sony's MLB 14: The Show for the PlayStation 4 yesterday, it dawned on me how, despite the obvious sports trappings, it really is the ultimate action Role-playing Game (RPG), setting a standard that the more typical fantasy-themed games in the genre would do well to emulate. Now, don't get me wrong, for the most part, MLB 14 is a standard sports videogame, one obviously themed to the well worn game of professional baseball. However, it does have among its cavalcade of modes, Road to the Show, which is as much of an RPG as any RPG that ever RPG'd (or something like that).
Road to the Show lets you create a baseball player from scratch. You have a pool of stats to distribute over a wide range of abilities (hitting, throwing, running, fielding, etc.), determine physical characteristics, design the player's features, determine preferred position, decide on the player's age, etc. In short, you can mould exactly the type of character you want to play, albeit only a male one (you can thank Major League Baseball for that particular restriction), right down to the name, which can even be spoken by the announcer who calls the games if you choose something common enough (my first name was there, "Bill," but not my last, so I chose a nickname of "Train," as in, "freight train - look out!," for my last name (don't judge me!)). (Read more)
After spending quite a bit of time recently on various discussion forums on AtariAge and Facebook, it has really struck me more than usual how incredibly demanding our retrogaming community (and gaming community at large) is, and how entitled, as the title of this blog post states, some people come off as. This is of course nothing new, going back to the days in the late 1990s when MAME developers would get criticized or even threatened when someone's favorite game wasn't properly emulated, as if the monumental task of emulating what is now thousands of arcade machines, for free, wasn't stressful enough, or otherwise rewarding for the end user. It was the one game that was the deal breaker among the countless other games and the incredible accomplishment in and of itself.
Of course, this kind of criticism has continued since. In my reviews over the past few years of the Atari Flashbacks 3 and 4, Sega Classic Console, and other similar devices, the negativity around those releases from viewers was often frequent and loud. Whether it wasn't getting the sound quite right in the Sega stuff, or missing a personal favorite game in the Atari stuff, the vitriol flew fast and furious. This included statements like, "No game x? It's a fail," or "The sound isn't quite right so I couldn't possibly use it." That's fine - individually we can dislike things for any reason we so choose - but then going on to state that people are idiots for buying it, or why would anyone want it, etc., and then going on what seems like a personal crusade to criticize said device at every possible opportunity (and, as we know, the Internet provides lots of opportunities) shows a remarkable lack of perspective. Take the examples in this paragraph. You're talking devices with say, 80 built-in games and original style controllers that typically retail for just $40. Can't we consider that maybe it might be OK to accept a few trade offs for something so low cost that offers relatively so much? Not for some, because apparently that one missing game is a personal affront or that tinny sound makes it completely worthless. [Read more]
Think gesture input on tablets is new? The video below proves otherwise. It is amusing to see how the storage media and display technology of the day struggle to keep up with the innovation here, but it is still extremely impressive.
It's a demo of a system used to document PCB and IC drawings from the 1970s. Goodness knows how much this beast cost in the day, but it is stated it cut certain jobs down from days to a couple of hours, so, given the expense of hiring engineers, it would have paid for itself in a reasonable amount of time I guess.
OK, the display itself here isn't touch sensitive, and modern displays that detect more than one point being touched is a significant development, but I honestly can't see how much more effective modern tech would be with this application.
I hate that the latest "kids react to old computers" video (this time centered around the Apple II) is making the rounds everywhere. Besides the fact that this same click-bait gimmick has been done multiple times before with other computers, it proves nothing. You can put just about anyone of any age in front of just about any old computer and they likely won't know what to do with it beyond possibly knowing how to insert removable media and then stumbling around for the rest. Every computer back then had its own set of commands and own way of working beyond the basics. Even someone who is highly skilled in one or another brand of vintage computer won't necessarily have a clue how to work with a completely different brand of vintage computer. I've certainly experienced this phenomena myself, especially since I work with dozens of different vintage computers each year (Pro Tip: Keeping command "cheat sheets" handy is a big help!).
And no, today's computers and mobile devices haven't made anyone "stupid" or "lazy." Today's computers and mobile devices - as you would hope from almost 40 years of evolution in the home - are merely more user friendly. Personal computers back then always strived for that as well, but there were obvious limits given the technology. [Read more]
After seeing yet another topic on AtariAge about why the Commodore 64 (C-64), released in 1982, succeeded in both sales and software support, where the Atari 8-bit series, released in 1979, didn't, I thought I'd offer up my usual thoughts on the matter in a more formal manner. To my mind, it's pretty simple. While the Atari 8-bits had a roughly three year headstart, in those three years, Atari wasn't able to make much headway in the market despite having the best audio-visual potential of the time, bar-none. The missteps with the lovely, but initially flawed, Atari 1200XL, didn't do them any favors, and by the time the C-64 started picking up significant momentum in 1983 when its retail price started dropping to the point where no one was able to compete effectively with its value proposition and still turn a profit, Atari was already done, particularly since they lacked Commodore's supply chain advantages.
Certainly price was a factor in the C-64's success in the US, but in the rest of the world, particularly Europe, price was often the primary driver (e.g., long after the US standardized on reliable, but expensive disks and drives, Europeans were still using unreliable, but cheap cassettes and tape decks), making Atari's inability to produce a low cost 8-bit in a timely manner particularly devastating. The influx of talented European programmers to the C-64's software pool can't be underestimated as the Atari 8-bit line struggled to make it into homes there. It also didn't do Atari any favors that they had multiple models out in the wild with 16K - 64K of memory at that time, making it difficult to target the higher spec. We can't underestimate the value of every Commodore 64 having 64K from its first day on the market to its last, making ports to platforms without a significant user base of guaranteed 64K-spec machines less likely. [Read more]
Inspired by a discussion on the Mid-Atlantic Retro Computing Hobbyists Yahoo! Group related to the recent VCF East 9.1 event and whether certain computing platforms should or should not be present at the museum location, I decided to offer up my thoughts on the often argued issue of what exactly constitutes "vintage" when it comes to computing hardware. Of course, me being me, I'll touch on videogame and mobile hardware as well.
It has been said that there's no one right answer for what constitutes "vintage," as it's naturally a constantly expanding target due to the simple passage of time. While this is true in the absolute sense, it doesn't mean that we as a community can't create an effective dividing line, no matter how much time passes, particularly once we introduce the concept of "intrinsic value" being tied to "vintage." For instance, I think we can all pretty much agree that generic PC DOS and Windows systems past a certain vintage - say mid-1980s - are generally out, which covers nearly all of the countless PC clones that continue to get produced to this day. It's not that some of these don't meet the basic criteria necessarily, it's that there's nothing notable about these boxes that anyone and everyone, be it a company or individual, could, did, and still do put together. It's even arguable that some of the parts - particularly certain expansion cards, like for video or sound - are worth more than the sum of the box, which is pretty telling for how we should generally value them in our determination of what is "vintage" and worth preserving and appreciating. [read more]
Sony just introduced their streaming games service. Why buy a PS4 if you can stream those games to a ps3?
What about the pricing model?
Games aren't movies! Netflix works because of the subscription model and the low price. In the short term Sony may be able to squeeze some money out of old games if they use a sensible subscription model pricing. But it won't work for new games as they need to earn money and cover developing costs. On the long term they will run into issues. They will have to up the price.
Considering the amazing number of devices most of us have access to these days, including smartphones, tablets, consoles, set top boxes, and computers, I'd be curious to know how everyone goes about playing. Do you stick to a handful of devices (and if so, which ones) or do you like to sample from everything that you own? What if you're like me and also have a collection of vintage platforms to choose from as well? There's a point where you have "option paralysis," of course, where you have so many gaming options to choose from that you tend not to play much of anything. Have you reached that point?
As for me, I find my habits fluctuate greatly. One week I might be on a vintage platform kick, while another I might exclusively game on my tablet or PC, while another still I might pick a recent console. Other times I want to play multiple things on multiple systems and end up not being able to choose or be limited by real world demands on my time (or energy), despite my enthusiasm otherwise. I suspect this will get worse as the two latest consoles get released this November and interest in the previous generation of systems wanes and we have to start making decisions about what to do with these now "legacy" consoles. Of course, that's to say nothing of things like low cost Android devices and even the upcoming "Steam Box," which will add further options (and confusion) to the mix. All these choices are truly both exciting and overwhelming.
So, what's YOUR plan of action?
As an unapologetic technophile, I naturally crave the latest and greatest technology. However, somewhat stifling those cravings are the reality of the high costs of new technology, available space, and the needs of my present workflow. In other words, even though I spend a disproportionate amount of my money on technology, my purchases must still be carefully considered for a variety of reasons.
While I have a demanding day job as a Technical Writer, I'm also a professional author and journalist, which requires a certain amount of portability if I don't wish to be chained to a desk for 12 - 16 hours a day. This portability is particularly important to me as I always try to make a point of balancing my working life with my personal (especially family) life.
On Tuesday, May 21, we'll have the next Xbox announcement. Nintendo has obviously already played their hand with the Wii U, an intriguing, but possibly failed gamble on a mix of current gen technology with tablet paradigms, and Sony has shown much of what they'll be offering with the PS4, a "social" next gen console that emphasizes its access speed for everything from updates to getting to play games/demos without much, if any, delay. Interestingly, Microsoft was first out of the gate this current generation, but will be last to make their announcement thanks to positive momentum in the past few years (everywhere except Japan, of course).
In any case, the rumor mill has been quite active, obviously, with the usual mix of thoughtful and not-so-thoughtful claims. You can read all about those elsewhere, but here are my thoughts on what is and isn't likely: