Atari 2600 Game Development 317
gjb6676 writes "An article over at ExtremeTech is covering recent game development projects on the Atari 2600. The amount of cartridge space they have to work with is a sobering thought:
'A two-word file in Word 2002, for example, requires 20 Kbytes. "That's 20 Kbytes, five times the amount of (ROM) space developers had to work with in the 2600.'"
Wow (Score:4, Insightful)
Re:Wow (Score:2, Insightful)
1. the programmer didn't write an efficient "Hello, world" program?
2. the point of a "Hello, world" program isn't to acheive efficieny?
3. compiler/linker/OS combinations have become much more complex?
Game Design, then and now (Score:5, Insightful)
1. the number of programmers has exponentially ballooned since the early 80s, leading to a larger number of less godlike programmers, AND programmers have become more reliant on fat libraries and limitless resources, so coding something this small would bend my brain for sure.
2. game content has changed dramatically. q bert was weird. space invaders was weird. pac man was weird. (yes, sports games did exist, but they weren't mainstream then). games today are less weird. it's either a first person shootemup, sports, or a linear fiction w/some combat.
Focusing on #2, I'd like to see if there really is some creative game writing locked away in some programmer's brain out there, or if we've become a nation of UnReal, GTO, Final Fantasy, and Madden XFL clones.
I don't mean to put down these fine games, I enjoy many console games. What I'm trying to get at is the utter weirdness of what people come up with when severly limited by resources. Facsimile and simulation are out the window, so you really have to dig deep for a good game.
We'll see, I'm very interested in the outcome. Maybe the winners of the IOCCC should check this out.
Bigger isn't necessarily better (Score:5, Insightful)
Sure, Tank and Space Invaders on the Atari 2600 weren't deep, multi-layered games but they did provide hours of fun. Similarly, Paradroid, Wizball and even Elite, the cream of the crop on the Commodore 64 would seem dull and shallow to most of the new generation of gamers used to the depth of Grand Theft Auto 3, Starcraft or EverQuest.
But, to those of us who were gaming back then, these titles were as immersive and addictive as anything available today. Hell, I still fire up VICE (the best C64 emulator available) to play some of those titles today, and not just for nostalic reasons - back then, without the flashy graphics and sound games had to be immediately playable and fun or else they just didn't capture the imagination.
Who remembers breaking joysticks waggling them back and forth playing Track and Field? Who remembers the pride they felt when they finally reached Elite status? Or when they completed Impossible Mission? The shear unadulterated fun of playing Pong and Breakout for hours on end, not giving a damn that the last five minutes weren't at all visually distinguishable from the first five?
It's funny, but even though I'm an avid gamer I've bought fewer games in the last two years than I have in any one year before that, going back as far as 1983. Partially this is because today's games have more depth to them, but mainly it's because there are fewer and fewer titles that really enthuse me any more.
The lack of originality in the games industry today is part of it - I haven't seen a truly original game since Populous - but, ironically, I don't think that today's games capture the imagination half as much as the games of yesteryear.
Efficiency... (Score:2, Insightful)
The Zen of Optimization (Score:5, Insightful)
Last week over lunch a developer posed a programming problem he'd been given on a job assignment. We all suggested a similar algorithm..then I went home and coded it. Then coded a more optimized one. And said I wanted to optimize it more. They asked me why it mattered that in one iteration I had two multiplication operations, and in the second version I had one. Why, because it's faster, of course. That's the sort of thing that's meaningless to an enterprise middleware programmer (for the most part), but everything to a game designer. Maybe you're doing this operation 10 million times a second, and every nanosecond you shave counts.
Hacking means working with the resources you have in the constraints you've been given. It's a shame that so many developers now would look at challenge like that and just dismiss it rather than seeing it as an opportunity to wake up some parts of your brain you don't normally get to use. Why must "solve it" mean "solve it once" instead of "give me the best solution"? It's a pretty safe bet that if you stop at one solution you haven't found the best one. Why be pleased with that?
Duane
"256 bytes? It's impossible to write a game in 256 bytes! I need over 100 bytes just to pull the A20 line high and enable extended memory!!"
- badly remembered quote from a rec.games.programmer who just didn't get it
Re:Using Microsoft as a standard of efficiency (Score:2, Insightful)
Re:Wow (Score:3, Insightful)
Writing apps that aren't bloated does *not* necessarily entail lots of debugging and excessive writing time.
Too many programmers have decided that doing a crummy job is good enough (since, thanks to hardware advances, people won't usually notice unless a crash turns up). As a result, the state of software engineering and the quality of the products turned out is downright awful compared to any other field of engineering.
Re:I think.. (Score:3, Insightful)
Ah yes, the 6809... (Score:4, Insightful)
I once hacked together a "multi-line BBS expansion board for an Apple ][" that was 6809 based in 17 chips: 6809, 6883 DRAM controller (two banks of 32K), 2 64Kx4 DRAM chips, a 32K EPROM, three PALs (mostly address decoding), 4 2681 DUARTS (one on the Apple side of the bus, one on the 6809 side for a serial link between them, leaving a spare serial port on the apple side and five for modems on the 6809 side, and buffering chips for a fully independent backplane (separate from the Apple bus).
Coded the whole damn thing in assembler too.
Man, those days were fun! I think I still have that board (wire-wrapped, of course) for posterity. I remember the 6502 had this wierd read after write which didn't jive well with the 2681, so I had to disable odd address reads in the memory space of the card from the Apple side.
Forget Atari... (Score:2, Insightful)
Re:RTFA. (Score:2, Insightful)
The "only" 128 colours that the article mentions were really a hell of a lot, not just in the late '70s, but throughout most of the '80s. I believe this is one reason the console survived so long. Compare the pretty gradated sky displays of some Activision titles to the weak colour palettes of the Intellivision, Colecovision, or NES (not to mention icky PC CGA and EGA) and you can see where some of the reward comes from. The other consoles had notable technical superiority - framebuffers, higher effective resolution (not that the 2600 has a fixed "resolution" exactly), much faster processors - but the games didn't look as good on a TV because the colour sucked. You can actually fix the point at which the 2600 stopped production and the last cartridges were produced with where colour palettes finally outstripped it on hardware like the SNES.
The sound was actually no slouch either for the time, and you can make real in-tune music using just the internal hardware (Pitfall II doesn't count as it has a much later "Pokey" sound chip inside). Games like Pressure Cooker and Sentinel have catchy tunes that play throughout. Sure they're repetitive, but has that really changed much in video games? Nowadays you get several hundred bars instead of just several, but most game music still makes you want to pull your teeth just for the distraction.
The other reason the 2600 continues to attract developers is that it has no fixed limitations - newer systems have a fixed pixel resolution (hell, they have pixels - although it's notable that the Vectrex also has a fairly thriving development community), a certain number of sound channels of a certain sample resolution, etc. These things are known at the outset and are seen as limits on the design. With the 2600 that isn't the case - there's the feeling that you can always get it to do what you want, if you could only work out how. And you can.
more fluff and crap, less filling (Score:1, Insightful)
The people that caused the problems before XML was "the Thang" will simply hack up XML in the literal and spiritual sense making it into what some complain is just extra bloat. Well duh! Of course it is, but it does't have to be that way. Its just a system/tool, not a buzz fest and not the magical elixer of productivity. The magical elixer of productivity is vigilant design and actual ENGINEERING (no, not people with degrees doing the development, but actually using a structured and effective method of design, analysis, implementation, test and deployment... what we have today is too many hack-shops) A cop who has spent any real time in the inner city will tell you to NEVER use a gang bangers weapon (as in, if it is ahem... borrowed from a turd and is now needed). Why, because the average gang banger doesn't know shit about proper weapon handling, care or just plain anything proper about using a weapon like that. The result could very well be the gun blowing up in your hand or face. Here we have software much like the gang banger's gun. You are taking a pretty big risk if you are basing your adoption decisions on showy, superficial aspects like the simple size/revenue of the company, the cuteness of the commercials (even if you don't admit it) or the slickness of the salesmen. Good buyers know how to look for relevant facts and while one theory is too often put as _THE_ factor in these decisions is the size of the company, the other negative effects of this as well as pattern recognition (and thus acknowledgement) of past performance. The downside of said big companies is the entrenchment of bureaucracy and the institutionalism of the entire concept through production to post deliverable support system. You end up with less product/service for more money. It is a curve... where up to a certain point it is possible to increase efficiency, but then you reach a sort of neutron cancellation of efficiency, product and support.
The industry will soon go through a reorganization stage (I am guessing) that will see the rise of many small companies and the shrinking of the behemoths... which will later witness the conglomeration. It is a continual cycle that feeds off of the trend/fad adoption nature of people as well as their inate ability to replace vigilance with complacence.