Forgot your password?
typodupeerror
Programming Entertainment Games IT Technology

Donkey Kong and Me 123

Posted by kdawson
from the when-men-were-men-and-code-was-16k dept.
MBCook sends us to the blog of one Landon Dyer, who posted an entry the other day entitled Donkey Kong and Me. It describes how he was offered at job at Atari after writing a Centipede clone and ended up programming Donkey Kong for the Atari 800. It's full of detail that will be fascinating to anyone who ever programmed assembly language that had to fit into 16K, as well as portents of what was to come at Atari. "My first officemate didn't know how to set up his computer. He didn't know anything, it appeared. He'd been hired to work on Dig Dug, and he was completely at sea. I had to teach him a lot, including how to program in assembly, how the Atari hardware worked, how to download stuff, how to debug. It was pretty bad."
This discussion has been archived. No new comments can be posted.

Donkey Kong and Me

Comments Filter:
  • Nep0 (Score:3, Funny)

    by Tablizer (95088) on Saturday March 08, 2008 @03:35PM (#22688134) Homepage Journal
    My first officemate didn't know how to set up his computer. He didn't know anything, it appeared. He'd been hired to work on Dig Dug, and he was completely at sea. I had to teach him a lot, including how to program in assembly

    Without RTFA, I bet its a relative of an Atari bigwig.
           
    • Re: (Score:3, Informative)

      by calebt3 (1098475)
      He doesn't say. But he does say that his coworker was fairly typical for new hires.
    • Re:Nep0 (Score:4, Interesting)

      by iamhassi (659463) on Saturday March 08, 2008 @06:50PM (#22689174) Journal
      " My first officemate didn't know how to set up his computer. He didn't know anything, it appeared. He'd been hired to work on Dig Dug, and he was completely at sea. I had to teach him a lot, including how to program in assembly "

      Makes me miss the good ol' days when you didn't need a staff of hundreds and a multi-million dollar budget to make a good game. Back then one guy who didn't know anything could sit down and within a few months crank out a fun game for a popular console. I took a semester of assembly for CS and it's not that bad, wrote a tic-tac-toe game as a final project where the computer randomly placed it's pieces (could have had it scan the board but that'd be too hard for players, as-is the PC wins most the time) so I know a tiny fraction of what the author's talking about.
      • Re: (Score:3, Interesting)

        by jlarocco (851450)

        Weird coincidence. I wrote a Tic Tac Toe program in assembly the other day with the goal of making it fit in the 512 bytes of a floppy disk bootsector.

        Right now two players take turns placing either 'X' or 'O', but I have about 40 bytes left to make the computer play.

        Fun stuff.

      • Re: (Score:3, Interesting)

        by lena_10326 (1100441)

        Makes me miss the good ol' days when you didn't need a staff of hundreds and a multi-million dollar budget to make a good game. Back then one guy who didn't know anything could sit down and within a few months crank out a fun game for a popular console.

        You know. Except for the gaming part, exactly that happened with general application programming and the VB/JS/PHP programmers of the 90's and 00's and a lot of people complained that untrained, know nothing dolts were diluting the talent base and bringing

      • Re: (Score:3, Insightful)

        Makes me miss the good ol' days when you didn't need a staff of hundreds and a multi-million dollar budget to make a good game. Back then one guy who didn't know anything could sit down and within a few months crank out a fun game for a popular console.

        Well, nowadays, that's wht Xbox Live Arcade is for.

      • Have you seen the game "Cave Story", by any chance?
      • by kvee119 (958937)
        It's not the good ol' days you're talking about. It's just your expectation today are way different. There were little competition in the past, and the gaming industry is like an uncultivated land in which resources were still plenty. How about putting it this way: any thing creative coming from you were easier to be accepted than today?
  • Great guy... (Score:1, Interesting)

    by Anonymous Coward
    I've conversed with Landon online a couple of times over the years. He seems to be a super nice guy, and his blog is at times hilarious. My favorite two stories are the guy with no degree (who purports to have one) and repeatedly applies for a job, and another entry with an intimidating inscrutable bulging forehead genius who interviewed Landon for a job.
    • by creimer (824291)
      I had a job interview with the QA lead at 3Dfx before they went under. He had a mohawk, neck-to-toe tattoes, and more piercings than I wanted to count. His boss reassured me that he had piercings in other areas. That was too much info and I didn't take the job.
  • by Ieshan (409693) <ieshan@nOSpAm.gmail.com> on Saturday March 08, 2008 @03:43PM (#22688184) Homepage Journal
    Not that it has direct relevance here, but if you haven't seen it, "The King of Kong" is a fantastic documentary about "Competitive Donkey Kong". It's the tale of a guy who has the gall to challenge the world record holder in Donkey Kong and the corruption in the competitive gaming industry. It's also fantastically funny and a great time to watch.

    Highly recommend it if you're at all into gaming, but it's also a great social commentary to watch even with your non-gamer girlfriend/boyfriend.
    • Re: (Score:2, Informative)

      by NMajik (935461)
      I haven't seen it yet, but it looks interesting. Trailer is here [dejavurl.com]
    • Re: (Score:3, Informative)

      by gardyloo (512791)
      Search for "King of Kong" on YouTube. Clips are posted by PicturehouseDF. The second clip is simply named "King Kong", but it's really "King of Kong". Fun documentary.
    • Re: (Score:2, Insightful)

      by Anonymous Coward
      I've seen the King of Kong documentary and after reading this article I was wondering how he managed to reverse engineer the game without beating it. TFA mentions that Atari reversed engineered their arcade ports without any help from the original developers. Given the fact that Donkey Kong is unbeatable and that only 2 people in the country have even seen the last level.. makes me wonder if he designed his own levels for the port or had to extract them from the rom or something.
    • by BeeBeard (999187) on Saturday March 08, 2008 @05:20PM (#22688684)
      I saw "King of Kong" and rank it among my most favorite films.

      However, I would object to the notion that competitive arcade gaming is an "industry" at all. Some of the movie's best moments were when it laid bare what competitive gaming really is--a self-regulated collection of sycophants, plagued by the childishness of its most famous poster boy.

      Your assessment of the film's accessibility to nerds and non-nerds alike is completely accurate. If any Slashdot readers have a friend or girlfriend (although that might be stretching things in the latter case ;) who may not share your interest in gaming, this is the perfect film reaching across the non-nerd isle. I cannot recommend it enough.
    • by gertam (1019200)
      I absolutely loved this movie!! It was filled with drama and twists. And Billy is a Tool!
    • by luke923 (778953)
      There's no excuse for you not to see it if you have a Netflix account, considering Netflix offers this movie as part of their streaming service. YEAH! YEAH! YEAH!
    • Re: (Score:2, Informative)

      by Dionysus (12737)
      Funny this article should come up, since I just watch "The King of Kong" today. Very good documentary, but wikipedia [wikipedia.org] has more info (or at least another perspective) that wasn't covered in the movie.
    • it was a fascinating documentary, but it seems Billy Mitchell and the fancrew are real bitches, IMO it seem slike Steve Wiebe beat the record but the others are fanboys of mitchell and refuses to give in. hos publicly shown top score to them should show that he had the capability to beat him.

      The part with the kid in his sent in video was hilarious though :) Go Stevie!
    • Re: (Score:3, Informative)

      by J44xm (971669)
      I, too, found it fascinating. However, despite how well and convincingly the movie presents everything, I would encourage people to take the factuality of the events portrayed with gracious helpings of salt, as a number of the events in King of Kong have been disputed [wikipedia.org] by Twin Galaxies [twingalaxies.com] itself. Personally, I believe that it's safest to view King of Kong as a piece of fiction based on actual people and events rather than a truly factual documentary.
    • by Keebler71 (520908)
      fyi "King of Kong" is available for instant viewing on Netflix.
  • FWIW (Score:5, Interesting)

    by Anonymous Coward on Saturday March 08, 2008 @03:47PM (#22688208)
    After RTFA, you can watch some video [youtube.com] of the game (here's the Atari 2600 version [youtube.com]).
  • Open Development (Score:5, Insightful)

    by Doc Ruby (173196) on Saturday March 08, 2008 @03:54PM (#22688238) Homepage Journal
    How I wish Atari had got that guy to teach everyone how to program the Atari 800 and 400. I had to teach myself from the most cryptic, sparse and often contadictory documentation. There was an "Internet" in the early 1980s, but practically no one had access (I did), so we depended on the few published books, occasional insights in magazines like COMPUTE!, Creative Computing and Byte.

    It wasn't enough. Programming wasn't just hard because it required assembly code skills (or forth, hah!), but because it was completely hidden territory. There was no real way to get source code from the programs that some people managed to write and distribute, and certainly no obligtion for anyone to release it (except the occasional superficial magazine article).

    The competing Apple ][+, IBM-PC and TRS-80, all had BBSes full of downloadable code (often including source). Their corporate vendors each published detailed programming guides. The TRS-80 was doomed because of the direction of its corporate parent (which should have stayed in the PC business, porting its OS on Intel HW when they all upgraded from 8 to 16 bits). But IBM and Apple survived, even thrived (as we all know), because it was easy to get in the programming game.

    By the time Atari finally published its "De Re Atari", which was a good start (the source code to the OS), the small developer "community" had already chosen either Apple or PC. If Atari had taught us all how to program from the beginning, its superior hardware and attractive game platform would probably have left it a strong competitor to the PC, much as the Mac has. But we were all on our own, and our platformed disappeared.

    The same dynamic is still true on new platforms. Make it easy to develop for it, and it will survive, even thrive.
    • Re:Open Development (Score:5, Informative)

      by ehrichweiss (706417) on Saturday March 08, 2008 @04:16PM (#22688334)
      Agreed. I had been doing assembly on the IIe for a couple years before(enough to start hacking the kernel and working on my own DOS) I got my hands on an Atari 800 and then I discovered that all of the info on writing anything for the 800 was basically useless and, as you stated, contradictory. I had at least 2 books on assembly for the Atari and neither of them got me to first base. Eventually I just dropped it and bought an Amiga which was a lot easier to get into.
      • Re:Open Development (Score:5, Informative)

        by Dogtanian (588974) on Saturday March 08, 2008 @04:51PM (#22688522) Homepage

        I had been doing assembly on the IIe for a couple years before(enough to start hacking the kernel and working on my own DOS) I got my hands on an Atari 800 and then I discovered that all of the info on writing anything for the 800 was basically useless and, as you stated, contradictory. I had at least 2 books on assembly for the Atari and neither of them got me to first base. Eventually I just dropped it and bought an Amiga which was a lot easier to get into.
        WTF? The Amiga didn't come out until 1985, by which time the Atari 8-bit line had been around for years and was reasonably well-documented. (*1) Sure, in its early days, Atari (intentionally IIRC) did not release information about the 400/800 line, and caused problems for developers. However, AFAIK people mostly had them figured out by the mid-80s.

        And I don't understand how the Amiga could be easier to get into than the 8-bit Ataris; being a 16/32-bit machine, it was far more complex and had fewer obvious routes to get "into" it.

        The Amiga was neither the contemporary of, nor (at the time of its release) comparable in price with the Atari 800/XL/XE. Even if you did get your Atari then (and you meant "Amiga" rather than getting it confused with another machine), it wasn't the same mystery as it had been in earlier times.

        (*1) The same year that the Amiga came out (1985), the third iteration of the same basic Atari 8-bit hardware (now sold as the XE line) hit the streets. The 400/800 had come out in 1979, the XL line in 1983... that was *years* earlier.
        • Re: (Score:3, Informative)

          by Shinobi (19308)
          "And I don't understand how the Amiga could be easier to get into than the 8-bit Ataris; being a 16/32-bit machine, it was far more complex and had fewer obvious routes to get "into" it."

          Wtf?

          You never noticed the full hardware specs for the Amiga, that were available from the start, as well as the specs etc for the various API's?

          If those aren't obvious, you must be fucking blind.
          • by Dogtanian (588974)

            You never noticed the full hardware specs for the Amiga, that were available from the start

            By the mid-80s the Atari 8-bit hardware was well-known, regardless of how unhelpful Atari had been in the early days. In addition, not only was it a far simpler machine full stop, but having been around for six years, its ins and outs and the techniques used to get the best out of it would be fairly well established.

            The Amiga may have been well-documented, but it was a far more complex machine and being fairly new would have meant that people were still finding their way around it. Having a dictionary do

            • Re: (Score:3, Interesting)

              by Vellmont (569020)

              The guy was talking about assembly programming, so that wouldn't be applicable here.

              Actually, he was talking about assembly on the 8 bit machines. Along with Basic, that was about your only option to program in.

              The 16 bit machines had C compilers available for them, so programming was quite a bit easier.

              That may explain the "easier to get into" comment.
            • Re:Open Development (Score:4, Interesting)

              by schon (31600) on Sunday March 09, 2008 @12:21AM (#22690542)

              The Amiga may have been well-documented, but it was a far more complex machine and being fairly new would have meant that people were still finding their way around it.
              I started with a vic-20, the to the C64, then to the Amiga. The Amiga had more complex hardware, but the archetecture was *much* easier to code for - and the documentation made it a breeze to move from one to the other.

              Having a dictionary doesn't make you an expert in the English language.
              This is completely irrelevant. You don't need to be an expert in the English language to be able write a story.

              the specs etc for the various API's?
              The guy was talking about assembly programming, so that wouldn't be applicable here.
              If you believe this, then you have absolutely *NO* idea what the Amiga documentation consisted of. I programmed the Amiga in assembly (right off the bat), and the "Includes and Autodocs" were indispensible. To say that they "wouldn't be applicable" just shows that not only are have you never read them, but that you don't really know what you're spouting on about.
              • by Dogtanian (588974)

                This is completely irrelevant. You don't need to be an expert in the English language to be able write a story.

                No, my point was that having a big pile of words doesn't in itself give you the skill to string them together to write a half-decent story.

                If you believe this, then you have absolutely *NO* idea what the Amiga documentation consisted of. I programmed the Amiga in assembly (right off the bat), and the "Includes and Autodocs" were indispensible. To say that they "wouldn't be applicable" just shows that not only are have you never read them, but that you don't really know what you're spouting on about.

                First off, I never claimed that I'd read the Amiga documentation. Show me where I said I had.

                Do the "includes and Autodocs" constitute or describe an API (for assembly) in the conventional sense? *If* so, fair enough, I was wrong, but this suggests a blurring of the boundaries between assembly and higher level languages. (Not that this is necessarily a bad thing if y

            • by vux984 (928602)
              as well as the specs etc for the various API's?

              The guy was talking about assembly programming, so that wouldn't be applicable here.

              Hmmm... I didn't start into assembly until DOS3.0, but at that point the standard BIOS and DOS interrupt references were indispensible. As were the references for some of the common 'drivers' like for creating sound on a soundblaster, or using a mouse for input.

              I'd call all of those genuine APIs. Software library functions your progam could invoke.

              Not to mention the guides cover
        • Re: (Score:3, Informative)

          by archeopterix (594938)
          "And I don't understand how the Amiga could be easier to get into than the 8-bit Ataris; being a 16/32-bit machine, it was far more complex and had fewer obvious routes to get "into" it." The complexity has little to do with the number of bits, and the Motorola 68xxx with its flat memory model and universal registers was really easy to get into. Switching from that to 8086 with its segments and offsets made me want to slit my wrists.
          • by Dogtanian (588974)
            Yeah, that's true. I was really using it as a (technically misleading) shorthand for the generations, since the 16/32-bit lines tended to be (essentially) next generation computers relative to the 8-bits, with an accompanying increase in overall system complexity.

            As for the x86 instruction set, I'm no expert on that, but I've seen enough that I can entirely understand why you hate it so much :/
          • by Nazlfrag (1035012)
            Agreed. Learning asm first on the 6502 then moving to the 68000 was great, but then moving to x86 was horrific. The entire Intel engineering staff should have been drawn and quartered for the literally thousands of headache inducing design decisions made.
        • Just because "other people" had figured out the hardware does not mean I had access to that information back then. I was mostly poor back in the 80s(and also still a teenager), and living in a mostly rural area so owning a computer with a cassette adapter was a luxury, and modems were far out of my reach for many years and BBSs were very hard to find in the area. Additionally, your issues adapting to learn new platforms/chips/etc. are not a reflection of my own.
      • by Digital Vomit (891734) on Saturday March 08, 2008 @09:49PM (#22689862) Homepage Journal

        I had at least 2 books on assembly for the Atari and neither of them got me to first base.

        You have much to learn about women, grasshopper. Much to learn.

    • Re: (Score:2, Insightful)

      by Tablizer (95088)
      But IBM and Apple survived, even thrived (as we all know), because it was easy to get in the programming game.

      They survived largely because they were targeting multi-purpose usages whereas Atari was targeting mostly games. The game crash of '82 didn't stop general computer growth. IBM thrived because of the clone market (eventually hurting IBM) and Apple survived because of the desktop publishing market it helped spark. Amiga could've had a chunk of that market, but didn't bother catering to it well.
      • Re: (Score:3, Interesting)

        by Doc Ruby (173196)
        The Atari computers weren't targeting mostly games. Atari released all kinds of general purpose SW. VisiCalc ran on Atari. It had serial ports, modems and printers.

        The problem was that the Atari corp was mostly not targeting. RTFA to see what a circus it was. In the early 1990s I knew the guy who wrote the Atari computer game version of "Millipede" (funny how the creeper programmers like to talk). He told me stories of how the halls of Atari were filled with people wired on coke so much, there literally was
    • by MrCopilot (871878)
      There was an "Internet" in the early 1980s, but practically no one had access (I did), so we depended on the few published books, occasional insights in magazines like COMPUTE!, Creative Computing and Byte.

      You needed a subscription to Antic my friend.http://atarimagazines.planetmirror.com/antic/ [planetmirror.com]

      Many a night was wasted writing and saving to cassette the code of that mag. Always praying no one hit the light switch or tripped over the plug (Happened a maddening # of times.)

      • by Doc Ruby (173196)
        I had a sub to Antic. I've got a stack of _The Atari Connection_ magazines going back to V1#1/Spring1981. I've even got _Atari 400/800 NEWS BITS_ going back to #1/1979:

        FOR A 36 CHARACTER DISPLAY
        TYPE
        POKE 82,2: POKE 83,37
        PUSH [RETURN]

        FOR A 40 CHARACTER DISPLAY
        TYPE
        POKE 82,0: POKE 83,39
        PUSH [RETURN]

        And so began the next 4-5 years of people like me poking and peeking around memory spelunking for hooks into the science-fictional hardware we'd hooked to our TVs.

        I learned character set redefinition and display list

    • I don't know what you guys are talking about. Guys like Chris Crawford were working miracles with that system, cranking out elegant and slim titles like "Eastern Front" on the 800 -- still one of my favorite strategy games, it really felt like a board game by Steve Jackson Games. His code was so tight and beautiful. The Atari 6502 systems were a joy to program on -- the graphics were unparalleled (color registers in a home system?!). I only wrote music software, because of how delightful the Pokey chip
      • by Doc Ruby (173196)
        I think you're just not reading the post right. We're not saying the computer itself was at all hard to program, any more than any other contemporary 8-bit assembly platform with multiprocessing (and multiple machine codes), polyphonic synthesizer, uniform device interface... Or that the HW was limited.

        To the contrary, it was an excellent HW platform - the best at its time. The problem was that the techniques and the OS code wasn't available outside Atari. Guys like Chris Crawford worked for Atari. That's w
    • I remember when one magazine revealed the forbidden secrets of how to program Player Missile graphics (Atari's term for sprites). It was actually pretty easy once you knew how it worked. I had grand dreams, but I ended up just making a few sprites and making them move about with my joystick. (After all, playing with my Atari wasn't going to get my homework done, as my Mom might say...)
      • by Doc Ruby (173196)
        Your mom was right. And I dropped out of college, and retired in 2000 an Internet zillionaire. (true story)
  • by nexusone (470558) <nexusone@bellsouth.net> on Saturday March 08, 2008 @03:58PM (#22688258) Homepage
    Just to think I started programming that was a lot of memory, today code seems so bloated....

    • by v1 (525388)
      Some of my most interesting projects back in the Apple II days were in assembly and were well under 16k. 8k seemed to be about the point where things leveled off at finished. The boot loader I wrote took two pages, but the start of it that was able to load the 2nd page was entirely in the first page, so with that it could cold boot an entire disk. Amazing what you can squeeze into 256 bytes of 6502. Considering that disk IO on the // was at the state machine controller level and not just calling a ROM/Fi
    • by Kjella (173770) on Saturday March 08, 2008 @07:29PM (#22689320) Homepage
      Take off the rosy glasses and you'll remember all the bad things as well. I had a Commodore 64, and there were so many things that you couldn't possibly fit into that amount of space. For example, I doubt you can fit a simple truetype font rendering library with hinting in 16K. I'm glad I'm past that point where I need to think about the 200 byte structure I'm working on is passed by copy or reference, or whether this is a short or a long. I'm glad I can pass around a larger structure and not try to chase byte-size improvemets by calling everything on a need-to-have basis. I'm glad I can use standard library functions, even when they're ovrekill to invoke.

      My primary metric is clear, do as much as possible with as little code as possible. By that I don't mean extreme LOC-compression or extreme cross-referencing, I'm talking about writing using standard functions to minimize maintenance, complexity and sources of bugs. Bloated? Well, you can say that I don't care how much memory the libraries eat, but I certainly don't want the *code* to be bloated.
      • by Megane (129182)

        The first microcomputer I used that had a hard drive was back in the early '80s. It was used to process the data for a dental lab, ran CP/M on a TRS-80 Model II and had a 5MB SASI hard drive. That hard drive was so BIG for the day that it was formatted as four partitions!

        My 8 megapixel camera makes 4 megabyte JPGs for each picture it takes.

  • by cheebie (459397) on Saturday March 08, 2008 @04:12PM (#22688314)

    My first officemate didn't know how to set up his computer. He didn't know anything, it appeared. He'd been hired to work on Dig Dug, and he was completely at sea. I had to teach him a lot, including how to program in assembly, how the Atari hardware worked, how to download stuff, how to debug. It was pretty bad.


    So, what was it like to work with Bill Gates?

    [rim shot]
  • by cgenman (325138) on Saturday March 08, 2008 @04:42PM (#22688462) Homepage
    Donkey Kong and Me

    In the fall of 1981 I was going to college and became addicted to the Atari arcade games Centipede and Tempest. I knew a little bit about the hardware of the Atari 400/800 home computer systems, and decided to make a scary purchase on my student budget and buy an Atari 400 and a black and white TV (which was all I could afford). I messed around in Basic for a while, then bought an Assembler/Editor cartridge and started hacking away on a Centipede clone. I didnt have much to go on in terms of seeing prior designs for games and had to figure everything out myself. Like most of the school problems, you really just have to work things out with a few hints from the textbooks and lectures.

    Anyone whos worked with that Asm/Editor cartridge probably bears the same deep emotional scars that I do. It was unbelievably slow, the debugger barely worked, and I had to remove comments and write in overlays of a couple K in order to squeeze in enough code. My game, which I called Myriapede, took about three months to write. I still have the original artwork and designs in my files; graph paper marked up with multi-colored pens, with the hexadecimal for the color assignments painstakingly translated on the side.

    [I had to guess at colors. All I had was that cheap black and white TV, and I had visit a friends and his color TV for a couple hours in order to fine tune things].

    The Atari Program Exchange (a captive publishing house) was holding a contest. The grand prize for the winning game was $25,000. Id spent a semester of college blowing off most of my courses and doing almost nothing except work on Myriapede. I finished it with a week or two to spare and submitted to the contest.

    A few weeks after I mailed Myriapede off to the contest, I got a letter from Atari that said (1) they were very impressed with the work, but (2) it looked to them like a substantial copy of Centipede (well, it was) and that theyd rejected it for that reason. The subtext was they would probably sue me if I tried to sell it anywhere else, too. I was crushed. I wound up going to a local user group and giving a couple copies of it away; I assume that it spread from there. I hear that people liked it (best download of 1982 or something like that).

    A few weeks later I got a call from Atari; they wanted to know if I was interested in interviewing for a job. I was practically vibrating with excitement. I flew out and did a loop, and made sure to show Myriapede to each interviewer; it was a conversation stopper every time. Until they saw it they kind of humored me (yeah, okay, you wrote a game), then when the game started up they started playing it, got distracted and (ahem!) had to be reminded that they were doing an interview! One of the guys I talked to was the author of Ataris official Centipede cartridge. He said on the spot that my version was better than his.

    A couple weeks later they gave me an offer. Atari moved my single roomful of stuff out to California. I flew out and spent two weeks in a hotel waiting for my things to arrive; Atari wanted me out there real bad.

    Now, there were two popular arcade games that I simply could not stand; the first was Zaxxon, a stupid and repetitive scrolling shooter. The second was Donkey Kong it was loud, pointless and annoying. Of course, the reason they wanted me in California was so I could work on a Donkey Kong cartridge. After a few moments of dispair (and faking enthusiasm in front of my bosses) I gritted my teeth, got a roll of quarters and spent a lot of time in the little arcade that my hotel had, playing the DK machine there and getting to know it really, really well.

    I should explain how Ataris Arcade conversions group worked. Basically, Ataris marketing folks would negotiate a license to ship GameCorps Foobar Blaster on a cartridge for the Atari Home Computer System. That was it. That was the entirety of the deal. We got ZERO help from the original developers of the games. No listings, no talking to the engineers, no design documents
  • by sootman (158191) on Saturday March 08, 2008 @04:58PM (#22688562) Homepage Journal
    And he's now hosting his blog on that very same Atari he used oh-so-many years ago.
  • Slashdotted (Score:2, Redundant)

    by DavidD_CA (750156)
    Apparently this blog is running on 16k, too.
  • by obstalesgone (1231810) on Saturday March 08, 2008 @06:42PM (#22689138) Homepage
    the best computer ever in the whole universe, except for virtually every other computer that has been produced since, was my Atari 600XL. Simple enough for a 5 year old to program in machine code, by copying long lists of poke statements out of the blue pages of antic magazine, this computer changed the way I saw the world. In fact, after only a few short years of sitting in front of a 27" inch TV typing in listings, the way I saw the world had become rather myopic.

    Until I got my first Amiga of course. 68000 assembly language reads like a great literary work. Yes, the Amiga 500 with it's unix-like (but not *too* unix like) operating system and it's non-surface mounted giant chips named after *hot chicks*, and later, pregnant chicks, brought a 12 year old and his potentially permanently scarring soldering iron closer together than they had ever been before. Yes.. I got my first virus on an Amiga. It was so cool.. and so scary. Never before had I seen a virus! Don't share floppies kids!

    Back then, there were also machines called "macs" which were identifiable by the fact that they used completely different hardware than a PC (stuff made by Motorola.. pfft.. a cellphone manufacturer. leave it up to them and we'll soon be computing on our cellphones!!) and completely different input devices. People said we would never learn to like mice... and they were right.

    Well.. it's all gone kids. The mac doesn't exist anymore. Just PC's with unix-like operating systems, and PC's with Microsoft operating systems... and we still rate them on the same system... we fire up mame, and see how well they can duplicate the Donkey Kong experience.

    I nearly beat level 2 today.
    • by oldhack (1037484)

      ...Simple enough for a 5 year old to program in machine code, by copying long lists of poke statements out of the blue pages of antic magazine...
      Ahem, that'd be data entry, not programming.
      • How dare you suggest that there could possibly be an inaccuracy or hidden agenda in my post! My post was 100% pure grade "A" bullshit.. no "B" class bullshit in it at all.
    • by iluvcapra (782887)

      Until I got my first Amiga of course. 68000 assembly language reads like a great literary work.

      ...

      there were also machines called "macs" which were identifiable by the fact that they used completely different hardware than a PC (stuff made by Motorola.. pfft..

      I know the slant of the passage is humorous, but Motorola made the 680xx line of CPUs [wikipedia.org].

  • Barrels and other creatures are XOR'd onto the screen (I had some mask-and-repaint code at one point, but it was way too slow). Mario is a few player objects (three, I think). The "prize" objects (umbrellas, etc.) are the remaining players. The XOR graphics are pretty annoying to me, but most other people didn't seem to mind

    Wasn't there a big lawsuit in the late 80's over using XOR for mouse cursors? If so, you had provable prior art right there. Then again, it was your competitor, Commodore that was unde
  • FORTH (Score:2, Flamebait)

    by Tablizer (95088)

    The 'cartoon' sequences were given to another engineer, whose code I had to entirely replace (he originally wanted to do the job in FORTH, and didn't understand that the game couldn't afford to devote half the cartridge space to a FORTH interpreter just to make his life easier).

    That's what happens when you over-educate people: they learn all these great abstractions and ideas, and then have to do it the primitive way in the field. My first job out of college was programming in Fortran-66 (1966 standard), w

    • Re: (Score:3, Interesting)

      by tompaulco (629533)
      My first job out of college was programming in Fortran-66 (1966 standard), which had no IF blocks and WHILE loops, only GOTO's. The company didn't want to pay for a newer compiler.
      Don't hold back, tell us what year that was.
      My first programming job was while I was still IN college, in 1989, converting Fortran 66 code into the state of the art Fortran 77, if you can consider 11 years old to be state of the art. It was kind of comparable to running Windows 95 today.
      • Re: (Score:3, Funny)

        by Megane (129182)

        I got a job offer in my inbox the other day from some moron recruiter. (Apparently he is incapable of understand that there is a "macro assembler" other than IBM 370, or that "I will not relocate" in all caps is just pretty formatting.)

        Some company in Ohio wants to convert their "Macro Assembler" code (hopefully 370 and not 360!) to... COBOL! Yes, in 2008. Way to be 20 years behind the times, guys! Maybe in another 10-15 years you'll discover SQL and the internet.

        • by Sporkinum (655143)
          That must be Diebold looking for code maintainers for their voting machines.
        • by Tablizer (95088)
          Some company in Ohio wants to convert their "Macro Assembler" code (hopefully 370 and not 360!) to... COBOL! Yes, in 2008.

          Modern COBOL is not as bad as earlier versions. It's almost tolerable for what its usually used for. New projects can avoid GOTO's, for example.
                         
      • Yeah, and in 1994, some poor dude had to redo it in FORTRAN-90!

    • I had an interview question one time, regarding how to branch without using an 'if'. Needless to say I didn't get the job :).

      Needless to say, now I can do it :)
  • "MBCook sends us to the blog of one Landon Dyer, who posted an entry the other day entitled Donkey Kong and Me." Entitle: to furnish with proper grounds for seeking or claiming something titled: to designate or call by a title
    • Gah. That last post came out formatted differently than I had intended. Here's a correction: "MBCook sends us to the blog of one Landon Dyer, who posted an entry the other day entitled Donkey Kong and Me." Entitle: to furnish with proper grounds for seeking or claiming something. Titled: to designate or call by a title.
      • by darien (180561)
        First hit on Google [thefreedictionary.com] gives me:

        1. To give a name or title to.
        2. To furnish with a right or claim to something: The coupon entitles the bearer to a 25 percent savings. Every citizen is entitled to equal protection under the law.

        Second hit on Google [wordreference.com] gives me:

        1. give a title to
        2. give a title to someone; make someone a member of the nobility
        3. give the right to

        Third hit on Google [wiktionary.org] gives me:

        1. To give a title to; to dignify by an honorary designation.
        2. To bestow the right to do (to own, to demand, or to receive)

    • by Porchroof (726270)
      Isn't it ironic that the better communications technology gets, the more corrupt the language becomes?
  • From the aticle: "After DK shipped, a cow-orker of mine got a copy of the source listing" So how do you ork a cow?
    • I don't know, but you could always ask Scott Adams [wikipedia.org] or Dilbert [wikipedia.org]?

      Adams has coined or popularized words and phrases over the years, such as...cow-orker

      The strip has also popularized the usage of the terms "cow-orker"

      Maybe they'd have the answer?!

      FWIW, I'm fairly sure cow-orker predates Dilbert. I'd be prepared to wager a smallish sum on this being a Kibo [google.com]-ism.

  • Oh, the shame (Score:2, Interesting)

    by Atari400 (1174925)
    My first computer was an Atari 400 (hence the moniker), and I had a lot of fun with it, first learning Atari Basic, then moving on to 6502 Assembler (Assembler/Editor cartridge, then the bliss of Mac65). I upgraded the RAM from 16K to a massive 48K, then got an Atari 1050 disc drive, then a Happy/US-doubler chip for the drive so I could make "backups" of my legitimately purchased software.

    The only code I wrote that got in to the wild was some disk based copy protection. I wrote the loader using an interpr

Neckties strangle clear thinking. -- Lin Yutang

Working...