Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Emulation (Games) Classic Games (Games) Hardware Hacking NES (Games) Nintendo Open Source Programming

Building an NES Emulator 140

An anonymous reader writes: Programmer Michael Fogleman recently built his own emulator for the original Nintendo Entertainment System. He's now put up a post sharing many technical insights he learned along the way. For example: "The NES used the MOS 6502 (at 1.79 MHz) as its CPU. The 6502 is an 8-bit microprocessor that was designed in 1975. ... The 6502 had no multiply or divide instructions. And, of course, no floating point. There was a BCD (Binary Coded Decimal) mode but this was disabled in the NES version of the chip—possibly due to patent concerns. The 6502 had a 256-byte stack with no overflow detection. The 6502 had 151 opcodes (of a possible 256). The remaining 105 values are illegal / undocumented opcodes. Many of them crash the processor. But some of them perform possibly useful results by coincidence. As such, many of these have been given names based on what they do." It's an interesting look at how software and hardware interacted back then, and what it takes to emulate that in modern times. Fogleman released the source code on GitHub.
This discussion has been archived. No new comments can be posted.

Building an NES Emulator

Comments Filter:
  • There are better ones. [arstechnica.com] Why do we care about this project, as opposed to the dozens of other (better) NES emulators?

    • by aardvarkjoe ( 156801 ) on Friday April 03, 2015 @11:43AM (#49398511)

      The story isn't that somebody made an NES emulator. Those have been around forever, and this is going to be uninteresting if all you want to do is play Mario. The story is that somebody wrote an article about it for anyone who is curious about some of the details.

      The article does focus mostly on the NES hardware, though, and I was expecting some insight on interesting or difficult points of writing the emulator itself.

      • by Anonymous Coward

        Ditto. I recently created an NES emulator as well and from my experience the hardest part wasnt emulating this component or that instruction, rather it was getting the timing just right. The CPU and PPU clocks are not identical / dont line up nicely, and some games are sensitive to specific pixels being generated at specific points in time of an instruction being emulated...you can get some weird results if you are off by even just a few clock ticks. I would be curious to know if this was also a notable cha

  • Little-known fact (Score:1, Flamebait)

    by kheldan ( 1460303 )
    In a previous lifetime I repaired coin-op arcade games, and the Nintendo VS system was certainly one of them. At some point after Nintendo utterly abandoned the coin-op industry (leaving countless operators flapping in the wind with no new games for their VS system cabinets) I discovered, in the schematics of the VS system motherboard, proof that Nintendo was using the coin-op industry as a test-bed for a home system: the control panel inputs, which in an arcade game are all discrete inputs per switch (4 fo
    • Fewer wires saves cost in a coin-op machine as well as a console.

      • Re: (Score:3, Interesting)

        by kheldan ( 1460303 )
        You misunderstand my post; everything I'm talking about is on the PCB in the coin-operated game. Literally two shift registers with copper traces connecting them. If you are sufficiently enabled electronically-speaking, then google 'nintendo vs system schematic, find the VS system operator's manual, which contains the PCB schematic; look for the player inputs, they go to 74LS165's, which are shift registers, which then lead to other shift registers converting the serial data back to parallel, which is then
        • by itzly ( 3699663 )

          I see two LS165 chips with a serial output that goes into a LS240 octal bus driver. Where's the serial to parallel converter you're talking about ?

          • by Megane ( 129182 )
            Apparently his eyes have gone bad over the years. The serial to parallel conversion is done in the game code.
        • Yes. You're right. They were testing their circuits.

          I've done similar things for an async clock crossing (only need to sync one signal pair), but that's on chip.
           

        • Re: (Score:1, Informative)

          by Anonymous Coward

          No, sorry, I just looked at the schematics and I'm pretty sure you're mistaken about the electronic design. On all four '165s, pin 9, the serial output, is connected to a tristate buffer (the 74LS240s). Those are interface ICs so the outputs on a shared data bus can all be connected together, so that's a very normal thing to place between the CPU and the peripheral electronics. The way I read it the CPU would output a specific signal to trigger the button state capture, then read from the same address 8 con

          • by itzly ( 3699663 )

            So while it's a little weird to me that the buttons go through serial shift registers and the DIP switches don't, what you're describing is not what I see. For all that it feels slightly untidy, I'm pretty sure this design just results from minimizing the number of ICs on the board.

            They used a single '165 to support 8 buttons. Even if it looks a bit clumsy, you can't really improve a single IC solution. Also, replacing it with another '240 would introduce more bus loading, add more PCB routing, and would require additional address decoding for the enable signal.

    • by Anonymous Coward on Friday April 03, 2015 @11:48AM (#49398551)

      I don't understand how this makes them bastards.

      • The design on the VS system motherboard never changed throughout it's entire lifespan. Nintendo are bastards because the evidence on the schematic of the PCB clearly shows that they intended to develop a home system all along, and once that was launched and well-established, they abandoned the coin-op industry entirely, leaving everyone with aging games and no new ones being developed. Back in the day it caused quite a bit of controversy. Some operators had hundreds of Nintendo VS System games, which no one
        • by bsolar ( 1176767 ) on Friday April 03, 2015 @12:02PM (#49398659)
          Everyone except millions of gamers finally able to play at home.
        • by SeaFox ( 739806 )

          Nintendo are bastards because ... they abandoned the coin-op industry entirely, leaving everyone with aging games and no new ones being developed. Back in the day it caused quite a bit of controversy. Some operators had hundreds of Nintendo VS System games, which no one wanted to play when they could play the SAME games at home for FREE. Nintendo fucked everyone.

          Did these operators have some sort of contract with Nintendo where they were guaranteed new titles for a set period of years? Were they told the titles would be exclusives for the VS system, which Nintendo later released for the home systems? I doubt it on both. Sounds like a bunch of whiny arcade owners upset a company exercised its freedom to not do business in a given segment anymore for new opportunities they weren't a party to. They were a middle-man to game players and Nintendo started selling to them

        • Re:Little-known fact (Score:5, Informative)

          by MightyYar ( 622222 ) on Friday April 03, 2015 @01:48PM (#49399367)

          The VS system didn't come out until after the NES (well, the Famikon version from Japan). You have the timeline backwards.

          Many of the cabinets on VS machines are older because they were converted Donkey Kong or Mario Bros cabinets.

    • Re:Little-known fact (Score:5, Informative)

      by tuffy ( 10202 ) on Friday April 03, 2015 @11:55AM (#49398597) Homepage Journal

      Nintendo VS system was an adaptation of their home hardware for arcade use, not the other way around. The Famicom predates it by years, remember.

    • by itzly ( 3699663 )

      So don't go getting so misty-eyed about Nintendo, they're utter bastards who care only about profits, and not who they screw over in the process.

      In what way is the customer screwed over by a manufacturer who uses a cheaper, and more reliable method of connecting switches to a CPU ?

    • I for one, am very thankful for the money I saved having an NES and not having to pump endless quarters into my local arcade, and I am most especially thankful that I never had to wait in line for my favorite game. It is not Nintendo's responsibility (or mine as a consumer) to protect someone else's outdated business model. Arcades lost, and consumers won.

      • It depends on the games, but honestly I miss arcades. It was more than just playing games. It was a social experience. Very little in life do you get to be in a room full of people who're intensely passionate about the exact same thing as you.

        You're thankful for not waiting in line, but some of my fondest memories are not of playing a game, but waiting in line for them. Cheering on an amazing Street Fighter match up with the 5+ other people who're in the queue watching with you, and the chaotic buzz of the

        • by swb ( 14022 )

          And many of those places ended up being good places for underage kids to buy weed or alcohol.

    • Maybe they were just smart and saw the writing on the wall for coin operated games.

      • by The Rizz ( 1319 )

        Maybe they were just smart and saw the writing on the wall for coin operated games.

        I think Nintendo were the ones who wrote it on the wall - the NES was the first real competition for coin-op arcades. Sure, other home systems already existed, but they were far, far behind arcade hardware's level of quality. The NES wasn't there yet, either, but it brought quality levels a lot closer than anything in the past ever had, and expanded into areas the arcade machines couldn't, with the concept of epic games with save/load features (Zelda, Metroid, etc.).

    • by PRMan ( 959735 )
      So they reused their home stuff in an arcade console. I thought that was obvious by looking at it.
    • Adding a couple of chips in order to make one of your platforms more software compatible to others reduces development costs. That doesn't make them bastards in any way.

    • Are you suggesting that the Famicom was based on a system that wasn't released until two years after its own release (1983 for the Famicom, 1985 for the VS. System)?

      Or are you suggesting that Nintendo abandoned the home market after making the VS. System, despite having released not one but three arcade systems after it, (the Playchoice-10, the Super System, and the Triforce board)?

      • There might be an explanation. In some parts of the US, there were VS systems in arcades/bowling alleys BEFORE the NES was available there. I saw a VS system with SMB1 before I ever saw an NES in stores. In areas where the NES was launched later, those VS. systems served as good publicity for the NES.

    • by st3v ( 805783 )
      Wow, you are so bitter for nothing! Calm down. It is common for hardware designers to design extra capabilities for future revisions. Makes it easier to improve. Car manufacturers do this too. The Mustang 5.0 engine right now has some cutouts in the design which hint that direct injection is coming. Welcome to reality.
    • The Famicom/NES predates the Vs. System. The VS. System was based on the Famicom/NES, not the other way around. You might have lived in one of those places that saw VS. systems before the NES was available there.

      At some point after Nintendo utterly abandoned the coin-op industry (leaving countless operators flapping in the wind with no new games for their VS system cabinets)

      They don't owe you profit. Besides Nintendo did sell a similar system based on the SNES

    • by Mr Z ( 6791 )
      And how, exactly, do you know this wasn't the other way around? ie. that the VS system was built to enable porting home-system games to the arcade, as suggested here? [wikipedia.org]
  • BCD mode (Score:4, Interesting)

    by flargleblarg ( 685368 ) on Friday April 03, 2015 @11:45AM (#49398525)
    BCD (Binary-Coded Decimal) mode was cool because it changed the way adding and subtracting worked. If you added 0x01 to 0x29, you'd get 0x30 instead of 0x2A. This was possible because there were actually two carry flags on the 6502 — one (named C) which was set upon overflow of values greater than 255, and the other (named D) which was set upon overflow of the low nybble (e.g., the low 4 bits).

    6502.org Tutorials: Decimal Mode [6502.org]
    • Re:BCD mode (Score:4, Informative)

      by Anonymous Coward on Friday April 03, 2015 @12:24PM (#49398807)

      To those of you wondering why BCD instructions would be useful, it has a lot to do with doing math with character-based input.

      Today, we're accustomed to parsing strings of digits to turn them into numeric types. However, this is actually quite costly, and it turns out that there's a much cheaper way to accomplish the same goals if you're only doing simple arithmetic with these values.

      Let's say you want to compute "2" + "3". "2" is 0x32 in ASCII or 0xF2 in EBCDIC, while "3" is 0x33 in ASCII or 0xF3 in EBCDIC. It is no coincidence that the low 4 bits of a digit's character encoding (in either of the two most popular encodings) correspond to that digit itself; it is specifically to support BCD math.

      I'll focus on ASCII for the remainder of this post, though this all applies equally to EBCDIC. All you need to do is mask things correctly (0x32 & 0x0F is 0x02, and 0x33 & 0x0F is 0x03) to "parse" character-encoded digits. Then you can just add them together (0x02 + 0x03 = 0x05) and OR the appropriate high 4 bits (0x30 for ASCII) to get back to character encoding land: 0x05 | 0x30 is 0x35, or ASCII "5". Much cheaper than contemporary number parsing and string building algorithms.

      Of course, if we had values that summed to 10 or more, we'd need to use the BCD-overflow flag to handle results properly (as 0x05 + 0x05 would yield 0x0A, which when transformed back to ASCII is ":", not "10") . This gets us into packed-BCD territory, which is beyond the scope of this post (but which can be summed up as packing two BCD values into a single byte ((a4) | b), which means we'd want 0x05 + 0x05 to give us 0x10, the packed-BCD value that represents the decimal number 10), as that's how BCD values with more than a single digit tend to be stored.

      BCD was a clever hack that enabled crappy computers to do a lot of work despite their slow speeds. Today, we just parse numbers in software, because it's no longer cost-effective to squeeze this level of efficiency out of our hardware. They don't make 'em like they used to.

      • by Anonymous Coward

        Today, we just parse numbers in software, because it's no longer cost-effective to squeeze this level of efficiency out of our hardware. They don't make 'em like they used to.

        The latest 64 bit chips (eg: x86-64) still have BCD instructions. In fact, the x86-64 bit chips can boot in 8086 mode and can even install DR-DOS or MS-DOS right on the metal no-emulator required using the provided BIOS emulation. This is STILL the way everyone boots machines that don't use (U)EFI: Search the bootable devices looking for a first sector with bytes 510 and 511 (last to bytes of a 512 byte sector) having the values 0x55 0xAA. Place that 512 byte sector into memory at 07C0:0000 (16 bit segme

        • When your chip is compiling/executing nearly every opcodes into microcode on the fly, it's time we retire the inefficient instruction set.

          That doesnt make it inefficient in the performance sense. Your argument is born from knowing just enough to be terribly wrong, probably because the RISC fanboys convinced you to stop considering relevant details

          If two designs can fetch N bytes of instructions per cycle then the design that packs more relevant operations into that N-bytes wins on performance. The relevant detail you missed is this exact bottleneck. Intel's design ultimately fetches more micro-ops per cycle than is possible with a more RI

        • by itzly ( 3699663 )

          they do this so much so that the x86 based instruction set is cluttered with crap we don't need anymore

          The unnecessary crap (like the BCD support) is so tiny that there's absolute no benefit in removing it. The rest of the x86 instruction set may look odd, but it actually works well.

    • by Binestar ( 28861 )

      BCD (Binary-Coded Decimal) mode was cool because it changed the way adding and subtracting worked. If you added 0x01 to 0x29, you'd get 0x30 instead of 0x2A. This was possible because there were actually two carry flags on the 6502 â" one (named C) which was set upon overflow of values greater than 255, and the other (named D) which was set upon overflow of the low nybble (e.g., the low 4 bits).

      I think I understand what it does, but what is the programming advantage of using this mode? Would it be used for things like scores? Or was there a different programming bonus effect you get out of it that I'm not seeing?

      • BCD mode is useful when you are working with numbers that you want to display to humans often. That is, you can do all the arithmetic in base 10 instead of binary. BCD is slower to work with than binary, but much faster to convert in and out of, since there's basically no conversion other than adding 0x30 (ASCII '0') to the nybble you want to display.

        Working in binary, on the other hand, requires costly conversion in and out of human-readable decimal. For example, converting decimal to binary requires a c
        • by itzly ( 3699663 )

          BCD mode is useful when you are working with numbers that you want to display to humans often.

          It's not really all that useful. Binary to BCD conversion isn't something that needs to be done often or quickly, because it's limited by how fast people can read.

          • by Binestar ( 28861 )

            It's not really all that useful. Binary to BCD conversion isn't something that needs to be done often or quickly, because it's limited by how fast people can read.

            If you read about it more, that is true now. Previously the speed gains for using BCD were worth it. You overestimate the hardware in an NES.

            • by itzly ( 3699663 )

              You overestimate the hardware in an NES

              The NES used a Ricoh 2A03 CPU, which contains a 6502 core, with the BCD functionality disabled. See http://en.wikipedia.org/wiki/R... [wikipedia.org]

            • There are about 30,000 cycles in each NES frame. Converting an 8-bit number from binary ($00-$FF) to three decimal digits (000-255) finishes within 84 cycles on a 6502. Converting a 16-bit number to five decimal digits takes a little longer (about 700 cycles using my routine [nesdev.com]). But you don't have to convert all the time; you can use a "dirty" system to convert when the score changes.

          • Actually, it was very useful when the 6502 was introduced. Remember, computers were slow back then. Converting a binary number to decimal was especially slow, since it involves division with remainder in a loop, once for each digit produce, and the 6502 had no hardware multiplication or division instructions.

            Also — and this is even true today — if you do all your calculations in base 10 instead of binary, you get a different (sometimes more desirable) behavior of rounding. For example, financi
            • by itzly ( 3699663 )

              Actually, it was very useful when the 6502 was introduced. Remember, computers were slow back then

              I know. My first computer had a 1 MHz 6502, and I did a lot of assembly programming for it. I even studied the binary to decimal conversion that came in the Basic ROM. It didn't use division, but it used tables with powers of 10, and repeated subtraction. There are other ways too. One of the problems with BCD mode is that you can only do add/subtract, and only when both arguments are BCD. You can't do multiply, or do mixed BCD/binary. So, yes, conversion for display is easy in BCD, but adding a 5% bonus is

          • by sjames ( 1099 )

            At one time, we used to actually count the cost of each and every machine instruction executed and of each and every byte used. That's how a computer less powerful than your calculator was able to land the Apollo Lunar Module.

            Binary to BCD conversion and back wasn't the way you did it. The values were stored in BCD form and the computations were performed in BCD. Processors had a specific mode that did the BCD math in hardware.

            Note that many of those same processors had NO hardware divider. The divisions b

            • by tlhIngan ( 30335 )

              At one time, we used to actually count the cost of each and every machine instruction executed and of each and every byte used. That's how a computer less powerful than your calculator was able to land the Apollo Lunar Module.

              That was when everyone had to account for every clock cycle - when computers were expensive, and humans cheap. Nowadays, humans are expensive and computers are cheap.

              Anyhow, it's also the main reason why emulation is hard - the slower the CPU, the harder it becomes. in fact, if you're

              • by sjames ( 1099 )

                Same reason the old "turbo" PCs had a button to drop it back to 4.77 MHz. Now that things are so much faster, code depends on timers and counters rather than loops so it's much easier.

      • BCD mode is used extensively in COBOL and in the banking industry. Also, conversions between binary and decimal numbers are really slow on processors that lack a multiply, divide or mod instruction. If the only divides and multiplies are by 10, then BCD math is quite competitive on an 8-bit processor. With the right workload, it is also competitive on some 16-bit processors.

        Many older and/or embedded processors lack a fast or single-cycle multiply and divide instructions. For instance, 8080A, 8085, 808

    • by itzly ( 3699663 )

      The D flag was just for the programmer to switch between decimal and binary. The extra nibble carry was an invisible internal signal.

  • by Anonymous Coward

    Ah - the 6502! That takes me back. The Commodore PET, VIC-20, and C-64 all used it and in the C64 in particular, the "illegal" opcodes were documented in "Compute!" magazine, a well-known and invaluable resource at the time (I still have them over two decades later!) In fact, after a bit of hacking at least one of the C64 cartridge games, it appears they were used in those too. Would be fun to see how faithful an emulator would be to them - though I suspect they would likely all fail.

    • Those illegal opcodes caused problems on the Apple II. Later machines starting with the Apple IIc used the 65c02 which defined new functions for some of the previously unused opcodes.
    • by 3vi1 ( 544505 )

      The C=64 used a 6510 (https://en.wikipedia.org/wiki/MOS_Technology_6510). The 1541 disk drives used 6502's though.

      • by Mr Z ( 6791 )

        True, but that was a 6502 core with an integrated I/O port. I'm pretty sure it largely reused the NMOS 6502 datapath, given that all the 6510 variants are compatible, including undocumented opcodes. At least, so claims Wikipedia. [wikipedia.org]

  • by William Baric ( 256345 ) on Friday April 03, 2015 @12:27PM (#49398827)

    I read the summary and thought : why the fuck does it tell those common facts about the 6502. I mean who doesn't know this? And then there was this awkward moment when I thought to myself : am I that old?

    • by dannycim ( 442761 ) on Friday April 03, 2015 @12:50PM (#49398979)

      Funny coincidence, four days ago I woke up in an ambulance (long boring story) and the number on the inside door was 6502. I smiled stupidly and said "Hah! 6502!" and looked at the two EMTs sitting next to me. They looked quizzically at me.

      "Oh right, I'm old." I said.

    • by aliquis ( 678370 )

      am I that old?

      No. We never get old.

      The NES and the 6502 is though.

    • I think we all are. I still have the book I learned 6502 assembly with. Impressed the hell out of the computer programming teacher in high school with some of what I did as well as made things interesting for him.
  • by spaceyhackerlady ( 462530 ) on Friday April 03, 2015 @12:47PM (#49398967)

    I wouldn't have minded seeing an example of one of those illegal opcodes and how what it did was useful.

    Brooks [wikipedia.org] called such things "curios". Side-effects of invalid operations that people had started to use, and that had to be considered part of the specification.

    My policy (seconded by my boss) is that I do not document such things. If a hack is documented people start to use it, then we have to support and maintain it.

    ...laura

    • by Anonymous Coward

      Go wild. [ataripreservation.org]

      • by Sowelu ( 713889 )

        Thanks! That's exactly what I was hoping to find in the article. I haven't played with that hardware before, and I got my hopes up when the preface said it was "very technical".

      • Thanks. A lot of these sound like "don't cares" in the instruction decoding logic. Reminds me of an undergrad course in logic design with lots of Karnaugh maps and stuff.

        ...laura

  • by m.dillon ( 147925 ) on Friday April 03, 2015 @12:48PM (#49398975) Homepage

    I started serious programming (at around age 14) on the Pet. First in BASIC, but once I found out you could break into a machine language monitor by wiring up a NMI button (we called it the two-button salute), there began my machine coding. In HEX, directly. Didn't even have a relocator at the beginning. It was a year before I could buy the expansion rom to add disassembly and relocation features to the machine language monitor.

    Ultimately I wrote an assmbler too. I don't think I have any of that code any more, it's been lost in time. Kinda makes me sad.

    The PETs 8-bit IEEE-488 bus was pretty awesome. The PET had a 1 MHz 6502. The external floppy drive also had a 1 MHz 6502 in it, and you could reprogram it. So one of my many projects was to speed up the data transfer between the two by synchronizing the processors with a series of handshakes and then pushing or pulling the sector data without any further handshakes (using processor timing).

    My friend did the same thing for the C64's serial interface (which didn't even have a uart) and sold a product called '1514 Flash!' that sped up the serial interface. Basically a little messing around at the beginning of each transfer to get the two sides synchronized within 1 clock cycle of each other and then pushing/pulling bits as fast as the little cpus would go without any further handshaking. The clocks would stay synchronized long enough to copy a whole sector.

    Other projects on the PET... it had a character generator rom which I replaced with a static ram. so when I powered it up I had to load a copy of the original rom into the ram blindly (because the display was total garbage due to it being an uninitialized ram).

    The PET had built-in CRT screen but the key was that the data input for the screen was actually a TTL input! So I could pop the wire off the connector and use it like a digital oscilloscope to probe various TTL-based projects (as well as the PET's motherboard itself).

    Another project...the character generator rom had something called quarter-block graphics. Basically 16 characters that had all 16 combinations of four quarter-blocks (2x2), so you could (I think) 320x200 graphics on it. I spent many hours optimizing the machine code to generate a pixel pusher.

    I got so good at writing editors from scratch, once when I went to computer camp and forgot to bring the tape I rewrote the text editor in machine code in less than an hour.

    Met Richard Garriott at that camp too, we were both on staff. He was working on (I think) Ultima II at the time (on an Apple II I think) and had an awesome ram disk for storing code temporarily while he was working on it. Once his computer stopped responding and after unsuccessfully trying to resurrect it he finally gave up and power cycled it, losing his work in the ram disk. It turned out he had accidentally disconnected the keyboard and the computer was actually fine. Oh well! Richard taught a class at that camp on human-interface parsing... basically had people write a dungeon game where you typed in what you wanted to do in English. Primitive of course, but the kids had a blast.

    I wrote a centipede game in machine code, incredibly fast and awesome (the last level the centipede was invisible and only blinked into existence for a second or two every few seconds), and submitted it to Cursor magazine. They rejected it because they thought I had stolen it from someone :-(. The only thing I ever tried to get published, rejected because the code was *too* good!

    The 6502 had two awesome indirect EA modes. (TABLE,X) and (ADDR),Y, along with the standard modes.

    Decimal mode wasn't as interesting, I wound up not using it for display conversion at all.

    The 6522 I/O chip was an incredibly powerful chip for its time, with multiple timers and timer-driven events. It had a few bugs, too.

    I remember all the unsupported machine codes the 6502 had. It was a hardwired cpu so all instruction codes did *something* (even if it was mostly to just crash the cpu). LDAX was my favorite. Mostly, though, the hidden codes were not very useful.

    The list goes on. Twas an awesome time, a time before PCs took over the world.

    -Matt

    • by Anonymous Coward

      Have to respond as another old school coder. Taught myself BASIC on my C64 and still pull it out and play it sometimes. Tried typing Assembly directly into the BASIC interperator and couldn't figure out why it wouldn't work. Lol. So many lessons learned back then. What I learned got me a job in the games industry where I ended up coding on the orignal Gameboy - another 1Mhz - 8k of RAM (4K paged) beast with bastardized Z80. The things you can do when you have so little to work with are absolutely out

    • Another project...the character generator rom had something called quarter-block graphics.

      You mean PETSCII:
      http://en.wikipedia.org/wiki/P... [wikipedia.org]

      And it's not just quarter blocks, but lines and symbols.

    • by Richy_T ( 111409 )

      The Pet was my first experience with networking. All the Pets in the room were connected to the same printer with Centronics cables (each plugged into the back of another). The printer was written to by outputting a byte to a port. I decided to see if it was possible to read the port and see the bytes that were being written by one of the other computers and it was.

      Of course, "real" networking was already around by then but I thought it was cool.

    • For fun, I helped a friend of mine reverse engineer a schematic for the 6502 based on the op-codes. Knew we had it right when we could predict all of the undocumented op-codes.

      Fun times. The world of computing was so full off wonder back then. (it still can be, bun one has to look a lot harder)

      (Hi Matt, remember me? Old friend of Bryce's from Starpoint days, and I contributed some enhancements to dme. Dang that was a long time ago)

  • What the hell? (Score:3, Informative)

    by Chris Katko ( 2923353 ) on Friday April 03, 2015 @02:00PM (#49399445)
    That's not even factually correct. It doesn't have a 6502. The NES had a Ricoh 2A03, which was a modified 6502 which REMOVED the BCD so they could put IO hardware registers in its place for controllers, sound, and DMA. The SNES did the exact same thing with the Ricoh 5A22, which is derived from the WDC 65C816--the same CPU as the Apple IIGS, which is why the development kid for SNES was an Apple IIGS.

    How are we supposed to learn something from this submission if they can't even be bothered to check Wikipedia first? Fun-fact: If your information isn't better than what Wikipedia already has, it's useless.
    • by itzly ( 3699663 )

      The NES had a Ricoh 2A03, which was a modified 6502 which REMOVED the BCD so they could put IO hardware registers in its place

      No. They left the BCD in place, but just disabled it. Also, the BCD logic is fairly small, and in the middle of the CPU, where there would be no room to add extra features. The extra IO registers were put around the original 6502 core. Here's an explanation: http://forums.nesdev.com/viewt... [nesdev.com]

      And here you can see the 2A03 chip, with the 6502 core sitting in the corner:
      http://www.vgmpf.com/Wiki/inde... [vgmpf.com]

  • I never owned a NES, but even I know just about everything in that article, just from downloading an emulator once (anyone remember Nesticle?).

    Where's the "technical" information? The fact that memory mappers exist for the platform, or that it was sprite/palette based graphics is hardly some massive insight to anyone starting down the route of writing an emulator for something of that era.

  • I know he says a "NES version of the chip" later, but saying 6502 then variant later is confusing.. (Just like the 2600 doesn't use a 6502, it uses IIRC a 6509.)

    The NES uses:
    http://en.wikipedia.org/wiki/R... [wikipedia.org]

    • by Mr Z ( 6791 )

      The Atari 2600 uses a 6507, which is a 6502 die in a smaller package—fewer address lines pinned out.

      The Ricoh chips in the NES use an exact copy of the 6502 die layout, plopped in a larger chip. Go have a look. [visual6502.org] You can see the 6502 portion in the lower right hand corner. (Here's the original MOS 6502 [visual6502.org] for reference.)

      When I say exact, I mean darn near exact: They differ by 5 transistors [visual6502.org], apparently, representing a surgical excision of decimal mode.

      So yes, the part number is technically not 6502. Bu

  • by Karmashock ( 2415832 ) on Friday April 03, 2015 @05:34PM (#49401041)

    ... here is what I want, Xbox and PS emulators.

    The PS1 has some pretty good emulators but the PS2 doesn't really... and I think the PS3 doesn't have them at all.

    As to the xbox, I've yet to see even one xbox emulator that was worth a damn.

    Not even the first xbox. Which makes me sad because I want to be able to play Halo 3 and 4 but I refuse to buy an xbox just for those games. I'm a PC gamer. MS released Halo 1 and 2 for the PC. Cool. And I don't mind waiting a couple years. That's fine too. But release them EVENTUALLY for the PC. Don't tell me the porting process isn't easy. We all know it is. So just do it and make me happy.

    That or I'm going to be one of those people skulking around looking for xbox emulators and cruising torrent sites for the game files.

    I will BUY your game for the PC. If you don't let me do that... then I'm sawing off my leg, wearing a eye patch, adopting a parrot, and developing an annoying love of the sound "ARRRRR".

    you have been warned.

    • MS released Halo 1 and 2 for the PC. Cool.

      it would be cooler if you could run a halo 2 dedicated server without paying a monthly fee. fuck that sideways.

  • Top of the technology emulation technology from the 90s. What's next?

The opossum is a very sophisticated animal. It doesn't even get up until 5 or 6 PM.

Working...