The Nintendo® Game Boy™, Part 1: The Intel 8080 and the Zilog Z80.


The Game Boy’s CPU is a hybrid between the Intel 8080 and the Zilog Z80. In this post we’ll  investigate the foundations of this custom microprocessor.

The Nintendo® Game Boy™ is a hand-held video game device from 1989. It was a tremendous success, selling over one million units within weeks of its release. It was redesigned and enhanced several times, including the technically-identical colored-cases Game Boys advertised as “Play it loud!”, and the lighter versions Game Boy Pocket and Game Boy Light (strictly speaking, Game Boys came in different models, for example, the DMG-CPU-03, DMG-CPU-05 and DMG-CPU-06, but because their differences are very subtle and not visible to the programmer, we’ll just treat them the same -although surely they are not identical-)[ref]. The original Game Boy is dubbed DMG (for Dot Matrix Game, in reference to its LCD display); Game Boy Pocket and Game Boy Light are known as MGB, which probably means Mini Game Boy. For all our practical concerns, they are all the same, and we’ll just talk about the Game Boy or DMG.


The place to start is the CPU.

The Game Boy is powered by a fairly simple microprocessor, which makes it an ideal target for learning about computer architecture through emulation. The CPU is actually a hybrid between the Intel 8080 and the Zilog Z80. The Z80 was designed to be binary compatible with the already existing Intel 8080. This means that the instruction set found in the 8080 was also implemented by the Z80 (in essence, the 8080 can be seen as a subset of the Z80). The Game Boy’s custom hybrid chip official name is Sharp LR35902.

Undoubtedly the foundation for the Sharp LR35902 chip is the Intel 8080, so let’s talk a bit about it.

INTEL 8080

“The first truly usable microprocessor”, this ‘8-bit legend’ was released in April 1974. Its history is full of glory, powering legendary machines like the MITS Altair 8800 and the arcade Space Invaders. It forever changed the way computers were built; it was the first general-purpose all-in-one computer (well, not quite, but as 8080’s lead designer, Federico Faggin, stated: “The 8080 really created the microprocessor market. The 4004 and 8008 suggested it, but the 8080 made it real”).

From a technical perspective, it presented a very straightforward architecture; its 16-bit wide address bus and 8-bit wide data bus enabled it access to 64K of byte-addressable memory. Its register set presented an 8-bit accumulator, an 8-bit status register, six 8-bit general-purpose registers, and a 16-bit stack pointer and 16-bit program counter. The convention is to name the accumulator register ‘A’, the status register ‘F’, the general purpose registers ‘B’, ‘C’, ‘D’, ‘E’, ‘H’ and ‘L’, the stack pointer register ‘SP’ and the program counter register ‘PC’. The general purpose registers could be used in pairs for some 16-bit instructions, (this capability was particularly useful for memory addressing). For example, ‘H’ and ‘L’ could be taken as a single 16-bit register ‘HL’ and used as a pointer to memory. This 16-bit capability could also be used to do arithmetic. For example, you could add any register pair (‘BC’, ‘DE’, ‘HL’, ‘SP’) to ‘HL’. You could also load them with a 16-bit immediate value, and increment/decrement them by 1. The ‘F’ register recorded results from the last executed instruction. Each of the 8 bits of this register carried a special meaning; for example, bit 8 was used as the ‘zero flag’. It was set when the last executed instruction resulted in a value of zero. Other such bits or ‘flags’ included a carry, sign, half-carry and parity flags.

There are lots of features from the 8080 that didn’t make it into the Game Boy’s Sharp LR35902 chip. For example, the 8080 had a separate address space for input/output; it accessed this space using the IN and OUT instructions. The Z80, being binary-compatible with the 8080, naturally also presented this feature. But we’ve got to bear in mind that both the 8080 and the Z80 were general-purpose microprocessors. On the other hand, the Game Boy is pretty much a very specific product, with limited support for peripheral devices, so its CPU doesn’t really need this feature. Instead, Input/Output was memory-mapped. For example, writing to memory address 0xFF40 would access the LCD Control register that is part of the LCD display device. In fact, the addresses 0xFF00 to 0xFF7F were reserved for this kind of device mapping. The advantage of this approach over the 8080’s and Z80’s separate address space is that there is no need for both IN and OUT instructions; these can be left out, making the design simpler since all read/write operations are made to memory. Of course, special circuitry needs to be able to redirect data read/written from/to devices. Anyway, for the Game Boy, simple LOAD/STORE instructions can be used to do IO. As a drawback, fewer memory is available to programs, but 128 bytes (0xFF00 to 0xFF7F) may seem like a fair trade.

The 8080 wouldn’t be without its competition. Besides the Motorola 6800 (precursor of the famous 68k family) and the also legendary 6502 from MOS Technology, there was the Z80 from Zilog, the other part of the Game Boy’s Sharp LR35902 hybrid. Let’s take some time to review this other beast.

Zilog Z-80 early advertisement, Electronics May 27, 1976. pages


Ironically, this giant was designed by the very same engineers at Intel behind the 8080. The most important feature behind it was its binary compatibility with the Intel 8080. That is, the same interface that the 8080 provided to the programmer, the Z80 also provided. This would allow any software written for the 8080 to be executed by the Z80, a strategy that forever affected the microprocessor industry as seen, for example, in todays various implementations of the x86 architecture by various manufacturers (Intel, AMD, VIA). But implementing the 8080’s functionality was the first part of the Z80’s goal; it introduced a number of unique functionality and features. Of course, as with the 8080, the Game Boy’s Sharp LR35902 was much simpler and almost all these new features didn’t make it into it.

The Z80 added some rather nice features, of which the most important adopted by the Sharp LR35902 was a special instruction that allowed for an extra 256 instruction set. Let’s explain a little bit how this works: The instructions for the Intel 8080 (and, thanks to the binary compatibility, for the Zilog Z80), had an extremely simple format. By a format, we mean the particular way the bytes read by the processor are interpreted while executing a program. Bytes from memory are read according to the value held by the Program Counter Register (‘PC’), which is used as a pointer to a location in memory. According to the values fetched using the ‘PC’ as pointer, the CPU is driven to execute one instruction or another, and the subsequent values in memory are interpreted according to the semantics of the particular instruction. Upon executing an instruction, the ‘PC’ is advanced the number of bytes equal to the size of the instruction, and the processor is ready to fetch and execute a new instruction.

In case of the Intel 8080 and the Zilog Z80, the way they interpreted the bytes fetched is such that the first byte fetched always indicated what is known as the Operation Code (OPCODE). This OPCODE determined the instruction to be executed, and was always 1 byte long. That is, the OPCODE would select one of 256 possible instructions. Now, according to the semantics of the currently-executing instruction, the bytes following the OPCODE may be interpreted as operands for the instruction. One of the special instructions presented by the Z80 that was missing in the 8080 was referenced by OPCODE 0xCB. The selected instruction was then interpreted as an instruction extender. When 0xCB was read by the program counter, the byte following the OPCODE was mapped to a whole different set of 256 instructions. This way, the Z80 was able to include a number of handy instructions that the Sharp LR35902 adopted. (This enhancement was possible because the Intel 8080 didn’t make use of the OPCODE 0xCB, so compatibility didn’t break with this added functionality). Among the extended instructions, some provided bit manipulation on registers and memory, others did block move and block I/O, while others did byte search. From this extensions, only the bit manipulation instructions made it into the Sharp LR35902.

The Z80 also provided a more flexible interrupt system, added two special registers, named IY and IX, used for the common base+offset memory addressing and added a second set of registers for system speedup; non of these features were included in the Sharp LR35902.

It is about enough talking about the Intel 8080 and the Zilog Z80, two main characters of a sweet era.

Continue with: The Nintendo Game Boy, Part 2: The Game Boy’s CPU | RealBoy.


Did you like this post? Do you have any suggestions? Please rate this post or leave us a comment so we can improve the quality of our work!


14 thoughts on “The Nintendo® Game Boy™, Part 1: The Intel 8080 and the Zilog Z80.

  1. “Each of the 8 bits of this register carried a special meaning; for example, bit 8 was used as the ‘zero flag’. It was set when the last executed instruction resulted in a value of zero. Other such bits or ‘flags’ included a carry, sign, half-carry and parity flags.”

    Does “bit 8” here refer to the least significant or most significant bit? I’m assuming most significant.

  2. Hi, I hope you still read this 😀

    How come all doc for dmgcpu tells me a machine cycle is 4 clock cycles long and we only have fetch-decode-execute? What is the extra cycle?

    Good job on your blog, I really enjoyed reading it.

      • I have the feeling that there is no need for a writeback cycle, because the output of the ALU uses latches, so at the end of the execute cycle you can just write the result back to the destination. I am not 100% sure of that. Also, I believe the forth cycle have something to do with a wait-cycle, something related to the memory. But I have never fully understood.

      • Oh also, I realized that the ALU is 4-bits, instead of 8-bits. That means that to execute an instruction we need 2 cycles to get the result from ALU. It might come from there… But I’m not 100% sure

    • Hi.
      Indeed, it is difficult to find documentation of the internals of the Game Boy CPU. The following is from the Z80 CPU User Manual from Zilog, but it can give us a hint on what is happening inside the CPU of the Game Boy:

      As you know, 1 machine cycle is ‘equivalent’ to 4 clock cycles. Now, the ‘fetch-decode-execute’ is all done in 1 machine cycle (4 clock cycles); the ‘fetch’ part is done during the first 2 clock cycles, while the ‘decode-execute’ part is done during the last 2 clock cycles. For the simplest instructions, this is it. For more complicated instructions, from 1 to 6 more machine cycles (4 to 20 clock cycles) follow. But the important thing to note is that the ‘fetch-decode-execute’ are not necessarily 1 clock cycle each; it is done in the 4 clock cycles, or a single machine cycle.

      • Thank you very much for your reply. I’ve been reading about it for a very long time now. So I’ve learned a couple of things. Yes, indeed the fetch takes 2 clock cycles. The second cycle is also used to read a wait signal from the memory, in case the memory needs more time to make the data available. The second thing I learned is that the alu is indeed 4-bits wide. On the execute clock cycle, it performs the desired operation on the least significant nibble, and the second execute cycle is pipelined and performed at the same time as the fetch of the next instruction. It is way more complicated than what I first thought. But it is way more clear in my head now. Please keep up with your work. It is very impressive. Thanks again.

  3. For the 4-bit ALU, you can check Ken Shirriff’s blog post on the Z80 ALU ( For the fetch-decode-execute cycle, I read on the Z80 official documentation ( And for the z80 pipeline I read an interview with the Z80 developers ( Everything that I am telling you concerns the Z80. So bear in mind that everything might be different from Sharp LR35902. I am just supposing that they work the same way. The only way to be 100% sure is to reverse engineer the chip itself. But I don’t have the time nor intelligence to do it :(.

  4. Pingback: Gameboy Emulator In-Depth |

  5. Pingback: Running the Original Game Boy On A Teensy Part 1 – The Robinson Files

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s