Video Game bits-What are they?

I've been recently getting into retro gaming, and I've noticed a lot of things, like bits. I know more bits means better graphics, but what is bits a measure of? I've also noticed how they go. It's by a power of 2, Like this: 1,2,4,8,16,32,64,128 etc. Lastly, what was the Atari 2600's?


sort by: active | newest | oldest
1-10 of 17Next »
RaymondU2 years ago

Kelseymh thank you for your well delivered answer. I do have a question in regards to the 8-bit- 256 color ratio. Can you explain to me why 8-bit is granted 256 colors while 24-bit can have millions? Can you dumb it down for me, I want know it well enough to where I can teach it to someone else. The answer is probably right in front of me but I'm stuck trying to figure this out and it's not adding up. Please and thank you!

kelseymh7 years ago
A "bit" is a single binary digit, a one or a zero.  There are two different places in computing where bits get counted:  in color displays, and in processing (the CPU or GPU).

Each pixel ("picture element") on your screen is assigned a number, which is translated into a color for display.  The number of bits tells you how many colors can simultaneously be displayed on the screen.  For example, "8-bit color" means that you can only have a maximum of 256 colors on your display.  Modern systems use "24-bit color", 8 bits each for red, blue, and green intensities, for a total of 16.78 million colors.

In computer processing, numbers are handled not in individual bits, but in bundles of eight called bytes.  The most common processor out there stores numerical values in blocks of four bytes each (32 bits).  For integers, that means values from 0 to about 4 billion, or the range -2 billion to +2 billion.  More recent, and more powerful processors, store and manipulate numbers in blocks of eight bytes each, or 64 bits.
 word.
.  double word
longword
infiniteword
dsman1952767 years ago
 In short, Bit's in your type of graphics tells how many Bit's(Binary digits) make up one color on the screen. Now-a-days, it's almost always 32-bit, were 32 Bit's of data define the color of each pixel on the screen.

In old games, this was most likely not the case. Instead of defining the amount of the primary colors that make up each pixel(Red, Blue, and Green), they define a 'Pallette' of colors, and each binary sequence(like 1010) defines one of the preset colors. This was because 32-bit graphics took up to much memory back then.

The commonly used old graphics types were 16-colors(4 bit) and 256-colors(8 bit). This stands for how many bits represent a color. The more bits, the more different sequences you can have, and Argo more colors which translates to better graphics.

Example:

We set the Binary sequence of 1010 to a value of 64 Red, 64 Blue, and 64 Green(That makes the color white).

Now every time that 1010 appears in the graphics memory, we color the corresponding pixel that color(which is white in this case).

As you can see, the more colors, the more variance you can have from pixel, allowing for better quality pictures because you don't have to have such a difference in colors.
caitlinsdad7 years ago
 You need to look into the history of computing.  Bits and Bytes, Binary, etc.  It was the computing power of the processor based on how many On-Off switches (Binary) it could think about at once.  That is grouped into instruction codes that you know as programming.  I believe you might be referring to the Altair 2600's which were the early computers before PCs or was that an Atari game console.
Nope, NYPA was right about the Atari 2600.  We had one of these when I was a little kid.
NYPA (author)  kelseymh7 years ago
Really? I've only seen one in my whole life. It was selling for $30 and was all black. Across the room was a 7800.
1-10 of 17Next »