9287Views17Replies

Author Options:

Video Game bits-What are they? Answered

I've been recently getting into retro gaming, and I've noticed a lot of things, like bits. I know more bits means better graphics, but what is bits a measure of? I've also noticed how they go. It's by a power of 2, Like this: 1,2,4,8,16,32,64,128 etc. Lastly, what was the Atari 2600's?

17 Replies

user
RaymondU (author)2014-12-17

Kelseymh thank you for your well delivered answer. I do have a question in regards to the 8-bit- 256 color ratio. Can you explain to me why 8-bit is granted 256 colors while 24-bit can have millions? Can you dumb it down for me, I want know it well enough to where I can teach it to someone else. The answer is probably right in front of me but I'm stuck trying to figure this out and it's not adding up. Please and thank you!

Select as Best AnswerUndo Best Answer

user
kelseymh (author)2010-05-20

A "bit" is a single binary digit, a one or a zero.  There are two different places in computing where bits get counted:  in color displays, and in processing (the CPU or GPU).

Each pixel ("picture element") on your screen is assigned a number, which is translated into a color for display.  The number of bits tells you how many colors can simultaneously be displayed on the screen.  For example, "8-bit color" means that you can only have a maximum of 256 colors on your display.  Modern systems use "24-bit color", 8 bits each for red, blue, and green intensities, for a total of 16.78 million colors.

In computer processing, numbers are handled not in individual bits, but in bundles of eight called bytes.  The most common processor out there stores numerical values in blocks of four bytes each (32 bits).  For integers, that means values from 0 to about 4 billion, or the range -2 billion to +2 billion.  More recent, and more powerful processors, store and manipulate numbers in blocks of eight bytes each, or 64 bits.

Select as Best AnswerUndo Best Answer

user
dsman195276 (author)2010-05-21

 In short, Bit's in your type of graphics tells how many Bit's(Binary digits) make up one color on the screen. Now-a-days, it's almost always 32-bit, were 32 Bit's of data define the color of each pixel on the screen.

In old games, this was most likely not the case. Instead of defining the amount of the primary colors that make up each pixel(Red, Blue, and Green), they define a 'Pallette' of colors, and each binary sequence(like 1010) defines one of the preset colors. This was because 32-bit graphics took up to much memory back then.

The commonly used old graphics types were 16-colors(4 bit) and 256-colors(8 bit). This stands for how many bits represent a color. The more bits, the more different sequences you can have, and Argo more colors which translates to better graphics.

Example:

We set the Binary sequence of 1010 to a value of 64 Red, 64 Blue, and 64 Green(That makes the color white).

Now every time that 1010 appears in the graphics memory, we color the corresponding pixel that color(which is white in this case).

As you can see, the more colors, the more variance you can have from pixel, allowing for better quality pictures because you don't have to have such a difference in colors.

Select as Best AnswerUndo Best Answer

user
caitlinsdad (author)2010-05-20

 You need to look into the history of computing.  Bits and Bytes, Binary, etc.  It was the computing power of the processor based on how many On-Off switches (Binary) it could think about at once.  That is grouped into instruction codes that you know as programming.  I believe you might be referring to the Altair 2600's which were the early computers before PCs or was that an Atari game console.

Select as Best AnswerUndo Best Answer

user
kelseymh (author)caitlinsdad2010-05-20

Nope, NYPA was right about the Atari 2600.  We had one of these when I was a little kid.

Select as Best AnswerUndo Best Answer

user
NYPA (author)kelseymh2010-05-21

Really? I've only seen one in my whole life. It was selling for $30 and was all black. Across the room was a 7800.

Select as Best AnswerUndo Best Answer

user
caitlinsdad (author)kelseymh2010-05-20

 I was thinking of the Altair 8800, my mistake. I wish I had an Arp 2600.  But I do have a digi-comp somewhere in the basement.

Select as Best AnswerUndo Best Answer

user
lemonie (author)2010-05-21

An easy entry is to use RGB. TVs, monitors, displays etc use an Red, Green, Blue mix. If you assign one byte to each colour you've got 256 shades of each, which is >16 million colours. Add another byte and you can put a "brigthness" on top and get 256*256 shades of R, G & B and 256x more colours.
If it's CPU rather than GPU, the data-bus is wider, so for each clock-cycle more data is processed.

There were a lot of good 8-bit games...

L

Select as Best AnswerUndo Best Answer

user
NachoMahma (author)2010-05-20

.  8-bit color is 28 (256) discrete colors. Modern systems use 16-bit color which is 216 (65536) or more (sometimes much more) colors.
Atari 2600.

Select as Best AnswerUndo Best Answer

user
DJ Radio (author)NachoMahma2010-05-21

I thought modern systems of today used either 32 or 64 bit.

Select as Best AnswerUndo Best Answer

user
NachoMahma (author)DJ Radio2010-05-21

.  That was rather awkward wording, wasn't it? :(   "or more (sometimes much more)". Yes 32- and 64- bit graphics are common nowadays.

Select as Best AnswerUndo Best Answer

user
Lithium Rain (author)DJ Radio2010-05-21

Processors and operating systems, not graphics.

Select as Best AnswerUndo Best Answer