02-27-2011, 09:50 AM
In terms of Graphics, #-bits refers to the color depth the hardware can display on-screen. color depth is, in simple terms, the amount of colors from which the hardware can work with. in an 8bit display the hardware has a pool of 256 colors to choose from, while on a 16bit display the pool is as large as 64.000 colors.
however, the color depth has nothing to do with how the hardware can display sprites and backgrounds on-screen. while two consoles can work under the 16bit color depth spectrum, the same image can(and probably will) be displayed accordingly to each system's hardware specifications and accordingly to each developer's skill to work under said hardware specifications.
basically, amount of bittage doesn't justify nor determine how a sprite(or an artwork in general) should be displayed, only each spriter/pixel artists determines the final result of working under said limitations. this being said as people often use "8bit" as a terrible excuse for simple/poor/bad/shit shading on their sprites.
also, in terms of specifications, #-bits refer to the hardware's processing power. the higer power the processor has, the faster and more accurate it can display elements on screen without problems. A game that tries to go far beyond the console's processing power may affect the way graphics and elements are displayed on-screen. for example, every time a game on the NES tried to display more sprites than the console could handle (knowing that the nes could handle 8 sprites per scanline), the hardware had to resort to Flickering, which is nothing but alternatively switch between displaying a sprite and hiding another. which was extremely annoying when it happened.
however, the color depth has nothing to do with how the hardware can display sprites and backgrounds on-screen. while two consoles can work under the 16bit color depth spectrum, the same image can(and probably will) be displayed accordingly to each system's hardware specifications and accordingly to each developer's skill to work under said hardware specifications.
basically, amount of bittage doesn't justify nor determine how a sprite(or an artwork in general) should be displayed, only each spriter/pixel artists determines the final result of working under said limitations. this being said as people often use "8bit" as a terrible excuse for simple/poor/bad/shit shading on their sprites.
also, in terms of specifications, #-bits refer to the hardware's processing power. the higer power the processor has, the faster and more accurate it can display elements on screen without problems. A game that tries to go far beyond the console's processing power may affect the way graphics and elements are displayed on-screen. for example, every time a game on the NES tried to display more sprites than the console could handle (knowing that the nes could handle 8 sprites per scanline), the hardware had to resort to Flickering, which is nothing but alternatively switch between displaying a sprite and hiding another. which was extremely annoying when it happened.