By Ramy Logan
In the early days of the modern day computer (the late 1950s, early 1960s), the monitor was nothing more than a punch card that humans had to decipher. As humanity’s need for computers increased, the punch card was no longer enough to convey the information to the user. This growing need brought the invention of the first real monitor.
The early monitors with a screen, closer to what we know as computers today, used Cathode Ray Tubes to display information. These screens consisted of vacuum tubes that are coated on one end with phosphorus. When electrons reach the phosphorus, they emit light. As technology advanced and the ability to display more than just code, computers went from being a tool for experts to a common household item.
Drawing electricity from a power supply, monitors today have a technology known as liquid crystal display or LCD. These displays are backlit and the LCD reacts to the electrodes running in a grid behind the LCD. this causes the display to act like windows and allow varying amounts of light to pass through to RGB filters that give our monitors the wide variety of color that we have today. There are two ways a Liquid Crystal Display can be backlit. The first is what we commonly hear about today, light emitting diodes or LEDs. They are the newer form of backlighting, but, unlike most technology, is not necessarily superior to its older counterpart. The older method was cold cathode fluorescent lamp, or CCFL< while it does take longer to warm up and reacts a few milliseconds slower than the LEDs, some computer user prefer it as it does have its benefits, including the ability (in some monitors) to have both greater contrast and brightness than its younger counterpart.