Logo of Computer Classes

How computer monitors are classified

Did you know that prehistory ends with the appearance of writing? So is. The story begins around 3800 BC. And since then it has only evolved. From hieroglyphs to WhatsApp messages.

But before getting to that, it underwent various changes in the supports. The printing press, the typewriter and computers. And what difference more to the latter two than the monitor. This element has changed the way we write and connect with digital.

It is unthinkable to imagine the Internet and everything related to computers without the presence of this hardware. To learn more about it, understand how it works and thus make a better acquisition, we recommend this post.

What is a computer monitor and what is this hardware for?

It is a device designed to show in a visual and updated way the information of the processes that occur in the computer through the interface. That is, the data that the system converts into visual information. Also known as a screen, it should not be confused with a TV, especially today, where the differences are minor. Both devices are designed for different purposes and therefore have different characteristics. Until a while ago, the distance between monitors and TV was quite explicit.

The resolution, design and types of connections were different. However, these days both use high-definition connections and are even quite similar in shape. But the truth is that monitors are manufactured to represent colors as close as possible to the real ones while televisions enhance them to generate greater attraction. Another difference is the response time, that is, the speed at which the pixels are updated. This is far superior for computer devices.

In turn, the Rate or VRR is very high in the monitors, it is also one of the factors that can define the quality and price of a product. Although it is estimated that manufacturers associated with these technologies are working on Smart TVs. One of the reasons for this is the number of people who use TV exclusively for video game consoles.

History and Evolution How have computer monitors changed over the years?

Like History and Prehistory, the timeline of the monitors is divided in two. The CRT technology of cathode ray tube and flat screen. The first MDAs that appeared in 1981 were monochromatic, that is, they only showed one color. They were designed for the text interfaces of the time. Before long, the CGAs already had a chromatic scale thanks to an integrated graphics card.

In 1984 IBM developed the EGA monitor that supports up to 16 different colors and a better resolution compared to its predecessors. Again in 1987, IBM released monitors with a VGA (Video Graphics Array) connection with the ability to display 256 colors and a maximum resolution of 640 x 480 pixels. At present, many screens and devices have this type of connectors, although updated, admitting a resolution of 1024 x 768.

The next generation of computer monitors were tubeless and operated entirely on electronic components. The first exponent of these new devices was the plasma screen that worked with tiny fluorescent lights that made up each pixel. They offered high brightness and definition, although they were expensive and consumed a lot of electrical power. LCD liquid crystal displays are the most widely used today, both in monitors, televisions and mobile devices.

They consume very little and transmit qualities in Full HD. LED monitors are made up of diodes of this same technology. And they are sub divided into OLED, AMOLED and Super AMOLED. The latter integrates the detection of touches on the screen. In addition, they are brighter and reflect less external lights. The latest news regarding this hardware, we can mention not only curved screens but also flexible ones. There are several devices that already offer this capacity and it will not take long to reach PC monitors.

Date update on 2021-03-29. Date published on 2021-03-29. Category: Computer class Author: Oscar olg Fuente: internetpasoapaso