What is the response time on the monitor. Tips for choosing an LCD monitor


Speaking about the various parameters of LCD monitors - and this topic is regularly raised not only in our articles, but also on almost any hardware site that touches on the subject of monitors - we can distinguish three levels of discussion of the problem.

Level one, basic: isn’t the manufacturer deceiving us? In general, the answer at the moment is completely banal: serious monitor manufacturers do not stoop to banal deception.

Level two, more interesting: what do the stated parameters actually mean? In fact, it boils down to a discussion of the question under what conditions these parameters are measured by manufacturers and what practical limitations these conditions impose on the applicability of measurement results. For example, good example There will be a measurement of response time according to the ISO 13406-2 standard, where it was defined as the sum of the times the matrix switches from black to white and back. Research shows that for all types of matrices this transition takes the minimum time, while on transitions between shades of gray the response time can be many times higher, which means that in reality the matrix will not look as fast as on paper. However, this example cannot be attributed to the first level of discussion, since it cannot be said that the manufacturer is deceiving us anywhere: if we set the maximum contrast on the monitor and measure the “black-white-black” switching time, then it will coincide with the declared .

However, there is an even more interesting level, the third: the question of how certain parameters are perceived by our eyes. Without touching the monitors for now (we will deal with them below), I will give an example from acoustics: from a purely technical point of view, tube sound amplifiers have rather mediocre parameters (high level of harmonics, poor impulse characteristics, and so on), and in connection with them we can talk about fidelity There is simply no need to reproduce sound. Nevertheless, many listeners, on the contrary, like the sound of tube technology - but not because it is objectively better than transistor technology (as I already said, this is not the case), but because the distortions it introduces are pleasant to the ear.

Of course, the conversation about the subtleties of perception comes when the parameters of the devices under discussion are good enough for such subtleties to have a noticeable impact. You can buy computer audio speakers for ten dollars - no matter what amplifier you connect them to, they won’t sound any better, because their own distortions obviously exceed any flaws in the amplifier. It’s the same with monitors - while the response time of the matrices was tens of milliseconds, there was simply no point in discussing the features of image perception by the retina; now, when the response time has been reduced to a few milliseconds, it suddenly turns out that the performance of the monitor - not the rated performance, but its subjective perception by a person - is determined not only by milliseconds...

In the article I bring to your attention, I would like to discuss both some of the passport parameters of monitors - the features of their measurement by manufacturers, compliance with reality, and so on - but also some points related specifically to the characteristics of human vision. First of all, this concerns the response time of monitors.

Monitor response time and eye response time

For a long time, in many reviews of monitors - what can I say, I’m a sinner myself - one could come across the statement that as soon as the response time of LCD panels (the real response time, and not the nameplate value, which, as we all know, when measured according to ISO13406 -2, to put it mildly, does not accurately reflect reality) decreases to 2...4 ms, then we can simply forget about this parameter, further reducing it will not give anything new, we will stop noticing blurring anyway.

And so, such monitors appeared - the latest models of gaming monitors on TN matrices with response time compensation fully provide an arithmetic mean (GtG) time of the order of a few milliseconds. Let's not discuss things like RTC artifacts or inherent shortcomings of TN technology now - all that matters to us is that the above numbers are actually achieved. However, if you put them next to a regular CRT monitor, many people will notice that the CRT is still faster.

Oddly enough, it does not follow from this that we need to wait for LCD monitors with a response of 1 ms, 0.5 ms... That is, you can wait for them, but such panels themselves will not solve the problem - moreover, subjectively they won't even be much different from modern 2...4 ms panels. Because the problem here is no longer in the panel, but in the peculiarities of human vision.

Everyone knows about such a thing as retinal inertia. It is enough to look at a bright object for one or two seconds, then close your eyes - and for a few more seconds you will see a slowly fading “imprint” of the image of this object. Of course, the print will be quite vague, actually a contour, but we are talking about such a long period of time as seconds. For about 10...20 ms after the disappearance of the actual picture, the retina of our eye continues to store its entire image, and only then it quickly fades away, leaving only the outlines of the brightest objects.

In the case of CRT monitors, the inertia of the retina plays a positive role: thanks to it, we do not notice the flickering of the screen. The duration of the afterglow of the phosphor of modern tubes is about 1 ms, while the time it takes for the beam to travel across the screen is 10 ms (with a frame scan of 100 Hz), that is, if our vision were inertia-free, we would see a light stripe running from top to bottom, only 1/10 wide screen height. This can be easily demonstrated by photographing a CRT monitor at different shutter speeds:


At a shutter speed of 1/50 sec (20 ms), we see a normal image that occupies the entire screen.


When the shutter speed is reduced to 1/200 sec (5 ms), a wide dark stripe appears in the image - during this time, with a scan of 100 Hz, the beam manages to bypass only half of the screen, while on the other half of the screen the phosphor has time to go out.


And finally, at a shutter speed of 1/800 sec (1.25 ms), we see a narrow light strip running across the screen, followed by a small and quickly darkening trail, while the main part of the screen is simply black. The width of the light stripe is precisely determined by the afterglow time of the phosphor.

On the one hand, this behavior of the phosphor forces us to use high frame rates on CRT monitors, for modern tubes - at least 85 Hz. On the other hand, it is precisely the relatively short afterglow time of the phosphor that leads to the fact that any, even the fastest, modern LCD monitor is still slightly inferior in speed to the good old CRT.

Let's imagine a simple case - a white square moving across a black screen, say, as in one of the tests popular program TFTTest. Consider two adjacent frames, between which the square has moved one position from left to right:


In the picture I tried to depict four consecutive “snapshots”, the first and last of which occur when the monitor displays two adjacent frames, and the middle two demonstrate how the monitor and our eye behave in the interval between frames.

In the case of a CRT monitor, the desired square is regularly displayed when the first frame arrives, but after 1 ms (phosphor afterglow time) it begins to quickly fade and disappears from the screen long before the second frame arrives. However, due to the inertia of the retina, we continue to see this square for about another 10 ms - by the beginning of the second frame it only begins to fade noticeably. At the moment the monitor draws the second frame, our brain receives two images - a white square in a new place, plus its imprint quickly fading on the retina in the old place.


Active matrix LCD monitors, unlike CRTs, do not flicker - the picture on them is preserved throughout the entire period between frames. On the one hand, this allows you not to worry about the frame rate (there is no screen flickering in any case, at any frequency), on the other... look at the picture above. So, during the interval between frames, the image on the CRT monitor quickly went dark, but on the LCD it remained unchanged. After the second frame arrives, our white square is displayed on the monitor in a new position, and the old frame goes out in 1...2 ms (in fact, the pixel blanking time for modern fast TN matrices is the same as the phosphor afterglow time for a CRT). However, the retina of our eye stores a residual image, which will go out only 10 ms after the disappearance of the real image, and until then it will be added to a new picture. As a result, within about ten milliseconds after the arrival of the second frame, our brain receives two images at once - the real picture of the second frame from the monitor screen plus the imprint of the first frame superimposed on it. Well, why not the usual blurring?.. Only now the old picture is stored not by the slow matrix of the monitor, but by the slow retina of our own eye.

In short, when the native response time of an LCD monitor drops below 10 ms, further reductions have less effect than might be expected - due to the fact that retinal inertia begins to play a noticeable role. Moreover, even if we reduce the monitor's response time to completely negligible amounts, it will still subjectively appear slower than a CRT. The difference lies in the moment from which the storage time of the residual image on the retina is counted: in a CRT this is the arrival time of the first frame plus 1 ms, and in an LCD this is the arrival time of the second frame - which gives us a difference of about ten milliseconds.

The solution to this problem is quite obvious - since a CRT appears fast due to the fact that most of the time between two successive frames its screen is black, which allows the afterimage on the retina to begin to fade just in time for the arrival of a new frame, then in an LCD monitor To achieve the same effect, additional black frames must be artificially inserted between image frames.

This is exactly what BenQ decided to do when they introduced Black Frame Insertion (BFI) technology some time ago. It was assumed that a monitor equipped with it would insert additional black frames into the output image, thereby emulating the operation of a conventional CRT:


Interestingly, it was initially assumed that frames would be inserted by changing the image on the matrix, and not by extinguishing the backlight. This technology is quite acceptable for fast TN matrices, but on MVA and PVA matrices there would be a problem with them being too big time switching to black and back: if for modern TNs it is a few milliseconds, then even for the best monitors on *VA matrices it fluctuates around 10 ms - thus, for them the time required to insert a black frame simply exceeds the repetition period of the main frames images, and BFI technology turns out to be unsuitable. In addition, the limitation on the maximum duration of a black frame is not even imposed by the repetition period of image frames (16.7 ms with a standard LCD frame scan of 60 Hz), but rather by our eyes - if the duration of black inserts is too long, the flickering of the monitor screen will be no less noticeable than on a CRT with scanning at the same 60 Hz. It's unlikely that anyone will like this.

Let me note in passing that it is still incorrect to talk about doubling the frame rate when using BFI, as some reviewers do: the natural frequency of the matrix should increase according to the addition of black frames to the video stream, but the image frame rate still remains the same, from the point of view of the video card and nothing changes at all.

As a result, when BenQ presented its FP241WZ monitor on a 24" PVA matrix, it actually did not contain the promised insertion of black frames, but a technology similar in purpose, but completely different in implementation, differing from the original one in that the black frame is not inserted after due to the matrix, but due to the control of the backlight lamps: at the right moment they simply go out for a short while.

Of course, for the implementation of BFI in this form, the response time of the matrix does not play any role at all; it can be used with equal success both on TN-matrices and on any others. In the case of the FP241WZ, its panel behind the matrix houses 16 independently controlled horizontal backlight lamps. Unlike a CRT, where (as we saw in photographs with a short shutter speed) a light scanning stripe runs across the screen, in BFI, on the contrary, the stripe is dark - at any given moment, 15 out of 16 lamps are on, and one is off. Thus, when BFI is running, a narrow dark stripe runs across the FP241WZ screen for the duration of one frame:


The reasons for choosing such a scheme (extinguishing one of the lamps instead of igniting one of the lamps, which would seem to exactly emulate a CRT, or extinguishing and lighting all the lamps at the same time) are quite obvious: modern LCD monitors operate with a frame scan of 60 Hz, so an attempt to accurately emulate a CRT would lead to severe flickering of the picture. A narrow dark strip, the movement of which is synchronized with the frame scanning of the monitor (that is, at the moment before each lamp is extinguished, the section of the matrix above it showed the previous frame, and by the time this lamp is lit, a new frame will already be recorded in it) on the one hand, partly compensates the above-described effect of retinal inertia, on the other hand, does not lead to noticeable flickering of the image.

Of course, with such modulation of the backlight, the maximum brightness of the monitor drops slightly - but, in general, this is not a problem; modern LCD monitors have a very good brightness reserve (in some models it can reach up to 400 cd/sq.m).

Unfortunately, I haven’t had time to visit our FP241WZ laboratory yet, so the question practical application new technology, I can only refer to the article of the respected site BeHardware “ BenQ FP241WZ: 1rst LCD with screening" (in English). As Vincent Alzieu notes in it, new technology really improves the subjective assessment of the monitor's reaction speed, however, despite the fact that only one of the sixteen backlight lamps is not lit at any given time, in some cases you can still notice screen flickering - primarily, on large single-color fields.

Most likely, this is due to the still insufficient frame rate - as I wrote above, switching the backlight lamps is synchronized with it, that is, a full cycle takes 16.7 ms (60 Hz). The sensitivity of the human eye to flicker depends on many conditions (for example, it is enough to recall, say, that the 100 Hz flicker of an ordinary fluorescent lamp with electromagnetic ballast is difficult to notice when looking directly at it, but easy if it falls into the area of ​​​​peripheral vision), so it is quite It seems reasonable to assume that the monitor still lacks the vertical scanning frequency, although the use of as many as 16 backlight lamps gives a positive effect: as we well know from CRT monitors, if the entire screen flickered at the same frequency of 60 Hz, we would have to look closely to detect this flickering would not be required, but working with such a monitor would be completely problematic.

The most reasonable way out of this situation seems to be a transition in LCD monitors to a frame scan of 75 or even 85 Hz. Some of our readers may argue that many monitors already support 75 Hz scanning - but, alas, I have to disappoint them, this support is done in the vast majority of cases only on paper: the monitor receives 75 frames per second from the computer, then simply throws out every fifth frame and continues to display the same 60 frames per second on its matrix. You can document this behavior by photographing an object moving quickly across the screen with a sufficiently long shutter speed (about 1/5 of a second - so that the camera has time to capture a dozen frames of the monitor): on many monitors, with a scan of 60 Hz, the photograph will show uniform movement of the object across the screen, and when scanning at 75 Hz, holes will appear in it. Subjectively, this will be felt as a loss of smoothness of movement.

In addition to this obstacle - I’m sure it can be easily overcome if there is such a desire on the part of monitor manufacturers - there is another one: with an increase in the frame rate, the required bandwidth of the interface through which the monitor is connected increases. In other words, to switch to 75 Hz scanning, monitors with working resolutions of 1600x1200 and 1680x1050 will need to use two-channel Dual Link DVI, since the operating frequency of single-channel Single Link DVI (165 MHz) will no longer be enough. This problem is not fundamental, but it does impose some restrictions on the compatibility of monitors with video cards, especially not very new ones.

Interestingly, increasing the frame rate itself will reduce image blurring at the same specification response time of the panel - and again the effect is associated with the inertia of the retina. Let’s say that the picture manages to move on the screen by a centimeter during the period of one frame at a scan rate of 60 Hz (16.7 ms) - then after changing the frame, the retina of our eye will capture the new picture plus the shadow of the old picture, shifted by a centimeter, superimposed on it. If we double the frame rate, then the eye will record frames with an interval of no longer 16.7 ms, but approximately 8.3 ms - respectively, and the shift of two pictures, old and new, relative to each other will become half as large, that is, with from the eye's point of view, the length of the trail trailing the moving image will be halved. Obviously, ideally, at a very high frame rate, we will get exactly the same picture as we see in real life, without any additional artificial lubrication.

Here, however, you need to understand that it is not enough to increase only the frame rate of the monitor, as was done in CRTs to combat screen flickering - it is necessary that all image frames be unique, otherwise there will be absolutely no point in increasing the frequency.

In games, this will lead to an interesting effect - since in most new products, even for modern video cards, a speed of 60 FPS is considered quite a good indicator, then raising the scanning frequency of the LCD monitor itself will not affect blurring until you set it enough a powerful video card (capable of running this game at a speed corresponding to the scan rate of the monitor) or do not lower the quality of the game’s graphics to a sufficiently low level. In other words, on LCD monitors with a real frame rate of 85 or 100 Hz, image blurring in games will, albeit to a small extent, still depend on the speed of the video card - and we are accustomed to considering blurring to depend solely on the monitor.

More the situation is more complicated with films - no matter what video card you install, the frame rate in the film is still 25, maximum 30 frames/sec, that is, increasing the frame rate of the monitor itself will not have any effect on reducing blurring in films. In principle, there is a way out of this situation: when playing a movie, you can programmatically calculate additional frames, which is an averaging between two real frames, and insert them into the video stream - by the way, this approach will reduce blurring in movies even on existing monitors, because their frame rate is 60 Hz is at least twice the frame rate in films, that is, there is a reserve.

This circuit has already been implemented at 100 Hz Samsung TV LE4073BD - it has a DSP installed that automatically tries to calculate intermediate frames and inserts them into the video stream between the main ones. On the one hand, the LE4073BD does demonstrate noticeably less blur compared to TVs that do not have this function, but, on the other hand, the new technology also gives an unexpected effect - the image begins to resemble cheap “soap operas” with their unnaturally smooth movements. Some may like this, but, as experience shows, most people prefer a little blurring of a regular monitor rather than the new “soapy effect” - especially since in films the blurring of modern LCD monitors is already somewhere on the border of perception.

Of course, in addition to these problems, purely technical obstacles will arise - raising the frame rate above 60 Hz will mean the need to use Dual Link DVI on monitors with a resolution of 1680x1050.

To summarize briefly, three main points can be noted:

a) When the real response time of an LCD monitor is less than 10 ms, further reducing it gives a weaker effect than expected due to the fact that the inertia of the retina begins to play a role. In CRT monitors, the black gap between frames gives the retina time to “light up”, while in classic LCD monitors there is no such gap, the frames follow continuously. Therefore, further efforts by manufacturers to increase the speed of monitors will be aimed not so much at reducing their nominal response time, but at combating the inertia of the retina. Moreover, this problem affects not only LCD monitors, but also any other active matrix technologies in which the pixel glows continuously.

b) The most promising technology at the moment seems to be the technology of short-term extinguishing of backlight lamps, as in the BenQ FP241WZ - it is relatively easy to implement (the only downside is the need for a large number and a certain configuration of backlight lamps, but for monitors of large diagonals this is a completely solvable problem), suitable for all types of matrices and does not have any intractable shortcomings. It may only be necessary to increase the scanning frequency of new monitors to 75...85 Hz - but perhaps manufacturers will be able to solve the above-mentioned problem with flickering noticeable on the FP241WZ in other ways, so for a final conclusion it is worth waiting for other models to appear on the market monitors with backlight dimming.

c) Generally speaking, from the point of view of most users, modern monitors (on any type of matrix) are quite fast even without such technologies, so you should seriously wait for the appearance of various models with backlight dimming unless otherwise you are definitely not satisfied.

Display Delay (Input Lag)

The topic of frame display delay in some monitor models, which has recently been very widely discussed in various forums, is only at first glance similar to the topic of response time - in fact, it is a completely different effect. If, with normal blurring, the frame received on the monitor begins to be displayed instantly, but its complete rendering takes some time, then with a delay between the arrival of the frame from the video card to the monitor and the beginning of its display, some time passes, a multiple of the frame scanning period of the monitor. In other words, the monitor has a frame buffer installed - ordinary RAM - storing one or more frames; When a new frame arrives from the video card, it is first written to the buffer, and only then displayed on the screen.

Objectively measuring this delay is quite simple - you need to connect two monitors (CRT and LCD or two different LCDs) to the two outputs of one video card in cloning mode, then run a timer on them that shows milliseconds, and take a series of photographs of the screens of these monitors. Then, if one of them has a delay, the timers in the photographs will differ by the amount of this delay - while one monitor shows the current timer value, the second will show the value that was several frames earlier. To obtain a reliable result, it is advisable to take at least a couple of dozen photographs, and then discard those that were clearly taken at the time of the frame change. The diagram below shows the results of such measurements for the Samsung SyncMaster 215TW monitor (compared to an LCD monitor that does not have any delay), the horizontal axis shows the difference in the timer readings on the screens of the two monitors, and the vertical axis shows the number of frames with such a difference:


A total of 20 photographs were taken, 4 of which were clearly caught at the moment of frame change (two values ​​were superimposed on each other in the timer images, one from the old frame, the second from the new), two frames gave a difference of 63 ms, three frames - 33 ms, and 11 frames - 47 ms. Obviously, the correct result for the 215TW is a latency value of 47ms, which is about three frames.

Making a small digression, I note that you should be somewhat skeptical about publications on forums, the authors of which claim abnormally low or abnormally high latency specifically on their monitors. As a rule, they do not collect sufficient statistics, but take one frame - as you saw above, in individual frames you can accidentally “catch” a value both higher and lower than the real one, and the higher the shutter speed set on the camera, the greater the likelihood of such an error . To get real numbers, you need to take a dozen or two frames and select the most common delay value.

However, this is all a lyric, of little interest to us, customers - well, before buying a monitor in a store, you won’t take pictures of the timers on it?.. From a practical point of view, a much more interesting question is whether it makes sense to pay attention to this delay at all. For example, we will consider the aforementioned SyncMaster 215TW with a latency of 47 ms - I am not aware of monitors with higher values, so this choice is quite reasonable.

If we consider the time of 47 ms from the point of view of the speed of human reaction, then this is a fairly small interval - it is comparable to the time it takes for a signal to travel from the brain to the muscles along the nerve fibers. In medicine, the term “simple sensorimotor reaction time” has been adopted - the interval between the appearance of a signal that is simple enough for the brain to process (for example, lighting a light bulb) and the muscle reaction (for example, pressing a button). On average, for a person, the PSMR time is about 200...250 ms, this includes the time of registration of an event by the eye and transmission of information about it to the brain, the time of recognition of the event by the brain and the time of transmitting a command from the brain to the muscles. In principle, compared to this figure, the delay of 47 ms does not look too big.

During normal office work, such a delay is simply impossible to notice. You can try for as long as you like to notice the difference between the movement of the mouse and the movement of the cursor on the screen - but the very time the brain processes these events and links them with each other (note, tracking the movement of the cursor is a much more complex task than tracking the lighting of a light bulb in the PSMR test, so that we are no longer talking about a simple reaction, which means that the reaction time will be longer than for PSMR) is so great that 47 ms turns out to be a completely insignificant value.

However, on the forums, many users say that on the new monitor the cursor movements feel like “wool”, they have difficulty hitting small buttons and icons the first time, and so on - and the delay that was absent on the old monitor is to blame for everything. present on the new one.

In the meantime, most people are upgrading to the new larger monitors, either from 19" models with a resolution of 1280x1024, or from CRT monitors altogether. Let's take for example the transition from a 19" LCD to the aforementioned 215TW: the horizontal resolution increases by about a third (from 1280 to 1680 pixels), which means that in order to move the mouse cursor from the left edge of the screen to the right, the mouse itself will have to be moved a greater distance - provided that its working resolution and settings remain the same. This is where the feeling of “vatness” and slowness of movements appears - try reducing the cursor speed by a third on your current monitor in the mouse driver settings, you will get exactly the same sensations.

It’s exactly the same with missing buttons after changing the monitor - our nervous system, sad as it may be to admit, is too slow to fix with our eyes the moment “the cursor has reached the button” and transmit the nervous impulse to the one pressing left button mouse finger before the cursor leaves the button. Therefore, in fact, the accuracy of hitting the buttons is nothing more than the precision of movements, when the brain knows in advance which movement of the hand corresponds to which movement of the cursor, and also with what delay after the start of this movement it is necessary to send a command to the finger so that when it presses the button mouse, the cursor was just on the right button. Of course, when you change both the resolution and the physical size of the screen, all this precision turns out to be completely useless - the brain has to get used to the new conditions, but at first, while it acts according to the old habit, you will indeed sometimes miss the buttons. Only the delay caused by the monitor has absolutely nothing to do with it. As in the previous experiment, the same effect can be achieved by simply changing the sensitivity of the mouse - if you increase it, at first you will “skip” the necessary buttons, if you decrease it, on the contrary, you will stop the cursor before reaching them. Of course, after a while the brain adapts to the new conditions, and you will start hitting the buttons again.

Therefore, if you change your monitor to a new one, with a significantly different resolution or screen size, do not be lazy to go into the mouse settings and experiment a little with its sensitivity. If you have an old mouse with a low optical resolution, then it would be a good idea to think about buying a new, more sensitive one - it will move more smoothly when set to high speed settings. Honestly, compared to the cost of a new monitor, spending an extra 20 dollars on a good mouse is not so ruinous.

So, we've sorted out the work, the next item is films. Theoretically, the problem here could arise due to desynchronization of the sound (which comes without delay) and the image (which is delayed by 47 ms on the monitor). However, after experimenting a little in any video editor, you can easily establish that a person notices desynchronization in films with a difference of the order of 200...300 ms, that is, many times more than what the monitor in question gives. While 47 ms is just a little more than the period of one frame of a film (at 25 frames per second, the period is 40 ms), it is impossible to notice such a small difference between sound and image.

And finally, the most interesting - games, the only area in which at least in some cases the delay introduced by the monitor can make a difference. However, it should be noted that many of those discussing the problem on forums and here tend to exaggerate it too much - for most people and in most games, the notorious 47 ms does not play any role. Perhaps, with the exception of the situation when in a multiplayer shooter you and your opponent see each other at the same time - in this case, reaction speed will really play a role, and the additional delay of 47 ms can become significant. If you already notice the enemy half a second later than he notices you, then some milliseconds will not save the situation.

It should be noted that the monitor delay does not affect the accuracy of aiming in FPS games, nor the accuracy of cornering in auto racing... In all these cases, the same precision of movements works - our nervous system does not have time to react at such a speed , in order to press the “fire” button exactly at the moment when the sight is aimed at the enemy, but it perfectly adapts to a variety of conditions and, in particular, to the need to give the finger the command “press!” at that moment when the sight had not yet reached the enemy. Therefore, any additional delays of short duration simply force the brain to slightly adapt to new conditions - moreover, if a person who is accustomed to a monitor with a delay is transferred to a model without a delay, he will have to get used to it in the same way, and for the first quarter of an hour the new monitor he will feel suspiciously uncomfortable.

And finally, I have already seen stories on forums several times about how it is generally impossible to play games on a new monitor due to the notorious latency, which ultimately boils down to the fact that a person, having changed from the 1280x1024 resolution of the old monitor to 1680x1050 of the new one, simply didn't think about it old video card It won't work too fast at this resolution. So, when reading forums, be careful - as a rule, you do not know anything about the level of technical literacy of those who write there, and you cannot tell in advance whether things that are obvious to you are also obvious to them.

The situation with the discussion of monitor delays is aggravated by two more points that are inherent to most people to one degree or another. First, many people are prone to overly complex attempts to explain simple phenomena - they prefer to believe that a bright point in the sky is a UFO rather than an ordinary weather balloon, that strange shadows in NASA lunar photographs indicate not an unevenness of the lunar landscape, but that that people have never gone to the moon, and so on. Actually, any person interested in the activities of ufologists and similar organizations will tell you that most of their so-called discoveries are a consequence not so much of the lack of simple “earthly” explanations for many phenomena, but rather of a reluctance to look for simple explanations at all, a priori moving on to overly complex theories. No matter how strange the analogy between ufologists and monitor buyers is, the latter, once on the forum, often behave the same way - for the most part they do not even try to consider the fact that with a significant change in the resolution and diagonal of the monitor, the sensations of working with it will change completely out of the blue depending on any delays, they immediately move on to discuss how the generally insignificant 47ms delay affects the movement of the mouse cursor.

Secondly, people are prone to self-hypnosis. Try to take two bottles of beer of different types, obviously cheap and obviously expensive, pour the same beer into them - the vast majority of people, having tried it, will say that the beer tastes better in a bottle with the label of an expensive type. Cover the labels with opaque tape - opinions will be equally divided. The problem here is that our brain cannot completely abstract from all sorts of external factors - when we see expensive packaging, we already begin to subconsciously expect a higher quality of the contents of this packaging, and vice versa. To combat this, all any serious subjective comparisons are carried out using the blind test method - when all the studied samples are given conventional numbers, and none of the experts taking part in the testing knows until the end how these numbers relate to real brands.

Much the same thing happens with the discussed topic of display delay. A person who has just bought or is just about to buy a new monitor goes to a forum on monitors, where he immediately discovers multi-page threads about the delay, in which he is told about “wobbly mouse movements”, and about the fact that it is impossible to play on such a monitor, and many other horrors. And, of course, there are a number of people there who claim that they can see this delay with their eyes. Having read all this, a person goes to the store and begins to look at the monitor he is interested in with the thought “there must be a delay here, people can see it!” Of course, after a while he himself begins to see it - or rather, he believes that he sees it - after which he returns home from the store and writes to the forum “Yes, I looked at this monitor, there really is a delay!” There are also more amusing cases - when people directly write something like “I’ve already been sitting at the monitor in question for two weeks, but only now, after reading the forum, I clearly saw a delay on it.”

Some time ago, videos posted on YouTube gained popularity in which on two adjacent monitors (working in desktop extension mode) a window is dragged up and down with a mouse - and you can clearly see how much this window lags on the monitor with a delay. The videos, of course, are beautiful, but... imagine: a monitor with a 60 Hz scan rate is filmed on a camera with its own matrix scan rate of 50 Hz, then saved into a video file with a frame rate of 25 Hz, uploaded to YouTube, which may well re-encode it internally. again, without telling us about it... Do you think that after all these transformations there is much left of the original? In my opinion, not very much. An attempt to view one of these videos frame by frame (by saving it from YouTube and opening it in a video editor) demonstrated this especially clearly - at some moments the difference between the two captured monitors is noticeably greater than the above-mentioned 47 ms, at other moments the windows on them move synchronously, as if there is no delay... In general, complete chaos, senseless and merciless.

So, let's make a short conclusion:

a) In some monitors, the display delay is objectively present, the maximum reliably recorded value is 47 ms.

b) A delay of this magnitude cannot be noticed either during normal work or in films. In games it can be significant at some points for well-trained players, but in most cases and for most people it is invisible in games.

c) As a rule, discomfort when changing a monitor to a model with a larger diagonal and resolution occurs due to insufficient speed or sensitivity of the mouse, insufficient speed of the video card, as well as the change in screen size itself. However, many people, having read too much on forums, a priori attribute any discomfort on a new monitor to problems with display lag.

To put it in a nutshell: theoretically the problem exists, but its practical significance is greatly exaggerated. The vast majority of people will never notice a delay of 47 ms anywhere, let alone lower delay values.

Contrast: nameplate, real and dynamic

Perhaps the statement “the contrast of a good CRT monitor is higher than the contrast of an LCD monitor” has long been perceived by many people as an a priori truth that does not require additional evidence - yet we see how noticeably the black background glows in the dark on the screen of LCD monitors. No, I am not going to completely refute this statement; it is difficult to refute what you see perfectly with your own eyes, even sitting at the latest S-PVA matrix with a standard contrast ratio of 1000:1.

Specification contrast, as a rule, is measured by manufacturers not of the monitors themselves, but of LCD matrices, on a special stand, when a certain signal is supplied and a certain level of backlight brightness. It is equal to the level ratio white to black level.

In finished monitors, the picture is primarily complicated by the fact that the black level is determined not only by the characteristics of the matrix, but also - sometimes - by the settings of the monitor itself, primarily in models where the brightness is controlled by the matrix, and not by the backlights. In this case, the contrast of the monitor may turn out to be much lower than the rated contrast of the matrix - if it is not configured too carefully. This effect can be clearly seen on Sony monitors, which have two brightness adjustments at once - both by the matrix and by the lamps - in them, when the matrix brightness increases above 50%, the black color quickly turns into gray.

Here I would like to note once again that the opinion that the rated contrast can be increased due to the brightness of the backlight - and this is supposedly why many monitor manufacturers install such powerful lamps in them - is completely wrong. As the brightness of the backlight increases, both the white and black levels increase at the same rate, which means their ratio, which is the contrast, does not change. It is impossible, through backlighting alone, to increase the brightness level of white without increasing the brightness of black.

However, all this has already been said many times before, so let's move on to other issues.

Undoubtedly, the nominal contrast of modern LCD monitors is still not high enough to successfully compete with good CRT monitors in this parameter - in the dark their screens still glow noticeably, even if the picture is completely black. But we most often use monitors not in the dark, but even in daylight, sometimes quite bright. Obviously, in this case, the real contrast we observe will differ from the passport one, measured in the semi-darkness of the laboratory - the external light reflected by it will be added to the own glow of the monitor screen.


Above is a photo of two monitors standing side by side - a Samsung SyncMaster 950p+ CRT monitor and a SyncMaster 215TW LCD monitor. Both are turned off, the external lighting is normal daylight on a cloudy day. It is clearly visible that the screen of a CRT monitor in external lighting is not just lighter, but much lighter than the screen of an LCD monitor - the situation is exactly the opposite of what we observe in the dark and with the monitors on.

This can be explained very simply - the phosphor used in cathode ray tubes itself has a light gray color. To darken the screen, a tint film is applied to its glass - since the phosphor’s own glow passes through this film once, and the external light passes through it twice (the first time on the way to the phosphor, the second time, reflected from the phosphor, on the way out to our eye) , then the latter is weakened by the film significantly more than the former.

However, it is not possible to make a completely black screen on a CRT - as the transparency of the film decreases, you have to increase the brightness of the phosphor glow, because the film also weakens it. And this brightness in a CRT is limited to a fairly modest level, since when the current of the electron beam increases too much, its focusing is greatly deteriorated, the image becomes fuzzy and blurry. For this reason, the maximum reasonable brightness of CRT monitors does not exceed 150 cd/sq.m.

In an LCD matrix, there is practically nothing for external light to be reflected from; there is no phosphor in it, only layers of glass, polarizers and liquid crystals. Of course, some small part of the light is reflected from the outer surface of the screen, but most of it freely passes inside and is lost there forever. Therefore, in daylight, the screen of an LCD monitor that is turned off looks almost black.

So, in daylight and the monitors are off, a CRT screen is significantly lighter than an LCD screen. If we turn on both monitors, then the LCD, due to its lower nominal contrast, will receive a greater increase in the black level than the CRT - but even so, it will still remain darker than the CRT. If we now close the curtains, “turning off” daylight, then the situation will change to the opposite, and the CRT will have a deeper black color.

Thus, the real contrast of monitors depends on the external illumination: the higher it is, the more advantageous the position is for LCD monitors; even in bright light, the picture on them remains contrasty, while on a CRT it noticeably fades. In the dark, on the contrary, the advantage is on the side of the CRT.

By the way, good appearance is partly based on this - by at least, on the showcase - monitors with a glossy screen surface. A regular matte coating scatters the light falling on it in all directions, while a glossy one reflects it purposefully, like a regular mirror - therefore, if the light source is not located directly behind you, then a matrix with a glossy coating will look more contrasting than a matte one. Alas, if the light source suddenly turns out to be behind you, the picture changes radically - a matte screen still scatters light more or less evenly, but a glossy one will reflect it directly into your eyes.

It should be noted that all these discussions concern not only LCD and CRT monitors, but also other display technologies - for example, the SED panels promised to us by Toshiba and Canon in the near future, having a fantastic rated contrast ratio of 100,000: 1 (in other words, black the color on them in the dark is completely black), in real life in daylight they will fade in exactly the same way as CRTs. They use the same phosphor, which glows when bombarded with an electron beam, and a black tint film is also installed in front of it, but if in a CRT, reducing the transparency of the tint (thereby increasing the contrast) was prevented by the defocusing of the beam, then in the SED this will be hampered by a noticeably decreasing decrease with increasing beam current, lifetime of emitter cathodes.

However, recently models of LCD monitors have appeared on the market with unusually high values ​​​​of the declared passport contrast - up to 3000: 1 - and at the same time using the same matrices as monitors with more familiar numbers in the specifications. The explanation for this lies in the fact that such large values ​​by LCD standards correspond not to “normal” contrast, but to the so-called dynamic one.

The idea is, in general, simple: in any film there are both light scenes and dark ones. In both cases, our eye perceives the brightness of the entire picture as a whole, that is, if most of the screen is light, then the black level in a few dark areas does not matter much, and vice versa. So it looks quite reasonable automatic adjustment The brightness of the backlight depends on the image on the screen - on dark scenes the backlight can be dimmed, thereby making them even darker, on light scenes, on the contrary, you can bring it to maximum brightness. It is this automatic adjustment that is called “dynamic contrast”.

The official figures for dynamic contrast are obtained very simply: the white level is measured at maximum backlight brightness, the black level at minimum. As a result, if the matrix has a rated contrast of 1000:1, and the monitor electronics allow you to automatically change the backlight brightness three times, then the final dynamic contrast will be equal to 3000:1.

At the same time, you need to understand that the dynamic contrast mode is only suitable for films, and maybe even for games - and in the latter, players would rather raise the brightness in dark scenes in order to make it easier to navigate what is happening, rather than lower it. For normal work, automatically adjusting the brightness depending on the image displayed on the screen is not only useless, but simply extremely annoying.

Of course, at any given moment in time, the screen contrast - the ratio of the white level to the black level - does not exceed the nominal static contrast of the monitor, however, as mentioned above, in light scenes the black level is not very important for the eye, and in dark scenes, on the contrary, the white level , so automatic brightness adjustment in movies is quite useful and really gives the impression of a monitor with a noticeably increased dynamic range.

The only downside of the technology is that the brightness is controlled as a whole for the entire screen, so in scenes that combine light and dark objects in equal proportions, the monitor will simply set a certain average brightness. Dynamic contrast will not give anything in dark scenes with individual small very bright objects (for example, a night street with lanterns) - since the overall background will be dark, the monitor will reduce the brightness to a minimum, accordingly dimming bright objects. However, as mentioned above, due to the peculiarities of our perception, these shortcomings are hardly noticeable and in any case less significant than the insufficient contrast of conventional monitors. So overall, the new technology should appeal to many users.

Color rendering: color gamut and LED backlight

A little over two years ago, in the article “Parameters of modern LCD monitors,” I wrote that such a parameter as color gamut is generally unimportant for monitors - simply because it is the same for all monitors. Fortunately, since then the situation has changed for the better - monitor models with increased color gamut have begun to appear on sale.

So what is color gamut?

As is known, humans see light in the wavelength range from approximately 380 to 700 nm, from violet to red. Four types of detectors act as light-sensitive elements in our eye - one type of rods and three types of cones. Rods have excellent sensitivity, but do not distinguish between different wavelengths at all; they perceive the entire range as a whole, which gives us black-and-white vision. Cones, on the contrary, have significantly less sensitivity (and therefore stop working at dusk), but with sufficient illumination they give us color vision - each of the three types of cones is sensitive to its own wavelength range. If a beam of monochromatic light with a wavelength of, say, 400 nm hits our eye, then only one type of cone, responsible for blue color, will react to it. Thus, different types of cones perform approximately the same function as the RGB filters in front of the digital camera sensor.

Although this makes it seem at first glance that our color vision can easily be described by three numbers, each of which would correspond to the level of red, green or blue, this is not the case. As experiments conducted at the beginning of the last century have shown, the processing of information by our eye and our brain is less unambiguous, and if we try to describe color perception in three coordinates (red, green, blue), it turns out that the eye can perceive without any problems colors for which in such a system the value of red turns out to be... negative. In other words, it is impossible to completely describe human vision in the RGB system - in fact, the spectral sensitivity curves different types cones are somewhat more complicated.


As a result of the experiments, a system was created that describes the entire range of colors perceived by our eyes. Its graphical display is called a CIE diagram and is shown in the figure above. Within the shaded area are all the colors perceived by our eyes; the outline of this area corresponds to pure, monochromatic colors, and the inner area corresponds, accordingly, to non-monochromatic, up to white color (it is marked by a white dot; in fact, “white color” from the point of view of the eye is a relative concept, depending on the conditions we can consider colors that are actually different from each other are white; on the CIE diagram, the so-called “flat spectrum point” is usually marked as the white point, having coordinates x=y=1/3; under normal conditions, the corresponding color will appear very cold, bluish).

With a CIE diagram, any color perceived by the human eye can be specified using two numbers, coordinates along the horizontal and vertical axes of the diagram: x and y. But this is not surprising, but the fact that we can recreate any color using a set of several monochromatic colors, mixing them in a certain proportion - our eye is completely indifferent to what spectrum the light that entered it actually had, all that matters is , how each type of receptor, rod and cone, was excited.

If human vision were successfully described by the RGB model, then to emulate any of the colors that the eye could see, it would be enough to take three sources, red, green and blue, and mix them in the right proportions. However, as mentioned above, we actually see more colors than can be described in RGB, so in practice the problem is the opposite: having three sources different colors, what other colors can we get by mixing them?


The answer is very simple and obvious: if you put points with the coordinates of these colors on the CIE diagram, then everything that can be obtained by mixing them will lie inside a triangle with vertices at these points. It is this triangle that is called the “color gamut”.

The maximum possible color gamut for a system with three basic colors is provided by the so-called laser display (see above in the figure), the basic colors in which are formed by three lasers, red, green and blue. The laser has a very narrow emission spectrum, it has excellent monochromaticity, therefore the coordinates of the corresponding basic colors will lie exactly on the border of the diagram. It is impossible to move them outside, beyond the border - this is a non-physical region, the coordinates of the points in it do not correspond to any light, and any shift of the points inside the diagram will lead to a decrease in the area of ​​the corresponding triangle and, accordingly, a decrease in the color gamut.

As can be clearly seen from the figure, even a laser display is not capable of reproducing all the colors that the human eye sees, although it is quite close to this. You can increase the color gamut only by using a larger number of basic colors (four, five, and so on), or by creating some kind of hypothetical system that can change the coordinates of its basic colors “on the fly” - however, if the first is simply technically difficult at the moment, then the second is generally unrealizable.

However, in any case, it is too early for us to grieve over the shortcomings of laser displays: we don’t even have them yet, and what we do have demonstrates a color gamut that is very much inferior to laser displays. In other words, in real monitors, both CRT and LCD (with the exception of some models, which will be discussed below), the spectrum of each of the basic colors is quite far from monochromatic - in terms of the CIE diagram, this means that the vertices of the triangle will shift from the boundaries of the diagram are closer to its center, and the area of ​​the triangle will noticeably decrease.

Above in the picture there are two triangles drawn - for a laser display and the so-called sRGB. In short, the second one exactly corresponds to the typical color gamut of modern LCD and CRT monitors. It's a sad picture, isn't it? I'm afraid we won't be able to see a pure green color yet...

The reason for this - in the case of LCD monitors - is the extremely poor spectrum of backlight lamps for LCD panels. Cold cathode fluorescent lamps (CCFL) are used as such - the discharge burning in them produces radiation in the ultraviolet spectrum, which is converted into ordinary white light by a phosphor applied to the walls of the lamp bulb.

In nature, the source of light for us is usually various hot bodies, primarily our Sun. The radiation spectrum of such a body is described by Planck's law, but the main thing is that it is continuous, continuous, all wavelengths are present in it, and the radiation intensities at close wavelengths differ slightly.

A fluorescent lamp, like other gas-discharge light sources, produces a line spectrum, in which there is no radiation at all at some wavelengths, and the intensities of parts of the spectrum that are only a few nanometers apart can differ by tens or hundreds of times. Since our eye is completely insensitive to a specific type of spectrum, from its point of view both the Sun and the fluorescent lamp give exactly the same light. However, in the monitor everything turns out to be somewhat more complicated...

So, several fluorescent lamps standing behind the LCD matrix shine through it. On the reverse side of the matrix there is a grid of multi-colored filters - red, green and blue - forming triads of subpixels. Each filter cuts out a piece of the spectrum from the lamp light corresponding to its passband - and as we remember, to obtain the maximum color gamut, this piece should be as narrow as possible. However, let’s imagine that at a wavelength of 620 nm in the spectrum of the backlight lamp there is a peak intensity... well, let it be 100 arbitrary units. Then for the red subpixel we install a filter with a maximum transmission at the same 620 nm and, it would seem, we get the first vertex of the color gamut triangle, lying neatly on the border of the diagram. It would seem that.

The phosphor of even modern fluorescent lamps is a rather capricious thing; we cannot control its spectrum at will; we can only select from a set of phosphors known in chemistry the one that more or less meets our needs. And the best one that we can choose has in its spectrum another peak with a height of the same 100 arbitrary units at a wavelength of 575 nm (this will be yellow). Our red filter with a maximum at a wavelength of 620 nm at this point has a transmittance of, well, let's say, 1/10 of the maximum.

What does this mean? That at the output of the filter we get not one wavelength, but two at once: 620 nm with an intensity of 100 conventional units and 575 nm with an intensity of 100 * 1/10 (we multiply the intensity in the lamp spectrum line by the transmittance of the filter at a given wavelength), then there are 10 conventional units. In general, not so little.

Thus, due to the “extra” peak in the spectrum of the lamp, which partially breaks through the filter, we got a polychromatic color instead of monochromatic red - red with an admixture of yellow. On the CIE diagram, this means that the corresponding vertex of the color gamut triangle has moved upward from the bottom edge of the diagram, closer to yellow shades, reducing the area of ​​the color gamut triangle.

However, as you know, it is better to see once than to hear five times. To see what was described above, I turned for help to the Department of Plasma Physics of the Research Institute of Nuclear Physics named after. Skobeltsyn, and soon I had an automated spectrographic system at my disposal. It was designed to study and control the growth processes of artificial diamond films in microwave plasma using the emission spectra of the plasma, so it will probably cope with some kind of banal LCD monitor without difficulty.


We turn on the system (the large and angular black box is the Solar TII MS3504i monochromator, its input port is visible on the left, opposite which there is a light guide with an optical system, the orange cylinder of the photosensor attached to the output port of the monochromator is visible on the right; the system’s power supply is on top)...


We install the input optical system at the desired height and connect the second end of the light guide to it...


And finally, we place it in front of the monitor. The entire system is controlled by a computer, so the process of taking the spectrum in the entire range of interest to us (from 380 to 700 nm) is completed in just a couple of minutes:


The horizontal axis of the graph shows the wavelength in angstroms (10 A = 1 nm), and the vertical axis shows the intensity in certain conventional units. For greater clarity, the graph is colored according to the wavelengths - as our eyes perceive them.

As an experimental monitor in in this case The Samsung SyncMaster 913N was performed, a fairly old budget model on a TN matrix, but in general this does not matter - the same lamps with the same spectrum that are in it are used in the vast majority of other modern LCD monitors .

So what do we see on the spectrum? Namely, what was described in words above: in addition to three distinct high peaks corresponding to the blue, red and green subpixels, we see another completely excess garbage in the region of 570...600 nm and 480...500 nm. It is these extra peaks that shift the vertices of the color gamut triangle far deeper into the CIE diagram.

Of course, the best way to combat this may be to abandon CCFL altogether - and some manufacturers have done so, such as Samsung with its SynsMaster XL20 monitor. In it, instead of fluorescent lamps, a block of LEDs of three colors is used as backlight - red, blue and green (exactly so, because using white LEDs does not make sense, because anyway, from the backlight spectrum with a filter we will cut out red, green and blue colors) . Each of the LEDs has a neat, even spectrum that exactly matches the passband of the corresponding filter and does not have any unnecessary side bands:


It's fun to watch, isn't it?

Of course, the band of each of the LEDs is quite wide, their radiation cannot be called strictly monochromatic, so it will not be possible to compete with a laser display, but when compared with the CCFL spectrum, it is a very pleasant picture, in which it is especially worth noting the neat smooth minima in those two areas where CCFL had completely unnecessary peaks. It is also interesting that the position of the maxima of all three peaks has shifted slightly - with red now noticeably closer to the edge of the visible spectrum, which will also have a positive effect on the color gamut.


And here, in fact, is the color gamut. We see that the coverage triangle of the SyncMaster 913N is practically no different from the modest sRGB, and compared to the coverage of the human eye, the green color suffers the most in it. But the color gamut of the XL20 is difficult to confuse with sRGB - it easily captures a significantly larger part of the shades of green and blue-green, as well as deep red. This is, of course, not a laser display, but it is impressive.

However, we won’t see LED-backlit home monitors for a long time. Even the SyncMaster XL20, which is scheduled to go on sale this spring, will cost about $2,000 with a 20" screen diagonal, and the 21" NEC SpectraView Reference 21 LED is almost triple that a large amount- only printers (for whom both of these models are primarily intended) are accustomed to such prices for monitors, but clearly not home users.

However, do not despair - there is hope for you and me too. It consists in the appearance on the market of backlit monitors using the same fluorescent lamps, but with a new phosphor, in which unnecessary peaks in the spectrum are partly suppressed. These lamps are not as good as LEDs, but they are still noticeably superior to old lamps - the color gamut they provide is approximately halfway between that of models with old lamps and models with LED backlighting.

For a numerical comparison of the color gamut, it is customary to indicate the percentage of the gamut of a given monitor from one of the standard gamuts; sRGB is quite small, so NTSC is often used as a standard color gamut for comparison. Regular sRGB monitors have a color gamut of 72% NTSC, monitors with enhanced backlights have a color gamut of 97% NTSC, and LED-backlit monitors have a color gamut of 114% NTSC.

What does increased color gamut give us? Manufacturers of LED-backlit monitors in their press releases usually place photographs of new monitors next to old ones, simply increasing the color saturation on the new ones - this is not entirely true, because in fact, new monitors only improve the saturation of those colors that go beyond the color limit coverage of old monitors. But, of course, when viewing the above press releases on your old monitor, you will never see this difference, because your monitor cannot reproduce these colors anyway. It's like trying to watch a report from a color TV show in black and white. Although, the manufacturers can also be understood - they need to somehow reflect the advantages of the new models in press releases?..

In practice, however, there is a difference - I can’t say it’s fundamental, but it definitely speaks in favor of models with an increased color gamut. It is expressed in very pure and deep red and green colors - if you change seats after long work on an LED-backlit monitor back to the good old CCFL, at first you just want to add color saturation to it, until you realize that this will not help it at all, red and green will remain somehow dull and dirty compared to “ LED monitor.

Unfortunately, so far the distribution of models with improved backlights is not going quite as we would like - for example, at Samsung it started with the SyncMaster 931C model on a TN matrix. Of course, budget TN monitors would also benefit from an increased color gamut, but hardly anyone takes such models to work with color due to the frankly poor viewing angles. However, all major manufacturers of panels for LCD monitors - LG.Philips LCD, AU Optronics and Samsung - already have S-IPS, MVA and S-PVA panels with a diagonal of 26-27" and new backlight lamps.

In the future, undoubtedly, lamps with new phosphors will completely replace the old ones - and we will finally go beyond the modest coverage of sRGB, for the first time in the entire existence of color computer monitors.

Color rendering: color temperature

In the previous section, I briefly mentioned that the concept of “white color” is subjective and depends on external conditions, now I would like to expand on this topic in a little more detail.

So, there really is no standard white color. One could take a flat spectrum as a standard (that is, one for which in the optical range the intensities at all wavelengths are the same), but there is one problem - in most cases, to the human eye it will not look white, but very cold, with a bluish tint .

The fact is that, just as you can adjust the white balance in a camera, our brain adjusts this balance for itself depending on the external lighting. The light of an incandescent light bulb in the evening at home seems to us only a little yellowish, although the same lamp, lit in light shade on a fine sunny day, already looks completely yellow - because in both cases our brain adjusts its white balance to the prevailing lighting, and in these cases it is different .

The desired white color is usually denoted through the concept of “color temperature” - this is the temperature to which an absolutely black body must be heated in order for the light emitted by it to look the desired way. Let's say the surface of the Sun has a temperature of about 6000 K - and indeed, the color temperature sunlight on a clear day is defined as 6000 K. The filament of an incandescent lamp has a temperature of about 2700 K - and the color temperature of its light is also 2700 K. It’s funny that the higher the body temperature, the colder its light seems to us, because blue tones begin to predominate in it .

For sources with a line spectrum - for example, the CCFL mentioned above - the concept of color temperature becomes somewhat more conventional, because it is, of course, impossible to compare their radiation with the continuous spectrum of an absolutely black body. So in their case, we have to rely on the perception of the spectrum by our eye, and from devices for measuring the color temperature of light sources we have to achieve the same cunning characteristic of color perception as that of the eye.

In the case of monitors, we can adjust the color temperature from the menu: as a rule, there are three or four preset values ​​(for some models - significantly more) and the ability to individually adjust the basic levels RGB colors. The latter is inconvenient compared to CRT monitors, where it was the temperature and not the RGB levels that were adjusted, but, unfortunately, for LCD monitors, except for some expensive models, this is the de facto standard. The purpose of adjusting the color temperature on the monitor is obvious - since the ambient light is chosen as a sample for adjusting the white balance, the monitor must be adjusted to it so that the white color looks white on it, and not bluish or reddish.

What is even more regrettable is that in many monitors the color temperature varies greatly between different gray levels - it is obvious that gray color differs from white very conditionally, only in brightness, so nothing prevents us from talking not about white balance, but about gray balance, and this will be even more correct. And many monitors also have different balances for different gray levels.


Above is a photograph of the ASUS PG191 monitor screen, on which four gray squares of different brightness are displayed - more precisely, three versions of this photograph are shown, added together. In the first of them, the gray balance is selected according to the rightmost (fourth) square, in the second - according to the third, in the last - according to the second. We can’t say about any of them that it is correct and the others are wrong - in fact, they are all incorrect, because the color temperature of the monitor should not depend in any way on what level of gray color we calculate it by, but here it is clearly not So. This situation can only be corrected by a hardware calibrator - but not by monitor settings.

For this reason, in each article for each monitor I provide a table with the results of color temperature measurements for four different gray levels - and if they are very different from each other, the monitor image will be tinted in different tones, as in the picture above.

Workspace ergonomics and monitor settings

Despite the fact that this topic is not directly related to the parameters of monitors, at the end of the article I would like to consider it, because, as practice shows, for many people, especially those accustomed to CRT monitors, the process of initially setting up an LCD monitor can cause difficulties.

Firstly, the location in space. The monitor should be located at arm's length from the person working behind it, perhaps slightly more if the monitor has a large screen size. You shouldn't place the monitor too close - so if you are going to buy a model with a small pixel size (17" monitors with a resolution of 1280x1024, 20" monitors with a resolution of 1600x1200 and 1680x1050, 23" with a resolution of 1920x1200...), consider whether the image will be suitable for you it is too small and illegible. If you have such concerns, it is better to take a closer look at monitors with the same resolution, but a larger diagonal, since the only other countermeasures left are scaling fonts and elements Windows interface(or the OS you are using), which is not available in all application programs gives a beautiful result.

The height of the monitor should ideally be adjusted so that the top edge of the screen is at eye level - in this case, when working, the gaze will be directed slightly downward, and the eyes will be half-closed with eyelids, which will protect them from drying out (as you know, we blink too rarely when working) . Many budget monitors, even 20" and 22" models, use stands without height adjustment - if you have the choice, it is better to avoid such models, and in monitors with height adjustment, pay attention to the range of this adjustment. However, almost all modern monitors allow you to remove the original stand from them and install a standard VESA bracket - and sometimes this opportunity is worth taking advantage of, because a good bracket gives not only freedom to move the screen, but also the ability to install it to the height that you need , starting from zero relative to the top of the table.

An important point is the lighting of the workplace. It is strictly contraindicated to work at a monitor in complete darkness - a sharp transition between a bright screen and a dark background will greatly tire your eyes. Small enough to watch movies and games background lighting, for example, one table or wall lamp; For work, it is better to organize full lighting of the workplace. For lighting, you can use incandescent lamps or fluorescent lamps with electronic ballast (both compact ones, chambered for E14 or E27, and ordinary “tubes”), but fluorescent lamps with electromagnetic ballast should be avoided - these lamps flicker strongly at twice the frequency of the mains voltage , i.e. 100 Hz, this flicker can interfere with the scan or the monitor's own backlight flicker, which sometimes creates extremely unpleasant effects. In large office premises, blocks of fluorescent lamps are used, the lamps in which flicker in different phases (either by connecting different lamps to different phases of the power supply, or by installing phase-shifting chains), which significantly reduces the noticeability of flicker. At home, where there is usually only one lamp, there is also only one way to combat flicker - the use of modern lamps with electronic ballast.

Having installed the monitor in the real space, you can connect it to the computer and continue the installation in the virtual one.

An LCD monitor, unlike a CRT, has exactly one resolution at which it performs well. The LCD monitor does not work well in all other resolutions - so it is better to immediately set its native resolution in the video card settings. Here, of course, we must once again note the need to think before purchasing a monitor whether the native resolution of the selected model will seem too large or too small to you - and, if necessary, adjust your plans by choosing a model with a different screen diagonal or with a different resolution.

The frame rate of modern monitors is, by and large, the same for all - 60 Hz. Despite the frequencies of 75 Hz and even 85 Hz formally declared for many models, when they are installed, the monitor matrix usually continues to operate at the same 60 Hz, and the monitor electronics simply discards “extra” frames. Therefore, there is no point in chasing high frequencies: unlike CRTs, there is no flicker on LCD monitors.

If your monitor has two inputs, digital DVI-D and analog D-Sub, then it is better to use the first one for work - it not only gives a higher quality picture at higher resolutions, but also simplifies the setup process. If you only have an analog input, then after connecting and setting the native resolution, you should open some clear, contrasting image - for example, a page of text - and check for unpleasant artifacts in the form of flickering, waves, interference, borders around characters, etc. similar. If something similar is observed, you should press the auto-adjust to signal button on the monitor; in many models it turns on automatically when the resolution is changed, but a smooth, low-contrast picture of the Windows desktop is not always enough for successful auto-tuning, so you have to run it manually again. When connecting via the DVI-D digital input, such problems do not arise, so when buying a monitor, it is better to pay attention to the set of inputs it has and give preference to models with DVI-D.

Almost all modern monitors have default settings that give very high brightness - about 200 cd/sq.m. This brightness is suitable for working on a sunny day, or for watching movies - but not for work: for comparison, the typical brightness of a CRT monitor is about 80...100 cd/sq.m. Therefore, the first thing to do after turning on the new monitor is to set the desired brightness. The main thing is to do it without haste, without trying to get the perfect result in one movement, and especially not trying to do it “like on the old monitor”; The problem is that making an old monitor look good doesn't mean fine-tuning it. high quality images - but only that your eyes are accustomed to it. A person who has switched to a new monitor from an old CRT with a dead tube and a dim image may at first complain about excessive brightness and clarity - but if a month later the old CRT is placed in front of him again, it turns out that now he can no longer sit in front of it, because that the picture is too dull and dark.

For this reason, if your eyes feel discomfort when working with the monitor, you should try changing its settings gradually and in connection with each other - reduce the brightness and contrast a little, work some more, if the discomfort remains, turn them down a little more... Let's do it after each Such a change takes time for the eyes to get used to the picture.

In principle, there is a good trick that allows you to quickly adjust the brightness of an LCD monitor to an acceptable level: you need to place a sheet of white paper next to the screen and adjust the brightness and contrast of the monitor so that the brightness of the white color on it is close to the brightness of the sheet of paper. Of course, this technique assumes that your workplace is well lit.

It is also worth experimenting a little with color temperature - ideally, it should be such that the white color on the monitor screen is perceived by the eye as white, and not bluish or reddish. However, this perception depends on the type of external lighting, while monitors are initially adjusted to some average conditions, and many models are also configured very sloppily. Try changing the color temperature to a warmer or cooler one, moving the RGB level adjustment sliders in the monitor menu - this can also have a positive effect, especially if the default color temperature of the monitor is too high: the eyes react worse to cool shades than to warm shades.

Unfortunately, many users do not follow these in general simple recommendations- and as a result, multi-page topics in the forums are born in the spirit of “Help me choose a monitor that doesn’t tire my eyes,” where they even go as far as creating lists of monitors that don’t tire my eyes. Gentlemen, I worked with dozens of monitors, and my eyes never got tired of any of them, with the exception of a couple of ultra-budget models that simply had problems with image clarity or completely crooked color rendition settings. Because your eyes get tired not from the monitor, but from its incorrect settings.

In forums, in similar topics, sometimes it gets to the point of ridiculousness - the influence of flickering backlight lamps (its frequency in modern monitors is usually 200...250 Hz, which, of course, is not perceived by the eye at all) on vision, the influence of polarized light, the influence of too low or The contrast of modern LCD monitors is too high (to taste), there was once even one topic in which the effect of the line spectrum of backlight lamps on vision was discussed. However, this seems to be a topic for another article, an April Fool’s article...

Speaking in dry scientific language, the response time of liquid crystal monitors is the shortest time that a pixel needs to change the brightness of the glow and is measured in milliseconds. (ms)

It would seem that everything is simple and clear, but if we consider the issue in detail, it turns out that these numbers hide several secrets.

A bit of science and history

The time of warm and tube CRT monitors with honest Hertz frame scan and RGB color has already passed. Then everything was clear - 100 Hz is good, and 120 Hz is even better. Each user knew what these numbers showed - how many times per second the picture on the screen is updated, or blinks. For comfortable viewing of dynamically changing scenes (for example, films), it was recommended to use a frame rate of 25 for TV and 30 Hz for digital video. The basis was the medical statement that human vision perceives an image as continuous if it blinks at least twenty-five times per second.

But technology has evolved, and liquid crystal panels, also called LCD, TFT, LCD, took over the baton from the CRT (cathode ray tube). Although production technologies differ, we will not focus on the details in this article; we will talk about the differences between TFT and LCD another time

What affects response time?

So, the principle of LCD operation is that the matrix cells change their brightness under the influence of a control signal, in other words, they switch. And it’s this switching speed or response time that determines maximum speed change the picture on the display.

It is converted into the usual hertz using the formula f=1/t. That is, in order to obtain the required 25 Hz, it is necessary to provide the pixels with a speed of 40 ms and 33 ms for 30 Hz.

Is it a lot or a little, and which monitor response time is better?

  1. If the time is long, then with sudden changes in the scene, artifacts will appear - where the matrix is ​​already black, the matrix still shows white. Or an object is displayed that has already disappeared from the camera's field of view.
  2. When the human eye is shown unclear pictures, visual fatigue increases, headaches may appear, and fatigue may increase. This is due to the visual tract - the brain is constantly interpolating information coming from the retina, and the eye itself is busy constantly changing focus.

It turns out that less is better. Especially if you have to spend most of your time at the computer. The older generation remembers how hard it was to sit through an eight-hour workday in front of a CRT - and yet they provided 60 Hz or more.

How can I find out and check the response time?

Although milliseconds are milliseconds in Africa, many have probably encountered the fact that different monitors with the same indicators produce images of different quality. This situation arose due to different methods for determining the matrix reaction. And it is hardly possible to find out what measurement method the manufacturer used in each specific case.

There are three main methods for measuring monitor response:

  1. BWB, also known as BtB, is an abbreviation of the English phrase “Black to Back” and “Black-White-Black”. Shows the time it takes for a pixel to switch from black to white and back to black. The most honest indicator.
  2. BtW – stands for “Black to White”. Switching on from an inactive state to one hundred percent luminosity.
  3. GtG is short for "Grey to Grey". How much does a point need to change the brightness of gray from ninety percent to ten. Usually it is about 1-2 ms.

And it turns out that checking the monitor’s response time using the third method will show a much better and more attractive result for the consumer than checking using the second. But if you don’t find fault, they’ll write that it’s 2 ms and that’s how it will be. But in fact, artifacts appear on the monitor, and the picture goes like a trail. And all because only the BWB method shows the true state of affairs- the first method, it is this that indicates the time required for a pixel to complete a full operating cycle in all possible states.

Unfortunately, the documentation available to consumers does not clarify the picture and what is meant by, for example, 8 ms is difficult to understand. Will it fit and be comfortable to work with?

For laboratory research, a rather complex software and hardware complex is used, which not every workshop has. But what if you want to check the manufacturer?

Checking the response time of the monitor at home is carried out by the TFT Monitor Test program . By selecting the test icon in the software menu and specifying the native screen resolution, a picture with a rectangle scurrying back and forth is displayed on the display. At the same time, the program will proudly show the measured time!

We used version 1.52, tested several displays and concluded that the program shows something, and even in milliseconds. Moreover, a monitor of poorer quality demonstrated worse results. But since the time of extinguishing and lighting of pixels is recorded only by a photosensor, which was not in sight, a purely software method can be recommended for a subjective comparative assessment - what the program measures is clear only to its developers.

A much more visual empirical test would be the “White Square” mode in the TFT Monitor Test - a white square moves across the screen, and the tester’s task is to observe the trail from this geometric figure. The longer the cable, the more time the matrix spends on switching and the worse its properties.

That’s all you can do to solve the problem “How to check the response time of a monitor.” We will not describe methods using cameras and calibration tables, but will consider them another time - this will take a couple more days. A full check can only be performed by a specialized organization with the appropriate technical base.

Gaming monitor response time

If the main purpose of the computer is gaming, then you should choose a monitor with the shortest response time. In fast-paced shooters, even a tenth of a second can decide the outcome of a battle. Therefore, the recommended monitor response time for games is no more than 8 ms. This value provides a frame rate of 125 Hz, and will be absolutely sufficient for any toy.

At the next closest value of 16ms, motion blur will be observed in hard batches. These statements are true if the stated time was measured by BWB, but craftily companies can write both 2 ms and 1 ms. Our recommendation remains the same - the less the better. Based on this approach, we say that the response time of a monitor for games should be at least 2 ms, since 2 ms GtG approximately corresponds to 16 ms BWB.

How to change the response time in the monitor?

Unfortunately, there is almost no way without replacing the screen. This is a characteristic of the layer itself, which is responsible for forming the image, and corresponds to the manufacturer’s design decision. There is, of course, a small loophole and the engineers solved the question: “How to change the response time.”

Companies that produce monitors call this feature OverDrive (OD) or RTC - response time compensation. This is when a higher voltage pulse is briefly applied to the pixel and it switches faster. If the monitor sparkles with the inscription “Gaming Mode” or something similar, then you should know that it is possible to adjust it for the better. Let us explain once again to make it completely clear - no programs or replacements of video cards will help and nothing can be tweaked - this is a physical property of the matrix and its controller.

conclusions

Buying a video card for a thousand or one and a half conventional units in order to run your favorite games at at least a hundred FPS, and sending a video signal to a monitor that can barely handle forty FPS, is a little irrational. It’s better to add a hundred to the display and enjoy the full dynamics of games and movies without disappointment - you definitely won’t get any pleasure from a 40 ms matrix, and the joy of owning a powerful video adapter will outweigh the poor image quality.

This article is devoted to a topical problem today - the choice LCD monitor. From information about the main characteristics of modern monitors, we move on to specific recommendations indicating the most interesting models in various price categories.

Disclaimer: The article does not aim to describe the operating principles of modern LCD monitors and is the subjective point of view of its author about the criteria for choosing an LCD monitor.

Lyrical digression. Five years ago, I did not even imagine that by today LCD monitors would almost completely replace the then traditional monitors based on a cathode ray tube from the computer market. But times have changed, and now a decent new CRT monitor, with good geometry and a large diagonal, is simply not available for sale. Meanwhile, manufacturers offer a 19″ monitor based on liquid crystals for 250 American rubles. But why does one 19″ monitor cost $250, while another costs $500 or more? And which one should you prefer?

First, let's talk about the characteristics of the monitor that you should pay attention to when choosing.

Response time

Response time is a characteristic that shows (without going into details) how quickly each pixel that forms an image on the monitor can change its color to a given one. The age-old problem with LCD monitors is that the image on them changes at a much slower rate than in the case of CRT-based monitors. As a result, on LCD monitors with a long response time, when the picture changes dynamically, you can see “blurring” of the picture, when the boundaries of a moving object blur and lose their clarity. To the credit of LCD monitor manufacturers, the response time situation is within last years has improved significantly, and modern LCD monitors have practically eliminated this problem, with rare exceptions (which will be discussed a little later).

By general rule, the faster the response time, the better. However, it is worth noting that manufacturers' methods for measuring response time are different, and the response time usually indicated by manufacturers can say little about how a particular monitor will behave in real applications. It is not possible to measure the response time without special equipment, so consumers are left with two options - either read reviews with objective measurements in specialized publications, or look at this monitor “live” in various applications and draw a conclusion “satisfied/dissatisfied” themselves, based on what they see . In my opinion, a response of about 8 ms or less is more than enough for comfortable watching movies and dynamic games. “Hardcore” gamers, at the same time, may need a 2 ms response on top-end LCD monitors built on a TN+film matrix.

Response time compensation (RTC, overdrive)

Since response time is one of the problematic characteristics of a monitor and practically the main characteristic that marketers of manufacturing companies focus on, engineers have developed a technology that allows reducing this characteristicresponse time compensation. However this technology brought with her not only positive sides, but also artifacts of “overclocking” matrices. IN latest models Monitors with this technology have significantly reduced the number of overclocking artifacts, but it is too early to talk about their absence. As in the case of response time, I advise you to read specialized reviews, or even better, look at such monitors in person, because the meager numbers in the reviews, although objective, give little idea to the untrained reader about the real situation with overdrive artifacts.

Contrast, brightness and backlight uniformity

The contrast of an LCD monitor is the ratio of the white level (the maximum brightness of which is in the center of the screen and is called the monitor brightness) to the black level. Roughly speaking, contrast determines how black a color will appear black rather than gray on your monitor screen. Manufacturers specify contrast ratios between 500:1 and 3000:1 for their LCD monitors. But most often this is the passport contrast of the matrices used in these monitors, which is measured by manufacturers on special stands under special conditions and does not take into account the influence of the electronics of a specific monitor model. Some manufacturers indicate the so-called “dynamic” contrast as the monitor contrast value. Monitors with this technology evaluate the currently displayed image and, depending on the predominance of light or dark tones, accordingly change the brightness of the matrix backlight. The black level is measured at the minimum brightness value, and the white level at the maximum, which is not entirely fair, since it is unattainable in reality at every single moment in time. It should also be noted that when different meanings monitor brightness, the contrast will also be very different, and the brightness required for comfortable work with text, for example, is significantly lower than the brightness required for watching videos and games.

Viewing Angles

Another one of the most important characteristics LCD monitors have different viewing angles. Because if the image on CRT monitors practically does not change even when looking at it from the side, then in the case of LCD monitors everything is completely different - the image changes significantly, and when viewed from above or below, a drop in contrast and color distortion are clearly visible. At the same time, manufacturers indicate viewing angles of 160? even for the most inexpensive panels, and so far no one has sued them for false advertising. Why, you ask? Yes, because they measure these angles under the condition that the contrast drops to values ​​of 10:1 in the center of the screen, and some even 5:1, which is completely unacceptable from the point of view of the possibility of working with a monitor at such values. To briefly summarize this section, we can only advise you to look at the monitor “live” and, asking to set a uniform fill of some color on it, look at it with different sides and draw your own conclusion whether this option suits you.

Color rendition

The color rendition of an LCD monitor is a characteristic that shows how fully and accurately the monitor displays the color spectrum visible to the human eye. Manufacturers indicate the number of colors that a monitor can reproduce as an indicator of color rendering. For modern LCD monitors, this number is traditionally specified as 16 million, which says absolutely nothing about the quality of color rendering in principle. This parameter is important primarily for those who are going to use the monitor for professional work with color or editing digital images, and due to the complexity of the description and its complexity, we will operate with comparative definitions - “better” and “worse”.

Matrix

Now let's talk about the type of matrix, because in the vast majority of cases all other characteristics of the LCD monitor, including the price, depend on it. Modern monitors use 3 main types of matrices - S-IPS, PVA (MVA, due to slight differences from PVA, can be considered a simplified analogue of PVA with slightly worse characteristics) and the most common in monitors - TN+film.

So, as far as we can see from the table, TN+film monitors are inferior to others in terms of characteristics, but are, nevertheless, the most common of all due to one significant factor - price. Comparing monitors to S-IPS matrices and PVA, we see that none of them has a clear advantage, and the choice should be made based on personal preferences and requirements. MVA still loses in terms of all the characteristics of PVA, but it also costs significantly less than models based on PVA and S-IPS.

Diagonal size and aspect ratio of the monitor, connection method

In the final part of our article, we will try to give practical advice on choosing an LCD monitor. But to do this, we will try to give a brief description of the existing LCD monitor market.

Currently, manufacturers offer us models 15″, 17″, 19″, 20″, 21″, 22″, 23″, 24″, 26″, 27″ and 30″. And if the 15″ and 17″ models have long become low-end and are produced only on a TN+film matrix, then in the 19″ sector the choice is much wider, including models on S-IPS, MVA and PVA matrices. But first, let’s focus on one important detail that directly affects the choice of LCD monitor - permission. Due to the peculiarities of LCD monitor technology, the latter are designed to display images in only one, so-called “native” resolution, which coincides with the physical number of pixels horizontally and vertically. Setting the resolution lower than the physical one leads to visible distortions and artifacts. Moreover, given the wide variety of diagonal sizes of the LCD monitors offered, their pixel sizes are also different, which greatly complicates the choice between them.

Diagonal size Matrix resolution Pixel size
15" 1024x768 0,297
17″ 1280x1024 0,264
19" 1280x1024 0,294
19″ wide 16:10 1440x900 0,284
20" 1600x1200 0,255
20″ wide 16:10 1680x1050 0,258
21″ 1600x1200 0,270
21″ wide 16:10 1680x1050 0,270
22″ wide 16:10 1680x1050 0,282
23″ wide 16:10 1920x1200 0,258
24″ wide 16:10 1920x1200 0,269
26″ wide 16:10 1920x1200 0,287
27″ wide 16:10 1920x1200 0,303
30″ wide 16:10 2560x1600 0,251

As we can see, the pixel sizes of modern LCD monitors in some cases differ by 17%, which is more than noticeable to the human eye. And if in the case of too large pixels we get “graininess” and “scattering” of the image into pixels, then in the case of too small ones we will unnecessarily strain our vision, risking spoiling it. Unfortunately, the means for scaling images of operating systems, and especially application software, are very far from perfect at the moment, therefore this measure will not help much in case the point is too small.

And a little more about aspect ratio monitor screens. There are currently three of them:

traditional 4:3, oddly enough, is not found so often - only on models with diagonals of 15″, 20″ and 21″; non-standard aspect ratio 5:4 - it is closer to a square, which has certain advantages when working with text - and inconvenience when watching films, the vast majority of which are released in widescreen; the rapidly gaining popularity of the 16:10 ratio, or the so-called widescreen monitors - due to the peculiarities of physiology, the human eye is more adapted to perceive a widescreen image than one close to a square one. However, older programs and games were designed for the 4:3 aspect ratio, without support for widescreen monitors.

At the same time, in the video card driver settings it is possible to set how the monitor should behave at a “non-native” program resolution:

    it can display the actual size of the image, and then there will be black bars along the edges, top and bottom; it can scale the picture in compliance with the proportions of the original image, and in this case we will get two stripes - on the sides or top/bottom, depending on the aspect ratio; without respecting the proportions, to fill the entire screen, and in this case we will get a distortion of the image proportions.

Traditionally, I suggest choosing a dot size that is comfortable for you personally by directly comparing monitors. As for the aspect ratio, the author’s personal opinion is that widescreen monitors are the future, especially for diagonals of 20″ and above.

Modern LCD monitors are connected to the video card in two ways - using a traditional analog connection using a D-Sub connector and a digital connection using a DVI connection. The latter ensures a minimum number of signal conversions on the way from the video card to the monitor and eliminates the dependence of image quality on the quality of the analog output of your video card.

Based on materials from gigamark.com.

When you purchase some additional equipment for your computer, such as an LCD monitor, there are many factors to consider. Today we will talk about such a parameter as response time. Knowing how the response time affects the image reproduced by the monitor, you can easily make the right choice.

LCD monitor s

The LCD monitor became the heir to the outdated CRT CRT monitors, significantly improving the weight and size characteristics of such devices. CRT monitors were very large and heavy, while modern LCD monitors are very light and compact. Unlike CRT monitors, LCD monitors are available in a wider range of models with different diagonals screen – from 14 to 28 inches. LCD operation is characterized by a wide range of parameters, such as maximum supported resolution, black color display depth, color purity, playback quality color range, as well as other parameters, among which response time occupies a special place.

Response time

The response time for an LCD monitor is one of key characteristics, which you should pay attention to when choosing a monitor. Response time can be described as the time an LCD monitor takes to change the color of each pixel. A high response time leads to such an unpleasant defect in the image as afterglow or trailing. When playing fast moving objects, such as an athlete, a vehicle, or a bird, they may leave a trail on the screen. This is due to the response time being too high, which can negatively affect the quality of dynamic scenes in movies and computer games Oh. Response time is measured in milliseconds - the lower this number, the better quality picture you will get on the monitor.

2ms or 5ms

Any response time less than 15 milliseconds is acceptable for LCD monitors and guarantees sufficient image quality, free from trailing motion and other artifacts. In general, an LCD monitor with a 2ms response time is considered better than a monitor with a 5ms response time. However, you should consider other parameters that affect the quality of video display. So, an LCD monitor with a response time of 2 ms can have weak spots in another, for example, as a reproduction of colors. And then it may turn out that a monitor with a response time of 5 ms is preferable for performing your tasks. If you are preparing to purchase a monitor, we recommend that you conduct a practical comparison of models with a response time of 2 or 5 ms.

What response time to choose

In general, if you only use your computer for watching videos and playing computer games, then be sure to choose a monitor with a response time of less than 12 ms. For many people, the difference between 2 and 5 ms response times is indistinguishable. They are more likely to pay attention to the fact that a monitor with a 5 ms response is cheaper than a monitor with a 2 ms response. In the end, the choice is yours - choose a monitor in the price range that suits you and with the necessary characteristics.

And don't be made a fool.

Almost any large electronics chain store offers a couple hundred TV models. My eyes are wide open, to be honest. In order not to fall for the tricks of marketers and the persuasion of sales consultants, you need to learn to identify all the disadvantages of a particular model a mile away.

The company’s experts helped us understand the theory and test it in practice. TP Vision. Thanks for the detailed and useful information, guys!

We tried to understand the main problems and formulate general recommendations regarding the process of choosing a TV.

Vulnerabilities

Cheap display panels

The display panels of modern LCD TVs differ not only in diagonal and backlight. Different itself work technology liquid crystals. Moreover, these differences are fundamental.

*clickable

Have you ever wondered why the cost of two TVs with the same diagonal may differ? several times? The use of outdated display panels plays a significant role in this. TN matrices are becoming less common, giving way to VA and IPS technologies. But each of them has its own advantages and disadvantages.

Response time

A little theory.

Response time is the speed at which the LCD cell is able to change the degree of transparency to form an image.

* That is, how quickly the color will change in one pixel.

Measured in milliseconds, the shorter it is, the better the display will be. dynamic scenes. Hollywood invests millions in special effects, so why watch these scenes distorted?

At the same time, each manufacturer considers it his duty measure response time in your own way. For example, GtG (gray to gray), BtW (black to white), BtB or BWB (black to white and back). There is no single standard, so this parameter can be compared among TVs of the same brand. The easiest way is to ask to turn on the same action scene on several models and take a closer look. Or torture the seller with what technology the manufacturer uses to measure response time, although they simply do not have such information.

Sellers' tricks

Sellers must give full And exhaustive information about the product. Bullshit. They should sell it to you. Those who manage to combine these skills meet very rarely.

What is the easiest way to convince a buyer that one TV is better than another? Easily. Raise the contrast and saturation on the desired product. If the manufacturer has not already done so. Feel free to ask to set the standard display mode on the models being compared.

Dumb Smart TV

Favorite function of sales consultants. The ability to watch movies online without leaving the couch tempts most Russian-speaking users. And if the applications preinstalled on the TV work more or less tolerably, then the built-in browser, as a rule, is simply disgusting.

Found the right page on the Internet? Ok, first get through the redirects and pop-up banners. Just a couple of clicks? Yes, but this may take a couple of minutes, because few browsers on TV can boast of high speed. If the TV in the store is connected to the network, it would be worth trying Smart functions TV.

Terrible interface

The logic for operating the menu is different for each brand of TV. and not always lucky. Duplicate sections, windows within windows, inconvenient navigation - you just can't find anything.

The implementation of the keyboard also raises many questions. Typing text with a couple of buttons on the remote control is a sophisticated punishment, nothing less.

No required connectors

It seems simple: we take all our devices used with the TV and look at what connectors are needed.

No matter how it is, TV is a purchase long-term, you need to think in advance about what will be connected to it in the future. It would be a good idea to find out the current strength in the USB connectors to know whether higher-capacity hard drives will open.

How to

  • Matrix

How not to make a mistake when choosing a matrix? We need to decide for what purpose buy a TV.

Types of matrices. Old TN matrices are quite enough if you use a TV as a monitor. For work and play - just right. It shows dynamic scenes perfectly, and these TVs are among the cheapest on the market. Cons: narrow viewing angle and dull color, which is not suitable for designers and lovers of beautiful cinema.

VA matrices are good at rendering black. The result is a beautiful, contrasting picture, but viewing angles suffer. Although they are wider than in TN matrices. These TVs are suitable for those who like to sit on the couch and play Xbox or PS.

IPS matrices have excellent color reproduction and a huge viewing angle. The most important thing is to watch TV series The whole family can sit wherever is convenient. The main disadvantage is the shallow black color, the picture turns out “flat”.

Permission. It’s not worth participating in the race for permission yet, quite enough 1920x1080 pixels. 4K TVs can certainly deliver breathtaking pictures, but for now... there is practically no such content. Except YouTube. The option remains to buy one for the future, but technological progress does not stand still, and it is not a fact that today’s 4K TV will be relevant in a couple of years.

Scan. You can often see the designations 1080p and 1080i (or 720p and 720i), be careful it's not the same thing. The resolution is the same in both versions, but the scanning type is different.

  • At 1080i (interlaced), the image is displayed sequentially, in even and odd lines. As a result, there is a ladder at the boundaries of the object and frame shake, they are trying to smooth it all out using software methods. The frame rate is limited.
  • At 1080p (progressive scan), the image is displayed on the screen immediately, the frame rate is higher.

Feel free to choose the second option.

  • Backlight type

If the LCD panel is not illuminated, it will not show anything. In modern models, LED backlighting is predominantly found; the old CCFL (with fluorescent lamps) can only be found in the cheapest and thickest TVs.

LED lighting can be edge (Edge LED) or carpet (Direct LED). In the first case The diodes are located on the sides, and the light from them is scattered through the diffuser. This makes it possible to produce cool and thin TVs, but makes local control of the backlight impossible; it turns out to be uneven.

If the backlight carpet, then the diodes are distributed evenly, covering the entire area of ​​the LCD panel. It becomes possible to locally control groups of LEDs, providing better color rendition. There are no gaps in the backlight, but the TV is a little thicker.

The size difference is not that big. Therefore, it is more logical to give preference to a TV with Direct LED.

  • Response

Whatever the color rendering and screen resolution, low speed response may nullify all the pleasure of watching. According to this criterion, TVs with TN matrices are ahead. But, as mentioned above, the picture suffers. The trade-off between response time and image quality is realized in VA matrices. IPS is left behind, unless it is modern subtypes like e-IPS and s-IPS.

For example, the response time on a 32-inch Philips TV is 2 ms, an impressive result. You can play the console and watch an action movie. Near 20 thousand rubles, at any electronics store.

  • White balance

The TV must contribute as less as possible distortions in the original content. Only now, modern manufacturers are not interested in ensuring that their displays meet color standards, but in ensuring that they sell. Therefore, more “rich blues” and “living reds” appear than those of their competitors. That is, the brightness and saturation of some colors programmatically overpriced, temperature changed. In a good way, if manufacturers set up their products correctly, then the TVs displayed on the counter would show similar images.

It is a common belief that Japanese and Korean companies often oversaturate colors and push their brightness up. The image temperature is usually below the reference 6500 K. While European manufacturers (for example, Phillips) are striving for more natural colors and correct white balance. An example is a 50-inch Phillips with a VA matrix. Adequate white balance coupled with low response time and natural colors. Everything you need to watch TV in the living room. Price - almost 45 thousand rubles.

  • Smart Smart TV

The main point is availability fast browser and a wide range of applications for consuming online content. Moreover, for comfortable surfing the network you need Flash support and HTML5. The interface should be convenient and intuitive. Wi-Fi module Makes life much easier for those who are bothered by extra wires. Which, however, is not critical.

Where can I find all this? Alternatively, try it Android TV. There is a convenient store of adapted applications, control from a smartphone is implemented, and the browser is faster. This Android is built into the 55-inch Philips 6500 series. The OS in this TV is a converted 5.1 (Lollipop). But 75 thousand rubles They are not asking for Smart TV. It's just a huge stylish TV with a cool image, Ambilight backlighting and everything you need.

  • Optimal screen size

There are no clear criteria for choosing the size of a TV. It's no secret that the further the viewer sits from the screen, the larger the diagonal is needed. It all comes down to personal preference, but the overall picture looks like this:

The viewing angle is also important. This is why TN TVs are not suitable for the living room. If you look from the side, the picture will change color.

  • Suitable 3D technology

If the choice fell on 3D TVs, you need to decide on the technology for transmitting stereoscopic images. Two main ones: active and passive. You need glasses everywhere.

With 3D active, the image is fed alternately to each eye at a very high frequency, which is synchronized with the frequency of the TV. This gives many people headaches and eyes. But the picture is displayed in the same resolution, albeit slightly darkened. The glasses have a built-in shutter mechanism that alternately closes the right and left lenses. This requires a power source, which means the glasses will have to be charged from time to time. The TV set usually includes one or two pairs of such glasses, the rest will have to be purchased, and they cost a lot.

In passive 3D, the image is perceived as a whole, the TV simply sends the picture at different angles for the left and right eyes. The glasses are simpler and work without batteries. Their lenses are special filters that only accept images from the right angles. The main thing is not to run into glasses with linear polarization, otherwise you will have to keep your head strictly vertical while viewing. It's better to take a kit that supports circular polarization. It would seem that these are all advantages, but the image quality suffers: the resolution is lower, dynamic scenes are distorted, and the “depth” of the 3D effect is less. A whole bunch of these glasses will be placed in the box with the TV, enough for the whole family. Yes they are for sale cheap, buying more is not a problem.







2024 gtavrl.ru.