LCD Motion Blur
PSYCH 221 Project
Lieven Verslegers
Project Advisor: Joyce Farrell

 

Abstract

Motion blur arises from the slow response time and hold-type rendering of liquid crystal displays, together with the motion pursuing function of the human visual system.  This project focuses on a motion blur model, and methods to reduce the blurring effect.  Furthermore, methods to experimentally assess motion blur are discussed.

 

Contents

1. Introduction: What is motion blur?
            1.1 Hold-type motion blur
            1.2 Slow-response motion blur
2. Modeling and analysis
            2.1 Display-perception chain
            2.2 Blur width
3. Motion blur reduction
            3.1 Black insertion
            3.2 Backlight flashing
4. Measuring motion blur
            4.1 Spatio-temporal integration method
            4.2 Temporal integration method
            4.3 Subjective experiment

1. Introduction: What is motion blur?

Liquid crystal displays (LCDs) have many advantages over cathode-ray tube (CRT) displays, including in brightness, resolution, size, weight, and power consumption.  However, when rendering rapid motion content, LCDs suffer from motion blur.  This motion blur can be attributed both to hold-type rendering and to the slow temporal response of the LCD.

 

1.1 Hold-type motion blur

Figure 1 illustrates the difference in rendering between LCDs and CRTs.  On the CRT, the pixel intensity over time consists of a series of pulses, which are much shorter than the frame duration.  On an LCD, on the contrary, the pixel intensity is sustained for the entire frame cycle.  This hold-type rendering, in combination with the motion-pursuing function of the human visual system, then leads to motion blur.

Figure 1.  CRT and LCD rendering.

To understand this, consider an observer who tracks a moving edge on a CRT, and compare this to an LCD (Fig. 2).  For the CRT, the path of the eye, as indicated by the arrows, when integrated over time (low temporal frequency filtering), does not lead to mixing of black and white in the image, and the moving edge is perceived sharply on the retina.  For the LCD, due to the hold-type rendering, the path of the eye moves through white and black regions.  The temporal integration of the eye then leads to a blurred edge on the retina.

Figure 2.  Tracking a moving edge on a CRT and an LCD.

 

1.2 Slow-response motion blur

The second type of motion blur results from the slow response of the LCD.  Ideally, when switched, a pixel should reach its target value instantly.  In reality, however, the pixel takes a certain amount of time to switch (up to several cycles for older LCDs).  This response is illustrated in Fig. 3.

Figure 3.  Slow-response in an LCD.

Improving the slow-response motion blur is commonly achieved through overdrive, demonstrated in Fig. 4.  The idea is to apply a larger driving value so that the target value is reached by the end of the frame.

Figure 4.  Overdrive.

 

2. Modeling and analysis

2.1 Display-perception chain

In order to model motion blur, both the LCD and the human visual system need to be accounted for.  Figure 5 shows the display-perception chain, which consists of two parts.  The first part is associated with the display: sample and hold.  The second and third part are related to the human visual system: motion pursuit, and spatio-temporal low-pass filtering.

Figure 5.  Display-perception chain.

Based on this display-perception chain, the LCD motion blur can be modeled as [1]:

.

(1)

Equation (1) gives the perceived image, compensated for motion with speed vx through eye tracking.  It corresponds to the perceived image on an ideal impulse display, convolved with the LCD temporal reconstruction function ht(t).  This function includes the LCD temporal response, as well as the hold-type rendering.

 

2.2 Blur width

The first-order hold temporal reconstruction function is given in Fig. 6.  The temporal transitions are linear, and the function spans over two frames, meaning that the current intensity will be influenced by the previous intensity.

Figure 6.  First-order hold.

The perceived edge, at a speed of 19 pixels per frame, is shown in Fig. 7.  The blur width (BW), as indicated in the figure, is the spatial width between two points on the blurred edge, the first point being the low value plus ten percent of the difference between the low and high value, and the second point the high value minus ten percent of the difference.

Figure 7.  Perceived edge for first-order hold.

In this case, the blur width can be calculated as BW ≈ 1.1vT.  This means that the blur width is proportional to the motion speed and frame length.  As an example, for the edge moving at 19 pixels per frame, the perceived blur width will be 21 pixels wide.

We can repeat the calculation for the zero-order hold function (Fig. 8).  This function corresponds to an immediate response.

Figure 8.  Zero-order hold.

The blur width (Fig. 9) in this case is 0.8 vT, or about 15 pixels wide, and is entirely due to hold-type rendering.

Figure 9.  Perceived edge for zero-order hold.

 

3. Motion blur reduction

3.1 Black insertion

Motion blur can be reduced through black insertion, as illustrated in Fig. 10.  This temporal reconstruction function corresponds to a doubling of the frame rate, whereby the second field is black. 

Figure 10.  Black insertion.

From Fig. 11, it can be seen that the blur width is halved as compared to the first-order hold, for the same moving edge.

Figure 11. Perceived edge with black insertion

 

3.2 Backlight flashing

Another approach is backlight flashing.  The temporal response is kept the same, but the backlight is turned on for only a fraction of the frame period (one eighth in Fig. 12).  This leads to a significant reduction in the blur width (Fig. 13).

Figure 12. Backlight flashing.

Figure 13.  Perceived edge with backlight flashing.

 

Other methods, not discussed here, include a motion adaptive deblurring filter, smooth frame insertion, and motion-compensated upconversion.

 

4. Measuring motion blur

4.1 Spatio-temporal integration method

One can use a camera to track a moving edge.  When the edge is tracked perfectly, the irradiance will be periodic in time, with the period equal to the LCD frame cycle.  The perceived image E(x) can then be found from the irradiance of the LCD ELCD(x):

.

(2)

Δt is the frame period of the camera and N is the ratio between the camera frame rate and the LCD frame rate.  Figure 14 shows a captured sequence of a moving edge.

Figure 14. Captured sequence of a moving edge (from [2]).

 

4.2 Temporal integration method

Alternatively, the temporal response for a moving edge can be measured.  Due to the motion blur, the output Et(t) is no longer a step function, but gradually changes from one value to the next (Fig. 15).  The perceived edge can be shown to be [3]:

,

(3)

from which the blur width can then be calculated.

Figure 15. Motion-induced temporal transition.

 

4.3 Subjective experiment

Motion blur can be measured in a subjective experiment as well.  In this case, the observer watches an edge moving on a screen (Fig. 16).  A still image with simulated blur is shown below the moving edge.  The observer can decrease or increase the motion blur in the still image to match the blur of the moving edge.

Figure 16.  Subjective experiment (from [2]).

 

Acknowledgements

I thank Joyce Farrell for suggesting this study.

 

References

[1]        H. Pan, X. Feng, and S. Daly, “LCD motion blur modeling and analysis”, IEEE International Conference on Image Processing, vol. II, 21-24 (2005).

[2]        X. Feng, H. Pan, and S. Daly, “Comparisons of motion-blur assessment strategies for newly emergent LCD and backlight driving technologies”, J. Soc. Info. Display, vol. 16, 981-988 (2008).

[3]        X. Feng, “LCD motion-blur analysis, perception, and reduction using synchronized backlight flashing”, Proc. SPIE, vol. 6057, 213-226 (2006).

[4]        X. Feng, H. Pan, and S. Daly, “Dynamic gamma: Application to LCD motion-blur reduction”, J. Soc. Info. Display, vol. 14, 949-956 (2006).

[5]        http://www.extremetech.com/article2/0,2845,2356408,00.asp