* Digital images: images that can be displayed and processed on computer lines.
*? Digital image processing: the image is analyzed and processed by computer to meet various purposes.
*? Characteristics of digital images:
1, the information of the image is very large.
2. The amount of image processing data is very large.
3, there are a lot of repetitive operations in the processing process.
4. The processing technology is comprehensive.
*? Human visual structure:
*? Cone cells: feel light and color. Sensitive to color.
*? Rod cells: only feel light, but not color. (No night blindness)
*? Brightness: the brightness of light.
*? Tone: the brightness of primary colors in color mode, such as the brightness of red, green and blue in RGB.
*? Saturation: the intensity of a color.
*? Brightness contrast effect:
1, and the comparison effect at the same time: according to the comparison measurement.
2. Maher band effect: The visual supervisor feels that the required bright or dark stripes appear where the brightness changes.
*? Digital image: converting a continuous analog signal into a discrete digital signal.
*? Nyquist sampling theorem;
Conditions for replacing continuous signals with discrete signals;
1, the original signal is a limited bandwidth signal.
2, the sampling frequency is not less than 2 times the highest frequency of the signal.
*? Spatial resolution:
Units: pixel/inch, pixel/cm, pixel * pixel.
Quantization of digital image: transforming gray scale into integer representation.
For example, 8 bits can represent 2 8 gray levels (0-256).
Amplitude resolution: The more gray levels, the higher the resolution.
(False contour: Because there is too little gray level, the color difference is increased when discretization, which leads to something similar to contour. )
* Calculate the data volume of digital images.
The pixel resolution is M*N, q bits/pixel.
The amount of data is: M*N*Q/8 bytes.
(Quantization level: 2 8)
*? Digital image classification:
1. Gray image: quantization from pure black to pure white.
2, binary image: only black and white
3. Color images: For example, RGB images, each color channel is represented by a corresponding bit.
* the basic relationship between pixels:
* positional relationship:
* adjacent:
Adjacency condition:
1, 4 adjacent or 8 adjacent
2. The gray values are similar.
* Connectivity: the quality produced by adjacency.
Connected Sets: Generated by Connectivity
4- Connected: 6
8- Connected: 2
Region: r is a subset of pixels in the image. If r is a connected set, then r is a region.
Boundary: If one or more domain pixels in the region r are not in the region, then the pixel is its boundary. (The above pictures are all boundaries. )
Pixel distance:
1, European distance
2. Block distance = |x 1-x2|+|y 1-y2|
3. Chessboard distance = max(|x 1-x2|, |y 1-y2|)
Algebraic operation of digital images;
Application:
Addition: remove additive noise and image superposition.
Subtraction: detecting image changes
Multiplication: matting, changing gray scale
*? Point operation: transform a single pixel.
*? Spatial filtering: domain-based processing
*? Gray scale conversion:
Original pixel->; Mapping function->; Transform pixel
Application:
1, image inversion (negative effect)
Take 8 bits as an example: the transformed pixel gray level = 255- the original pixel gray level.
2. Linear transformation (1)
Expansion: Enlarge the dynamic range of image gray (underexposed or overexposed) in gray density, increase contrast and make the image clearer.
Compression: On the contrary, images can be softened.
*? Piecewise linear transformation (2):
3, nonlinear transformation:
Objective It is not necessary to enlarge the dynamic range of gray values, such as dark parts and highlights, by processing pixels with different gray ranges to different degrees.
* logarithmic expansion:
Exponential expansion:
Gray histogram: reflects the gray distribution.
Horizontal gray scale, vertical pixel number or percentage
* calculation:
Histogram equalization
For example, practice
Gray scale 0-7
The distribution probability is 0. 19, 0.25, 0.2 1, 0. 16, 0.08, 0.06, 0.03 and 0.02.
Find the pixel distribution after histogram homogenization;
Answer:
There are only five gray levels after homogenization, and the probability of 1, 3, 5, 6, 7 is as follows:
1:0. 19,3:0.25,5:0.2 1,6:0.24,7:0. 1 1
Histogram specification
In short, given a template, the gray distribution of transformed image pixels is similar to that of the template.
For example, for this problem, the gray level of 0 accounts for 0. 19, which is close to 0.2 of the target template, so it becomes the gray level of 3 of the target template. The middle gray levels of 1, 2 and 3 add up to 0.62, which is close to 0.6 of the target template, so it becomes 5.
*? Spatial filters/templates: matrices
*? Filtering process:
1. Align the filter with the pixels of the image in turn.
2. Do convolution (multiply the corresponding pixel by k, and finally sum).
3. Assign the result to the image pixel corresponding to the middle position of the filter.
*? Edge problem: Because the filter can't go beyond the range of the image, the edge can't be filtered.
*? Treatment method:
1, ignored
2. There are pixels with the same gray value as the edge outside the imaginary edge.
Spatial filtering classification:
1. Smoothing filtering: Smoothing the image, removing high-frequency components, making the gray value of the image change less and reducing noise.
2. Sharpening filtering: removing low-frequency components makes the image contrast increase and the edge is obvious.
1, domain average method
Noise can be reduced, but the image is blurred.
2. Weighted average method
The importance (weight) of gray levels in different positions is different, the middle is the most important, and the importance of edges is reduced.
3. Nonlinear smoothing filtering
1, using the difference to reflect the gray level change of adjacent pixels (continuous change degree is called differential, discrete is called differential, which is actually difference. Is a concept)
2, through the differential gradient. (Gradient can be used to detect edges, because the gray level of edge pixels varies greatly. )
3. Sharpen pixel gray value = original pixel gray value+sharpening degree coefficient * gradient.
Practical application:
1、
2. Second-order difference template-Laplace operator
Calculate gradient:
Direct sharpening:
> The matrix filter we used before was used to process images in the spatial domain, and now we have to turn to the frequency domain.
& gt Students who don't know the frequency domain can go to Zhihu to search.
& gt introduction:
Fourier, a genius mathematician, found that any periodic signal can be expressed by sine function series, and any aperiodic signal can be expressed by weighted integral of sine signal.
> So the distribution of these sine functions produces the concept of frequency domain.
After the two-dimensional discrete Fourier transform of the image:
Four corners, low frequency part. The center is the highest frequency.
The brightest indicates the highest low-frequency energy (see the picture, pixels with small gray changes such as black coat and background account for the majority and belong to low-frequency components).
Because of the periodicity and yoke symmetry of two-dimensional DFT, we can concentrate the spectrum.
Interleaving of spectra;
* the basis of frequency filtering
Steps:
1, image space to frequency domain
2. Multiply the frequency spectrum with the frequency filter.
3, performing inverse Fourier transform to obtain an image.
*? Frequency domain filtering classification:
1, low-pass filtering
2. Qualcomm filtering.
3, bandpass and bandstop filtering
4. Homomorphic filtering
* Notch filter
Thought: Noise and edges belong to high-frequency components, low-pass, as the name implies, low-frequency passes, filtering out high-frequency.
Classification:
1, ideal low-pass filter
Where D0 is the manually determined cutoff frequency.
Disadvantages: Ringing may occur.
Reasons for ringing phenomenon:
2. Butterworth low-pass filter
Disadvantages: the smoothing effect is not as good as the ideal low pass.
When Butterworth order n rises, the ringing phenomenon increases. But it is better than ideal low-pass because there is a smooth transition between low frequency and high frequency. The higher the order, the lower the smoothness, so the ringing phenomenon is enhanced.
3. Gaussian low-pass filter (GLPF)
Disadvantages: the smoothing effect is not as good as the first two.
The relationship between smoothing effect and cutoff frequency;
High frequency passes and low frequency is filtered out. Achieve sharpening.
Qualcomm filter template = 1- low pass filter template.
Effect:
Similarly, IHPF also has ringing phenomenon.
Qualcomm filter only gets edge information, and all non-edge information turns black. In order to get an enhanced sharpening image, a high-frequency enhancement filtering method is adopted.
Method:
K * high pass filter +c
K is? & gt coefficient 1, where c is a constant.
For images with large dynamic range (black is very black and white is very white), the details are in the black or white part.
Gray scale expansion is used to improve contrast and further expand the dynamic range of the image.
Compressing gray scale reduces the dynamic range, but the details are more difficult to distinguish.
At this time, it is necessary to combine frequency filtering with gray scale transformation-homomorphic filtering.
* theoretical basis:
The image is synthesized according to the illumination/reflectivity model.
Illuminance: sunlight or other light sources, which generally do not change much, belong to low frequency.
Reflectivity: it is determined by the surface material of the object, which varies greatly and is high frequency.
For example, if you look out of the window, the sun shines almost evenly on everything. But the different details presented are determined by the reflectivity of flowers, plants and houses)
So,
Weakening the incident light i(x, y) can narrow the gray range.
How to strongly reflect light r(x, y) can improve image contrast.
Process:
In this way, the homomorphic filter automatically weakens the low-frequency incident light and reduces the dynamic range. Enhance the high frequency to improve the contrast.
Image degradation: In the process of image generation, storage and transmission, the image quality is damaged due to the imperfection of equipment.
Image restoration: based on the image degradation model, the degradation model is established according to prior knowledge, and then the original image is restored by inverse operation.
* The connection and difference between image enhancement and image restoration.
Contact: It's all about improving the visual quality of images.
Difference: enhancement is subjective, regardless of the cause of image degradation. Reduction is objective, and the purpose is to restore the original appearance to the greatest extent.
Image degradation model:
It is described by probability density function.
Classification:
1, Gaussian noise
2. Rayleigh noise
3. Gamma noise
4, evenly distributed noise
5, impulse noise (salt and pepper noise)
6. Periodic noise
Gray histogram of some noises:
Case:
Analysis:
Take a place that hasn't changed much and draw a histogram. This is a Gaussian noise model.
Processing additive noise (Gaussian noise, uniformly distributed noise)-spatial filtering
1, arithmetic average filtering, arithmetic average.
2, geometric mean filtering, geometric mean.
Advantages: Geometric mean filtering retains more details of the image, and the smoothness is similar to arithmetic.
3. Harmonic mean filtering
The effect of dealing with "salt" noise is good, and it is not suitable for "pepper" noise.
4. Inverse harmonic mean filtering
Q filter order:
Q>0 to deal with "pepper" noise.
Q == 0 is arithmetic average filtering.
Q<0 Processing Salt Noise (Q ==-1, Harmonic Mean Filtering)
5. Statistical sorting and screening:
Median filter: Under the same size, it is less blurred than the general filter. It is very effective to deal with impulse noise. But using too much will blur the image.
Maximum filter: it is effective to deal with "pepper" noise, but it will remove the black pigment on the edge of some black objects.
Minimum filter: it is effective to deal with "salt" noise, but it will remove some white pigments on the edge of white objects.
Midpoint filtering: calculate the arithmetic average of the maximum and minimum values in the filtering template, which is the midpoint value. Gaussian and uniform noise are the best.
6. Adaptive filter (the repair intensity can be determined by itself according to the currently processed pixel information)
Effect:
7. Adaptive median filtering
Find the median in the template. If the median value is not a pulse, see if the central value Zxy is a pulse. The center value Zxy is