Main Content

visionhdl.ChromaResampler

Downsample or upsample chrominance component

Description

The visionhdl.ChromaResampler System object™ downsamples or upsamples a pixel stream.

  • Downsampling reduces bandwidth and storage requirements in a video system by combining pixel chrominance components over multiple pixels. You can specify a filter to prevent aliasing, by selecting the default filter or by entering coefficients.

  • Upsampling restores a signal to its original rate. You can use interpolation or replication to calculate the extra sample.

The object accepts luma and the chrominance components. The object does not modify the luma component and applies delay to align with the resampled chrominance outputs. The rate of the output luma component is the same as the input.

To resample the chrominance component of a pixel stream:

  1. Create the visionhdl.ChromaResampler object and set its properties.

  2. Call the object with arguments, as if it were a function.

To learn more about how System objects work, see What Are System Objects?

Creation

Description

example

CR = visionhdl.ChromaResampler(Name,Value) returns a chroma resampler System object that resamples the chrominance component of a pixel stream. Set properties using one or more name-value pairs. Enclose each property name in single quotes.

Properties

expand all

Unless otherwise indicated, properties are nontunable, which means you cannot change their values after calling the object. Objects lock when you call them, and the release function unlocks them.

If a property is tunable, you can change its value at any time.

For more information on changing property values, see System Design in MATLAB Using System Objects.

Resampling format, specified as one of these values.

  • '4:4:4 to 4:2:2' — Perform a downsampling operation.

  • '4:2:2 to 4:4:4' — Perform an upsampling operation.

Lowpass filter to accompany a downsample operation, specified as one of these values.

  • 'Auto' — Use the built-in lowpass filter.

  • 'Property' — Filter using the coefficients in theHorizontalFilterCoefficients property.

  • 'None' — Do not filter the input signal.

Dependencies

This property applies when you set Resampling to '4:4:4 to 4:2:2'.

Coefficients for the antialiasing filter, specified as a vector.

Dependencies

This property applies when you set Resampling to '4:4:4 to 4:2:2' and AntialiasingFilterSource to 'Property'.

Interpolation method for an upsample operation, specified as one of these values.

  • 'Linear' — Use linear interpolation to calculate the missing values.

  • 'Pixel replication' — Repeat the chrominance value of the preceding pixel to create the missing pixel.

Dependencies

This property applies when you set Resampling to '4:2:2 to 4:4:4'.

When the input is any integer or fixed-point data type, the algorithm uses fixed-point arithmetic for internal calculations. This property does not apply when the input data type is single or double.

When the input is any integer or fixed-point data type, the algorithm uses fixed-point arithmetic for internal calculations. This property does not apply when the input data type is single or double.

Data type for the antialiasing filter coefficients, specified as numerictype(S,WL,FL), where S is 1 (true) for signed and 0 (false) for unsigned, WL is the word length, and FL is the fraction length in bits.

Dependencies

This parameter applies when you set AntialiasingFilterSource to 'Property' or 'Auto'.

Usage

Description

example

[pixelout,ctrlout] = CR(pixelin,ctrlin) computes the next output pixel, pixelout, in the resampled video stream. The pixel data arguments, pixelin and pixelout, are vectors of three values representing a pixel in Y'CbCr color space. The object passes through the luma component and control signals, ctrlin, aligned with the output pixel stream.

This object uses a streaming pixel interface with a structure for frame control signals. This interface enables the object to operate independently of image size and format and to connect with other Vision HDL Toolbox™ objects. The object accepts and returns a three-component vector that represents a single pixel and a structure that contains five control signals. The control signals indicate the validity of each pixel and its location in the frame. To convert a pixel matrix into a pixel stream and control signals, use the visionhdl.FrameToPixels object. For a full description of the interface, see Streaming Pixel Interface.

Input Arguments

expand all

Input pixel in gamma-corrected Y'CbCr color space, specified as a vector of three values.

The software supports double and single data types for simulation, but not for HDL code generation.

Data Types: uint8 | uint16 | fixdt(0,N,0), where N = 8,9,...,16 | single | double

Control signals accompanying the input pixel stream, specified as a pixelcontrol structure containing five logical data type signals. The signals describe the validity of the pixel and its location in the frame. For more details, see Pixel Control Structure.

Data Types: struct

Output Arguments

expand all

Output pixel in gamma-corrected Y'CbCr color space, returned as a vector of three values.

The software supports double and single data types for simulation, but not for HDL code generation.

Data Types: uint8 | uint16 | fixdt(0,N,0), where N = 8,9,...,16 | single | double

Control signals accompanying the output pixel stream, returned as a pixelcontrol structure containing five logical data type signals. The signals describe the validity of the pixel and its location in the frame. For more details, see Pixel Control Structure.

Data Types: struct

Object Functions

To use an object function, specify the System object as the first input argument. For example, to release system resources of a System object named obj, use this syntax:

release(obj)

expand all

stepRun System object algorithm
releaseRelease resources and allow changes to System object property values and input characteristics
resetReset internal states of System object

Examples

collapse all

Resample a 4:4:4 Y'CbCr image to 4:2:2. The example also shows how to convert an R'G'B' input image to Y'CbCr color space.

Prepare a test image by selecting a portion of an image file.

frmActivePixels = 64;
frmActiveLines = 48;
frmOrig = imread('fabric.png');
frmInput = frmOrig(1:frmActiveLines,1:frmActivePixels,:);

Create a serializer and specify the size of inactive pixel regions. The number of padding pixels on each line must be greater than the latency of each pixel-processing object.

frm2pix = visionhdl.FrameToPixels( ...
      'NumComponents',3, ...
      'VideoFormat','custom', ...
      'ActivePixelsPerLine',frmActivePixels, ...
      'ActiveVideoLines',frmActiveLines, ...
      'TotalPixelsPerLine',frmActivePixels+40, ...
      'TotalVideoLines',frmActiveLines+10, ...
      'StartingActiveLine',6, ...     
      'FrontPorch',5);

Create a color space converter and resampler, using the default property values. The default conversion is 'RGB to YCbCr'. The default resampling mode is '4:4:4 to 4:2:2'. The default antialiasing filter is a 29-tap lowpass filter. This filter gives the object a latency of 30 cycles.

convert2ycbcr = visionhdl.ColorSpaceConverter();
downsampler = visionhdl.ChromaResampler();

Serialize the test image using the serializer object. pixIn is a numPixelsPerFrame -by-3 matrix. ctrlIn is a vector of control signal structures. Preallocate vectors for the output signals.

[pixIn,ctrlIn] = frm2pix(frmInput);

[~,~,numPixelsPerFrame] = getparamfromfrm2pix(frm2pix);
pix444 = zeros(numPixelsPerFrame,3,'uint8');
ctrl444  = repmat(pixelcontrolstruct,numPixelsPerFrame,1);
pix422 = zeros(numPixelsPerFrame,3,'uint8');
ctrl422 = repmat(pixelcontrolstruct,numPixelsPerFrame,1);

For each pixel in the stream, convert to Y'CbCr, then downsample.

for p = 1:numPixelsPerFrame  
    [pix444(p,:),ctrl444(p)] = convert2ycbcr(pixIn(p,:),ctrlIn(p));
    [pix422(p,:),ctrl422(p)] = downsampler(pix444(p,:),ctrl444(p));
end

Create deserializers with a format matching that of the serializer. Convert the 4:4:4 and 4:2:2 pixel streams back to image frames.

pix2frm444 = visionhdl.PixelsToFrame( ...
      'NumComponents',3, ...
      'VideoFormat','custom', ...
      'ActivePixelsPerLine',frmActivePixels, ...
      'ActiveVideoLines',frmActiveLines, ...
      'TotalPixelsPerLine',frmActivePixels+40);

pix2frm422 = visionhdl.PixelsToFrame( ...
     'NumComponents',3, ...
     'VideoFormat','custom', ...
     'ActivePixelsPerLine',frmActivePixels, ...
     'ActiveVideoLines',frmActiveLines, ...
      'TotalPixelsPerLine',frmActivePixels+40);

[frm444,frmValid1] = pix2frm444(pix444,ctrl444);
[frm422,frmValid2] = pix2frm422(pix422,ctrl422);

The 4:2:2 and 4:4:4 pixel streams and frames have the same number of pixels. To examine the resampled data, regroup the pixel data for the first 8 pixels of the first line. The first row is the Y elements of the pixels, the second row is the Cb elements, and the third row is the Cr elements. In the 4:2:2 data, the Cb and Cr elements change only every second sample.

YCbCr444 = [frm444(1,1:8,1); frm444(1,1:8,2); frm444(1,1:8,3)]
YCbCr444 = 3x8 uint8 matrix

   132   134   129   124   125   122   118   119
   116   118   119   122   122   121   123   123
   135   131   125   121   119   116   118   118

YCbCr422 = [frm422(1,1:8,1); frm422(1,1:8,2); frm422(1,1:8,3)]
YCbCr422 = 3x8 uint8 matrix

   132   134   129   124   125   122   118   119
   116   116   120   120   122   122   123   123
   135   135   126   126   119   119   118   118


figure
imshow(frm422,'InitialMagnification',300)
title '4:2:2'

figure
imshow(frm444,'InitialMagnification',300)
title '4:4:4'

Algorithms

This object implements the algorithms described on the Chroma Resampler block reference page.

Extended Capabilities

Version History

Introduced in R2015a

expand all