diff -r 43e37759235e -r 51a74ef9ed63 Symbian3/SDK/Source/GUID-A578ECBB-28C5-51C6-A040-4AE65AD38C07.dita --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/Symbian3/SDK/Source/GUID-A578ECBB-28C5-51C6-A040-4AE65AD38C07.dita Wed Mar 31 11:11:55 2010 +0100 @@ -0,0 +1,247 @@ + + + + + +Stream +Encoding And Stream Decoding +

This document gives you more information about the stream encoding and +stream decoding method.

+
Purpose

This tutorial explains how to encode and +decode an image by passing pixel data block by block.

Required +Background

The image is decoded / encoded using imageconversion.dll and +currently only JPEG codec is supported for the streaming block method, so jpegcodec.dll is +used from the Imaging Plugins component.

Introduction

An image is compressed into an image frame. This +is decoded / encoded in one go which consumes more memory usage.

The +Symbian JPEG codec now supports enhanced functionality +during the encode / decode operation using Stream Encoding and Stream Decoding +methods. In these methods an image frame which is part of a compressed image +can be divided into sub blocks and these are encoded / decoded block by block +of YUV pixel data.

For decoder MImageConvStreamedDecode and TImageConvStreamedDecode are +used to adapt the streaming functionality. And for encoder MImageConvStreamedEncode and TImageConvStreamedEncode are used to adapt the streaming functionality.

Note:- Only +the Symbian JPEG codec supports decoding / encoding of an image using Stream +Encoding and Stream Decoding methods which consumes less memory usage. No +other Symbian codecs are modified to provide this support.

The Stream +Encoding And Stream Decoding methods also supports cropping or scaling an +image in sequence order or random order.

Setup +and Configuration Requirements

For the encoder / the decoder to +perform streaming you need to set up the navigation mode by using the streaming +capabilities.

    +
  • The +streaming capabilities for decoding are supported by the Image Processor Adaptation Plug-in decoder. For example you can obtain +the optimum number of blocks through streaming in a single request to get +maximum performance by using the parameter aOptimalBlocksPerRequest.

  • +
  • +
  • The streaming capabilities +for encoding are supported by the Image +Processor Adaptation Plug-in encoder. For example you can obtain the +maximum number of blocks through streaming by using the parameter aMaxBlocksPerRequest.

  • +

During the decode operation, the blocks or sub-frames can be navigated +in the following order :

    +
  • The sub-frames or blocks +can be passed sequentially from top left of the image, left to right and top +to bottom.

  • +
  • The blocks can be passed +in random order to access each block.

  • +

During the encode operation,the blocks or sub-frames can be navigated +:

    +
  • Sequentially from top +left of the image, left to right and top to bottom.

  • +
  • Randomly from top of +the image to bottom and bottom of the image to top.

  • +
+
Using Stream Encoding And Stream Decoding

The +Following tasks are covered in this tutorial:

    +
  • How to encode and decode an image by the streaming block method

  • +

Basic Procedure For Stream +Encoding And Stream Decoding

The high level steps to perform streaming +block during encode and decode operation are as follows:

    +
  1. To create the encoder +call CImageEncoder::FileNewL() or CImageEncoder::DataNewL() and +to create the decoder call CImageDecoder::FileNewL() or CImageDecoder::DataNewL().

  2. +
  3. For the encoder streaming, +requests a streaming interface through CImageEncoder::BlockStreamerL() and +for the decoder streaming, request an interface through CImageDecoder::BlockStreamerL().

    After +requesting the streaming interface, if the streaming extension is supported +then a T class pointer is returned which gives access to the JPEG codec extension. TImageConvStreamedDecode gives +the extension functionality for stream decoding and TImageConvStreamedEncode gives +the extension functionality for stream encoding.

  4. +
  5. To +set the navigation mode for the encode streaming call the Image Processor +Adaptation Plug-in encoder navigator and for the decode streaming call the +Image Processor Adaptation Plug-in decoder navigator.

    For +decode streaming, the navigation possibilities are :

      +
    • The blocks are returned +from first to last.

    • +
    • The blocks are returned +from last to first.

    • +
    • The blocks are returned +randomly e.g. 18, 5, 20.

    • +
    • The blocks are returned +in a random order but moving only from first to last e.g. 1, 5, 18.

    • +
    • The blocks are returned +in a random order but moving only from last to first e.g. 18, 5, 1.

    • +

    The navigation are shown below:

    + +enum TNavigation + { + + ENavigationSequentialForward = 0x01, // Sequential order from first to last + ENavigationSequentialBackwards = 0x10, // Sequential order from last to first + ENavigationRandom = 0x08, // random order + ENavigationRandomForward = 0x02, // random order frist to last + ENavigationRandomBackwards = 0x04, // random order last to first + + } + +

    For encode streaming, the navigation possibilities are:

      +
    • The blocks are returned +from first to last.

    • +
    • The blocks are returned +in a random order but moving only from first to last e.g. 1, 5, 18.

    • +
    • The blocks are returned +in a random order but moving only from last to first e.g. 1, 5, 18.

    • +
    + +enum TNavigation + { + + ENavigationSequentialForward = 0x01, // sequential order from first to last + ENavigationRandomForward = 0x02, // random order from first to last + EnavigationRandomBackwards = 0x04, // random order from last to first + + }; + +
  6. +
  7. To initialize the stream +decoder use TImageConvStreamedDecode::InitFrameL() and +use its parameter.

  8. +
  9. To initialize the encode +streaming use TImageConvStreamedEncode::InitFrameL() and +use its parameter.

  10. +
  11. During decode function, +the memory for storing CImageFrame must be large enough +to contain the decoded frame. To obtain the buffer size for a particular decode +function call TImageConvStreamedDecode::GetBufferSize().

    The GetBufferSize() function +returns:

      +
    • The required size to +store the image when using the Imaging plugins format code.

    • +
    • The size in pixels of +the block from the stream are returned by calling aBlockSizeInPixels, +when aNumBlocks of minimum block size are requested.

    • +
  12. +
  13. To store the image data +in any format or layout which is described by a format code UID, create an +empty image frame using CImageFrame.

  14. +
  15. To set the image frame +size in pixels call CImageFrame::SetFrameSizeInPixels(). +The parameter aFrameSize is used to returned aBlockSizeInPixels from GetBufferSize.

  16. +
  17. In decode streaming, +in order to start asynchronous call to return blocks use MImageConvStreamedDecode::GetNextBlocks().

  18. +
  19. In encode streaming, +in order to start asynchronous call to append blocks use MImageConvStreamedEncode::AppendBlocks().

  20. +

Note: The memory optimization is mainly achieved by GetNextBlocks and AppendBlocks applying +effect to the image frame block. And the streaming is only supported by the +images which are multiples of Minimum Coded Unit (MCU).

Example

The example below shows how to use stream encoding +and stream decoding methods:

+ +void CIclExample::StreamDecodeAndEncodeYuvFrameL(const TDesC& aSrcFileName, const TDesC& aDestFileName) + { + const TInt KFrameNumber = 0; // first frame + const TUid KFormat = KUidFormatYUV422Interleaved; // 422 sampling scheme + const TInt KNumBlocksToGet = 1; + RChunk chunk; + TSize streamBlockSizeInPixels; + TEncodeStreamImgProcPlugin ImgProcPlugin; + TInt numBlocksRead = 0; + TBool haveMoreBlocks = ETrue; + // Create the decoder, passing the filename. The image is recognised by the + // Image Conversion Library, an appropriate codec plugin loaded and the image headers parsed. + // If the image is not recognised or valid then the call will leave with an error + CImageDecoder* jpegImageDecoder = static_cast<CJPEGImageFrameDecoder*>( CImageDecoder::FileNewL(iFs, aSrcFileName)); + CleanupStack::PushL(jpegImageDecoder); + + // Create the encoder, passing the filename. The image is recognised by the + // Image Conversion Library, an appropriate codec plugin loaded and the image headers parsed. + // If the image is not recognised or valid then the call will leave with an error + CImageEncoder* jpegImageEncoder = static_cast<CJPEGImageFrameEncoder*>( CImageEncoder::FileNewL(iFs, aDestFileName, CImageEncoder::EOptionNone, KImageTypeJPGUid)); + CleanupStack::PushL(jpegImageEncoder); + + // Create encode & decode Block Streamer + TImageConvStreamedDecode* streamDecode = jpegImageDecoder->BlockStreamerL(); + TImageConvStreamedEncode* streamEncode = jpegImageEncoder->BlockStreamerL(); + + TFrameInfo frameInfo = jpegImageDecoder->FrameInfo(); + TSize frameSizeInPixels(frameInfo.iOverallSizeInPixels); //NOTE: The image used for decoding should be multiple of MCU(Minimum coded unit) + + //set the navigation mode initialize decoder frame + TDecodeStreamImgProcPlugin::TNavigation decodeNavigation = TDecodeStreamImgProcPlugin::ENavigationSequentialForward; + streamDecode->InitFrameL(KFormat, KFrameNumber, decodeNavigation); + + streamEncode->GetCapabilities(KFormat,ImgProcPlugin); + TSize blockSizeInPixels = TSize(ImgProcPlugin.MinBlockSizeInPixels()); + + //initialize encoder frameImgProcPlugin::ENavigationSequentialForward; + streamEncode->InitFrameL(KFormat, KFrameNumber, frameSizeInPixels, blockSizeInPixels, encodeNavigation, NULL); + + //When decoding, the buffer wrapped by the destination CImageFrame must be large enough to contain the decoded frame. + //GetBufferSize() should be used to obtain the buffer size required for a particular decode + TInt imageSizeInBytes = streamDecode->GetBufferSize(KFormat, streamBlockSizeInPixels, KNumBlocksToGet); + + User::LeaveIfError(chunk.CreateGlobal(KRChunk, imageSizeInBytes, imageSizeInBytes, EOwnerProcess)); + CleanupClosePushL(chunk); + + // Create an empty imageframe + CImageFrame* imageFrame = CImageFrame::NewL(&chunk, imageSizeInBytes, 0); + CleanupStack::PushL(imageFrame); + + imageFrame->SetFrameSizeInPixels(streamBlockSizeInPixels); + + while(haveMoreBlocks) + { + // See Note 1 + CActiveListener* activeListener = CreateAndInitializeActiveListenerLC(); + + //decoder get blocks + streamDecode->GetNextBlocks(activeListener->iStatus, *imageFrame, KNumBlocksToGet, numBlocksRead, haveMoreBlocks); + + // See Note 2 + CActiveScheduler::Start(); + User::LeaveIfError(activeListener->iStatus.Int()); // decode complete. + + //NOTE: Apply effects like adjust brightness etc in low memory conditions by use of streaming to the image frame block + + // See Note 1 + activeListener->InitializeActiveListener(); + + //encoder append blocks + streamEncode->AppendBlocks(activeListener->iStatus, *imageFrame, numBlocksRead); + + // See Note 2 + CActiveScheduler::Start(); + User::LeaveIfError(activeListener->iStatus.Int()); // encode complete. + + CleanupStack::PopAndDestroy(activeListener); // encodeActiveListener + } + + CleanupStack::PopAndDestroy(4); // imageFrame, chunk, jpegImageEncoder and jpegImageDecoder + } +
+
+Imaging Frameworks +overview +Image Conversion +Overview +Image Encoding +Tutorial +Image Decoding +Tutorial +Guide to +Symbian supplied Codecs +
\ No newline at end of file