diff -r ebc84c812384 -r 46218c8b8afa Symbian3/PDK/Source/GUID-0B240B41-652F-586A-915E-3D67BFEB8AE6.dita --- a/Symbian3/PDK/Source/GUID-0B240B41-652F-586A-915E-3D67BFEB8AE6.dita Thu Mar 11 15:24:26 2010 +0000 +++ b/Symbian3/PDK/Source/GUID-0B240B41-652F-586A-915E-3D67BFEB8AE6.dita Thu Mar 11 18:02:22 2010 +0000 @@ -1,216 +1,216 @@ - - - - - - F32 Performance -Test SuiteThe F32 Performance test suite provides performance tests for the -most frequently used public APIs in the RFile and RFs classes. -
Test Suite -Overview

This test suite provides an indication of the overall -performance characteristics of the file server.

The following APIs -are tested:

RFile APIs:

IMPORT_C TInt Read(TDes8 &aDes, TInt aLength) const; -IMPORT_C TInt Read(TInt aPos, TDes8 &aDes, TInt aLength) const; -IMPORT_C TInt Write(TDes8 &aDes, TInt aLength) const; -IMPORT_C TInt Write(TInt aPos, TDes8 &aDes, TInt aLength) const; -IMPORT_C TInt Seek(TSeek aMode, TInt aPos) const; -

RFs APIs:

IMPORT_C TInt Entry(const TDesC &aName, TEntry &anEntry) const;
-
Test approach

The suite benchmarks the performance -of the APIs for each call is then calculated. by recording the time taken -for a given number calls to each API to complete. The average time taken

The -suite consists of six test scripts: Three for RFile and three for RFs.

The -three tests are described as small, medium and large according to the number -of times that a API is called. Small indicates that the API will be called -100 times, medium 500 times and large 1000 times.

-
Coverage Omissions

Not all the public Base F32 -APIs are tested by the test suite. The scope is limited to the most frequently -used APIs of RFs and RFile (see Test -Suite Overview for details).

-
Test Suite Details

Test -Script Source Tree Location

Descriptions of the test cases that -is test scripts can be found at the following location:

    -
  • ...\baseapitest\basesvs\performance\f32\t_perf\scripts\pbase-f32-rfile-performance-large.script

  • -
  • ...\baseapitest\basesvs\performance\f32\t_perf\scripts\pbase-f32-rfile-performance-medium.script

  • -
  • ...\baseapitest\basesvs\performance\f32\t_perf\scripts\pbase-f32-rfile-performance-small.script

  • -
  • ...\baseapitest\basesvs\performance\f32\t_perf\scripts\pbase-f32-rfs-performance-large.script

  • -
  • ...\baseapitest\basesvs\performance\f32\t_perf\scripts\pbase-f32-rfs-performance-medium.script

  • -
  • ...\baseapitest\basesvs\performance\f32\t_perf\scripts\pbase-f32-rfs-performance-small.script

  • -

Test -Script EPOC tree Location

When the tests are built for emulator -or hardware (WINSCW or ARMV5), the scripts -are exported into the following location in the epoc tree:

%EPOCROOT%\epoc32\data\Z\base\performance\f32\t_perf\

Test -Script Build Location

When the tests are built, the scripts are -built into the following location:

%EPOCROOT%\epoc32\release\<winscw|armv5>\<udeb|urel>\Z\base\performance\f32\t_perf\

When the tests are built to be executed on hardware the files -are built into the z: drive of the ROM.

Test Data Source Tree Location

The test suite contains following -test data files:

    -
  • ...\baseapitest\basesvs\performance\f32\t_perf\testdata\pbase-f32-rfile-performance-large.ini

  • -
  • ...\baseapitest\basesvs\performance\f32\t_perf\testdata\pbase-f32-rfile-performance-medium.ini

  • -
  • ...\baseapitest\basesvs\performance\f32\t_perf\testdata\pbase-f32-rfile-performance-small.ini

  • -
  • ...\baseapitest\basesvs\performance\f32\t_perf\testdata\pbase-f32-rfile-performance-utils.ini

  • -
  • ...\baseapitest\basesvs\performance\f32\t_perf\testdata\pbase-f32-rfs-performance-large.ini

  • -
  • ...\baseapitest\basesvs\performance\f32\t_perf\testdata\pbase-f32-rfs-performance-medium.ini

  • -
  • ...\baseapitest\basesvs\performance\f32\t_perf\testdata\pbase-f32-rfs-performance-small.ini

  • -

The global environment file located at:

...\baseapitest\basesvs\performance\f32\t_perf\testdata\<platform>\base_perf_f32_env.ini

where <platform> is either WINSCW or ARMV5.

Test Data Files EPOC Tree Location

When the tests are built -for emulator or hardware (winscw/armv5), the data files are exported into -the following location in the epoc tree:

%EPOCROOT%\epoc32\data\Z\base\performance\f32\t_perf\

Test Data Files Emulator Location

When the tests are built, -the test data files are built into the following location:

%EPOCROOT%\epoc32\release\winscw\<udeb/urel>\Z\base\performance\f32\t_perf\

Note: When the tests are built to be executed on hardware -the files are built into the z: drive of the ROM.

Test .driver File

The base.driver file -found in …\baseapitest\basesvs\testsuites\base \ is used -by the test driver to construct the test suite tree structure and export all -the appropriate files to the correct location in the epoc32 tree and on the -device.

When the tests are built, the .driver file -can be found in the following location:

%EPOCROOT%\epoc32\testdriver\testproduct\..

TCS File Source Location

The .tcs file -can be found in the following location:

...\baseapitest\basesvs\config\t_base.tcs

TCS File Build Location

When the tests are built, the .tcs file -is generated into the following location:

%EPOCROOT%\epoc32\release\<winscw|armv5>\<udeb|urel>\Z\base\performance\f32\t_perf\

When the tests are built to be executed on hardware, the -files are built into the z: drive of the ROM.

Test configuration

Tests -can be configured separately using the environment configuration file base_perf_f32_env.ini. -This file contains configurable parameters which you can modify to execute -with different test data.

For example :

    -
  • the name of the drive -on which to create the test directory structure

  • -
  • the name of the file(s) -to create

  • -
  • the size of the file(s) -to create

  • -
  • the number of files -to create in each directory

  • -
  • the name of each directory -to create in the directory structure

  • -

The following parameters are within the default section of the base_perf_f32_env.ini configuration -file:

    -
  • driveName

  • -
  • fileBaseName

  • -
  • fileSize

  • -
  • numOfFiles

  • -
  • subDirName

  • -

You can also modify the values of these parameters, and other parameters -available in the base_perf_f32_env.ini configuration -file. For more information about the parameters available in the base_perf_f32_env.ini configuration -file, see the Configuring -the base_perf_f32_env.ini file section.

As the configuration -file base_perf_f32_env.ini is used by all RFile tests -and RFs tests, it is important to analyse all test scenarios before modifying -this configuration file.

For more information about the details -that can be manipulated for each test scenario, see the Configuring -the base_perf_f32_env.ini file section.

-
Test environment and execution

Device -Setup

None

Test -Execution

The following two different ROM configurations can be -used to run these tests on the target platform:

    -
  • NAND configuration ROM

  • -
  • NOR configuration ROM

  • -

Both ROMS described in the following sections are techview images -with STAT built in and should launch STAT automatically during start up.

For -test execution instructions refer to Base -F32 Test Technology. To use NAND or NOR configuration of ROM, replace -the buildrom step in the Base -F32 Test Technology section with the steps listed in the NAND configuration ROM and NOR -configuration ROM respectively.

NAND configuration ROM

To -use a NAND configuration ROM, perform the following steps:

    -
  1. Navigate to the \epoc32\rom directory -and run one of the following commands:

    if using TestDriver:

    buildrom --D_STARTUPMODE2 -D_NAND2 <h4hrp/h2> techview_statapi

    if -not TestDriver:

    buildrom -D_STARTUPMODE2 -D_NAND2 <h4hrp/h2> -techview t_base_f32_perf

    This creates two image files, h4hrp_001.techview.nand.IMG and h4hrp_001.techview.nand.rofs.img.

  2. -
  3. Rename the two image -files h4hrp_001.techview.nand.IMG and h4hrp_001.techview.nand.rofs.img as core.img and rofs1.img respectively.

  4. -
  5. Navigate to the sf/os/kernelhwsrv/kernel/eka/rombuild/... directory -and run the following command to generate the nandloader:

    rom --v=h4hrp -b=urel -i=armv5 -t=nandloader -d=_NAND2

    Successful -execution of this command generates the image file H4HRPARMV5.IMG

  6. -
  7. Zip the H4HRPARMV5.IMG file -rename the zip file to sys$rom.zip.

  8. -
  9. Transfer all three files core.img, rofs1.img and sys$rom.zip to the hardware board using an MMC card or a USB cable.

    When -the board is restarted with the NAND configuration, the STAT application should -launch automatically into the waiting for data state. If this does -not occur check that the NAND configuration ROM is built correctly and that -you have configured STAT correctly.

  10. -

NOR configuration ROM

To -use a NOR configuration ROM, perform the following steps:

    -
  1. Navigate to the \epoc32\rom directory -and run one of the following commands:

    if using TestDriver:

    buildrom --D_STARTUPMODE2 -D_NOR <h4hrp/h2> techview_statapi

    if -not using TestDriver:

    buildrom -D_STARTUPMODE2 -D_NOR <h4hrp/h2> -techview pbase-f32-performance

    This creates an image file, om_001.techview.IMG.

  2. -
  3. Zip the om_001.techview.IMG file -and rename the zip file to flashimg.zip.

  4. -
  5. Transfer this file to -the hardware board using an MMC card or a USB cable. The H4 board boots the -TechView image and loads into NOR-Flash. When the board is booted up the STAT -application should launch automatically into the waiting for data state. -If this does not occur check that the NOR configuration ROM is built correctly -and that you have configured STAT correctly.

  6. -
-
Test results

For TEF log files location and general -principles of their interpretation refer to Base -F32 Test Technology

How -to interpret test results

Interpreting -the HTML logs

Each HTML file contains the following information -for each performance test scenario:

    -
  • Number of function -calls is (X): The number of times the API has been called, where x is -an integer.

  • -
  • Time taken for all -function calls (X) microseconds: The total time taken for the API to complete -X number of calls to it, where x is the measured in microseconds.

  • -
  • Approximate Average -Time Taken per call (X) microseconds: The average time taken for a call -to the API to complete, where x is the measured in micro seconds.

  • -

RFile API tests only the output information:

    -
  • BlockSize (x) bytes: The -buffer size that is read or written to a file, where x is measured in bytes.

    Note: For RFile::Seek tests, -this size is zero.

  • -
  • Bytes processed -(X) bytes: The total number of bytes processed during the test.

  • -
  • Throughput (X) MB/Sec: The -total amount of data that was processed during the test in one second, where -x is measured in megabytes (MB).

  • -

Interpreting the CSV file -log

The .CSV file also contains the test -results. By default this CSV file is created on the c drive and is called -f32-perfResults, however both the file name and its creation location is configured -using the environment configuration file base_perf_f32_env.ini for -further details see Test -Configuration.

It does not indicate whether an individual test -scenario has passed or failed. Instead it contains user friendly performance -data associated with each API test scenario. This data is listed under the -following field headers within the file:

    -
  • Operation: Indicates -the actual test scenario and lists the API that was tested. For example, ReadXSmallBytes indicates -that the API to be tested is RFILE READ: Read (TDes8 &aDes, - TInt aLength). The ReadXSmallBytes test -scenario title indicates that it will be using an extra small number of bytes -(set to 16 bytes by default).

  • -
  • Calls: Indicates -the number of times this API is called (RFILE READ).

  • -
  • Time Taken: Indicates -the total time taken for all calls to that API using the test scenario specified -in the Operation section. This value is represented in microseconds.

  • -
  • Average time: Indicates -the average time taken for each individual call to that API using the test -scenario listed in the Operation section. This value is represented -in microseconds.

  • -

Interpreting -Anomalies

Note that the following behaviour for the RFile::Read and RFile::Write APIs -is not true:

    -
  • Read operation + Seek -operation >= Read Seek operation

  • -
  • Write operation + Seek -operation >= Write Seek operation

  • -

This is because the Symbian platform file system has a client-server -behaviour. A client creates a package for the file server and sends a KCurrentPosition within -that package. In the case of an RFile::Read or RFile::Write API -that does not contain a position in its argument.

On the server side, -processing this package involves checking whether the position received is -equal to the KCurrentPosition, and if this assignment is true then -the position is assigned a value. This extra assignment gives an overhead. -In both the cases (RFile::Read and RFile::Write) -an operation without the position takes a longer time to complete than the -one with a position.

+ + + + + + F32 Performance +Test SuiteThe F32 Performance test suite provides performance tests for the +most frequently used public APIs in the RFile and RFs classes. +
Test Suite +Overview

This test suite provides an indication of the overall +performance characteristics of the file server.

The following APIs +are tested:

RFile APIs:

IMPORT_C TInt Read(TDes8 &aDes, TInt aLength) const; +IMPORT_C TInt Read(TInt aPos, TDes8 &aDes, TInt aLength) const; +IMPORT_C TInt Write(TDes8 &aDes, TInt aLength) const; +IMPORT_C TInt Write(TInt aPos, TDes8 &aDes, TInt aLength) const; +IMPORT_C TInt Seek(TSeek aMode, TInt aPos) const; +

RFs APIs:

IMPORT_C TInt Entry(const TDesC &aName, TEntry &anEntry) const;
+
Test approach

The suite benchmarks the performance +of the APIs for each call is then calculated. by recording the time taken +for a given number calls to each API to complete. The average time taken

The +suite consists of six test scripts: Three for RFile and three for RFs.

The +three tests are described as small, medium and large according to the number +of times that a API is called. Small indicates that the API will be called +100 times, medium 500 times and large 1000 times.

+
Coverage Omissions

Not all the public Base F32 +APIs are tested by the test suite. The scope is limited to the most frequently +used APIs of RFs and RFile (see Test +Suite Overview for details).

+
Test Suite Details

Test +Script Source Tree Location

Descriptions of the test cases that +is test scripts can be found at the following location:

    +
  • ...\baseapitest\basesvs\performance\f32\t_perf\scripts\pbase-f32-rfile-performance-large.script

  • +
  • ...\baseapitest\basesvs\performance\f32\t_perf\scripts\pbase-f32-rfile-performance-medium.script

  • +
  • ...\baseapitest\basesvs\performance\f32\t_perf\scripts\pbase-f32-rfile-performance-small.script

  • +
  • ...\baseapitest\basesvs\performance\f32\t_perf\scripts\pbase-f32-rfs-performance-large.script

  • +
  • ...\baseapitest\basesvs\performance\f32\t_perf\scripts\pbase-f32-rfs-performance-medium.script

  • +
  • ...\baseapitest\basesvs\performance\f32\t_perf\scripts\pbase-f32-rfs-performance-small.script

  • +

Test +Script EPOC tree Location

When the tests are built for emulator +or hardware (WINSCW or ARMV5), the scripts +are exported into the following location in the epoc tree:

%EPOCROOT%\epoc32\data\Z\base\performance\f32\t_perf\

Test +Script Build Location

When the tests are built, the scripts are +built into the following location:

%EPOCROOT%\epoc32\release\<winscw|armv5>\<udeb|urel>\Z\base\performance\f32\t_perf\

When the tests are built to be executed on hardware the files +are built into the z: drive of the ROM.

Test Data Source Tree Location

The test suite contains following +test data files:

    +
  • ...\baseapitest\basesvs\performance\f32\t_perf\testdata\pbase-f32-rfile-performance-large.ini

  • +
  • ...\baseapitest\basesvs\performance\f32\t_perf\testdata\pbase-f32-rfile-performance-medium.ini

  • +
  • ...\baseapitest\basesvs\performance\f32\t_perf\testdata\pbase-f32-rfile-performance-small.ini

  • +
  • ...\baseapitest\basesvs\performance\f32\t_perf\testdata\pbase-f32-rfile-performance-utils.ini

  • +
  • ...\baseapitest\basesvs\performance\f32\t_perf\testdata\pbase-f32-rfs-performance-large.ini

  • +
  • ...\baseapitest\basesvs\performance\f32\t_perf\testdata\pbase-f32-rfs-performance-medium.ini

  • +
  • ...\baseapitest\basesvs\performance\f32\t_perf\testdata\pbase-f32-rfs-performance-small.ini

  • +

The global environment file located at:

...\baseapitest\basesvs\performance\f32\t_perf\testdata\<platform>\base_perf_f32_env.ini

where <platform> is either WINSCW or ARMV5.

Test Data Files EPOC Tree Location

When the tests are built +for emulator or hardware (winscw/armv5), the data files are exported into +the following location in the epoc tree:

%EPOCROOT%\epoc32\data\Z\base\performance\f32\t_perf\

Test Data Files Emulator Location

When the tests are built, +the test data files are built into the following location:

%EPOCROOT%\epoc32\release\winscw\<udeb/urel>\Z\base\performance\f32\t_perf\

Note: When the tests are built to be executed on hardware +the files are built into the z: drive of the ROM.

Test .driver File

The base.driver file +found in …\baseapitest\basesvs\testsuites\base \ is used +by the test driver to construct the test suite tree structure and export all +the appropriate files to the correct location in the epoc32 tree and on the +device.

When the tests are built, the .driver file +can be found in the following location:

%EPOCROOT%\epoc32\testdriver\testproduct\..

TCS File Source Location

The .tcs file +can be found in the following location:

...\baseapitest\basesvs\config\t_base.tcs

TCS File Build Location

When the tests are built, the .tcs file +is generated into the following location:

%EPOCROOT%\epoc32\release\<winscw|armv5>\<udeb|urel>\Z\base\performance\f32\t_perf\

When the tests are built to be executed on hardware, the +files are built into the z: drive of the ROM.

Test configuration

Tests +can be configured separately using the environment configuration file base_perf_f32_env.ini. +This file contains configurable parameters which you can modify to execute +with different test data.

For example :

    +
  • the name of the drive +on which to create the test directory structure

  • +
  • the name of the file(s) +to create

  • +
  • the size of the file(s) +to create

  • +
  • the number of files +to create in each directory

  • +
  • the name of each directory +to create in the directory structure

  • +

The following parameters are within the default section of the base_perf_f32_env.ini configuration +file:

    +
  • driveName

  • +
  • fileBaseName

  • +
  • fileSize

  • +
  • numOfFiles

  • +
  • subDirName

  • +

You can also modify the values of these parameters, and other parameters +available in the base_perf_f32_env.ini configuration +file. For more information about the parameters available in the base_perf_f32_env.ini configuration +file, see the Configuring +the base_perf_f32_env.ini file section.

As the configuration +file base_perf_f32_env.ini is used by all RFile tests +and RFs tests, it is important to analyse all test scenarios before modifying +this configuration file.

For more information about the details +that can be manipulated for each test scenario, see the Configuring +the base_perf_f32_env.ini file section.

+
Test environment and execution

Device +Setup

None

Test +Execution

The following two different ROM configurations can be +used to run these tests on the target platform:

    +
  • NAND configuration ROM

  • +
  • NOR configuration ROM

  • +

Both ROMS described in the following sections are techview images +with STAT built in and should launch STAT automatically during start up.

For +test execution instructions refer to Base +F32 Test Technology. To use NAND or NOR configuration of ROM, replace +the buildrom step in the Base +F32 Test Technology section with the steps listed in the NAND configuration ROM and NOR +configuration ROM respectively.

NAND configuration ROM

To +use a NAND configuration ROM, perform the following steps:

    +
  1. Navigate to the \epoc32\rom directory +and run one of the following commands:

    if using TestDriver:

    buildrom +-D_STARTUPMODE2 -D_NAND2 <h4hrp/h2> techview_statapi

    if +not TestDriver:

    buildrom -D_STARTUPMODE2 -D_NAND2 <h4hrp/h2> +techview t_base_f32_perf

    This creates two image files, h4hrp_001.techview.nand.IMG and h4hrp_001.techview.nand.rofs.img.

  2. +
  3. Rename the two image +files h4hrp_001.techview.nand.IMG and h4hrp_001.techview.nand.rofs.img as core.img and rofs1.img respectively.

  4. +
  5. Navigate to the sf/os/kernelhwsrv/kernel/eka/rombuild/... directory +and run the following command to generate the nandloader:

    rom +-v=h4hrp -b=urel -i=armv5 -t=nandloader -d=_NAND2

    Successful +execution of this command generates the image file H4HRPARMV5.IMG

  6. +
  7. Zip the H4HRPARMV5.IMG file +rename the zip file to sys$rom.zip.

  8. +
  9. Transfer all three files core.img, rofs1.img and sys$rom.zip to the hardware board using an MMC card or a USB cable.

    When +the board is restarted with the NAND configuration, the STAT application should +launch automatically into the waiting for data state. If this does +not occur check that the NAND configuration ROM is built correctly and that +you have configured STAT correctly.

  10. +

NOR configuration ROM

To +use a NOR configuration ROM, perform the following steps:

    +
  1. Navigate to the \epoc32\rom directory +and run one of the following commands:

    if using TestDriver:

    buildrom +-D_STARTUPMODE2 -D_NOR <h4hrp/h2> techview_statapi

    if +not using TestDriver:

    buildrom -D_STARTUPMODE2 -D_NOR <h4hrp/h2> +techview pbase-f32-performance

    This creates an image file, om_001.techview.IMG.

  2. +
  3. Zip the om_001.techview.IMG file +and rename the zip file to flashimg.zip.

  4. +
  5. Transfer this file to +the hardware board using an MMC card or a USB cable. The H4 board boots the +TechView image and loads into NOR-Flash. When the board is booted up the STAT +application should launch automatically into the waiting for data state. +If this does not occur check that the NOR configuration ROM is built correctly +and that you have configured STAT correctly.

  6. +
+
Test results

For TEF log files location and general +principles of their interpretation refer to Base +F32 Test Technology

How +to interpret test results

Interpreting +the HTML logs

Each HTML file contains the following information +for each performance test scenario:

    +
  • Number of function +calls is (X): The number of times the API has been called, where x is +an integer.

  • +
  • Time taken for all +function calls (X) microseconds: The total time taken for the API to complete +X number of calls to it, where x is the measured in microseconds.

  • +
  • Approximate Average +Time Taken per call (X) microseconds: The average time taken for a call +to the API to complete, where x is the measured in micro seconds.

  • +

RFile API tests only the output information:

    +
  • BlockSize (x) bytes: The +buffer size that is read or written to a file, where x is measured in bytes.

    Note: For RFile::Seek tests, +this size is zero.

  • +
  • Bytes processed +(X) bytes: The total number of bytes processed during the test.

  • +
  • Throughput (X) MB/Sec: The +total amount of data that was processed during the test in one second, where +x is measured in megabytes (MB).

  • +

Interpreting the CSV file +log

The .CSV file also contains the test +results. By default this CSV file is created on the c drive and is called +f32-perfResults, however both the file name and its creation location is configured +using the environment configuration file base_perf_f32_env.ini for +further details see Test +Configuration.

It does not indicate whether an individual test +scenario has passed or failed. Instead it contains user friendly performance +data associated with each API test scenario. This data is listed under the +following field headers within the file:

    +
  • Operation: Indicates +the actual test scenario and lists the API that was tested. For example, ReadXSmallBytes indicates +that the API to be tested is RFILE READ: Read (TDes8 &aDes, + TInt aLength). The ReadXSmallBytes test +scenario title indicates that it will be using an extra small number of bytes +(set to 16 bytes by default).

  • +
  • Calls: Indicates +the number of times this API is called (RFILE READ).

  • +
  • Time Taken: Indicates +the total time taken for all calls to that API using the test scenario specified +in the Operation section. This value is represented in microseconds.

  • +
  • Average time: Indicates +the average time taken for each individual call to that API using the test +scenario listed in the Operation section. This value is represented +in microseconds.

  • +

Interpreting +Anomalies

Note that the following behaviour for the RFile::Read and RFile::Write APIs +is not true:

    +
  • Read operation + Seek +operation >= Read Seek operation

  • +
  • Write operation + Seek +operation >= Write Seek operation

  • +

This is because the Symbian platform file system has a client-server +behaviour. A client creates a package for the file server and sends a KCurrentPosition within +that package. In the case of an RFile::Read or RFile::Write API +that does not contain a position in its argument.

On the server side, +processing this package involves checking whether the position received is +equal to the KCurrentPosition, and if this assignment is true then +the position is assigned a value. This extra assignment gives an overhead. +In both the cases (RFile::Read and RFile::Write) +an operation without the position takes a longer time to complete than the +one with a position.

\ No newline at end of file