This topic describes the Screen Driver Graphics test suite and explains how to set up the test environment.
The purpose of the test suite is to test specific functionality of the display, such as the display mode and display orientation. The tests are automated and do not require user intervention.
The following is the basic test structure:
NewScreenDeviceL()
InitScreen()
SetDisplayMode()
SetAutoUpdate()
OrientationsAvailable()
SetOrientation()
WriteLine() /WriteRGB() /WriteRGBAlphaLine() /ReadLine()
Destructor
Display Modes used: EColor16MA, EColor64K and EColor16MU (in different hardware devices, using different display modes).
Orientation Modes used: EOrientationNormal
Draw Modes used: EDrawModePEN, EDrawModeNOTPEN, EDrawModeAND, EDrawModeOR, EDrawModeNOTSCREEN and EDrawModeXOR.
Screen Driver test cases cover basic functionality for graphics display.
The following APIs are covered in the header file resolution:
Test script source tree location
Descriptions of the test cases in this test suite can be found in the following location:
...\graphics\graphicsapitest\screendriverhaitest\screendriver\scripts\graphics-screendriver-cfbsdrawdevice-automated.script
Test script build location
When the tests are built, the scripts are built in the following location:
...\epoc32\data\z\graphics\graphics-screendriver-cfbsdrawdevice-automated.script
Test script hardware path location
Descriptions of the test cases in this test suite can be found in the following location:
z:/graphics/graphics-screendriver-cfbsdrawdevice-automated.script
Test data source tree location
Descriptions for the test data files are located in the following locations:
...\graphics\graphicsapitest\screendriverhaitest\screendriver\testdata\graphics-screendriver-cfbsdrawdevice-automated.ini
...\graphics\graphicsapitest\screendriverhaitest\screendriver\testdata\t_screendriver_environment.ini
Test data build location
When the tests are built, the scripts are built in the following location:
...\epoc32\data\z\graphics\graphics-screendriver-cfbsdrawdevice-automated.ini
...\epoc32\data\z\graphics\t_screendriver_environment.ini
Test data hardware path location
Descriptions of the data files for this suite can be found in the following location:
For information about the pre-build setup, see Test environment.
Device data
The t_screendriver.ini file is empty, Screen Driver does not require device data.
Excluding tests
Not all the tests run on all devices. In order to run the device-specific tests you need to exclude certain tests. These are listed in the file t_screendriver.tcs. This file is used as an input to TEF and it can contain:
For more details, refer to Symbian Verification Suite » TestExecute Framework.
Source tree location
The following is the source tree location for the t_screendriver.ini and t_screendriver.tcs files:
...\graphics\graphicsapitest\screendriverhaitest\screendriver\testdata\<device>\t_screendriver.ini
...\graphics\graphicsapitest\screendriverhaitest\screendriver\testdata\<device>\t_screendriver.tcs
Build path location
...\epoc32\data\z\graphics\t_screendriver.ini
...\epoc32\data\z\graphics\t_screendriver.tcs
Hardware path location
The following is the hardware location for the t_screendriver.ini and t_screendriver.tcs files:
General test environment
The test execution is controlled from the test PC and tests can be run on target environments, such as a hardware reference board.
A typical regression test environment consists of:
A PC with the Symbian platform source code, binaries and test source code. In addition, specific test execution tools and configuration scripts are required to run the test and prepare the test environment on the PC. These should all be readily available in the build.
For hardware testing, a prototype phone or reference board (such as H4HRP) must be connected to the PC through a serial cable for debug purpose. The base ROM image is built on the PC and transferred to the hardware by flashing the device or using an MMC card.
Remote side test environment setup
There is no remote side test environment and the tests are not dependant on any external equipment.
Building
For details, refer to Building and execution.
Test execution
Manual test execution on hardware
After the image is successfully loaded to the device, it prompts at the system drive that is C. There are several alternative methods that can launch the tests.
Launching with the batch file
The batch file t_screendriver.bat executes graphics-screendriver-cfbsdrawdevice-automated.script.
After switching to the following path, run the tests through:
z:\graphics> t_screendriver.bat
Running script directly
On the system drive, launch TEF passing the connection script and the TCS file as command parameters.
C:\> testexecute z:\base\usb\ base-bsp-usb-validation-automated.script -tcx z:\base\usb\t_usb.tcs
Automated test execution on hardware (using TestDriver)
Run the following command to build the code for TestDriver:
testdriver build
To specifically build automated test only, use the switch –s
>testdriver run –p [platform] –b [build]–s [<specific test suite path>]
where the <specific test suite path> can be screendriverhai or screendriverhai.screendriver.
Note: For details on how to run test using TestDriver, refer to Steps to build and execute tests using TestDriver.
Test results log file location
When tests have finished executing on the hardware board, results can be found in c:\logs\testexecute on the device. You can copy these files into the MMC or other media and then observe the test results in the HTML files. If using TestDriver to execute, results can be found in %EPOCROOT%\testdriver\results on the PC.
Each test case in the script contains just one test step, which passes or fails. Each test step performs a number of actions on the API. The log displays the percentage of test cases that have passed successfully. The script should have a 100% pass rate.
How to interpret test results
Each test suite produces its own HTML log file containing the results of the test run. When examining the log file, check the results at the bottom of the log first. This gives a summary of the number of tests in the suite that have passed, failed or panicked. In the body of the log, passing tests are highlighted in green, failing tests in red, while tests which cause a panic are shown in blue.
Copyright ©2010 Nokia Corporation and/or its subsidiary(-ies).
All rights
reserved. Unless otherwise stated, these materials are provided under the terms of the Eclipse Public License
v1.0.