diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/architecture.rst
--- a/buildframework/helium/doc/src/architecture.rst Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/doc/src/architecture.rst Mon Oct 11 11:16:47 2010 +0100
@@ -68,7 +68,7 @@
Practices
=========
-Files created in Ant, Perl, Python or XML syntax must follow the `Style guide `_.
+Files created in Ant, Perl, Python or XML syntax must follow the `Style guide `_.
.. index::
@@ -88,15 +88,8 @@
See the reference API documentation:
* `Helium API`_
-* `Java APIs`_
-* `Python APIs`_
-* `Custom Ant tasks`_
.. _`Helium API` : api/helium/index.html
-.. _`Java APIs` : api/java/index.html
-.. _`Python APIs` : api/python/index.html
-.. _`Custom Ant tasks` : api/ant/index.html
-
.. index::
single: Tools and scripts locations
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/development/coding_conventions.rst
--- a/buildframework/helium/doc/src/development/coding_conventions.rst Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/doc/src/development/coding_conventions.rst Mon Oct 11 11:16:47 2010 +0100
@@ -46,7 +46,7 @@
Documentation
=============
-Standalone documents like this design document and the user guide are documented in reStructuredText__ format.
+Standalone documents like this design document and the user guide are documented in reStructuredText_ format.
__ http://docutils.sourceforge.net/rst.html
@@ -73,8 +73,31 @@
"``hlm-p``", "Properties"
"``hlm-m``", "Macros"
- It is **not** possible to link to the task or anything in the Java documentation.
+
+.. note:: It is **not** possible to link to tasks or anything in the Java documentation.
+A section of RST documentation might look like this::
+
+ The :hlm-t:`foo` target requires the :hlm-p:`bar` property to be defined. It uses the :hlm-t:`bazMacro` macro.
+
+Fields from the API elements can also be embedded in the RST documentation using an index-like syntax::
+
+ :hlm-p:`bar[summary]`
+
+This would extract the ``summary`` field of the ``bar`` property and insert it into the document. The available fields are:
+
+.. csv-table:: API element fields
+ :header: "Field", "Description"
+
+ "summary", "The first sentence or section of the documentation."
+ "documentation", "The whole documentation text."
+ "scope", "The visibility scope."
+ "defaultValue", "The default value if one is defined. Properties only."
+ "type", "The type of the element. Properties only."
+ "editable", "Whether definition is required or optional. Properties only."
+ "deprecated", "Deprecation message."
+
+
Creating Index References
`````````````````````````
@@ -129,7 +152,7 @@
:header: "Tag", "Applies to", "Description"
"scope", "All elements", "The scope or visibility of the element. Valid values are ``public`` (default), ``protected`` and ``private``."
- "editable", "All types", "Whether this element should be overridden or defined by the user. Valid values are ``required`` and ``optional``"
+ "editable", "All types", "Indicates whether the property must be defined or not. Valid values are ``required`` and ``optional``. ``required`` means it must be defined for the related feature to work. The user must define it if there is no default value, i.e. it is not already defined in Helium."
"type", "Properties", "The type of the property value. Valid values are ``string`` (default), ``integer``, ``boolean``."
"deprecated", "All elements", "Documents that the element is deprecated and may be removed in a future release. The text should describe what to use instead."
@@ -341,7 +364,7 @@
* Unit tests are written for each Python module.
* They should follow the Nose_ testing framework conventions.
-* The test suite is run by calling :hlm-t:`py-unittest`.
+* The test suite is run by calling ``bld test``.
.. _Nose : http://somethingaboutorange.com/mrl/projects/nose/
@@ -364,7 +387,7 @@
* `Twisted Coding Standard`_ (but with a grain of salt):
.. _`PEP 8 - Style Guide for Python Code` : http://www.python.org/dev/peps/pep-0008/
-.. _`Twisted Coding Standard` : http://twistedmatrix.com/trac/browser/trunk/doc/development/policy/coding-standard.xhtml?format=raw
+.. _`Twisted Coding Standard` : http://twistedmatrix.com/documents/current/core/development/policy/coding-standard.html
.. index::
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/development/developer_guide.rst
--- a/buildframework/helium/doc/src/development/developer_guide.rst Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/doc/src/development/developer_guide.rst Mon Oct 11 11:16:47 2010 +0100
@@ -249,7 +249,7 @@
Also all setter methods visible through Ant must be documented properly using *@ant.required*
or *@ant.not-required* javadoc style attributes.
-You can find more information on how to document Ant tasks using the doclet plugin on http://doclet.neuroning.com/.
+You can find more information on how to document Ant tasks using the doclet plugin on http://antdoclet.neuroning.com/.
General coding guidelines
-------------------------
@@ -264,7 +264,7 @@
----------------------------
In order to match as must as configurability concepts, Helium custom types and tasks must follow development guidelines as
-much as possible. You can find then on http://.apache.org/_task_guidelines.html.
+much as possible. You can find then on http://ant.apache.org/ant_task_guidelines.html.
Logging
-------
@@ -414,6 +414,15 @@
Debug logs for component tests can be found at ``/build/components//xunit``.
+Filtering Python tests using nose
+---------------------------------
+
+Python unit tests are run through the nose testing framework. To run just a single Python test module, use::
+
+ bld test -Dcomponent=pythoncore -Dnose.args=amara
+
+The value of ``nose.args`` is passed through to nose.
+
.. index::
single: Assertions
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/manual/blocks.rst
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/doc/src/manual/blocks.rst Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,110 @@
+.. ============================================================================
+ Name : blocks.rst
+ Part of : Helium
+
+ Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+ All rights reserved.
+ This component and the accompanying materials are made available
+ under the terms of the License "Eclipse Public License v1.0"
+ which accompanies this distribution, and is available
+ at the URL "http://www.eclipse.org/legal/epl-v10.html".
+
+ Initial Contributors:
+ Nokia Corporation - initial contribution.
+
+ Contributors:
+
+ Description:
+
+ ============================================================================
+
+.. index::
+ module: Blocks
+
+======
+Blocks
+======
+
+.. contents::
+
+.. _`Blocks-Intro-label`:
+
+Blocks Introduction
+====================
+
+Blocks is a packaging framework, which allows you to create bundles
+with interdependencies (like rpm or deb packages) base on the outcome of the build.
+
+
+Enabling Blocks input generation
+================================
+
+The input generation consists in gathering data from build steps throughout the build to allow the generation
+of the future bundle. Not all the steps are supported, so the build engineer must keep in mind that custom
+exports or modification of the binaries after a controlled build step might lead to bundles with inconsistent content.
+
+In order to enable blocks input generation you simply need to define the **blocks.enabled** property to true. Intermediate
+configuration file will be generated under **blocks.config.dir**.
+
+e.g::
+
+ hlm -Dblocks.enabled=true....
+
+
+Currently supported steps are:
+ * SBSv2 compilation
+ * Configuration export using cMaker (only if cmaker-what is called)
+ * ROM image creation
+
+
+Bundle generation
+=================
+
+Once the data have been gathered during the build, it is then possible to create bundles. To do so you need to call the
+**blocks-create-bundles** target. Generated bundle will be created under **blocks.bundle.dir**.
+
+e.g::
+
+ hlm -Dblocks.enabled=true .... blocks-create-bundles
+
+
+Blocks workspace management with Helium
+=======================================
+
+Helium allows you to use any build environment as Blocks workspace. The :hlm-t:`blocks-create-workspace` will handle the
+automatic creation of workspace base on the current build.drive used. If the current build.drive represent an
+already existing workspace then it will reuse it. The :hlm-p:blocks.workspace.id property will contain the Blocks workspace
+id. Also when new workspace is created some repositories can be automatically added using the **blocks.repositories.id** reference
+to an hlm:blocksRepositorySet object.
+
+::
+
+
+
+
+
+
+
+Installing bundles
+==================
+The :hlm-t:`blocks-install-bundles` target will allow you to install packages under the workspace, to do so, you can configure
+the following references using patternset:
+
+::
+
+
+
+
+
+
+
+
+
+
+
+
+The **blocks.bundle.filter.id** patternset will allow you to filter bundles based on their name. And **blocks.bundle.filter.id** patternset will allow you
+to install group selected group of bundles.
+
+Finally the workspace can be updated using the :hlm-t:`blocks-update-bundles` target.
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/manual/configuring.rst
--- a/buildframework/helium/doc/src/manual/configuring.rst Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/doc/src/manual/configuring.rst Mon Oct 11 11:16:47 2010 +0100
@@ -84,7 +84,7 @@
-Note that here the default target is :hlm-t:`product-build` so this would be used for a product build configuration. In reality it would need many more properties to be complete.
+Note that here the default target is ``product-build`` so this would be used for a product build configuration. In reality it would need many more properties to be complete.
Refer to the `configuration reference`_ for a full list of all Helium Ant properties.
@@ -188,9 +188,7 @@
Viewing target dependencies
===========================
-The :hlm-t:`deps` target can be used to display a list of the target dependencies for a given target. See the `manual page`_ for more information. Also the :hlm-t:`execlist` command works in a similar way but shows a dialog showing a separated list of all the dependent targets and then just the top-level of dependencies, to help with continuing a build on the command line.
-
-.. _`manual page`: ../api/helium/target-deps.html
+The :hlm-t:`deps` target can be used to display a list of the target dependencies for a given target. Also the :hlm-t:`execlist` command works in a similar way but shows a dialog showing a separated list of all the dependent targets and then just the top-level of dependencies, to help with continuing a build on the command line.
.. index::
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/manual/configuring_features.rst.ftl
--- a/buildframework/helium/doc/src/manual/configuring_features.rst.ftl Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/doc/src/manual/configuring_features.rst.ftl Mon Oct 11 11:16:47 2010 +0100
@@ -18,12 +18,12 @@
============================================================================
-####################
+###########################
Configuring Helium Features
-####################
+###########################
Introduction
--------------------------------
+------------
This describes how to configure the Helium features.
@@ -36,15 +36,15 @@
Enabling blocks features.
Enabling to use dragonfly and many more.
-Properties need to be defined for enabling/disabling the features.
--------------------------------------------------------------
+Properties need to be defined for enabling/disabling the features
+-----------------------------------------------------------------
<#assign propertyCache = {}>
<#list doc.antDatabase.project.property as property>
<#assign propertyCache = propertyCache + {property.name: property}>
#list>
.. csv-table:: Feature properties
- :header: "Property name", "Description", "Allowed value", "Deprecated property"
+ :header: "Property name", "Description", "Default value", "Deprecated property"
<#list propertyCache?keys?sort as name>
<#assign property=propertyCache[name]>
@@ -58,7 +58,7 @@
<#assign deprecatedMessage="${deprecatedName.deprecated}">
#if>
#list>
- ":hlm-p:`${name}`", "${property.summary?replace("^", " ", "rm")?replace("\"", "\"\"", "rm")?trim}", "true/false", "${deprecatedProperty}${deprecatedMessage}"
+ ":hlm-p:`${name}`", "${property.summary?replace("^", " ", "rm")?replace("\"", "\"\"", "rm")?trim}", "${property.defaultValue}", "${deprecatedProperty}${deprecatedMessage}"
#if>
#list>
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/manual/coverity.rst
--- a/buildframework/helium/doc/src/manual/coverity.rst Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/doc/src/manual/coverity.rst Mon Oct 11 11:16:47 2010 +0100
@@ -217,6 +217,8 @@
machine coverity login password
+.. _`.netrc file`: configuring.html?highlight=netrc#passwords
+
.. csv-table:: Coverity feature flags
:header: "Flags to set", "Action performed", "Allowed value"
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/manual/documentation.rst.ftl
--- a/buildframework/helium/doc/src/manual/documentation.rst.ftl Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/doc/src/manual/documentation.rst.ftl Mon Oct 11 11:16:47 2010 +0100
@@ -33,10 +33,6 @@
APIs
====
-* `Search API`_
-
-.. _`Search API`: ../api/index.html
-
* `Helium API`_
The `Helium API`_ specifies all the available Ant_ targets and their
@@ -50,19 +46,20 @@
.. _`Helium Antlib`: ../helium-antlib/index.html
+<#if !(ant?keys?seq_contains("sf"))>
+
* `Ant Tasks`_
-.. _`Ant Tasks`: ../api/ant/index.html
+.. _`Ant Tasks`: ../api/doclet/index.html
-<#if !(ant?keys?seq_contains("sf"))>
Customer APIs
-------------
* `IDO API`_
* `DFS70501 API`_
-.. _`IDO API`: ../ido/api/helium/index.html
-.. _`DFS70501 API`: ../dfs70501/api/helium/index.html
+.. _`IDO API`: http://helium.nmp.nokia.com/doc/ido/api/helium/index.html
+.. _`DFS70501 API`: http://helium.nmp.nokia.com/doc/dfs70501/api/helium/index.html
#if>
Building custom documentation
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/manual/final.rst
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/doc/src/manual/final.rst Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,63 @@
+..
+ ============================================================================
+ Name : final.rst
+ Part of : Helium
+
+ Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+ All rights reserved.
+ This component and the accompanying materials are made available
+ under the terms of the License "Eclipse Public License v1.0"
+ which accompanies this distribution, and is available
+ at the URL "http://www.eclipse.org/legal/epl-v10.html".
+
+ Initial Contributors:
+ Nokia Corporation - initial contribution.
+
+ Contributors:
+
+ Description:
+
+ ============================================================================
+
+.. index::
+ single: Stage - Final operations
+
+Final operations
+================
+
+Final operation are steps which could happen at the workflow completion.
+
+
+Running a target at build completion
+------------------------------------
+
+Helium offers the possibility to run a final target despite any error which could occur during the build.
+The configuration of the target is done using the **hlm.final.target** property.
+
+e.g:
+::
+
+
+
+
+Running action on failure
+-------------------------
+
+The signaling framework will automatically run all signalExceptionConfig in case of Ant failure at the
+end of the build.
+
+This example shows how simple task can be run in case of failure:
+::
+
+
+
+
+ Signal: ${r'$'}{signal.name}
+ Message: ${r'$'}{signal.message}
+
+
+
+
+
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/manual/index.rst.ftl
--- a/buildframework/helium/doc/src/manual/index.rst.ftl Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/doc/src/manual/index.rst.ftl Mon Oct 11 11:16:47 2010 +0100
@@ -19,10 +19,15 @@
============================================================================
-->
-###################################
- Helium Manual
-###################################
+#############
+Helium Manual
+#############
+.. raw:: html
+
+
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/manual/introduction.rst
--- a/buildframework/helium/doc/src/manual/introduction.rst Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/doc/src/manual/introduction.rst Mon Oct 11 11:16:47 2010 +0100
@@ -40,7 +40,7 @@
It is recommended to read the Ant_ documentation before learning about Helium. An understanding of XML_ is also needed as Ant_ is configured using an XML_ format.
-.. _Ant: http://Ant.apache.org/
+.. _Ant: http://ant.apache.org/
.. _XML: http://www.w3.org/XML/
.. index::
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/manual/metrics.rst
--- a/buildframework/helium/doc/src/manual/metrics.rst Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/doc/src/manual/metrics.rst Mon Oct 11 11:16:47 2010 +0100
@@ -45,7 +45,7 @@
====================
To enable logging to diamonds from Helium one needs to ensure that:
-* The properties :hlm-p:`diamonds.host` and :hlm-p:`diamonds.port` are set correctly.
+* The properties ``diamonds.host`` and ``diamonds.port`` are set correctly.
* By default they are taken from ``helium/tools/common/companyproperties.ant.xml``, but can be overridden by using:
* **Command line**
@@ -83,8 +83,8 @@
"``diamonds.host``", "Diamonds server address"
"``diamonds.port``", "Server port number"
"``diamonds.path``", "Builds path in Diamonds server"
- ":hlm-p:`build.family`", "Category of product"
- ":hlm-p:`stages`", "Start and end target of a stages with logical stage name"
+ "``build.family``", "Category of product"
+ "``stages``", "Start and end target of a stages with logical stage name"
":hlm-p:`sysdef.configurations.list`", "System definition name list to log component faults"
":hlm-p:`build.name`", "Name of product"
":hlm-p:`release.label`", "Name of release"
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/manual/stage_ats.rst.ftl
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/doc/src/manual/stage_ats.rst.ftl Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,712 @@
+.. ============================================================================
+ Name : stage_ats.rst.ftl
+ Part of : Helium
+
+ Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+ All rights reserved.
+ This component and the accompanying materials are made available
+ under the terms of the License "Eclipse Public License v1.0"
+ which accompanies this distribution, and is available
+ at the URL "http://www.eclipse.org/legal/epl-v10.html".
+
+ Initial Contributors:
+ Nokia Corporation - initial contribution.
+
+ Contributors:
+
+ Description:
+
+ ============================================================================
+
+.. index::
+ module: Testing
+
+=======
+Testing
+=======
+
+This is a good start for Helium users who want to setup test automation using ATS4. (**ATS3 users**, please `read here`_ .)
+
+.. _`read here`: stage_ats_old.html
+
+
+
+.. contents::
+
+
+
+Helium Test Automation
+======================
+
+Helium can be used to auomate testing. For this purpose, test asset must be alligned with the standard guidelines for writing the tests.
+
+Helium supports several test frameworks including `STIF`_, `TEF`_, RTest, MTF, `SUT`_, QtTest `EUnit`_, TDriver and ASTE. (`Description of test frameworks`_)
+
+Most of the above mentioned test frameworks share common configuration to setup TA environemnet. However, there are a few exceptions, which are discussed below under the headings of every test framework.
+
+
+
+<#if !(ant?keys?seq_contains("sf"))>
+.. _`STIF`: http://s60wiki.nokia.com/S60Wiki/STIF
+.. _`TEF`: http://s60wiki.nokia.com/S60Wiki/TEF_%28TestExecute_Framework%29
+.. _`EUnit`: http://s60wiki.nokia.com/S60Wiki/EUnit
+#if>
+
+.. _`SUT`: http://developer.symbian.org/wiki/index.php/Symbian_Test_Tools#SymbianUnitTest
+.. _`Description of test frameworks`: http://developer.symbian.org/wiki/index.php/Symbian_Test_Tools
+
+
+
+Prerequisites
+-------------
+
+* `Harmonized Test Interface (HTI)`_ needs to be compiled and into the image.
+* The reader is expected to already have a working ATS setup in which test cases can be executed. ATS server names,
+ access rights and authentication etc. is supposed to be already taken care of.
+
+<#if !(ant?keys?seq_contains("sf"))>
+.. _`Harmonized Test Interface (HTI)`: http://s60wiki.nokia.com/S60Wiki/HTI
+<#else>
+.. _`Harmonized Test Interface (HTI)`: http://developer.symbian.org/wiki/index.php/HTI_Tool
+#if>
+
+
+Setting up a Test Automation Environment with Helium
+====================================================
+
+Basic Test Automation step-by-step setup guide.
+
+
+Step 0: Structuring Test-source/test-asset
+------------------------------------------
+Test source usually lives in a component's ``tsrc`` directory. Test source components are created like any other Symbian SW component;
+there is a ``group`` directory with a ``bld.inf`` file for building, ``.mmp`` files for defining the targets, and so on.
+
+The test generation code expects ``.pkg`` file in the ``group`` directory of test component to be compiled, to get the paths of the files
+(can be data, configuration, initialization, etc) to be installed and where to install on the phone.
+
+**Please note** that not all components have ``tsrc`` and ``group`` directories. For instance, Qt, ASTE and TDriver do not have similar test asset structure as STIF, TEF and other test components. It is recommended to follow the test asset guidelines prior to setting up test automation with Helium.
+
+
+Step 1: Setting up system definition file
+-----------------------------------------
+**System Definition Files supporting layers.sysdef.xml**
+ **layers** in ``layers.sysdef.xml`` file and **configuration** in ``build.sysdef.xml`` file (`Structure of System Definition files version 1.4`_).
+
+ <#if !(ant?keys?seq_contains("sf"))>
+.. _`new API test automation guidelines`: http://s60wiki.nokia.com/S60Wiki/Test_Asset_Guidelines
+.. _`Structure of System Definition files version 1.4`: http://delivery.nmp.nokia.com/trac/helium/wiki/SystemDefinitionFiles
+#if>
+
+A template of layer in layers.sysdef.xml for system definition files
+
+.. code-block:: xml
+
+
+
+
+
+
+
+
+
+
+
+
+* Layer name should end with **_test_layer**
+* Two standard names for ATS test layers are being used; ``unit_test_layer`` and ``api_test_layer``. Test components (the``unit`` tags)
+ should be specified under these layers and grouped by ``module`` tag(s).
+* In the above, two modules means two drop files will be created; ``module`` may have one or more ``unit``
+* By using property ``exclude.test.layers``, complete layers can be excluded and the components inside that layer will not be included in the AtsDrop. This property is a comma (,) separated list
+
+**System Definition Files version 3.0 (SysDefs3)** (new Helium v.10.79)
+ The `structure of System Definition files version 3.0`_ is different than previous versions of system definition files. In SysDefs3, package definition files are used for components specification. Instead of layers naming conventions, filters are used to identify test components and test types, for example: "test, unit_test, !api_test" etc.
+
+<#if !(ant?keys?seq_contains("sf"))>
+.. _`structure of System Definition files version 3.0`: http://wikis.in.nokia.com/view/SWManageabilityTeamWiki/PkgdefUse
+<#else>
+.. _`structure of System Definition files version 3.0`: sysdef3.html
+#if>
+
+An example template for defining test components in a package definition file.
+
+.. code-block:: xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+* Filter "test" must be specified for every test component. If it is not specified, the component will not be considered as a test component.
+* / are now used to group test components, it work in the same way as ... in sysdef v1.4 works. The components having same group name are grouped together.
+ Separate drop files are created for different groups. In the above example, if only 'test' is selected, then two drop files will be created, one with tc1 and the other one with tc2 and tc3.
+
+
+Step 2: Configure ATS Ant properties
+---------------------------------------
+The properties are categorized as
+
+* **Common** - Valid for all test frameworks (Table-1).
+* **API/Module** - Valid for only API/Module tests like STIF, STF, EUNit etc., and hence, are shared among many test frameworks (Table-2).
+
+
+Also, the edit status of the properties can be described as
+
+* [must] - must be set by user
+* [recommended] - should be set by user but not mandatory
+* [allowed] - should **not** be set by user however, it is possible.
+
+.. csv-table:: Table-1: ATS - Common Properties
+ :header: "Property name", "Edit status", "Description"
+
+ ":hlm-p:`ats.server`", "[must]", ":hlm-p:`ats.server[documentation]`"
+ ":hlm-p:`ats.drop.location`", "[allowed]", ":hlm-p:`ats.drop.location[documentation]`"
+ ":hlm-p:`ats.product.name`", "[must]", ":hlm-p:`ats.product.name[documentation]`"
+ ":hlm-p:`ats.email.list`", "[allowed]", ":hlm-p:`ats.email.list[documentation]`"
+ ":hlm-p:`ats.report.type`", "[allowed]", ":hlm-p:`ats.report.type[documentation]`"
+ ":hlm-p:`ats.flashfiles.minlimit`", "[allowed]", ":hlm-p:`ats.flashfiles.minlimit[documentation]`"
+ ":hlm-p:`ats.plan.name`", "[allowed]", ":hlm-p:`ats.plan.name[documentation]`"
+ ":hlm-p:`ats.product.hwid`", "[allowed]", ":hlm-p:`ats.product.hwid[documentation]`"
+ ":hlm-p:`ats.script.type`", "[allowed]", ":hlm-p:`ats.script.type[documentation]`"
+ ":hlm-p:`ats.test.timeout`", "[allowed]", ":hlm-p:`ats.test.timeout[documentation]`"
+ ":hlm-p:`ats.testrun.name`", "[allowed]", ":hlm-p:`ats.testrun.name[documentation]`"
+ ":hlm-p:`ats.report.location`", "[allowed]", ":hlm-p:`ats.report.location[documentation]`"
+ ":hlm-p:`ats.diamonds.signal`", "[allowed]", ":hlm-p:`ats.diamonds.signal[documentation]`"
+
+
+An example of setting up the common properties as in table-1:
+
+.. code-block:: xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+.. csv-table:: Table-2: ATS - API/Module properties
+ :header: "Property name", "Edit status", "Description"
+
+ ":hlm-p:`ats.target.platform`", "[allowed]", ":hlm-p:`ats.target.platform[documentation]`"
+ ":hlm-p:`ats.obey.pkgfiles.rule`", "[allowed]", ":hlm-p:`ats.obey.pkgfiles.rule[documentation]`"
+ ":hlm-p:`ats.specific.pkg`", "[allowed]", ":hlm-p:`ats.specific.pkg[documentation]`"
+ ":hlm-p:`ats.test.filterset`", "[allowed]", ":hlm-p:`ats.test.filterset[documentation]`"
+
+
+An example of setting up API/Module testing properties as in table-2:
+
+.. code-block:: xml
+
+
+
+
+
+
+
+
+
+
+
+
+Step 3: Configure or select ROM images (Optional)
+-------------------------------------------------
+Since helium 10 images are picked up using :hlm-p:`ats.product.name` and Imaker iconfig.xml files. Property ``release.images.dir`` is searched for iconfig.xml files, the ones where the product name is part of :hlm-p:`ats.product.name` is used.
+
+You should only build the images for each product you want to include in ats. See `Imaker`_ docs for more info. Eg.
+
+.. _`Imaker`: ../helium-antlib/imaker.html
+
+.. code-block:: xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+For older products where there are no iconfig.xml, ``reference.ats.flash.images`` is used:
+
+.. code-block:: xml
+
+
+
+
+
+
+
+
+.. Note::
+
+ Always declare *Properties* before and *filesets* after importing helium.ant.xml in order to overwrite the default values during the build.
+
+
+Step 4: Enabling or disabling test automation features
+------------------------------------------------------
+Helium supports a number of test automation features, which are discussed below. These features can be enabled or disabled by switching the values of the following properties to either *true* or *false*.
+
+
+.. csv-table:: Table-3: ATS - Switches/enablers
+ :header: "Property name", "Edit status", "Description"
+
+ ":hlm-p:`ats.enabled`", "[allowed]", ":hlm-p:`ats.enabled[documentation]`"
+ ":hlm-p:`ats4.enabled`", "[allowed]", ":hlm-p:`ats4.enabled[documentation]`"
+ ":hlm-p:`ats.stf.enabled`", "[allowed]", ":hlm-p:`ats.stf..enabled[documentation]`"
+ ":hlm-p:`aste.enabled`", "[allowed]", ":hlm-p:`aste.enabled[documentation]`"
+ ":hlm-p:`ats.ctc.enabled`", "[allowed]", ":hlm-p:`ats.ctc.enabled[documentation]`"
+ ":hlm-p:`ats.trace.enabled`", "[allowed]", ":hlm-p:`ats.trace.enabled[documentation]`"
+ ":hlm-p:`ats.emulator.enable`", "[allowed]", ":hlm-p:`ats.emulator.enable[documentation]`"
+ ":hlm-p:`ats.singledrop.enabled`", "[allowed]", ":hlm-p:`ats.singledrop.enabled[documentation]`"
+ ":hlm-p:`ats.multiset.enabled`", "[allowed]", ":hlm-p:`ats.multiset.enabled[documentation]`"
+ ":hlm-p:`ats.delta.enabled`", "[allowed]", ":hlm-p:`ats.delta.enabled[documentation]`"
+ ":hlm-p:`ats.java.importer.enabled`", "[allowed]", ":hlm-p:`ats.java.importer.enabled[documentation]`"
+ ":hlm-p:`ats.tdriver.enabled`", "[allowed]", ":hlm-p:`ats.tdriver.enabled[documentation]`"
+
+
+For example:
+
+.. code-block:: xml
+
+
+
+
+Supported Test Frameworks
+=========================
+In this section only Helium specific properties, targets or other related issues are discussed to configure the following test frameworks. However, as mentioned earlier, there are test asset guidelines to setup test components for different test frameworks.
+
+ASTE
+----
+* ASTE tests can be enabled by setting :hlm-p:`aste.enabled` (see table-3).
+* `SW Test Asset`_ location and type of test should be known as a prerequisite.
+* To configure the ASTE tests, aste specific properties are required in addition to those in table-1
+
+<#if !(ant?keys?seq_contains("sf"))>
+.. _`SW Test Asset`: http://s60wiki.nokia.com/S60Wiki/MC_SW_Test_Asset_documentation
+#if>
+
+.. csv-table:: Table: ATS - ASTE properties
+ :header: "Property name", "Edit status", "Description"
+
+ ":hlm-p:`ats.aste.testasset.location`", "[must]", ":hlm-p:`ats.aste.testasset.location[documentation]`"
+ ":hlm-p:`ats.aste.software.release`", "[must]", ":hlm-p:`ats.aste.software.release[documentation]`"
+ ":hlm-p:`ats.aste.software.version`", "[must]", ":hlm-p:`ats.aste.software.version[documentation]`"
+ ":hlm-p:`ats.aste.testasset.caseids`", "[recommended]", ":hlm-p:`ats.aste.testasset.caseids[documentation]`"
+ ":hlm-p:`ats.aste.language`", "[recommended]", ":hlm-p:`ats.aste.language[documentation]`"
+ ":hlm-p:`ats.aste.test.type`", "[recommended]", ":hlm-p:`ats.aste.test.type[documentation]`"
+ ":hlm-p:`ats.aste.plan.name`", "[recommended]", ":hlm-p:`ats.aste.plan.name[documentation]`"
+ ":hlm-p:`ats.aste.testrun.name`", "[recommended]", ":hlm-p:`ats.aste.testrun.name[documentation]`"
+ ":hlm-p:`ats.aste.email.list`", "[recommended]", ":hlm-p:`ats.aste.email.list[documentation]`"
+
+
+An example of setting up ASTE properties:
+
+.. code-block:: xml
+
+
+
+
+
+
+
+
+
+
+
+
+EUnit
+-----
+* Test framework is selected if there is a library ``eunit.lib`` in the ``.mmp`` file of a test component
+* Following EUnit specific properties are required in addition to those in table-1 and table-2.
+
+.. csv-table:: Table: ATS - ASTE properties
+ :header: "Property name", "Edit status", "Description"
+
+ ":hlm-p:`eunit.test.package`", "[allowed]", ":hlm-p:`eunit.test.package[documentation]`"
+ ":hlm-p:`eunitexerunner.flags`", "[allowed]", ":hlm-p:`eunitexerunner.flags[documentation]`"
+
+
+An example of setting up ASTE properties as in the above table:
+
+.. code-block:: xml
+
+
+
+
+
+MTF
+---
+* The test framework is selected if there is a library ``testframeworkclient.lib`` in the ``.mmp`` file of a test component
+* There is no MTF specific configuration for Helium in addition to those in table-1 and table-2.
+
+
+QtTest
+------
+* The test framework is selected if there is a library ``QtTest.lib`` in the ``.mmp`` file of a test component
+* There are several ``.PKG`` files created after executing ``qmake``, but only one is selected based on a set target platform. See (:hlm-p:`ats.target.platform`) description in table-2.
+* Properties in table-1 and table-2 should also be configured.
+
+
+RTest
+-----
+* The test framework is selected if there is a library ``euser.lib`` and a comment ``//RTEST``in the ``.mmp`` file of a test component.
+* There is no RTest specific configuration for Helium in addition to those in table-1 and table-2.
+
+
+STF
+---
+* The test framework is selected if there is ``ModuleName=TEFTESTMODULE`` in ``.ini`` file of a component.
+* There is no STF specific configuration for Helium in addition to those in table-1 and table-2.
+* To enable STF for ATS set, :hlm-p:`ats.stf.enabled` (see table-3). By default this is not enabled.
+
+
+STIF
+----
+* The test framework is selected if there is a library ``stiftestinterface.lib`` in the ``.mmp`` file of a test component
+* There is no STIF specific configuration for Helium in addition to those in table-1 and table-2.
+
+
+SUT
+---
+* The test framework is selected if there is a library ``symbianunittestfw.lib`` in the ``.mmp`` file of a test component
+* There is no SUT specific configuration for Helium in addition to those in table-1 and table-2.
+
+
+TEF
+---
+* The test framework is selected if there is a library ``testframeworkclient.lib`` in the ``.mmp`` file of a test component
+* There is no TEF specific configuration for Helium in addition to those in table-1 and table-2.
+
+
+TDriver
+-------
+* TDriver tests can be enabled by setting :hlm-p:`ats.tdriver.enabled` (see table-3).
+* TDriver Test Asset location should be known as a prerequisite.
+* Following TDriver specific properties are required in addition to those in table-1.
+
+
+.. csv-table:: Table: ATS Ant properties
+ :header: "Property name", "Edit status", "Description"
+
+ ":hlm-p:`ats.tdriver.enabled`", "[must]", ":hlm-p:`ats.tdriver.enabled[documentation]`"
+ ":hlm-p:`tdriver.asset.location`", "[must]", ":hlm-p:`tdriver.asset.location[documentation]`"
+ ":hlm-p:`tdriver.test.profiles`", "[must]", ":hlm-p:`tdriver.test.profiles[documentation]`"
+ ":hlm-p:`tdriver.tdrunner.enabled`", "[must]", ":hlm-p:`tdriver.tdrunner.enabled[documentation]`"
+ ":hlm-p:`tdriver.test.timeout`", "[must]", ":hlm-p:`tdriver.test.timeout[documentation]`"
+ ":hlm-p:`tdriver.parameters`", "[must]", ":hlm-p:`tdriver.parameters[documentation]`"
+ ":hlm-p:`tdriver.sis.files`", "[must]", ":hlm-p:`tdriver.sis.files[documentation]`"
+ ":hlm-p:`tdriver.tdrunner.parameters`", "[must]", ":hlm-p:`tdriver.tdrunner.parameters[documentation]`"
+ ":hlm-p:`tdriver.template.file`", "[allowed]", ":hlm-p:`tdriver.template.file[documentation]`"
+
+
+An example of setting up TDriver properties:
+
+.. code-block:: xml
+
+
+
+
+
+
+
+
+
+
+
+
+* To execute the tests, :hlm-t:`tdriver-test` target should be called.
+* To create custom templates for TDriver, read `Instructions for creating TDriver custom template`_.
+
+
+.. _`Instructions for creating TDriver custom template`: tdriver_template_instructions.html
+
+
+
+Test Automation Features
+========================
+
+CTC (Code Coverage)
+-------------------
+
+* To enable ctc for ATS set, :hlm-p:`ats.ctc.enabled` (see table-3).
+* To compile components for CTC see `configure CTC for SBS`_
+
+.. _`configure CTC for SBS`: ../helium-antlib/sbsctc.html
+
+* Once ATS tests have finished results for CTC will be shown in Diamonds.
+* The following are optional CTC properties
+
+.. csv-table:: Table: ATS Ant properties
+ :header: "Property name", "Edit status", "Description"
+
+ "``ctc.instrument.type``", "[allowed]", "Sets the instrument type"
+ "``ctc.build.options``", "[allowed]", "Enables optional extra arguments for CTC, after importing a parent ant file."
+
+
+For example,
+
+.. code-block:: xml
+
+
+
+
+
+
+
+
+
+Or
+
+.. code-block:: xml
+
+
+
+
+
+
+See `more information on code coverage`_.
+
+<#if !(ant?keys?seq_contains("sf"))>
+.. _`more information on code coverage`: http://s60wiki.nokia.com/S60Wiki/CTC
+<#else>
+.. _`more information on code coverage`: http://developer.symbian.org/wiki/index.php/Testing_Guidelines_for_Package_Releases#Code_coverage
+#if>
+
+
+
+Customized test XML files
+-------------------------
+
+The user can customize the generated test.xml with files:
+
+* **preset_custom.xml** goes before first set
+* **postset_custom.xml** goes after last set
+* **precase_custom.xml** goes before first case
+* **postcase_custom.xml** goes after last case
+* **prestep_custom.xml** goes before first step
+* **poststep_custom.xml** goes after last step
+* **prerun_custom.xml** goes before first run or execute step
+* **postrun_custom.xml** goes after last run or execute step
+* **prepostaction.xml** goes before first postaction
+* **postpostaction.xml** goes after last postaction
+
+The files must be in the directory 'custom' under the 'tsrc' or 'group' folder to be processed.
+
+The files need to be proper XML snippets that fit to their place. In case of an error an error is logged and a comment inserted to the generated XML file.
+
+A postaction section customization file (prepostaction.xml or postpostaction.xml) could look like this
+
+.. code-block:: xml
+
+
+ RunProcessAction
+
+
+
+
+
+
+The ``prestep_custom.xml`` can be used to flash and unstall something custom.
+
+.. code-block:: xml
+
+
+ FileUploadTask
+
+
+
+
+
+
+
+
+
+
+And then the ``prerun_custom.xml`` can be used to execute a task.
+
+.. code-block:: xml
+
+
+ NonTestExecuteTask
+
+
+
+
+
+
+
+
+
+**Note:** The users is expected to check the generated test.xml manually, as there is no validation. Invalid XML input files will be disregarded and a comment will be inserted to the generated XML file.
+
+
+Custom templates/drops
+----------------------
+* If you need to send a static drop to ATS then you can call the target :hlm-t:`ats-custom-drop`.
+* An example template is in helium/tools/testing/ats/templates/ats4_naviengine_template.xml
+* Then set a property to your own template, as follows.
+
+.. code-block:: xml
+
+
+
+
+Overriding XML values
+---------------------
+* Set the property ``ats.config.file`` to the location of the config file.
+
+Example configuration:
+
+.. code-block:: xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Delta testing
+-------------
+
+
+Multiset support
+----------------
+* Enable the feature by setting property :hlm-p:`ats.multiset.enabled` to ``true``.
+* If enabled, a 'set' in test.xml, is used for each pkg file in a component, this allows tests to run in parallel on several devices.
+
+ROM Bootup Tests
+----------------
+* ROM images can be tested on ATS by executing target ":hlm-t:`ats-bootup-test`". This feature is useful to test whther the created ROM images boot-up a device or not .
+* To enable this feature, set a property ":hlm-p:`ats.bootuptest.enabled`" (see table-3)
+* In addition to enable the feature, properties in the table-1 are also required.
+
+
+Single/Multiple test drops creation
+-----------------------------------
+* It is mentioned earlier in Step 1, that components can be grouped together.
+* During automation, separate TestDrops are created based on these groups.
+* This grouping can be neglected and a single test drop can be created by setting a property :hlm-p:`ats.singledrop.enabled` By default the value is 'false'. For example,
+
+
+.. code-block:: xml
+
+
+
+
+
+Skip uploading test drops
+-------------------------
+* ``ats-test`` target can only create a drop file, and does not send the drop (or package) to ATS server.
+* To use the feature, set the following property to ``flase``.
+
+.. code-block:: xml
+
+
+
+
+<#if !(ant?keys?seq_contains("sf"))>
+
+Support for multiple products (ROM images)
+------------------------------------------
+
+See: `Instructions for setting up multiple roms and executing specific tests`_.
+
+.. _`Instructions for setting up multiple roms and executing specific tests`: http://helium.nmp.nokia.com/doc/ido/romandtest.html
+
+
+#if>
+
+
+Testing with Winscw Emulator
+----------------------------
+* If enabled, ``ats-test`` target creates a zip of build area instead of images for use by emulator on ATS server.
+* Set a property as follows.
+
+.. code-block:: xml
+
+
+
+
+<#if !(ant?keys?seq_contains("sf"))>
+
+Tracing
+-------
+* Currently there isn't a single standard method of doing tracing in Symbian Platform.
+* Application, middleware, driver and kernel developers have used different methods for instrumenting their code. Due to the different methods used, it is inherently difficult to get a coherent overview of the whole platform when debugging and testing sw.
+* Current implementation of Tracing in Helium is based on the instruction given `here`_.
+* Tracing can be enabled by setting :hlm-p:`ats.trace.enabled` to ``true`` (see table-3).
+
+.. _`here`: http://s60wiki.nokia.com/S60Wiki/Tracing
+
+#if>
+
+
+Troubleshooting TA
+==================
+
+.. csv-table:: Table: Trouble shooting test automation
+ :header: "Type", "Description", "Possible solution"
+
+ "Error", "'**' not found", "Either the PKG file does not exist or incorrect filename."
+ "Error", "No test modules found in '**'", "This error is raised when there is no test components available. Check that your components are in the SystemDefinition files, and that the filters are set accordingly to the test asset documentation and that the components actually exists in the asset."
+ "Error", "'**' - test source not found", "Path in the bld.inf file is either incorrect or the component does not exist."
+ "Error", "Not enough flash files: # defined, # needed", "Check property :hlm-p:`ats.flashfiles.minlimit`. Selected ROM images files # is lesser than the required no. of files. This error can also be eliminated by reducing the value of the property."
+ "Error", "'CPP failed: '' in: '**'", "Check the path and/or the file. There can be broken path in the file or mising directives and macros."
+ "Error", "** - test sets are empty", "missing/invalid deirectives and/or project macros. The .mmp file ca be missing."
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/manual/stage_ats.rst.inc.ftl
--- a/buildframework/helium/doc/src/manual/stage_ats.rst.inc.ftl Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,506 +0,0 @@
-<#--
-============================================================================
-Name : stage_ats.rst.inc.ftl
-Part of : Helium
-
-Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
-All rights reserved.
-This component and the accompanying materials are made available
-under the terms of the License "Eclipse Public License v1.0"
-which accompanies this distribution, and is available
-at the URL "http://www.eclipse.org/legal/epl-v10.html".
-
-Initial Contributors:
-Nokia Corporation - initial contribution.
-
-Contributors:
-
-Description:
-
-============================================================================
--->
-
-.. index::
- single: ATS - STIF, TEF, RTEST, MTF and EUnit
-
-.. _`Stage-ATS-label`:
-
-Stage: ATS - STIF, TEF, RTEST, MTF and EUnit (also Qt)
-=======================================================
-
-ATS testing is the automatic testing of the phone code once it has been compiled and linked to create a ROM image.
-
-Explanation of the process for getting ATS (`STIF`_ and `EUnit`_) tests compiled and executed by Helium, through the use of the :hlm-t:`ats-test` target.
-
-http://developer.symbian.org/wiki/index.php/Symbian_Test_Tools
-
-<#if !(ant?keys?seq_contains("sf"))>
-.. _`STIF`: http://s60wiki.nokia.com/S60Wiki/STIF
-.. _`EUnit`: http://s60wiki.nokia.com/S60Wiki/EUnit
-#if>
-
-.. image:: ats.dot.png
-
-Prerequisites
-----------------
-
-* `Harmonized Test Interface (HTI)`_ needs to be compiled and into the image.
-* The reader is expected to already have a working ATS setup in which test cases can be executed. ATS server names,
- access rights and authentication etc. is supposed to be already taken care of.
-
-<#if !(ant?keys?seq_contains("sf"))>
-.. _`Harmonized Test Interface (HTI)`: http://s60wiki.nokia.com/S60Wiki/HTI
-<#else>
-.. _`Harmonized Test Interface (HTI)`: http://developer.symbian.org/wiki/index.php/HTI_Tool
-#if>
-
-Test source components
--------------------------
-
-Test source usually lives in a component's ``tsrc`` directory. Test source components are created like any other Symbian SW component;
-there is a ``group`` directory with a ``bld.inf`` file for building, ``.mmp`` files for defining the targets, and so on.
-
-The test generation code expects ``.pkg`` file in the ``group`` directory of test component to be compiled, to get the paths of the files
-(can be data, configuration, initialization, etc files) to be installed and where to install on the phone.
-
-
-Three STEPS to setup ATS with Helium
---------------------------------------
-
-**Step 1: Configure System Definition Files**
- If the tsrc directory structure meets the criteria defined in the `new API test automation guidelines`_, then test components
- should be included in the System Definition files.
-
-**System Definition Files supporting layers.sysdef.xml**
- **layers** in ``layers.sysdef.xml`` file and **configuration** in ``build.sysdef.xml`` file (`Structure of System Definition files version 1.4`_).
-
- <#if !(ant?keys?seq_contains("sf"))>
-.. _`new API test automation guidelines`: http://s60wiki.nokia.com/S60Wiki/Test_Asset_Guidelines
-.. _`Structure of System Definition files version 1.4`: http://delivery.nmp.nokia.com/trac/helium/wiki/SystemDefinitionFiles
-#if>
-
-A template of layer in layers.sysdef.xml for system definition files
-
-.. code-block:: xml
-
-
-
-
-
-
-
-
-
-
-
-* Layer name should end with **_test_layer**
-* Two standard names for ATS test layers are being used; ``unit_test_layer`` and ``api_test_layer``. Test components (the``unit`` tags)
- should be specified under these layers and grouped by ``module`` tag(s).
-* In the above, two modules means two drop files will be created; ``module`` may have one or more ``unit``
-* By using property ``exclude.test.layers``, complete layers can be excluded and the components inside that layer will not be included in the AtsDrop. This property is a comma (,) separated list
-
-**System Definition Files version 3.0 (SysDefs3)** (new Helium v.10.79)
- The `structure of System Definition files version 3.0`_ is different than previous versions of system definition files. In SysDefs3, package definition files are used for components specification. Instead of layers naming conventions, filters are used to identify test components and test types, for example: "test, unit_test, !api_test" etc.
-
-<#if !(ant?keys?seq_contains("sf"))>
-.. _`structure of System Definition files version 3.0`: http://wikis.in.nokia.com/view/SWManageabilityTeamWiki/PkgdefUse
-<#else>
-.. _`structure of System Definition files version 3.0`: sysdef3.rst
-#if>
-
-An example template for defining test components in a package definition file.
-
-.. code-block:: xml
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-* Filter "test" must be specified for every test component. If it is not specified, the component will not be considered as a test component.
-* / are now used to group test components, it work in the same way as ... in sysdef v1.4 works. The components having same group name are grouped together.
- Separate drop files are created for different groups. In the above example, if only 'test' is selected, then two drop files will be created, one with tc1 and the other one with tc2 and tc3.
-
-
-**Step 2: Configure ATS properties in build.xml**
-
-**(A)** Username and Password for the ATS should be set in the `.netrc file`_::
-
- machine ats login ats_user_name password ats_password
-
-Add the above line in the ``.netrc`` file and replace ``ats_user_name`` with your real ATS username and ``ats_password`` with ATS password.
-
-**(B)** The following properties are ATS dependent with their edit status
-
-* [must] - must be set by user
-* [recommended] - should be set by user but not mandatory
-* [allowed] - should **not** be set by user however, it is possible.
-
-.. csv-table:: ATS Ant properties
- :header: "Property name", "Edit status", "Description"
-
- ":hlm-p:`ats.server`", "[must]", "For example: ``4fix012345`` or ``catstresrv001.company.net:80``. Default server port is ``8080``, but it is not allowed between intra and Noklab. Because of this we need to define server port as 80. The host can be different depending on site and/or product."
- ":hlm-p:`ats.drop.location`", "[allowed]", "Server location (UNC path) to save the ATSDrop file, before sending to the ATS Server. For example: ``\\\\trwsem00\\some_folder\\``. In case, :hlm-p:`ats.script.type` is set to ``import``, ATS doesn't need to have access to :hlm-p:`ats.drop.location`, its value can be any local folder on build machine, for example ``c:/temp`` (no network share needed)."
- ":hlm-p:`ats.product.name`", "[must]", "Name of the product to be tested."
- ":hlm-p:`eunit.test.package`", "[allowed]", "The EUnit package name to be unzipped on the environment, for executing EUnit tests."
- ":hlm-p:`eunitexerunner.flags`", "[allowed]", "Flags for EUnit exerunner can be set by setting the value of this variable. The default flags are set to ``/E S60AppEnv /R Off``."
- ":hlm-p:`ats.email.list`", "[allowed]", "The property is needed if you want to get an email from ATS server after the tests are executed. There can be one to many semicolon-separated email addresses."
- ":hlm-p:`ats.report.type`", "[allowed]", "Value of the ats email report, for ATS4 set to 'no_attachment' so email size is reduced"
- ":hlm-p:`ats.flashfiles.minlimit`", "[allowed]", "Limit of minimum number of flash files to execute :hlm-t:`ats-test` target, otherwise ``ATSDrop.zip`` will not be generated. Default value is 2 files."
- ":hlm-p:`ats.plan.name`", "[allowed]", "Modify the plan name if you have understanding of ``test.xml`` file or leave it as it is. Default value is ``plan``."
- ":hlm-p:`ats.product.hwid`", "[allowed]", "Product HardWare ID (HWID) attached to ATS. By default the value of HWID is not set."
- ":hlm-p:`ats.script.type`", "[allowed]", "There are two types of ats script files to send drop to ATS server, ``runx`` and ``import``; only difference is that with ``import`` ATS doesn't have to have access rights to ``testdrop.zip`` file, as it is sent to the system over http and import doesn't need network shares. If that is not needed ``import`` should not be used. Default value is ``runx`` as ``import`` involves heavy processing on ATS server."
- ":hlm-p:`ats.target.platform`", "[allowed]", "Sets target platform for compiling test components. Default value is ``armv5 urel``."
- ":hlm-p:`ats.test.timeout`", "[allowed]", "To set test commands execution time limit on ATS server, in seconds. Default value is ``60``."
- ":hlm-p:`ats.testrun.name`", "[allowed]", "Modify the test-run name if you have understanding of ``test.xml`` file or leave it as it is. Default value is a string consist of build id, product name, major and minor versions."
- ":hlm-p:`ats.trace.enabled`", "[allowed]", "Should be ``true`` if tracing is needed during the tests running on ATS. Default value is ``false``, the values are case-sensitive. See http://s60wiki.nokia.com/S60Wiki/CATS/TraceTools."
- ":hlm-p:`ats.ctc.enabled`", "[allowed]", "Should be ``true`` if coverage measurement and dynamic analysis (CTC) tool support is to be used by ATS. Default value is ``false``. The values are case-sensitive."
- ":hlm-p:`ats.ctc.host`", "[allowed]", "CTC host, provided by CATS used to create coverage measurement reports. MON.sym files are copied to this location, for example ``10.0.0.1``. If not given, code coverage reports are not created"
- ":hlm-p:`ats.obey.pkgfiles.rule`", "[allowed]", "If the property is set to ``true``, then the only test components which will have PKG files, will be included into the ``test.xml`` as a test-set. Which means, even if there's a test component (executable) but there's no PKG file, it should not be considered as a test component and hence not included into the test.xml as a separate test. By default the property value is ``false``."
- "``reference.ats.flash.images``", "[allowed]", "Fileset for list of flash images (can be .fpsx, .C00, .V01 etc) It is recommended to set the fileset, default filset is given below which can be overwritten. set *dir=""* attribute of the filset to ``${r'$'}{build.output.dir}/variant_images`` if hlm-t:`variant-image-creation` target is being used."
- ":hlm-p:`tsrc.data.dir`", "[allowed]", "The default value is ``data`` and refers to the 'data' directory under 'tsrc' directory."
- ":hlm-p:`tsrc.path.list`", "[allowed]", "Contains list of the tsrc directories. Gets the list from system definition layer files. Assuming that the test components are defined already in te ``layers.sysdef.xml`` files to get compiled. Not recommended, but the property value can be set if there are no System Definition file(s), and tsrc directories paths to set manually."
- ":hlm-p:`ats.report.location`", "[allowed]", "Sets ATS reports store location. Default location is ``${r'$'}{publish.dir}/${r'$'}{publish.subdir}``."
- ":hlm-p:`ats.multiset.enabled`", "[allowed]", "Should be ``true`` so a set is used for each pkg file in a component, this allows tests to run in parallel on several devices."
- ":hlm-p:`ats.diamonds.signal`", "[allowed]", "Should be ``true`` so at end of the build diamonds is checked for test results and Helium fails if tests failed."
- ":hlm-p:`ats.delta.enabled`", "[allowed]", "Should be ``true`` so only ADOs changed during :hlm-t:`do-prep-work-area` are tested by ATS."
- ":hlm-p:`ats4.enabled`", "[allowed]", "Should be ``true`` if ATS4 is to be used."
- ":hlm-p:`ats.emulator.enable`", "[allowed]", "Should be ``true`` if ``WINSCW`` emulator is to be used."
- ":hlm-p:`ats.specific.pkg`", "[allowed]", "Text in name of PKG files to use eg. 'sanity' would only use xxxsanity.pkg files from components."
- ":hlm-p:`ats.singledrop.enabled`", "[allowed]", "If present and set to 'true', it will create one drop file, if set to any other value or not present it will create multiple drop files (defined by the sysdef file). This is to save traffic to the server."
- ":hlm-p:`ats.java.importer.enabled`", "[allowed]", "If set to 'true', for older uploader is used for ats3 which shows improved error message."
- ":hlm-p:`ats.test.filterset`", "[allowed]", "(new Helium v.10.79)Contains a name of test filterset (see example below). A filterset is used to select/unselect test components. The filter(s) is/are effective when the same filters are defined in the package definition file for component(s)."
-
-An example of setting up properties:
-
-.. code-block:: xml
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- ...
-
- ...
-
-
-
-
-
-
-
-
-.. Note::
-
- Always declare *Properties* before and *filesets* after importing helium.ant.xml.
-
-**STEP 3: Call target ats-test**
-
-To execute the target, a property should be set(````).
-
-Then call :hlm-t:`ats-test`, which will create the ATSDrop.zip (test package).
-
-If property *ats.email.list* is set, an email (test report) will be sent when the tests are ready on ATS.
-
-CTC:
-----
-
-CTC code coverage measurements reports can be created as part of Test Automation process.
-
-1. Build the src using ``build_ctc`` configuration, which is in ``build.sysdef.xml`` file, to create ``MON.sym`` files. It means that a property ``sysdef.configurations.list`` should be modified either add or replace current build configuration with ``build_ctc``
-
-2. Set the property, ``ats.ctc.host``, as described above, for sending the ``MON.sym`` files to the network drive. *(Please contact ATS server administrator and ask for the value to set this property)*
-
-3. Enable CTC process by setting up property ``ats.ctc.enabled`` to "true"
-
-4. Test drops are sent to the ATS server, where, after executing tests ``ctcdata.txt`` files are created. ``ctcdata.txt`` and ``MON.sym`` files are then further processed to create code coverage reports.
-
-5. View or download the Code coverage reports by following the link provided in the ATS report email (sent after the tests are executed on ATS)
-
-*NOTE: After receiving the email notification, it may take a few minutes before the code coverage reports are available.*
-
-
-Qt Tests:
----------
-
-QtTest.lib is supported and the default harness is set to EUnit. If ``QtTest.lib`` is there in ``.mmp`` file, Helium sets the Harness to Eunit and ATS supported Qt steps are added to test.xml file
-
-In ``layers.sysdef.xml`` file, the layer name should end with "_test_layer" e.g. "qt_unit_test_layer".
-
-There are several ``.PKG`` files created after executing ``qmake``, but only one is selected based on which target platform is set. Please read the property (``ats.target.platform``) description above.
-
-.. _`Skip-Sending-AtsDrop-label`:
-
-Skip Sending AtsDrop to ATS
-----------------------------
-
-By setting property of ``ats.upload.enabled`` to ``false``, ``ats-test`` target only creates a drop file, and does not send the drop (or package) to ATS server.
-
-Customizing the test.xml in ATS
---------------------------------
-
-The user can customize the generated test.xml with files:
-
-* **preset_custom.xml** goes before first set
-* **postset_custom.xml** goes after last set
-* **precase_custom.xml** goes before first case
-* **postcase_custom.xml** goes after last case
-* **prestep_custom.xml** goes before first step
-* **poststep_custom.xml** goes after last step
-* **prerun_custom.xml** goes before first run or execute step
-* **postrun_custom.xml** goes after last run or execute step
-* **prepostaction.xml** goes before first postaction
-* **postpostaction.xml** goes after last postaction
-
-The files must be in the directory 'custom' under the 'tsrc' or 'group' folder to be processed.
-
-The files need to be proper XML snippets that fit to their place. In case of an error an error is logged and a comment inserted to the generated XML file.
-
-A postaction section customization file (prepostaction.xml or postpostaction.xml) could look like this
-
-.. code-block:: xml
-
-
- Pre PostAction from custom file
-
-
-
-
-
-
-
-The ``prestep_custom.xml`` can be used to flash and unstall something custom.
-
-.. code-block:: xml
-
-
-
- install
-
-
-
-
- ...
-
-
-
-And then the ``prerun_custom.xml`` can be used to start measuring.
-
-.. code-block:: xml
-
-
-
- execute
-
-
-
-
-
-
-
-
-
-**Note:** The users is expected to check the generated test.xml manually, as there is no validation. Invalid XML input files will be disregarded and a comment will be inserted to the generated XML file.
-
-Overriding Test xml values
---------------------------
-
-Set the property ``ats.config.file`` to the location of the config file.
-
-Example configuration:
-
-.. code-block:: xml
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-.. index::
- single: ATS - ASTE
-
-Stage: ATS - ASTE
-===================
-
-Explanation of the process for getting ATS `ASTE`_ tests compiled and executed by Helium, through the use of the :hlm-t:`ats-aste` target.
-
-<#if !(ant?keys?seq_contains("sf"))>
-.. _`ASTE`: http://s60wiki.nokia.com/S60Wiki/ASTE
-#if>
-
-Prerequisites
---------------
-
-* `Harmonized Test Interface (HTI)`_ needs to be compiled and into the image.
-* The reader is expected to already have a working ATS setup in which test cases can be executed. ATS server names, access rights and authentication etc. is supposed to be already taken care of.
-* `SW Test Asset`_ location and type of test should be known.
-
-<#if !(ant?keys?seq_contains("sf"))>
-.. _`Harmonized Test Interface (HTI)`: http://s60wiki.nokia.com/S60Wiki/HTI
-.. _`SW Test Asset`: http://s60wiki.nokia.com/S60Wiki/MC_SW_Test_Asset_documentation
-#if>
-
-Test source components
---------------------------
-
-Unlike STIF, EUnit etc tests, test source components (or ``tsrc`` structure) is not needed for `ASTE`_ tests.
-
-Two STEPS to setup ASTE with Helium
-------------------------------------
-
-**STEP 1: Configure ASTE properties in build.xml**
-
-**(A)** Username and Password for the ATS should be set in the `.netrc file`_
-
-.. code-block:: text
-
- machine ats login ats_user_name password ats_password
-
-Add the above line in the .netrc file and replace *ats_user_name* with your real ats username and "ats_password" with ats password.
-
-.. _`.netrc file`: configuring.html?highlight=netrc#passwords
-
-
-**(B)** The following properties are ASTE dependent with their edit status
-
-* [must] - must be set by user
-* [recommended] - should be set by user but not mandatory
-* [allowed] - should **not** be set by user however, it is possible.
-
-.. csv-table:: ATS Ant properties
- :header: "Property name", "Edit status", "Description"
-
- ":hlm-p:`ats.server`", "[must]", "For example: ``4fio00105`` or ``catstresrv001.company.net:80``. Default server port is ``8080``, but it is not allowed between intra and Noklab. Because of this we need to define server port as ``80``. The host can be different depending on site and/or product."
- ":hlm-p:`ats.drop.location`", "[must]", "Server location (UNC path) to save the ATSDrop file, before sending to the ATS. For example: ``\\\\trwsem00\\some_folder\\``. In case, ``ats.script.type`` is set to ``import``, ATS doesn't need to have access to :hlm-p:`ats.drop.location`, its value can be any local folder on build machine, for example ``c:/temp`` (no network share needed)."
- ":hlm-p:`ats.product.name`", "[must]", "Name of the product to be tested."
- ":hlm-p:`ats.aste.testasset.location`", "[must]", "Location of SW Test Assets, if the TestAsset is not packaged then it is first compressed to a ``.zip`` file. It should be a UNC path."
- ":hlm-p:`ats.aste.software.release`", "[must]", "Flash images releases, for example 'SPP 51.32'."
- ":hlm-p:`ats.aste.software.version`", "[must]", "Version of the software to be tested. For example: 'W810'"
- ":hlm-p:`ats.aste.email.list`", "[recommended]", "The property is needed if you want to get an email from ATS server after the tests are executed. There can be one to many semicolon(s) ";" separated email addresses."
- ":hlm-p:`ats.flashfiles.minlimit`", "[recommended]", "Limit of minimum number of flash files to execute ats-test target, otherwise ATSDrop.zip will not be generated. Default value is "2" files."
- ":hlm-p:`ats.aste.plan.name`", "[recommended]", "Modify the plan name if you have understanding of test.xml file or leave it as it is. Default value is "plan"."
- ":hlm-p:`ats.product.hwid`", "[recommended]", "Product HardWare ID (HWID) attached to ATS. By default the value of HWID is not set."
- ":hlm-p:`ats.test.timeout`", "[recommended]", "To set test commands execution time limit on ATS server, in seconds. Default value is '60'."
- ":hlm-p:`ats.aste.testrun.name`", "[recommended]", "Modify the test-run name if you have understanding of ``test.xml`` file or leave it as it is. Default value is a string consists of build id, product name, major and minor versions."
- ":hlm-p:`ats.aste.test.type`", "[recommended]", "Type of test to run. Default is 'smoke'."
- ":hlm-p:`ats.aste.testasset.caseids`", "[recommended]", "These are the cases that which tests should be run from the TestAsset. For example, value can be set as ``100,101,102,103,105,106,``. A comma is needed to separate case IDs"
- ":hlm-p:`ats.aste.language`", "[recommended]", "Variant Language to be tested. Default is 'English'"
- "``reference.ats.flash.images``", "[recommended]", "Fileset for list of flash images (can be .fpsx, .C00, .V01 etc) It is recommended to set the fileset, default filset is given below which can be overwritten. set *dir=\"\"* attribute of the filset to ``${r'$'}{build.output.dir}/variant_images`` if :hlm-t:`variant-image-creation` target is being used."
-
-
-An example of setting up properties:
-
-.. code-block:: xml
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- ...
-
- ...
-
-
-
-
-
-
-
-
-*PLEASE NOTE:* Always declare *Properties* before and *filesets* after importing helium.ant.xml.
-
-**STEP 2: Call target ats-aste**
-
-To execute the target, a property should be set(````).
-
-Then call :hlm-t:`ats-aste`, which will create the ATSDrop.zip (test package).
-
-If property ``ats.aste.email.list`` is set, an email (test report) will be sent when the tests are ready on ATS/ASTE.
-
-
-Skip Sending AtsDrop to ATS
-------------------------------
-
-click :ref:`Skip-Sending-AtsDrop-label`:
-
-Stage: ATS - Custom Drop
-========================
-
-If you need to send a static drop to ATS then you can call the target :hlm-t:`ats-custom-drop` and set a property to your own template.
-
-A example template is in helium/tools/testing/ats/templates/ats4_naviengine_template.xml
-
-.. code-block:: xml
-
-
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/manual/stage_ats_old.rst.ftl
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/doc/src/manual/stage_ats_old.rst.ftl Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,563 @@
+.. ============================================================================
+ Name : stage_ats_old.rst.ftl
+ Part of : Helium
+
+ Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+ All rights reserved.
+ This component and the accompanying materials are made available
+ under the terms of the License "Eclipse Public License v1.0"
+ which accompanies this distribution, and is available
+ at the URL "http://www.eclipse.org/legal/epl-v10.html".
+
+ Initial Contributors:
+ Nokia Corporation - initial contribution.
+
+ Contributors:
+
+ Description:
+
+ ============================================================================
+
+.. index::
+ module: TestingOld
+
+
+===========================
+Testing (ATS3/Old Document)
+===========================
+
+This is an old version of document for test automation for the **Helium users using ATS3**.
+
+**ATS4 users** should read `Helium Test Automation User Guide`_ (revised).
+
+.. _`Helium Test Automation User Guide`: stage_ats.html
+
+.. contents::
+
+
+Stage: ATS - STIF, TEF, RTEST, MTF, SUT and EUnit (also Qt)
+===========================================================
+
+ATS testing is the automatic testing of the phone code once it has been compiled and linked to create a ROM image.
+
+Explanation of the process for getting ATS (`STIF`_ and `EUnit`_) tests compiled and executed by Helium, through the use of the :hlm-t:`ats-test` target.
+
+http://developer.symbian.org/wiki/index.php/Symbian_Test_Tools
+
+<#if !(ant?keys?seq_contains("sf"))>
+.. _`STIF`: http://s60wiki.nokia.com/S60Wiki/STIF
+.. _`EUnit`: http://s60wiki.nokia.com/S60Wiki/EUnit
+#if>
+
+.. image:: ats.dot.png
+
+Prerequisites
+-------------
+
+* `Harmonized Test Interface (HTI)`_ needs to be compiled and into the image.
+* The reader is expected to already have a working ATS setup in which test cases can be executed. ATS server names,
+ access rights and authentication etc. is supposed to be already taken care of.
+
+<#if !(ant?keys?seq_contains("sf"))>
+.. _`Harmonized Test Interface (HTI)`: http://s60wiki.nokia.com/S60Wiki/HTI
+<#else>
+.. _`Harmonized Test Interface (HTI)`: http://developer.symbian.org/wiki/index.php/HTI_Tool
+#if>
+
+Test source components
+----------------------
+
+Test source usually lives in a component's ``tsrc`` directory. Test source components are created like any other Symbian SW component;
+there is a ``group`` directory with a ``bld.inf`` file for building, ``.mmp`` files for defining the targets, and so on.
+
+The test generation code expects ``.pkg`` file in the ``group`` directory of test component to be compiled, to get the paths of the files
+(can be data, configuration, initialization, etc files) to be installed and where to install on the phone.
+
+
+Three STEPS to setup ATS with Helium
+------------------------------------
+
+**Step 1: Configure System Definition Files**
+ If the tsrc directory structure meets the criteria defined in the `new API test automation guidelines`_, then test components
+ should be included in the System Definition files.
+
+**System Definition Files supporting layers.sysdef.xml**
+ **layers** in ``layers.sysdef.xml`` file and **configuration** in ``build.sysdef.xml`` file (`Structure of System Definition files version 1.4`_).
+
+ <#if !(ant?keys?seq_contains("sf"))>
+.. _`new API test automation guidelines`: http://s60wiki.nokia.com/S60Wiki/Test_Asset_Guidelines
+.. _`Structure of System Definition files version 1.4`: http://delivery.nmp.nokia.com/trac/helium/wiki/SystemDefinitionFiles
+#if>
+
+A template of layer in layers.sysdef.xml for system definition files
+
+.. code-block:: xml
+
+
+
+
+
+
+
+
+
+
+
+* Layer name should end with **_test_layer**
+* Two standard names for ATS test layers are being used; ``unit_test_layer`` and ``api_test_layer``. Test components (the``unit`` tags)
+ should be specified under these layers and grouped by ``module`` tag(s).
+* In the above, two modules means two drop files will be created; ``module`` may have one or more ``unit``
+* By using property ``exclude.test.layers``, complete layers can be excluded and the components inside that layer will not be included in the AtsDrop. This property is a comma (,) separated list
+
+**System Definition Files version 3.0 (SysDefs3)** (new Helium v.10.79)
+ The `structure of System Definition files version 3.0`_ is different than previous versions of system definition files. In SysDefs3, package definition files are used for components specification. Instead of layers naming conventions, filters are used to identify test components and test types, for example: "test, unit_test, !api_test" etc.
+
+<#if !(ant?keys?seq_contains("sf"))>
+.. _`structure of System Definition files version 3.0`: http://wikis.in.nokia.com/view/SWManageabilityTeamWiki/PkgdefUse
+<#else>
+.. _`structure of System Definition files version 3.0`: sysdef3.html
+#if>
+
+An example template for defining test components in a package definition file.
+
+.. code-block:: xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+* Filter "test" must be specified for every test component. If it is not specified, the component will not be considered as a test component.
+* / are now used to group test components, it work in the same way as ... in sysdef v1.4 works. The components having same group name are grouped together.
+ Separate drop files are created for different groups. In the above example, if only 'test' is selected, then two drop files will be created, one with tc1 and the other one with tc2 and tc3.
+
+
+**Step 2: Configure ATS properties in build.xml**
+
+**(A)** Username and Password for the ATS should be set in the `.netrc file`_::
+
+ machine ats login ats_user_name password ats_password
+
+Add the above line in the ``.netrc`` file and replace ``ats_user_name`` with your real ATS username and ``ats_password`` with ATS password.
+
+**(B)** The following properties are ATS dependent with their edit status
+
+* [must] - must be set by user
+* [recommended] - should be set by user but not mandatory
+* [allowed] - should **not** be set by user however, it is possible.
+
+.. csv-table:: ATS Ant properties
+ :header: "Property name", "Edit status", "Description"
+
+ ":hlm-p:`ats.server`", "[must]", "For example: ``4fix012345`` or ``catstresrv001.company.net:80``. Default server port is ``8080``, but it is not allowed between intra and Noklab. Because of this we need to define server port as 80. The host can be different depending on site and/or product."
+ ":hlm-p:`ats.drop.location`", "[allowed]", "Server location (UNC path) to save the ATSDrop file, before sending to the ATS Server. For example: ``\\\\trwsem00\\some_folder\\``. In case, :hlm-p:`ats.script.type` is set to ``import``, ATS doesn't need to have access to :hlm-p:`ats.drop.location`, its value can be any local folder on build machine, for example ``c:/temp`` (no network share needed)."
+ ":hlm-p:`ats.product.name`", "[must]", "Name of the product to be tested."
+ ":hlm-p:`eunit.test.package`", "[allowed]", "The EUnit package name to be unzipped on the environment, for executing EUnit tests."
+ ":hlm-p:`eunitexerunner.flags`", "[allowed]", "Flags for EUnit exerunner can be set by setting the value of this variable. The default flags are set to ``/E S60AppEnv /R Off``."
+ ":hlm-p:`ats.email.list`", "[allowed]", "The property is needed if you want to get an email from ATS server after the tests are executed. There can be one to many semicolon-separated email addresses."
+ ":hlm-p:`ats.report.type`", "[allowed]", "Value of the ats email report, for ATS4 set to 'no_attachment' so email size is reduced"
+ ":hlm-p:`ats.flashfiles.minlimit`", "[allowed]", "Limit of minimum number of flash files to execute :hlm-t:`ats-test` target, otherwise ``ATSDrop.zip`` will not be generated. Default value is 2 files."
+ ":hlm-p:`ats.plan.name`", "[allowed]", "Modify the plan name if you have understanding of ``test.xml`` file or leave it as it is. Default value is ``plan``."
+ ":hlm-p:`ats.product.hwid`", "[allowed]", "Product HardWare ID (HWID) attached to ATS. By default the value of HWID is not set."
+ ":hlm-p:`ats.script.type`", "[allowed]", "There are two types of ats script files to send drop to ATS server, ``runx`` and ``import``; only difference is that with ``import`` ATS doesn't have to have access rights to ``testdrop.zip`` file, as it is sent to the system over http and import doesn't need network shares. If that is not needed ``import`` should not be used. Default value is ``runx`` as ``import`` involves heavy processing on ATS server."
+ ":hlm-p:`ats.target.platform`", "[allowed]", "Sets target platform for compiling test components. Default value is ``armv5 urel``."
+ ":hlm-p:`ats.test.timeout`", "[allowed]", "To set test commands execution time limit on ATS server, in seconds. Default value is ``60``."
+ ":hlm-p:`ats.testrun.name`", "[allowed]", "Modify the test-run name if you have understanding of ``test.xml`` file or leave it as it is. Default value is a string consist of build id, product name, major and minor versions."
+ ":hlm-p:`ats.trace.enabled`", "[allowed]", "Should be ``true`` if tracing is needed during the tests running on ATS. Default value is ``false``, the values are case-sensitive. See http://s60wiki.nokia.com/S60Wiki/CATS/TraceTools."
+ ":hlm-p:`ats.ctc.enabled`", "[allowed]", "Should be ``true`` if coverage measurement and dynamic analysis (CTC) tool support is to be used by ATS. Default value is ``false``. The values are case-sensitive."
+ ":hlm-p:`ats.ctc.host`", "[allowed]", "ATS3 only CTC host, provided by CATS used to create coverage measurement reports. MON.sym files are copied to this location, for example ``10.0.0.1``. If not given, code coverage reports are not created"
+ ":hlm-p:`ats.obey.pkgfiles.rule`", "[allowed]", "If the property is set to ``true``, then the only test components which will have PKG files, will be included into the ``test.xml`` as a test-set. Which means, even if there's a test component (executable) but there's no PKG file, it should not be considered as a test component and hence not included into the test.xml as a separate test. By default the property value is ``false``."
+ ":hlm-p:`tsrc.data.dir`", "[allowed]", "The default value is ``data`` and refers to the 'data' directory under 'tsrc' directory."
+ ":hlm-p:`tsrc.path.list`", "[allowed]", "Contains list of the tsrc directories. Gets the list from system definition layer files. Assuming that the test components are defined already in te ``layers.sysdef.xml`` files to get compiled. Not recommended, but the property value can be set if there are no System Definition file(s), and tsrc directories paths to set manually."
+ ":hlm-p:`ats.report.location`", "[allowed]", "Sets ATS reports store location. Default location is ``${r'$'}{publish.dir}/${r'$'}{publish.subdir}``."
+ ":hlm-p:`ats.multiset.enabled`", "[allowed]", "Should be ``true`` so a set is used for each pkg file in a component, this allows tests to run in parallel on several devices."
+ ":hlm-p:`ats.diamonds.signal`", "[allowed]", "Should be ``true`` so at end of the build diamonds is checked for test results and Helium fails if tests failed."
+ ":hlm-p:`ats.delta.enabled`", "[allowed]", "Should be ``true`` so only ADOs changed during :hlm-t:`do-prep-work-area` are tested by ATS."
+ ":hlm-p:`ats4.enabled`", "[allowed]", "Should be ``true`` if ATS4 is to be used."
+ ":hlm-p:`ats.emulator.enable`", "[allowed]", "Should be ``true`` if ``WINSCW`` emulator is to be used."
+ ":hlm-p:`ats.specific.pkg`", "[allowed]", "Text in name of PKG files to use eg. 'sanity' would only use xxxsanity.pkg files from components."
+ ":hlm-p:`ats.singledrop.enabled`", "[allowed]", "If present and set to 'true', it will create one drop file, if set to any other value or not present it will create multiple drop files (defined by the sysdef file). This is to save traffic to the server."
+ ":hlm-p:`ats.java.importer.enabled`", "[allowed]", "If set to 'true', for older uploader is used for ats3 which shows improved error message."
+ ":hlm-p:`ats.test.filterset`", "[allowed]", "(new Helium v.10.79)Contains a name of test filterset (see example below). A filterset is used to select/unselect test components. The filter(s) is/are effective when the same filters are defined in the package definition file for component(s)."
+
+An example of setting up properties:
+
+.. code-block:: xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+.. Note::
+
+ Always declare *Properties* before and *filesets* after importing helium.ant.xml.
+
+**STEP 3: Call target ats-test**
+
+To execute the target, a property should be set(````).
+
+Then call :hlm-t:`ats-test`, which will create the ATSDrop.zip (test package).
+
+If property *ats.email.list* is set, an email (test report) will be sent when the tests are ready on ATS.
+
+CTC
+---
+
+* To enable ctc for ATS set, :hlm-p:`ats.ctc.enabled`.
+
+For ATS3 only, set the ftp hostname for the ATS server:
+
+.. code-block:: xml
+
+
+
+
+* To compile components for CTC see `configure CTC for SBS`_
+
+.. _`configure CTC for SBS`: ../helium-antlib/sbsctc.html
+
+
+* Once ATS tests have finished results for CTC will be shown in Diamonds.
+* The following are optional CTC properties
+
+.. csv-table:: Table: ATS Ant properties
+ :header: "Property name", "Edit status", "Description"
+
+ "``ctc.instrument.type``", "[allowed]", "Sets the instrument type"
+ "``ctc.build.options``", "[allowed]", "Enables optional extra arguments for CTC, after importing a parent ant file."
+
+
+For example,
+
+.. code-block:: xml
+
+
+
+
+
+
+
+
+
+Or
+
+.. code-block:: xml
+
+
+
+
+
+
+See `more information on code coverage`_
+
+<#if !(ant?keys?seq_contains("sf"))>
+.. _`more information on code coverage`: http://s60wiki.nokia.com/S60Wiki/CTC
+<#else>
+.. _`more information on code coverage`: http://developer.symbian.org/wiki/index.php/Testing_Guidelines_for_Package_Releases#Code_coverage
+#if>
+
+
+
+
+
+
+
+Qt Tests
+--------
+
+QtTest.lib is supported and the default harness is set to EUnit. If ``QtTest.lib`` is there in ``.mmp`` file, Helium sets the Harness to Eunit and ATS supported Qt steps are added to test.xml file
+
+In ``layers.sysdef.xml`` file, the layer name should end with "_test_layer" e.g. "qt_unit_test_layer".
+
+There are several ``.PKG`` files created after executing ``qmake``, but only one is selected based on which target platform is set. Please read the property (:hlm-p:`ats.target.platform`) description above.
+
+.. _`Skip-Sending-AtsDrop-label`:
+
+Skip Sending AtsDrop to ATS
+---------------------------
+
+By setting property of :hlm-p:`ats.upload.enabled` to ``false``, ``ats-test`` target only creates a drop file, and does not send the drop (or package) to ATS server.
+
+Choosing images to send to ATS
+------------------------------
+
+Since helium 10 images are picked up using :hlm-p:`ats.product.name` and Imaker iconfig.xml files. ``release.images.dir`` is searched for iconfig.xml files, the ones where the product name is part of :hlm-p:`ats.product.name` is used.
+
+You should only build the images for each product you want to include in ats. Eg.
+
+.. code-block:: xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+For older products where there are no iconfig.xml, ``reference.ats.flash.images`` is used:
+
+.. code-block:: xml
+
+
+
+
+
+
+
+Customizing the test.xml in ATS
+-------------------------------
+
+The user can customize the generated test.xml with files:
+
+* **preset_custom.xml** goes before first set
+* **postset_custom.xml** goes after last set
+* **precase_custom.xml** goes before first case
+* **postcase_custom.xml** goes after last case
+* **prestep_custom.xml** goes before first step
+* **poststep_custom.xml** goes after last step
+* **prerun_custom.xml** goes before first run or execute step
+* **postrun_custom.xml** goes after last run or execute step
+* **prepostaction.xml** goes before first postaction
+* **postpostaction.xml** goes after last postaction
+
+The files must be in the directory 'custom' under the 'tsrc' or 'group' folder to be processed.
+
+The files need to be proper XML snippets that fit to their place. In case of an error an error is logged and a comment inserted to the generated XML file.
+
+A postaction section customization file (prepostaction.xml or postpostaction.xml) could look like this
+
+.. code-block:: xml
+
+
+ Pre PostAction from custom file
+
+
+
+
+
+
+
+The ``prestep_custom.xml`` can be used to flash and unstall something custom.
+
+.. code-block:: xml
+
+
+
+ install
+
+
+
+
+ ...
+
+
+
+And then the ``prerun_custom.xml`` can be used to start measuring.
+
+.. code-block:: xml
+
+
+
+ execute
+
+
+
+
+
+
+
+
+
+**Note:** The users is expected to check the generated test.xml manually, as there is no validation. Invalid XML input files will be disregarded and a comment will be inserted to the generated XML file.
+
+Overriding Test xml values
+--------------------------
+
+Set the property ``ats.config.file`` to the location of the config file.
+
+Example configuration:
+
+.. code-block:: xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+.. index::
+ single: ATS - ASTE
+
+Stage: ATS - ASTE
+=================
+
+Explanation of the process for getting ATS `ASTE`_ tests compiled and executed by Helium, through the use of the :hlm-t:`ats-aste` target.
+
+<#if !(ant?keys?seq_contains("sf"))>
+.. _`ASTE`: http://s60wiki.nokia.com/S60Wiki/ASTE
+#if>
+
+Prerequisites
+-------------
+
+* `Harmonized Test Interface (HTI)`_ needs to be compiled and into the image.
+* The reader is expected to already have a working ATS setup in which test cases can be executed. ATS server names, access rights and authentication etc. is supposed to be already taken care of.
+* `SW Test Asset`_ location and type of test should be known.
+
+<#if !(ant?keys?seq_contains("sf"))>
+.. _`Harmonized Test Interface (HTI)`: http://s60wiki.nokia.com/S60Wiki/HTI
+.. _`SW Test Asset`: http://s60wiki.nokia.com/S60Wiki/MC_SW_Test_Asset_documentation
+#if>
+
+Test source components
+----------------------
+
+Unlike STIF, EUnit etc tests, test source components (or ``tsrc`` structure) is not needed for `ASTE`_ tests.
+
+Two STEPS to setup ASTE with Helium
+-----------------------------------
+
+**STEP 1: Configure ASTE properties in build.xml**
+
+**(A)** Username and Password for the ATS should be set in the `.netrc file`_
+
+.. code-block:: text
+
+ machine ats login ats_user_name password ats_password
+
+Add the above line in the .netrc file and replace *ats_user_name* with your real ats username and "ats_password" with ats password.
+
+.. _`.netrc file`: configuring.html?highlight=netrc#passwords
+
+
+**(B)** The following properties are ASTE dependent with their edit status
+
+* [must] - must be set by user
+* [recommended] - should be set by user but not mandatory
+* [allowed] - should **not** be set by user however, it is possible.
+
+.. csv-table:: ATS Ant properties
+ :header: "Property name", "Edit status", "Description"
+
+ ":hlm-p:`ats.server`", "[must]", "For example: ``4fio00105`` or ``catstresrv001.company.net:80``. Default server port is ``8080``, but it is not allowed between intra and Noklab. Because of this we need to define server port as ``80``. The host can be different depending on site and/or product."
+ ":hlm-p:`ats.drop.location`", "[must]", "Server location (UNC path) to save the ATSDrop file, before sending to the ATS. For example: ``\\\\trwsem00\\some_folder\\``. In case, ``ats.script.type`` is set to ``import``, ATS doesn't need to have access to :hlm-p:`ats.drop.location`, its value can be any local folder on build machine, for example ``c:/temp`` (no network share needed)."
+ ":hlm-p:`ats.product.name`", "[must]", "Name of the product to be tested."
+ ":hlm-p:`ats.aste.testasset.location`", "[must]", "Location of SW Test Assets, if the TestAsset is not packaged then it is first compressed to a ``.zip`` file. It should be a UNC path."
+ ":hlm-p:`ats.aste.software.release`", "[must]", "Flash images releases, for example 'SPP 51.32'."
+ ":hlm-p:`ats.aste.software.version`", "[must]", "Version of the software to be tested. For example: 'W810'"
+ ":hlm-p:`ats.aste.email.list`", "[recommended]", "The property is needed if you want to get an email from ATS server after the tests are executed. There can be one to many semicolon(s) ";" separated email addresses."
+ ":hlm-p:`ats.flashfiles.minlimit`", "[recommended]", "Limit of minimum number of flash files to execute ats-test target, otherwise ATSDrop.zip will not be generated. Default value is "2" files."
+ ":hlm-p:`ats.aste.plan.name`", "[recommended]", "Modify the plan name if you have understanding of test.xml file or leave it as it is. Default value is "plan"."
+ ":hlm-p:`ats.product.hwid`", "[recommended]", "Product HardWare ID (HWID) attached to ATS. By default the value of HWID is not set."
+ ":hlm-p:`ats.test.timeout`", "[recommended]", "To set test commands execution time limit on ATS server, in seconds. Default value is '60'."
+ ":hlm-p:`ats.aste.testrun.name`", "[recommended]", "Modify the test-run name if you have understanding of ``test.xml`` file or leave it as it is. Default value is a string consists of build id, product name, major and minor versions."
+ ":hlm-p:`ats.aste.test.type`", "[recommended]", "Type of test to run. Default is 'smoke'."
+ ":hlm-p:`ats.aste.testasset.caseids`", "[recommended]", "These are the cases that which tests should be run from the TestAsset. For example, value can be set as ``100,101,102,103,105,106,``. A comma is needed to separate case IDs"
+ ":hlm-p:`ats.aste.language`", "[recommended]", "Variant Language to be tested. Default is 'English'"
+
+
+An example of setting up properties:
+
+.. code-block:: xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+*PLEASE NOTE:* Always declare *Properties* before and *filesets* after importing helium.ant.xml.
+
+**STEP 2: Call target ats-aste**
+
+To execute the target, a property should be set(````).
+
+Then call :hlm-t:`ats-aste`, which will create the ATSDrop.zip (test package).
+
+If property ``ats.aste.email.list`` is set, an email (test report) will be sent when the tests are ready on ATS/ASTE.
+
+
+Skip Sending AtsDrop to ATS
+---------------------------
+
+click :ref:`Skip-Sending-AtsDrop-label`:
+
+
+
+
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/manual/stage_final.rst.inc.ftl
--- a/buildframework/helium/doc/src/manual/stage_final.rst.inc.ftl Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,64 +0,0 @@
-<#--
-============================================================================
-Name : stage_final.rst.inc.ftl
-Part of : Helium
-
-Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
-All rights reserved.
-This component and the accompanying materials are made available
-under the terms of the License "Eclipse Public License v1.0"
-which accompanies this distribution, and is available
-at the URL "http://www.eclipse.org/legal/epl-v10.html".
-
-Initial Contributors:
-Nokia Corporation - initial contribution.
-
-Contributors:
-
-Description:
-
-============================================================================
--->
-
-.. index::
- single: Stage - Final operations
-
-Stage: Final operations
-=======================
-
-Final operation are steps which could happen at the workflow completion.
-
-
-Running a target at build completion
-------------------------------------
-
-Helium offers the possibility to run a final target despite any error which could occur during the build.
-The configuration of the target is done using the **hlm.final.target** property.
-
-e.g:
-::
-
-
-
-
-Running action on failure
--------------------------
-
-The signaling framework will automatically run all signalExceptionConfig in case of Ant failure at the
-end of the build.
-
-This example shows how simple task can be run in case of failure:
-::
-
-
-
-
- Signal: ${r'$'}{signal.name}
- Message: ${r'$'}{signal.message}
-
-
-
-
-
-
-
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/manual/stage_matti.rst.inc.ftl
--- a/buildframework/helium/doc/src/manual/stage_matti.rst.inc.ftl Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,80 +0,0 @@
-<#--
-============================================================================
-Name : stage_matti.rst.inc.ftl
-Part of : Helium
-
-Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
-All rights reserved.
-This component and the accompanying materials are made available
-under the terms of the License "Eclipse Public License v1.0"
-which accompanies this distribution, and is available
-at the URL "http://www.eclipse.org/legal/epl-v10.html".
-
-Initial Contributors:
-Nokia Corporation - initial contribution.
-
-Contributors:
-
-Description:
-
-============================================================================
--->
-
-.. index::
- single: MATTI
-
-Stage: MATTI
-=============
-
-MATTI testing is very similar to ATS testing, so for details of how it all links together see :ref:`Stage-ATS-label`: `and the matti website`_.
-
-<#if !(ant?keys?seq_contains("sf"))>
-.. _`and the matti website`: http://trmatti1.nmp.nokia.com/help/
-#if>
-
-The set up of parameters is very similar (a few less parameters and it mostly uses ATS values). The main difference is that once the drop file has been uploaded to the ATS server it uses MATTI to perform the tests (the drop file contains the flash files, the ruby tests/sip profiles, data files, sis files and/or parameters file in xml format).
-
-The following parameters are the ones that are not listed in the ATS parameters, all other parameters required are as listed in the ATS section above, which include :hlm-p:`ats.server`, :hlm-p:`ats.email.list`, :hlm-p:`ats.email.format`, :hlm-p:`ats.email.subject`, :hlm-p:`ats.testrun.name`, :hlm-p:`ats.product.name`, :hlm-p:`ats.flashfiles.minlimit`, :hlm-p:`ats.flash.images` and :hlm-p:`ats.upload.enabled`.
-
-* [must] - must be set by user
-* [recommended] - should be set by user but not mandatory
-* [allowed] - should **not** be set by user however, it is possible.
-
-.. csv-table:: ATS Ant properties
- :header: "Property name", "Edit status", "Description"
-
- ":hlm-p:`matti.enabled`", "[must]", "Enable MATTI testing to occur, if not present the target :hlm-t:`matti-test` will not run."
- ":hlm-p:`matti.asset.location`", "[must]", "The location of the test asset where ruby test files, sip profiles, hardware data etc are located."
- ":hlm-p:`matti.test.profiles`", "[must]", "Test profiles to be executed should be mentioned in this comma separated list e.g., 'bat, fute'."
- ":hlm-p:`matti.sierra.enabled`", "[must]", "Mustbe set to 'true' if sierra is engine is to be used. If true .sip files are used otherwise .rb (ruby) files are used to execute tests-"
- ":hlm-p:`matti.test.timeout`", "[must]", "Separate but similar property to ats.test.timeout for matti tests."
- ":hlm-p:`matti.parameters`", "[must]", "Matti test parameters can be given through Matti parameters xml file."
- ":hlm-p:`matti.sis.files`", "[must]", "There are special sis files required to execute with test execution. This is a comma separated list in which several sis files can be deifned in a certain format like '##' e.g. "
- ":hlm-p:`matti.sierra.parameters`", "[must]", "Sierra parameters are set using this property. e.g. '--teardown --ordered'"
- ":hlm-p:`matti.template.file`", "[allowed]", "Location of the matti template file."
-
-
-All you need to do is setup the following parameters:
-
-.. code-block:: xml
-
-
-
-
-
-
-
-
-
-
-
-
-
-In order to upload and view the test run you need to have a valid user ID and password that matches that in your ``.netrc`` file. To create the account open a web browser window and enter the name of the ats.server with /ATS at the end e.g. http://123456:80/ATS. Click on the link in the top right hand corner to create the account. To view the test run once your account is active you need to click on the 'test runs' tab.
-
-To run the tests call the target :hlm-t:`matti-test` (you will need to define the :hlm-p:`build.drive`, :hlm-p:`build.number` and it is best to create the :hlm-p:`core.build.version` on the command line as well if you do not add it to the list of targets run that create the ROM image). e.g.
-::
-
- hlm -Dbuild.number=001 -Dbuild.drive=z: -Dcore.build.version=001 matti-test
-
-If it displays the message 'Matti testdrop created successfully!', script has done what it needs to do. The next thing to check is that the drop file has been uploaded to the ATS server OK. If that is performed successfully then the rest of the testing needs to be performed by the ATS server. There is also a ``test.xml`` file created that contains details needed for debugging any problems that might occur. To determine if the tests have run correctly you need to read the test run details from the server.
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/manual/stage_preparation.rst.inc.ftl
--- a/buildframework/helium/doc/src/manual/stage_preparation.rst.inc.ftl Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/doc/src/manual/stage_preparation.rst.inc.ftl Mon Oct 11 11:16:47 2010 +0100
@@ -36,24 +36,37 @@
Helium supports the creation of an environment based on a release store in a network drive. The main requirement from that release is to publish release metadata with the content.
-.. csv-table:: Ant properties to modify
+.. csv-table:: Ant properties to modify for Helium 11 and older
+ :header: "Property", "Description", "Values"
+
+ ":hlm-p:`s60.grace.server`", ":hlm-p:`s60.grace.server[summary]`", ":hlm-p:`s60.grace.server[defaultValue]`"
+ ":hlm-p:`s60.grace.service`", ":hlm-p:`s60.grace.service[summary]`", ":hlm-p:`s60.grace.service[defaultValue]`"
+ ":hlm-p:`s60.grace.product`", ":hlm-p:`s60.grace.product[summary]`", ":hlm-p:`s60.grace.product[defaultValue]`"
+ ":hlm-p:`s60.grace.release`", ":hlm-p:`s60.grace.release[summary]`", ":hlm-p:`s60.grace.product[defaultValue]`"
+ ":hlm-p:`s60.grace.revision`", ":hlm-p:`s60.grace.revision[summary]`", ":hlm-p:`s60.grace.revision[defaultValue]`"
+ ":hlm-p:`s60.grace.cache`", ":hlm-p:`s60.grace.cache[summary]`", ":hlm-p:`s60.grace.cache[defaultValue]`"
+ ":hlm-p:`s60.grace.checkmd5.enabled`", ":hlm-p:`s60.grace.checkmd5.enabled[summary]`", ":hlm-p:`s60.grace.checkmd5.enabled[defaultValue]`"
+ ":hlm-p:`s60.grace.usetickler`", ":hlm-p:`s60.grace.usetickler[summary]`", ":hlm-p:`s60.grace.usetickler[defaultValue]`"
+
+
+.. csv-table:: Ant properties to modify for Helium 12
:header: "Property", "Description", "Values"
- ":hlm-p:`s60.grace.server`", "UNC path to network drive.", ""
- ":hlm-p:`s60.grace.service`", "Service name.", ""
- ":hlm-p:`s60.grace.product`", "Product name.", ""
- ":hlm-p:`s60.grace.release`", "Regular expression to match release under the product directory.", ""
- ":hlm-p:`s60.grace.revision`", "Regular expresion to match a new build revision", "e.g: (_\d+)?"
- ":hlm-p:`s60.grace.cache`",
- ":hlm-p:`s60.grace.checkmd5.enabled`",
- ":hlm-p:`s60.grace.usetickler`", "Validate the release based on the tickler.", "true, false(default)"
+ ":hlm-p:`download.release.server`", ":hlm-p:`download.release.server[summary]`", ":hlm-p:`download.release.server[defaultValue]`"
+ ":hlm-p:`download.release.service`", ":hlm-p:`download.release.service[summary]`", ":hlm-p:`download.release.service[defaultValue]`"
+ ":hlm-p:`download.release.product`", ":hlm-p:`download.release.product[summary]`", ":hlm-p:`download.release.product[defaultValue]`"
+ ":hlm-p:`download.release.regex`", ":hlm-p:`download.release.regex[summary]`", ":hlm-p:`download.release.regex[defaultValue]`"
+ ":hlm-p:`download.release.revision`", ":hlm-p:`download.release.revision[summary]`", ":hlm-p:`download.release.revision[defaultValue]`"
+ ":hlm-p:`download.release.cache`", ":hlm-p:`download.release.cache[summary]`", ":hlm-p:`download.release.cache[defaultValue]`"
+ ":hlm-p:`download.release.checkmd5.enabled`", ":hlm-p:`download.release.checkmd5.enabled[summary]`", ":hlm-p:`download.release.checkmd5.enabled[defaultValue]`"
+ ":hlm-p:`download.release.usetickler`", ":hlm-p:`download.release.usetickler[summary]`", ":hlm-p:`download.release.usetickler[defaultValue]`"
-Once configured you can invoke Helium:
+Once configured you can invoke Helium::
- > hlm -Dbuild.number=1 -Dbuild.drive=X: ido-update-build-area-grace
+ hlm -Dbuild.number=1 -Dbuild.drive=X: ido-update-build-area
- > dir X:
- ...
- ...
+You should then have the latest release extracted to the X: drive.
-You should then have the latest/mentioned release un-archived under the X: drive.
\ No newline at end of file
+<#if !ant?keys?seq_contains("sf")>
+.. include:: stage_nokia_preparation.rst.inc
+#if>
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/manual/stage_publishing.rst.inc.ftl
--- a/buildframework/helium/doc/src/manual/stage_publishing.rst.inc.ftl Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/doc/src/manual/stage_publishing.rst.inc.ftl Mon Oct 11 11:16:47 2010 +0100
@@ -34,7 +34,7 @@
Diamonds is a utility tool that keeps track of build and release information. See the **Metrics** manual under section `Helium Configuration`_ for more info.
-.. _Helium Configuration: ../metrics.html#helium-configuration
+.. _Helium Configuration: metrics.html#helium-configuration
.. index::
@@ -103,6 +103,7 @@
+
@@ -114,6 +115,7 @@
+
@@ -179,6 +181,8 @@
"``archives.dir``", "The directory where the zip files are saved to.", ""
"``policy.csv``", "This property defines the location of the policy definition file.", ""
"``policy.default.value``", "This property defines the policy value when policy file is missing or invalid (e.g. wrong format).", "9999"
+ "``split.on.uncompressed.size.enabled``", "To enable/disable splitting the zip files depending on source file size.", "true/false"
+
The policy mapper enables the sorting of the content compare to its policy value. The mapper is looking for a policy file in the file to archive directory.
If the distribution policy file is missing then the file will go to the ``policy.default.value`` archive. Else it tries to open the file which
@@ -226,6 +230,9 @@
They support the same set of configuration properties as the default ``policy.remover``.
+<#if !ant?keys?seq_contains("sf")>
+.. include:: stage_metadata.rst.inc
+#if>
.. index::
single: Zipping SUBCON
@@ -233,5 +240,12 @@
Subcon zipping
--------------
-Subcon zipping is also configured using the same XML format as :hlm-t:`zip-ee` and implemented in the :hlm-t:`zip-subcon` target. A :hlm-p:`zips.subcon.spec.name` property must be defined but currently it is still a separate configuration file.
+Subcon zipping is also configured using the same XML format as :hlm-t:`zip-ee` and implemented in the :hlm-t:`zip-subcon` target. A ``zips.subcon.spec.name`` property must be defined but currently it is still a separate configuration file.
+
+Stage: Blocks packaging
+=======================
+
+Refer to the `Blocks integration manual`_
+
+.. _`Blocks intergration manual`: blocks.html
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/manual/stage_source_preparation.rst.inc.ftl
--- a/buildframework/helium/doc/src/manual/stage_source_preparation.rst.inc.ftl Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/doc/src/manual/stage_source_preparation.rst.inc.ftl Mon Oct 11 11:16:47 2010 +0100
@@ -136,6 +136,10 @@
The following properties are required:
- database: the name of the synergy database you want to use.
+<#if !ant?keys?seq_contains("sf")>
+.. include:: stage_nokia_ccm.rst.inc
+#if>
+
Mercurial
---------
@@ -150,6 +154,8 @@
+<#if !ant?keys?seq_contains("sf")>
For more information see API_
-.. _API: ../helium-antlib/api/doclet/index.SCM.html
+.. _API: ../api/doclet/index.SCM.html
+#if>
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/manual/stages.rst.ftl
--- a/buildframework/helium/doc/src/manual/stages.rst.ftl Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/doc/src/manual/stages.rst.ftl Mon Oct 11 11:16:47 2010 +0100
@@ -23,6 +23,7 @@
.. index::
module: Stages
+
=============
Helium stages
=============
@@ -47,7 +48,7 @@
.. include:: stage_releasing.rst.inc
Stage: Cenrep creation (S60 3.2.3 - 5.x)
-=================================
+========================================
<#if !(ant?keys?seq_contains("sf"))>
See: http://configurationtools.nmp.nokia.com/builds/cone/docs/cli/generate.html?highlight=generate
#if>
@@ -57,7 +58,7 @@
* IDO can use the ido-gen-cenrep to generate the cenreps which are IDO specific.
* We should pass the sysdef.configurations.list as parameter to ido-gen-cenrep target. Else it will use the defualt one of helium.
-Example:
+Example
-------
Below example will generate the cenrep only for IDO specific confml files.
@@ -105,17 +106,35 @@
.. include:: stage_integration.rst.inc
-.. include:: stage_ats.rst.inc
+
+Stage: Testing
+==============
+
+The test sources or test asset is mantained by test developers, who follow certain rules and standards to create the directory/file structure and writing the tests.
-.. include:: stage_matti.rst.inc
+Testing is performed automatically by ATS server which receives a zipped testdrop, containing test.xml file (required by ATS), ROM images, test cases, dlls, executbales and other supporting files. This testdrop is created by Helium Test Automation system.
+
+Read more: `Helium Test Automation User Guide`_
+
+.. _`Helium Test Automation User Guide`: stage_ats.html
-Stage: Check EPL License header.
-=================================
+
+
+
+Stage: Check EPL License header
+===============================
The target ``check-sf-source-header`` could be used to run to validate the source files for EPL license header.
* Include the target ``check-sf-source-header`` in the target sequence.
-* This will validate source files present on the build area to contain EPL license.
+* This will validate source files present on the build area not to contain SFL license.
+* Target could be enabled by setting ``sfvalidate.enabled`` to ``true``.
+
+The target ``ido-check-sf-source-header`` could be used to run to validate the source files for EPL license header for IDO/Package level.
+
+* Include the target ``ido-check-sf-source-header`` in the IDO target sequence.
+* This will validate source files present on the build area to contain EPL license by extracting values from ``distribution.policy.S60`` files.
+* Target could be enabled by setting ``sfvalidate.enabled`` to ``true``.
.. index::
single: Compatibility Analyser (CA)
@@ -123,32 +142,40 @@
Stage: Compatibility Analyser
=============================
-The Compatibility Analyser is a tool used to compare **binary** header and library files to ensure that the version being checked has not made any changes to the interfaces which may cause the code to not work correctly. Helium supplies a target that calls this Compatibility Analyser. Users who wish to use this tool first need to read the CA user guide found under SW DOcMan at: http://bhlns002.apac.nokia.com/symbian/symbiandevdm.nsf/WebAllByID2/DSX05526-EN/s60_compatibility_analyser_users_guide.doc.
+The Compatibility Analyser is a tool used to compare **binary** header and library files to ensure that the version being checked has not made any changes to the interfaces which may cause the code to not work correctly. Helium supplies a target that calls this Compatibility Analyser.
+Users who wish to use this tool first need to read the CA user guide found under: /epoc32/tools/s60rndtools/bctools/doc/S60_Compatibility_Analyser_Users_Guide.doc.
-The Compatibility Analyser is supplied as part of SymSEE, there is a wiki page for the tool found at http://s60wiki.nokia.com/S60Wiki/Compatibility_Analyser. As part of the configuration a default BC template file has been provided at Helium\tools\quality\CompatibilityAnalyser\config_template.txt make the necessary changes to this file (as described in the user guide). The supplied example file works with CA versions 2.0.0 and above which is available in SymSEE version 12.1.0 and above. The configurations that will need changing are:
+<#if !(ant?keys?seq_contains("sf"))>
+The Compatibility Analyser is supplied as part of SymSEE, there is a wiki page for the tool found at: http://s60wiki.nokia.com/S60Wiki/Compatibility_Analyser.
+#if>
+As part of the configuration a default BC template file has been provided at helium/tools/quality/compatibility_analyser/ca.cfg.xml make the necessary changes to this file (as described in the user guide). The supplied example file works with CA versions 2.0.0 and above.
+
+The minimum configurations that will need changing are:
* BASELINE_SDK_DIR
* BASELINE_SDK_S60_VERSION
* CURRENT_SDK_DIR
* REPORT_FILE_HEADERS
* REPORT_FILE_LIBRARIES
-The default configuration is supplied as part of tools\quality\CompatibilityAnalyser\compatibilty.ant.xml where there are a few properties that need to be set (overriding of these is recommended in your own config file):
+The default configuration is supplied as part of tools/quality/compatibility_analyser/compatibilty.ant.xml where there are a few properties that need to be set (overriding of these is recommended in your own config file):
.. csv-table:: Compatibility Analyser Ant properties
:header: "Property name", "Edit status", "Description"
":hlm-p:`ca.enabled`", "[must]", "Enables the bc-check and ca-generate-diamond-summary targets to be executed, when set to true."
- ":hlm-p:`bctools.root`", "[must]", "Place where the CheckBC and FilterBC tools are e.g. C:/APPS/carbide/plugins/com.nokia.s60tools.compatibilityanalyser.corecomponents_2.0.0/BCTools"
- ":hlm-p:`default.bc.config`", "[must]", "Place where the CheckBC default configuration file is, it is copied from this location to the output folder for use by checkBC.py e.g. helium/tools/quality/compatibility_analyser/ca_config_template.txt"
- ":hlm-p:`bc.config.dir`", "[must]", "The bc_config_template.txt file (default configuration file) will be copied from the folder it is saved in within helium to the location named in this property where it will be used ( in conjunction with the bc.config.file property). e.g. build.log.dir/bc"
- ":hlm-p:`bc.config.file`", "[must]", "The bc_config_template.txt file (default configuration file) will be copied from the folder it is saved in within helium to the location named and named as defined in this property where it will be used. You need to make sure this is not the same name as any other IDO or person using the build area. e.g. bc.config.dir/bc.config"
- ":hlm-p:`bc.check.libraries.enabled`", "[must]", "Enables the Binary Comparison for libraries when set to 'true'."
- ":hlm-p:`lib.param.val`", "[must]", "Defines the parameter that checkBC.py is called with -la (all libraries checked) or -ls lib (single library checked) (lib = the name of library to check) or -lm file.name (multiple libraries checked) the file.name is a file that contains the names of the library(ies) to be checked."
- ":hlm-p:`bc.check.headers.enabled`", "[must]", "Enables the Binary Comparison for headers when set to 'true'."
- ":hlm-p:`head.param.val`", "[must]", "Defines the parameter that checkBC.py is called with -ha (all headers checked) or -hs file (single header checked) (file= name of header file to check) or -hm file.name (multiple headers checked) the file.name is a file that contains the names of the header(s) to be checked"
+ ":hlm-p:`bc.prep.ca.file`", "[must]", "The name and location of the file that contains all the CA configuration values like, 'BASELINE_SDK_DIR=C:\Symbian\9.2\S60_3rd_FP1_2': an example file can be found at helium/tools/quality/compatibility_analyser/test/ca.cfg.xml "
+ ":hlm-p:`bc.tools.root`", "[must]", "Place where the CheckBC and FilterBC tools are e.g. /epoc32/tools/s60rndtools/bctools"
+ ":hlm-p:`bc.build.dir`", "[must]", "The place that all the files created during the running of the CA tool will be placed."
+ ":hlm-p:`bc.config.file`", "[must]", "The 'ca.ant.config.file' file (configuration file) will be copied from the folder it is saved in within helium to the location named as defined in this property where it will be used. You need to make sure this is not the same name as any other IDO or person using the build area. e.g. bc.config.dir/bc.config"
+ ":hlm-p:`bc.check.libraries.enabled`", "[must]", "Enables the compatibility analyser for libraries when set to 'true' (default value is 'false')."
+ ":hlm-p:`bc.lib.param.val`", "[optional]", "Defines the parameter that checkBC.py is called with -la (all libraries checked) (default value) or -ls lib (single library checked) (lib = the name of library to check) or -lm file.name (multiple libraries checked) the file.name is a file that contains the names of the library(ies) to be checked. If the 'bc.what.log.entry.enabled' property is set this variable must not be set."
+ ":hlm-p:`bc.check.headers.enabled`", "[must]", "Enables the compatibility analyser for headers when set to 'true' (default value is 'false')."
+ ":hlm-p:`bc.head.param.val`", "[optional]", "Defines the parameter that checkBC.py is called with -ha (all headers checked) (default value) or -hs file (single header checked) (file= name of header file to check) or -hm file.name (multiple headers checked) the file.name is a file that contains the names of the header(s) to be checked. If the 'bc.what.log.entry.enabled' property is set this variable must not be set."
":hlm-p:`bc.check.report.id`", "[must]", "Adds this to the CA output file name to give it a unique name."
- ":hlm-p:`ido.ca.html.output.dir`", "[must]", "Defines the location of CA output and the input for the diamonds creation target. e.g. build.log.dir/build.id_ca"
+ ":hlm-p:`bc.log.file.to.scan`", "[must]", "This must be set if the 'bc.what.log.entry.enabled' property is set otherwise it is not required. It is the name of the log file that was created during the build that will be scanned in order to determine which headers or library files will be compared."
+ ":hlm-p:`bc.what.log.entry.enabled`", "[optional]", "If set to true the 'whatlog' will be scanned for the list of header and/or library files that will be compared. The default is 'false'"
+ ":hlm-p:`bc.fail.on.error`", "[optional]", "If set to true the build will fail if there is an error with the binary compatibility analyser (including the conversion to diamonds XML files). If set to false it will not fail the build if there is a problem with CA."
and then run the target:
@@ -158,8 +185,6 @@
where nnn is the build number and n: is the substed drive letter.
-The results of the output from the analysis are placed in the \output\logs\BC folder under the substed build drive and are called libraries_report_?.xml and headers_report_?.xml, the reports can be viewed in Web-formatted layout, based on the BBCResults.xsl stylesheet which is copied to the \output\logs\BC folder on the build drive.
-
+The results of the output from the analysis are placed in the /output/logs/bc folder under the substed build drive and are called 'libraries_report_{bc.check.report.id}' and 'headers_report_{bc.check.report.id}', the reports can be viewed in Web-formatted layout, based on the BBCResults.xsl stylesheet which is copied to the /output/logs/bc folder on the build drive.
-.. include:: stage_final.rst.inc
-
+By running the target 'ca-generate-diamond-summary' the output is summarised and passed to diamonds where is is displayed in the 'Quality Aspects' section.
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/manual/tdriver_template_instructions.rst.ftl
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/doc/src/manual/tdriver_template_instructions.rst.ftl Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,302 @@
+<#--
+============================================================================
+Name : tdriver_template_instructions.rst.ftl
+Part of : Helium
+
+Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+All rights reserved.
+This component and the accompanying materials are made available
+under the terms of the License "Eclipse Public License v1.0"
+which accompanies this distribution, and is available
+at the URL "http://www.eclipse.org/legal/epl-v10.html".
+
+Initial Contributors:
+Nokia Corporation - initial contribution.
+
+Contributors:
+
+Description:
+
+============================================================================
+-->
+
+.. index::
+ single: TDriver
+
+
+=======================
+TDriver Custom Template
+=======================
+
+
+.. contents::
+
+
+Instructions for creating custom templates for TDriver
+======================================================
+
+Creating custom template for TDriver is very simple and easy. It requires a bit of knowledge of `Python`_, `Python dictionary`_ and `Jinja templates`_, however, it is not mandatory. There will be an example template below, which may help to understand how to create new/modify TDriver template.
+
+.. _`Python`: http://wiki.python.org/moin/BeginnersGuide
+.. _`Python dictionary`: http://docs.python.org/tutorial/datastructures.html#dictionaries
+.. _`Jinja templates`: http://jinja.pocoo.org/2/documentation/templates
+
+
+The test.xml template consists of two parts
+ - Explicit (hardcoded part of the test.xml) and
+ - Implicit (logical/processed data from the TDriver scripts)
+
+
+Explicit template data
+----------------------
+
+This consists of normal template structure, obvious properties, attributes and their values.
+
+For example:
+
+.. code-block:: xml
+
+
+
+
+
+
+
+
+
+It does not make any sense without parameters and values. However, explicit data does not require any logic or it is not the data coming from any script either.
+
+
+Implicit template data
+----------------------
+
+- This is complete processed data from several sources.
+- In case of TDriver template, this is a dictionary, ``xml_dict``, which has hierarchical data structure.
+- The contents of the dictionary can be categorized into:
+
+**Pre-Data** (non-itterative data and comes before the execution block in the beginning of the test.xml)
+
+
+.. csv-table:: Pre-Data (Data structure)
+ :header: "Variable name", "Description", "Usage example"
+
+ "diamonds_build_url", "Non-iterative - string", "xml_dict['diamonds_build_url']"
+ "testrun_name", "Non-iterative - string", "xml_dict['testrun_name']"
+ "device_type", "Non-iterative - string", "xml_dict['device_type']"
+ "alias_name", "Non-iterative - string", "xml_dict['alias_name']"
+
+
+
+**Execution-Block** (itterative, which is dependet on number of execution blocks. Please see the template example for exact usage)
+
+
+.. csv-table:: Pre-Data (Data structure)
+ :header: "Variable name", "Description", "Usage example"
+
+ "execution_blocks", "Iterative - dictionary. It has the following members", "for exe_block in xml_dict['execution_blocks']"
+ "image_files", "Iterative - list of ROM images.", "for image_file in exe_block['image_files']"
+ "install_files", "Iterative - list of files to be installed", "for file in exe_block['install_files']"
+ "tdriver_sis_files", "Iterative - list of sisfiles to be installed. This unpacks three values of sisfiles (src, dst_on_ats_server, dst_on_phone).", "for sisfile in exe_block['tdriver_sis_files']"
+ "tdriver_task_files", "Iterative - list of task files, .pro or .rb files, depending on the value of :hlm-p:`tdriver.tdrunner.enabled`.", "for task_file in exe_block['tdriver_task_files']"
+ "asset_path", "Non-iterative - string", "exe_block['asset_path']"
+ "test_timeout", "Non-iterative - string", "exe_block['test_timeout']"
+ "tdriver_parameters", "Non-iterative - string", "exe_block['tdriver_parameters']"
+ "tdrunner_enabled", "Non-iterative - boolean", "exe_block['tdrunner_enabled']"
+ "tdrunner_parameters", "Non-iterative - string", "exe_block['tdrunner_parameters']"
+ "ctc_enabled", "Non-iterative - boolean", "exe_block['ctc_enabled']"
+
+
+
+**Post-Data** (non-itterative data and comes after the execution block in the end of the test.xml)
+
+
+.. csv-table:: Pre-Data (Data structure)
+ :header: "Variable name", "Description", "Usage example"
+
+ "report_email", "Non-iterative - string", "xml_dict['report_email']"
+ "email_format", "Non-iterative - string", "xml_dict['email_format']"
+ "email_subject", "Non-iterative - string", "xml_dict['email_subject']"
+ "report_location", "Non-iterative - string", "xml_dict['report_location']"
+
+
+
+Example template
+================
+
+
+.. code-block:: xml
+
+ {% import 'ats4_macros.xml' as macros with context %}
+
+
+
+ {% if xml_dict['diamonds_build_url'] -%}
+ {{ xml_dict['diamonds_build_url'] }}
+ Smoke
+ {% endif %}
+ {{ xml_dict['testrun_name'] }}
+
+
+
+
+
+
+
+
+
+ {% for exe_block in xml_dict['execution_blocks'] -%}
+
+
+
+ {% if exe_block['image_files'] -%}
+
+ FlashTask
+
+ {% set i = 1 %}
+ {% for img in exe_block['image_files'] -%}
+
+ {% set i = i + 1 %}
+ {% endfor -%}
+
+
+ {% endif %}
+
+ {% if exe_block['install_files'] != [] -%}
+ {% for file in exe_block['install_files'] -%}
+
+ FileUploadTask
+
+
+
+
+
+ {% endfor -%}
+ {% endif %}
+
+ {% if exe_block['tdriver_sis_files'] != [] -%}
+ {% for sisfile in exe_block['tdriver_sis_files'] -%}
+
+ FileUploadTask
+
+
+
+
+
+ {% endfor -%}
+ {% endif %}
+
+ {% for sis_file in exe_block["tdriver_sis_files"] -%}
+
+ InstallSisTask
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ {%- endfor -%}
+
+
+ RebootTask
+
+
+
+ CreateDirTask
+
+
+
+
+
+ {% if exe_block["ctc_enabled"] == "True" -%}
+ {{ macros.ctc_initialization(exe_block) }}
+ {%- endif %}
+
+
+ {% if exe_block["tdriver_task_files"] -%}
+ {% for task_file in exe_block["tdriver_task_files"] -%}
+
+ TestabilityTask
+
+
+
+
+
+
+
+
+ {% endfor -%}
+ {% endif %}
+
+
+ {% if exe_block["ctc_enabled"] == "True" -%}
+ {{ macros.ctc_finalization(exe_block) }}
+ {%- endif %}
+
+
+ CleanupTask
+
+
+
+
+
+
+ {% endfor -%}
+
+
+
+ EmailAction
+
+
+
+
+
+
+ {% if xml_dict['report_location'] -%}
+
+ FileStoreAction
+
+
+
+
+
+ {% endif %}
+ {% if xml_dict['diamonds_build_url'] -%}
+
+ DiamondsAction
+ {% if xml_dict['execution_blocks'] != [] and xml_dict['execution_blocks'][0]["ctc_enabled"] == "True" -%}
+
+
+
+ {%- endif %}
+
+ {%- endif %}
+
+
+
+
+
+
+Setting Custom Template for execution
+=====================================
+
+To execute custom template, set property :hlm-p:`tdriver.template.file`, for example:
+
+.. code-block:: xml
+
+
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/new_user_tutorial.rst
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/doc/src/new_user_tutorial.rst Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,280 @@
+.. ============================================================================
+ Name : new_user_tutorial.rst
+ Part of : Helium
+
+ Copyright (c) 2010 Nokia Corporation and/or its subsidiary(-ies).
+ All rights reserved.
+ This component and the accompanying materials are made available
+ under the terms of the License "Eclipse Public License v1.0"
+ which accompanies this distribution, and is available
+ at the URL "http://www.eclipse.org/legal/epl-v10.html".
+
+ Initial Contributors:
+ Nokia Corporation - initial contribution.
+
+ Contributors:
+
+ Description:
+
+ ============================================================================
+
+########################
+Helium New User Tutorial
+########################
+
+.. index::
+ module: Helium New User Tutorial
+
+.. contents::
+
+Introduction
+============
+
+This tutorial covers the basic information to get up and running using the Helium build framework. Check the `Helium manual`_ for more detailed information.
+
+.. _`Helium manual`: manual/index.html
+
+
+Setting up a simple build file
+===============================
+
+Helium is based on `Apache Ant`_, a build tool written in Java that uses XML configuration files. A Helium build configuration must start with a ``build.xml`` file in the directory where Helium commands are run from::
+
+
+
+
+
+
+
+
+
+
+
+
+
+.. _`Apache Ant`: http://ant.apache.org/
+.. _`Ant manual`: http://ant.apache.org/manual
+
+Helium looks for a ``build.xml`` project file in the current directory. It will parse this and additional imported Ant files to create the project configuration.
+
+
+Basic structure
+===============
+
+The main components of Ant project files are targets, types and tasks.
+
+Targets define a set of tasks to be run as a build step. A target can depend on other targets, so a complete build process can be built up using a chain of targets. A simple target to echo some text to the console might look like this::
+
+
+ Hello!
+
+
+Types are information elements for configuring the build. The most common are properties that define a single value or location::
+
+
+
+Properties representing locations are normalised to full paths when the ``location`` attribute is used::
+
+
+
+.. note:: Once a property is defined it is immutable. The value of a property is defined by the first definition that is found.
+
+Another common type is a fileset that represents a collection of files, typically using wildcard selection::
+
+
+
+
+
+This will select all XML files under the ``helium.build.dir`` directory. Note the use of ``${}`` to insert the value of a property.
+
+There are a number of other types such as dirset, filelist, patternset which may be used for some configuration. See the "Concepts and Types" section of the `Ant manual`_ for more details.
+
+
+Import statements
+-----------------
+
+Import statements are used to pull additional Ant project file content into the main project::
+
+
+
+Here the order of elements is significant:
+
+Properties
+ Must be defined before the import to override a property value in an imported file. See the `properties list `_ for default values.
+
+Types
+ Other types such as filesets, dirsets that are referenced by ``ID`` must be defined after the import.
+
+Targets
+ Can be defined anywhere in the file.
+
+``helium.ant.xml`` is the root file to import from Helium, which will pull in all the Helium content.
+
+
+Run a command
+=============
+
+Make sure Helium is on the ``PATH``. Then the ``hlm`` command can be run from the project directory containing the ``build.xml`` file. Try a quick test::
+
+ hlm hello
+
+This should echo "Hi!" to the console, which shows that Helium can be imported successfully.
+
+A target can be run using its name as a command::
+
+ hlm [target]
+
+Often it can be useful to define or override property values on the command line, like this::
+
+ hlm [target] -Dname=value
+
+
+Setting up a build
+==================
+
+An actual build process is defined by chaining together a number of major build stages, e.g. preparation, compilation, ROM building, etc. So a top-level build process target called from the command line might look like this::
+
+
+
+
+
+
+
+
+
+In this case an additional target is defined and run after prep but before compilation. The full build is then run by calling::
+
+ hlm full-build -Dbuild.number=1
+
+Configuring build stages
+------------------------
+
+Configuring each build stage typically involves defining or overriding properties and other types that are needed for that stage. In some cases special XML file formats are used. Please refer to the `appropriate sections `_ of the manual for information on configuring each stage.
+
+There are a number of individual features that can be enabled or disabled using flag properties. See `this list `_.
+
+
+Overriding and extending targets
+================================
+
+If the build sequence needs customizing or extending, it is useful to be able to define new targets and potentially override existing Helium targets. Targets can be defined anywhere within the XML file. If multiple targets have the same name the first one to be parsed in the order of importing Ant files will be executed when called by name. Any target can be called explicitly by using its fully-qualified name which is constructed by prepending the name of the enclosing project, e.g.::
+
+ hlm common.hello
+
+This calls the ``hello`` target which is located in the common project file. It can be seen in the `API documentation`_.
+
+.. _`API documentation`: api/helium/project-common.html#hello
+
+Any existing target can be extended by overriding it and adding custom steps at the start or the end. To add steps to the start of a target, override it defining a new custom target and the original one as dependencies, e.g. to run a step before preparation::
+
+
+ Run before original target.
+
+
+
+
+Additional steps could be added to the end of a target using a similar method, or just include them in the overriding target thus::
+
+
+ Run after original target.
+
+
+
+Basic operations
+================
+
+Simple file-based tasks
+-----------------------
+
+Ant has core support for wide range of file-based tasks. Here are a few simple examples:
+
+Copying all HTML log files by wildcard::
+
+
+
+
+
+
+
+Zip all the log files::
+
+
+
+
+
+
+
+
+
+Deleting text log files::
+
+
+
+
+
+
+
+See the Ant Tasks section of the `Ant manual`_ for a full list of available tasks.
+
+
+Running an external tool
+------------------------
+
+The ```` task can be used to run an external tool::
+
+
+
+
+
+
+
+See the `Ant manual entry `_ for more details on how to use the ```` task. Use ```` along with the customisation methods above to call additional tools at suitable places during the build. The `Setting up a build`_ section shows how a custom tool target could be called during a full build process.
+
+External scripts can be run by calling the appropriate runtime executable and providing the script as an argument::
+
+
+
+
+
+
+Simple macros
+-------------
+
+Defining a macro is a useful method of combining a set of task steps to avoid repetition. This example defines a macro called ``testing`` and calls it::
+
+
+
+
+
+ v is @{v}
+
+
+
+
+
+
+ this is a test
+
+
+
+
+Getting help
+============
+
+There are several sources of further information:
+
+ * The `Helium manual`_.
+ * The `Helium API`_ of `targets`_, `properties`_ and `macros`_.
+ * Command line help. Try running::
+
+ hlm help [name]
+
+ to get help on a specific target or property.
+
+.. _`Helium API`: api/helium/index.html
+.. _`targets`: api/helium/targets_list.html
+.. _`properties`: api/helium/properties_list.html
+.. _`macros`: api/helium/macros_list.html
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/quick_start_guide.rst.ftl
--- a/buildframework/helium/doc/src/quick_start_guide.rst.ftl Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/doc/src/quick_start_guide.rst.ftl Mon Oct 11 11:16:47 2010 +0100
@@ -70,8 +70,7 @@
- `Using Ant `_: specifically the Projects and Properties sections.
- `Configure Helium `_: `common configuration format `_ and `Helium stages `_.
- - `Helium glossary `_: lists the specific properties used in Helium.
-
+ - `Helium glossary `_: lists the specific properties used in Helium.
.. index::
single: Running builds with Helium
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/tutorials/imaker/buildinfo_creation.rst
--- a/buildframework/helium/doc/src/tutorials/imaker/buildinfo_creation.rst Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/doc/src/tutorials/imaker/buildinfo_creation.rst Mon Oct 11 11:16:47 2010 +0100
@@ -34,7 +34,6 @@
-
FTL template
~~~~~~~~~~~~
@@ -51,31 +50,6 @@
::
> hlm.bat -Dbuild.drive=Z: rombuild-imaker-create-buildinfo
- *** WARNING: Can't find vcvars32.bat - MS Visual Studio not present.
- Buildfile: build.xml
-
- create-data-model-db:
-
- validate-at-startup:
- [python] ERROR: Description has no content for 'read.build.int'
- [python] WARNING: Required property not defined: sysdef.configurations.list
- [python] WARNING: Required property not defined: tsrc.data.dir
- [python] WARNING: Required property not defined: ats3.pathToDrop
- [python] WARNING: Required property not defined: ats3.host
- [python] WARNING: Required property not defined: ats.flash.images
- [python] WARNING: Required property not defined: ats.image.type
- [python] WARNING: Required property not defined: ats.drop.file
- [python] WARNING: Required property not defined: ats3.username
- [python] WARNING: Required property not defined: cache.drive
- [python] WARNING: Required property not defined: ats.product.name
- [python] WARNING: Required property not defined: ats3.password
-
- rombuild-imaker-create-buildinfo:
- [fmpp] File processed.
-
- BUILD SUCCESSFUL
- Total time: 3 seconds
-
The output
~~~~~~~~~~
@@ -89,25 +63,24 @@
##########################################################################
BUILD_LOGGING_KEY_STAGES = prep,build-ebs-main,postbuild,flashfiles,java-certification-rom,zip-main,publish-generic,variants-core,variants-elaf,variants-china,variants-thai,variants-japan,variants,mobilecrash-prep,localise-tutorial-content,hdd-images,zip-flashfiles,zip-localisation,data-packaging-prep
- BUILD_SUMMARY_FILE_2 = Z:\output\logs\summary\pf_5250_16_wk2008_summary.log2.xml
- BUILD_LOG = Z:\output\logs\pf_5250_16_wk2008_ant_build.log
- BUILD_NAME = pf_5250
- BUILD_CACHE_LOG_DIR = C:\DOCUME~1\wbernard\LOCALS~1\Temp\helium\pf_5250_16_wk2008\logs
+ BUILD_SUMMARY_FILE_2 = Z:\output\logs\summary\x_16_wk2008_summary.log2.xml
+ BUILD_LOG = Z:\output\logs\x_16_wk2008_ant_build.log
+ BUILD_NAME = x
+ BUILD_CACHE_LOG_DIR = C:\DOCUME~1\x\LOCALS~1\Temp\helium\x_16_wk2008\logs
BUILD_SYSTEM = ebs
BUILD_LOG_DIR = Z:\output\logs
- BUILD_CACHE_DIR = C:\DOCUME~1\wbernard\LOCALS~1\Temp\helium\pf_5250_16_wk2008
+ BUILD_CACHE_DIR = C:\DOCUME~1\x\LOCALS~1\Temp\helium\x_16_wk2008
BUILD_OUTPUT_DIR = Z:\output
- BUILD_SUMMARY_FILE = Z:\output\logs\pf_5250_16_wk2008_build_summary.xml
+ BUILD_SUMMARY_FILE = Z:\output\logs\x_16_wk2008_build_summary.xml
BUILD_VERSION = 0.0.1
BUILD_SYSTEM_EBS = Not used
BUILD_SISFILES_DIR = Z:\output\sisfiles
BUILD_ERRORS_LIMIT = 0
BUILD_DRIVE = Z:
BUILD_NUMBER = 1
- BUILD_DUPLICATES_LOG = Z:\output\logs\pf_5250_16_wk2008_build_duplicates.xml
+ BUILD_DUPLICATES_LOG = Z:\output\logs\x_16_wk2008_build_duplicates.xml
BUILD_LOGGING_START_STAGE = check-env-prep
- BUILD_ID = pf_5250_16_wk2008
-
+ BUILD_ID = x_16_wk2008
Download the example: `buildinfo_creation.zip `_
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/doc/src/user_graph.dot.ftl
--- a/buildframework/helium/doc/src/user_graph.dot.ftl Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/doc/src/user_graph.dot.ftl Mon Oct 11 11:16:47 2010 +0100
@@ -107,7 +107,7 @@
"DOS Scripting" -> NOSE[dir=none, lhead=cluster_4_1, ltail=cluster_4];
- Start [fontcolor=navyblue,fontsize=12,style=filled,href="introduction.html"];
+ Start [fontcolor=navyblue,fontsize=12,style=filled,href="manual/introduction.html"];
Ant [fontcolor=navyblue,fontsize=12,shape=box,href="http://ant.apache.org/manual/"];
"Running Helium" [fontcolor=navyblue,fontsize=12,shape=box,href="manual/running.html"];
@@ -123,7 +123,7 @@
"ROM Image" [fontcolor=navyblue,fontsize=12,shape=box,href="tutorials/rom_image.html"];
<#if !(ant?keys?seq_contains("sf"))>
- "Setting up Helium at Nokia" [fontcolor=navyblue,fontsize=12,shape=box,href="nokia/nokia.html"];
+ "Setting up Helium at Nokia" [fontcolor=navyblue,fontsize=12,shape=box,href="manual/retrieving.html"];
"Helium Nokia Stages" [fontcolor=navyblue,fontsize=12,shape=box,href="manual/nokiastages.html"];
Diamonds [fontcolor=navyblue,fontsize=12,shape=box,href="http://diamonds.nmp.nokia.com/diamonds/"];
"Helium Wiki" [fontcolor=navyblue,fontsize=12,shape=box,href="http://delivery.nmp.nokia.com/trac/helium/wiki"];
@@ -131,7 +131,7 @@
MCL [fontcolor=navyblue,fontsize=12,shape=box,href="http://s60wiki.nokia.com/S60Wiki/S60_Software_Asset_Management/Organization/Delivery_Services/Howto_build_DFS70.91.91_/_S60.MCL_with_Helium"];
IDO [fontcolor=navyblue,fontsize=12,shape=box,href="http://helium.nmp.nokia.com/doc/ido"];
TeamCI [fontcolor=navyblue,fontsize=12,shape=box,href="http://helium.nmp.nokia.com/doc/teamci"];
- "Helium Test Plan" [fontcolor=navyblue,fontsize=12,shape=box,href="manual/testing.html"];
+ "Helium Test Plan" [fontcolor=navyblue,fontsize=12,shape=box,href="development/testing.html"];
#if>
"Helium Developer Guide" [fontcolor=navyblue,fontsize=12,shape=box,href="development/developer_guide.html"];
@@ -139,7 +139,7 @@
Python [fontcolor=navyblue,fontsize=12,shape=box,href="http://www.python.org/"];
Java [fontcolor=navyblue,fontsize=12,shape=box,href="http://java.sun.com/j2se/"];
FMPP [fontcolor=navyblue,fontsize=12,shape=box,href="http://fmpp.sourceforge.net/"];
- "DOS Scripting" [fontcolor=navyblue,fontsize=12,shape=box,href="http://en.wikipedia.org/wiki/Batch_script"];
+ "DOS Scripting" [fontcolor=navyblue,fontsize=12,shape=box,href="http://en.wikipedia.org/wiki/Batch_file"];
ANTUnit [fontcolor=navyblue,fontsize=12,shape=box,href="http://ant.apache.org/antlibs/antunit/"];
NOSE [fontcolor=navyblue,fontsize=12,shape=box,href="http://ivory.idyll.org/articles/nose-intro.html"];
}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/helium.ant.xml
--- a/buildframework/helium/helium.ant.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/helium.ant.xml Mon Oct 11 11:16:47 2010 +0100
@@ -22,11 +22,11 @@
-->
- Main full build targets and properties
+ Main starting point to import Helium. Imports the other main Ant files.
-
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/deps/net.sourceforge.docutils/docutils/0.5/docutils-0.5.py2.5.egg
Binary file buildframework/helium/sf/deps/net.sourceforge.docutils/docutils/0.5/docutils-0.5.py2.5.egg has changed
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/deps/net.sourceforge.docutils/docutils/0.5/ivy.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/deps/net.sourceforge.docutils/docutils/0.5/ivy.xml Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,33 @@
+
+
+
+
+
+
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/doc/src/index.rst
--- a/buildframework/helium/sf/doc/src/index.rst Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,19 +0,0 @@
-=============
-Helium Antlib
-=============
-.. _Helium Antlib Java API: api/javadoc/index.html
-.. _Helium Antlib AntDoclet: api/doclet/index.html
-
-Developer Documentation
-=======================
-.. toctree::
- :maxdepth: 1
- :glob:
-
- *
-
-API Documentation
-=================
- * `Helium Antlib AntDoclet`_
- * `Helium Antlib Java API`_
-
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/doc/src/index.rst.ftl
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/doc/src/index.rst.ftl Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,43 @@
+<#--
+============================================================================
+Name : .ftl
+Part of : Helium
+
+Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+All rights reserved.
+This component and the accompanying materials are made available
+under the terms of the License "Eclipse Public License v1.0"
+which accompanies this distribution, and is available
+at the URL "http://www.eclipse.org/legal/epl-v10.html".
+
+Initial Contributors:
+Nokia Corporation - initial contribution.
+
+Contributors:
+
+Description:
+
+============================================================================
+-->
+=============
+Helium Antlib
+=============
+
+Developer Documentation
+=======================
+.. toctree::
+ :maxdepth: 1
+ :glob:
+
+ *
+
+<#if !(ant?keys?seq_contains("sf"))>
+
+API Documentation
+=================
+ * `Helium Antlib AntDoclet`_
+ * `Helium Antlib Java API`_
+
+.. _Helium Antlib Java API: ../api/javadoc/index.html
+.. _Helium Antlib AntDoclet: ../api/doclet/index.html
+#if>
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/AntFile.java
--- a/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/AntFile.java Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/AntFile.java Mon Oct 11 11:16:47 2010 +0100
@@ -76,10 +76,14 @@
doc = contentHandler.getDocument();
}
catch (SAXException e) {
- throw new IOException(e.getMessage());
+ throw new IOException("Error parsing: " + path + " at " + e.getMessage(), e);
}
}
+ public String getName() {
+ return getFile().getName();
+ }
+
public Project getProject() {
return rootProject;
}
@@ -129,4 +133,6 @@
}
return antlibFiles;
}
+
+
}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/AntObjectMeta.java
--- a/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/AntObjectMeta.java Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/AntObjectMeta.java Mon Oct 11 11:16:47 2010 +0100
@@ -17,6 +17,7 @@
package com.nokia.helium.ant.data;
+import java.util.ArrayList;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
@@ -34,7 +35,7 @@
public class AntObjectMeta {
public static final Map SCOPES;
-
+
static {
Map tempMap = new HashMap();
tempMap.put("public", new Integer(1));
@@ -42,7 +43,7 @@
tempMap.put("private", new Integer(3));
SCOPES = Collections.unmodifiableMap(tempMap);
}
-
+
/** The default scope if an element does not have a defined scope. */
public static final String DEFAULT_SCOPE = "public";
@@ -88,7 +89,7 @@
* @param name Attribute name.
* @return Attribute value.
*/
- protected String getAttr(String name) {
+ public String getAttr(String name) {
if (node.getNodeType() == Node.ELEMENT_NODE) {
String value = ((Element) node).attributeValue(name);
if (value != null) {
@@ -116,6 +117,10 @@
return parent.getRootMeta();
}
+ public AntObjectMeta getParent() {
+ return parent;
+ }
+
/**
* Returns the Ant file this Ant object is contained in.
*/
@@ -150,11 +155,15 @@
*/
public String getLocation() {
RootAntObjectMeta rootMeta = getRootMeta();
- String location = rootMeta.getFilePath();
- if (node instanceof ElementWithLocation) {
- location += ":" + ((ElementWithLocation)node).getLineNumber();
+ return rootMeta.getFilePath() + ":" + getLineNumber();
+ }
+
+ public Integer getLineNumber() {
+ int lineNum = 0;
+ if (node != null && node instanceof ElementWithLocation) {
+ lineNum = ((ElementWithLocation) node).getLineNumber();
}
- return location;
+ return lineNum;
}
/**
@@ -210,10 +219,10 @@
public String getDeprecated() {
return comment.getTagValue("deprecated");
}
-
+
/**
- * Returns the content of the "since" tag that should indicate which release this feature
- * was first added.
+ * Returns the content of the "since" tag that should indicate which release
+ * this feature was first added.
*
* @return Since release number.
*/
@@ -247,7 +256,7 @@
this.comment = comment;
}
- private void processComment() {
+ private void processComment() {
Comment commentNode = getCommentNode();
if (commentNode != null) {
comment = new AntComment(commentNode);
@@ -259,8 +268,7 @@
Node commentNode = null;
if (node.getNodeType() == Node.COMMENT_NODE) {
commentNode = node;
- }
- else {
+ } else {
List children = node.selectNodes("preceding-sibling::node()");
if (children.size() > 0) {
// Scan past the text nodess, which are most likely whitespace
@@ -275,13 +283,12 @@
if (child.getNodeType() == Node.COMMENT_NODE) {
commentNode = child;
log("Node has comment: " + node.getStringValue(), Project.MSG_DEBUG);
- }
- else {
+ } else {
log("Node has no comment: " + node.toString(), Project.MSG_WARN);
}
}
}
- return (Comment)commentNode;
+ return (Comment) commentNode;
}
public void log(String text, int level) {
@@ -290,8 +297,36 @@
project.log(text, level);
}
}
-
+
public String toString() {
return getName();
}
+
+ public List getTasks(String taskType) {
+ return getTaskDefinitions(".//" + taskType);
+ }
+
+ @SuppressWarnings("unchecked")
+ public List getScriptDefinitions(String xpathExpression) {
+ List nodes = getNode().selectNodes(xpathExpression);
+ List scripts = new ArrayList();
+ for (Element node : nodes) {
+ MacroMeta macroMeta = new MacroMeta(this, node);
+ macroMeta.setRuntimeProject(getRuntimeProject());
+ scripts.add(macroMeta);
+ }
+ return scripts;
+ }
+
+ @SuppressWarnings("unchecked")
+ public List getTaskDefinitions(String xpathExpression) {
+ List nodes = getNode().selectNodes(xpathExpression);
+ List tasks = new ArrayList();
+ for (Element node : nodes) {
+ TaskMeta taskMeta = new TaskMeta(this, node);
+ taskMeta.setRuntimeProject(getRuntimeProject());
+ tasks.add(taskMeta);
+ }
+ return tasks;
+ }
}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/Database.java
--- a/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/Database.java Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/Database.java Mon Oct 11 11:16:47 2010 +0100
@@ -23,13 +23,10 @@
import java.util.Collection;
import java.util.Collections;
import java.util.HashMap;
-import java.util.Iterator;
import java.util.List;
import java.util.Map;
-
import org.apache.tools.ant.Project;
-import org.apache.tools.ant.Target;
-
+import org.apache.tools.ant.ProjectHelper;
import com.nokia.helium.freemarker.WikiMethod;
import freemarker.cache.ClassTemplateLoader;
@@ -50,6 +47,8 @@
public static final String DEFAULT_SCOPE = "public";
public static final Map NAMESPACE_MAP;
+ private List propertiesList;
+ private List commentPropertiesList;
private Project rootProject;
private Map antfilesMap;
private Map packagesMap;
@@ -65,23 +64,17 @@
this(project, DEFAULT_SCOPE);
}
- @SuppressWarnings("unchecked")
- public Database(Project project, String scopeFilter) throws IOException {
- this.rootProject = project;
+ public Database(Project rootProject, String scopeFilter) throws IOException {
+ this.rootProject = rootProject;
this.scopeFilter = scopeFilter;
antfilesMap = new HashMap();
packagesMap = new HashMap();
- if (project != null) {
- Map targets = project.getTargets();
- Iterator targetsIter = targets.values().iterator();
-
- while (targetsIter.hasNext()) {
- Target target = targetsIter.next();
- String antFilePath = new File(target.getLocation().getFileName()).getCanonicalPath();
-
- if (!antfilesMap.containsKey(antFilePath)) {
- addAntFile(antFilePath);
+ if (rootProject != null) {
+ ProjectHelper helper = (ProjectHelper) rootProject.getReference(ProjectHelper.PROJECTHELPER_REFERENCE);
+ for (Object antFilename : helper.getImportStack()) {
+ if (antFilename instanceof File) {
+ addAntFile(((File) antFilename).getCanonicalPath());
}
}
}
@@ -101,7 +94,7 @@
private void addAntFile(String antFilePath) throws IOException {
if (!antfilesMap.containsKey(antFilePath)) {
- log("Adding project to database: " + antFilePath, Project.MSG_DEBUG);
+ log("Adding project to database: " + antFilePath, Project.MSG_VERBOSE);
AntFile antfile = new AntFile(this, antFilePath, scopeFilter);
antfile.setProject(rootProject);
antfilesMap.put(antFilePath, antfile);
@@ -193,25 +186,29 @@
}
public List getProperties() {
- List propertiesList = new ArrayList();
- for (AntFile antfile : antfilesMap.values()) {
- RootAntObjectMeta rootMeta = antfile.getRootObjectMeta();
- if (rootMeta instanceof ProjectMeta) {
- propertiesList.addAll(((ProjectMeta) rootMeta).getProperties());
+ if ( propertiesList == null) {
+ propertiesList = new ArrayList();
+ for (AntFile antfile : antfilesMap.values()) {
+ RootAntObjectMeta rootMeta = antfile.getRootObjectMeta();
+ if (rootMeta instanceof ProjectMeta) {
+ propertiesList.addAll(((ProjectMeta) rootMeta).getProperties());
+ }
}
}
return propertiesList;
}
-
+
public List getCommentProperties() {
- List propertiesList = new ArrayList();
- for (AntFile antfile : antfilesMap.values()) {
- RootAntObjectMeta rootMeta = antfile.getRootObjectMeta();
- if (rootMeta instanceof ProjectMeta) {
- propertiesList.addAll(((ProjectMeta) rootMeta).getPropertyCommentBlocks());
+ if (commentPropertiesList == null) {
+ commentPropertiesList = new ArrayList();
+ for (AntFile antfile : antfilesMap.values()) {
+ RootAntObjectMeta rootMeta = antfile.getRootObjectMeta();
+ if (rootMeta instanceof ProjectMeta) {
+ commentPropertiesList.addAll(((ProjectMeta) rootMeta).getPropertyCommentBlocks());
+ }
}
}
- return propertiesList;
+ return commentPropertiesList;
}
public List getPackages() throws IOException {
@@ -221,4 +218,15 @@
}
return packages;
}
+
+ public List getTargets() {
+ List targets = new ArrayList();
+ for (AntFile antFile : antfilesMap.values()) {
+ RootAntObjectMeta rootMeta = antFile.getRootObjectMeta();
+ if (rootMeta instanceof ProjectMeta) {
+ targets.addAll(((ProjectMeta)rootMeta).getTargets());
+ }
+ }
+ return targets;
+ }
}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/MacroMeta.java
--- a/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/MacroMeta.java Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/MacroMeta.java Mon Oct 11 11:16:47 2010 +0100
@@ -17,6 +17,7 @@
package com.nokia.helium.ant.data;
+import java.util.ArrayList;
import java.util.List;
import org.dom4j.Element;
@@ -34,6 +35,10 @@
public String getDescription() {
return getAttr("description");
}
+
+ public String getText() {
+ return getNode().getText();
+ }
@SuppressWarnings("unchecked")
public String getUsage() {
@@ -66,4 +71,17 @@
return "\n" + macroElements + " ";
}
}
+
+ @SuppressWarnings("unchecked")
+ public List getAttributes() {
+ List attributes = new ArrayList();
+ if (getNode().getNodeType() == Node.ELEMENT_NODE) {
+ Element element = (Element)getNode();
+ List attributeNodes = element.elements("attribute");
+ for (Element attributeNode : attributeNodes) {
+ attributes.add(attributeNode.attributeValue("name"));
+ }
+ }
+ return attributes;
+ }
}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/ProjectMeta.java
--- a/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/ProjectMeta.java Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/ProjectMeta.java Mon Oct 11 11:16:47 2010 +0100
@@ -111,19 +111,6 @@
return properties;
}
- @SuppressWarnings("unchecked")
- public List getMacros() {
- ArrayList objects = new ArrayList();
- List nodes = getNode().selectNodes("//macrodef | //scriptdef");
- for (Element node : nodes) {
- MacroMeta macroMeta = new MacroMeta(this, node);
- macroMeta.setRuntimeProject(getRuntimeProject());
- if (macroMeta.matchesScope(getScopeFilter())) {
- objects.add(macroMeta);
- }
- }
- return objects;
- }
@SuppressWarnings("unchecked")
public List getProjectDependencies() {
@@ -181,13 +168,12 @@
}
}
- @SuppressWarnings("unchecked")
+ @SuppressWarnings("rawtypes")
private String findSignalFailMode(String signalid, Document antDoc) {
XPath xpath2 = DocumentHelper.createXPath("//hlm:signalListenerConfig[@id='" + signalid
+ "']/signalNotifierInput/signalInput");
xpath2.setNamespaceURIs(Database.NAMESPACE_MAP);
List signalNodes3 = xpath2.selectNodes(antDoc);
-
for (Iterator iterator3 = signalNodes3.iterator(); iterator3.hasNext();) {
Element propertyNode3 = (Element) iterator3.next();
String signalinputid = propertyNode3.attributeValue("refid");
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/PropertyMeta.java
--- a/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/PropertyMeta.java Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/PropertyMeta.java Mon Oct 11 11:16:47 2010 +0100
@@ -29,6 +29,7 @@
public static final String STRING_TYPE = "string";
public static final String INTEGER_TYPE = "integer";
public static final String BOOLEAN_TYPE = "boolean";
+ public static final String FLOAT_TYPE = "float";
public static final String DEFAULT_TYPE = STRING_TYPE;
public PropertyMeta(AntObjectMeta parent, Node propNode) {
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/RootAntObjectMeta.java
--- a/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/RootAntObjectMeta.java Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/RootAntObjectMeta.java Mon Oct 11 11:16:47 2010 +0100
@@ -51,7 +51,7 @@
public AntFile getAntFile() {
return antFile;
}
-
+
/**
* Returns the location path of the object.
*
@@ -73,19 +73,15 @@
return this;
}
- @SuppressWarnings("unchecked")
- public List getMacros() throws IOException {
- ArrayList objects = new ArrayList();
- List nodes = getNode().selectNodes("//macrodef | //scriptdef");
- for (Element node : nodes) {
- MacroMeta macroMeta = new MacroMeta(this, node);
- macroMeta.setRuntimeProject(getRuntimeProject());
- if (macroMeta.matchesScope(scopeFilter)) {
- objects.add(macroMeta);
+ public List getMacros() {
+ List objects = getScriptDefinitions("//macrodef | //scriptdef");
+ List filteredList = new ArrayList();
+ for (MacroMeta macroMeta : objects) {
+ if (macroMeta.matchesScope(getScopeFilter())) {
+ filteredList.add(macroMeta);
}
}
- return objects;
+ return filteredList;
}
+
}
-
-
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/TargetMeta.java
--- a/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/TargetMeta.java Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/TargetMeta.java Mon Oct 11 11:16:47 2010 +0100
@@ -43,12 +43,36 @@
super(parent, node);
}
+ private String getPropertyFromDb(String prop) {
+ for (PropertyMeta property : getDatabase().getProperties()) {
+ if (property.getName().equals(prop)
+ && property.matchesScope(getRootMeta().getScopeFilter())) {
+ return prop;
+ }
+ }
+ for (PropertyCommentMeta property : getDatabase().getCommentProperties()) {
+ if (property.getName().equals(prop)
+ && property.matchesScope(getRootMeta().getScopeFilter())) {
+ return prop;
+ }
+ }
+ return null;
+ }
+
public String getIf() {
- return getAttr("if");
+ String propertyIf = getAttr("if");
+ if (!propertyIf.isEmpty() && getPropertyFromDb(propertyIf) != null) {
+ return propertyIf;
+ }
+ return "";
}
public String getUnless() {
- return getAttr("unless");
+ String propertyUnless = getAttr("unless");
+ if (!propertyUnless.isEmpty() && getPropertyFromDb(propertyUnless) != null) {
+ return propertyUnless;
+ }
+ return "";
}
public String getDescription() {
@@ -84,7 +108,7 @@
AntFile antFile = (AntFile) iterator.next();
RootAntObjectMeta rootObjectMeta = antFile.getRootObjectMeta();
if (rootObjectMeta instanceof ProjectMeta) {
- ProjectMeta projectMeta = (ProjectMeta)rootObjectMeta;
+ ProjectMeta projectMeta = (ProjectMeta) rootObjectMeta;
projectMeta.getConfigSignals(getName(), signals);
}
}
@@ -95,7 +119,17 @@
ArrayList properties = new ArrayList();
Visitor visitor = new AntPropertyVisitor(properties);
getNode().accept(visitor);
- return properties;
+ return filterPropertyDependencies(properties);
+ }
+
+ private List filterPropertyDependencies(ArrayList properties) {
+ List propertiesFiltered = new ArrayList();
+ for (String string : properties) {
+ if (getPropertyFromDb(string) != null) {
+ propertiesFiltered.add(string);
+ }
+ }
+ return propertiesFiltered;
}
private class AntPropertyVisitor extends VisitorSupport {
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/TaskContainerMeta.java
--- a/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/TaskContainerMeta.java Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/TaskContainerMeta.java Mon Oct 11 11:16:47 2010 +0100
@@ -106,7 +106,7 @@
if (name.endsWith("signal") || name.endsWith("execSignal")) {
String signalid = node.attributeValue("name");
- if (signalList != null) {
+ if (signalid != null && signalid.length() > 0) {
signalList.add(signalid);
}
}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/TaskMeta.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/TaskMeta.java Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,49 @@
+/*
+ * Copyright (c) 2007-2008 Nokia Corporation and/or its subsidiary(-ies).
+ * All rights reserved.
+ * This component and the accompanying materials are made available
+ * under the terms of the License "Eclipse Public License v1.0"
+ * which accompanies this distribution, and is available
+ * at the URL "http://www.eclipse.org/legal/epl-v10.html".
+ *
+ * Initial Contributors:
+ * Nokia Corporation - initial contribution.
+ *
+ * Contributors:
+ *
+ * Description:
+ *
+ */
+package com.nokia.helium.ant.data;
+
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import org.dom4j.Element;
+import org.dom4j.Node;
+
+/**
+ * Meta object representing an Ant task.
+ *
+ */
+public class TaskMeta extends AntObjectMeta {
+
+ public TaskMeta(AntObjectMeta parent, Node node) {
+ super(parent, node);
+ }
+
+ @SuppressWarnings("unchecked")
+ public Map getParams() {
+ Map params = new HashMap();
+
+ if (getNode().getNodeType() == Node.ELEMENT_NODE) {
+ Element element = (Element)getNode();
+ List paramNodes = element.elements("param");
+ for (Element paramNode : paramNodes) {
+ params.put(paramNode.attributeValue("name"), paramNode.attributeValue("value"));
+ }
+ }
+ return params;
+ }
+}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/antlib.xml
--- a/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/antlib.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/antlib.xml Mon Oct 11 11:16:47 2010 +0100
@@ -23,6 +23,5 @@
-
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/taskdefs/AntConfigLintTask.java
--- a/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/taskdefs/AntConfigLintTask.java Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,119 +0,0 @@
-/*
- * Copyright (c) 2007-2008 Nokia Corporation and/or its subsidiary(-ies).
- * All rights reserved.
- * This component and the accompanying materials are made available
- * under the terms of the License "Eclipse Public License v1.0"
- * which accompanies this distribution, and is available
- * at the URL "http://www.eclipse.org/legal/epl-v10.html".
- *
- * Initial Contributors:
- * Nokia Corporation - initial contribution.
- *
- * Contributors:
- *
- * Description:
- *
- */
-
-package com.nokia.helium.ant.data.taskdefs;
-
-import java.io.IOException;
-import java.util.ArrayList;
-import java.util.HashMap;
-import java.util.List;
-import java.util.Map;
-
-import org.apache.tools.ant.BuildException;
-import org.apache.tools.ant.DynamicElement;
-import org.apache.tools.ant.Project;
-import org.apache.tools.ant.Task;
-
-import com.nokia.helium.ant.data.Database;
-import com.nokia.helium.ant.data.types.AntLintCheck;
-import com.nokia.helium.ant.data.types.LintIssue;
-
-/**
- * Another version of AntLint that fits with the antdata API more closely.
- */
-@SuppressWarnings("serial")
-public class AntConfigLintTask extends Task implements DynamicElement {
-
- // Temporary solution, should change to scanning jar
- public static final Map CHECKS = new HashMap() {
- {
- put("wrongtypepropertycheck", "WrongTypePropertyCheck");
- // put("protected", new Integer(2));
- // put("private", new Integer(3));
- }
- };
- private List checks = new ArrayList();
- private List issues = new ArrayList();
- private Database db;
- private int errorsTotal;
-
- public AntConfigLintTask() throws IOException {
- setTaskName("antconfiglint");
- }
-
- @SuppressWarnings("unchecked")
- public Object createDynamicElement(String name) {
- AntLintCheck check = null;
- String className = "com.nokia.helium.ant.data.types." + CHECKS.get(name);
- log("Creating check: " + className, Project.MSG_DEBUG);
- try {
- Class clazz = (Class) Class.forName(className);
- if (clazz != null) {
- check = (AntLintCheck) clazz.newInstance();
- checks.add(check);
- }
- }
- catch (ClassNotFoundException th) {
- th.printStackTrace();
- throw new BuildException("Error in Antlint configuration: " + th.getMessage());
- } catch (InstantiationException th) {
- th.printStackTrace();
- throw new BuildException("Error in Antlint configuration: " + th.getMessage());
- } catch (IllegalAccessException th) {
- th.printStackTrace();
- throw new BuildException("Error in Antlint configuration: " + th.getMessage());
- }
- return check;
- }
-
- public void execute() {
- errorsTotal = 0;
- try {
- db = new Database(getProject());
- if (checks.size() == 0) {
- throw new BuildException("No checks defined.");
- }
- for (AntLintCheck check : checks) {
- check.setTask(this);
-
- log("Running check: " + check, Project.MSG_DEBUG);
- check.run();
-
- }
- for (LintIssue issue : issues) {
- log(issue.toString());
- }
- if (errorsTotal > 0) {
- throw new BuildException("AntLint errors found: " + errorsTotal);
- }
- }
- catch (IOException e) {
- throw new BuildException(e.getMessage());
- }
- }
-
- public void addLintIssue(LintIssue issue) {
- issues.add(issue);
- if (issue.getSeverity() == AntLintCheck.SEVERITY_ERROR) {
- errorsTotal++;
- }
- }
-
- public Database getDatabase() {
- return db;
- }
-}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/types/AntLintCheck.java
--- a/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/types/AntLintCheck.java Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,108 +0,0 @@
-/*
- * Copyright (c) 2007-2008 Nokia Corporation and/or its subsidiary(-ies).
- * All rights reserved.
- * This component and the accompanying materials are made available
- * under the terms of the License "Eclipse Public License v1.0"
- * which accompanies this distribution, and is available
- * at the URL "http://www.eclipse.org/legal/epl-v10.html".
- *
- * Initial Contributors:
- * Nokia Corporation - initial contribution.
- *
- * Contributors:
- *
- * Description:
- *
- */
-
-package com.nokia.helium.ant.data.types;
-
-import java.io.IOException;
-
-import org.apache.tools.ant.types.DataType;
-
-import com.nokia.helium.ant.data.Database;
-import com.nokia.helium.ant.data.taskdefs.AntConfigLintTask;
-
-/**
- * An Ant Lint coding conventions check.
- */
-public abstract class AntLintCheck extends DataType {
- public static final int SEVERITY_ERROR = 0;
- public static final int DEFAULT_SEVERITY = SEVERITY_ERROR;
-
- private AntConfigLintTask task;
- private String name;
- private String text;
- private int severity = DEFAULT_SEVERITY;
-
- /**
- * Set the pattern text.
- *
- * @param text is the pattern text to set.
- */
- public void addText(String text) {
- this.text = text;
- }
-
- /**
- * Get the name of the Checker.
- *
- * @return name of the checker.
- */
- public String getName() {
- return name;
- }
-
- /**
- * Set the name of the Checker.
- *
- * @param name is the name of the checker to set.
- * @ant.required
- */
- public void setName(String name) {
- this.name = name;
- }
-
- /**
- * Get the pattern set for this Checker.
- *
- * @return the pattern.
- */
- public String getPattern() {
- return text;
- }
-
- /**
- * Get the severity.
- *
- * @return the severity
- */
- public int getSeverity() {
- return severity;
- }
-
- /**
- * Set the severity. (Valid values : error|warning)
- *
- * @param severity is the severity to set.
- * @ant.required
- */
- public void setSeverity(int severity) {
- this.severity = severity;
- }
-
- public AntConfigLintTask getTask() {
- return task;
- }
-
- public void setTask(AntConfigLintTask task) {
- this.task = task;
- }
-
- protected Database getDb() {
- return task.getDatabase();
- }
-
- public abstract void run() throws IOException;
-}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/types/LintIssue.java
--- a/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/types/LintIssue.java Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,69 +0,0 @@
-/*
- * Copyright (c) 2007-2008 Nokia Corporation and/or its subsidiary(-ies).
- * All rights reserved.
- * This component and the accompanying materials are made available
- * under the terms of the License "Eclipse Public License v1.0"
- * which accompanies this distribution, and is available
- * at the URL "http://www.eclipse.org/legal/epl-v10.html".
- *
- * Initial Contributors:
- * Nokia Corporation - initial contribution.
- *
- * Contributors:
- *
- * Description:
- *
- */
-
-package com.nokia.helium.ant.data.types;
-
-import com.nokia.helium.ant.data.AntFile;
-
-/**
- * An Ant lint issue.
- */
-public class LintIssue {
- private String description;
- private int level;
- private AntFile antfile;
- private String location;
-
- public LintIssue(String description, int level, String location) {
- super();
- this.description = description;
- this.level = level;
- this.location = location;
- }
-
- public String getDescription() {
- return description;
- }
-
- public void setDescription(String description) {
- this.description = description;
- }
-
- public int getSeverity() {
- return level;
- }
-
- public void setLevel(int level) {
- this.level = level;
- }
-
- public AntFile getAntfile() {
- return antfile;
- }
-
- public void setAntfile(AntFile antfile) {
- this.antfile = antfile;
- }
-
- public String getLocation() {
- return location;
- }
-
- public void setLocation(String location) {
- this.location = location;
- }
-}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/types/PropertyNameCheck.java
--- a/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/types/PropertyNameCheck.java Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,25 +0,0 @@
-/*
- * Copyright (c) 2007-2008 Nokia Corporation and/or its subsidiary(-ies).
- * All rights reserved.
- * This component and the accompanying materials are made available
- * under the terms of the License "Eclipse Public License v1.0"
- * which accompanies this distribution, and is available
- * at the URL "http://www.eclipse.org/legal/epl-v10.html".
- *
- * Initial Contributors:
- * Nokia Corporation - initial contribution.
- *
- * Contributors:
- *
- * Description:
- *
- */
-
-package com.nokia.helium.ant.data.types;
-
-/**
- * A check for the name of a property.
- */
-public class PropertyNameCheck {
-
-}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/types/WrongTypePropertyCheck.java
--- a/buildframework/helium/sf/java/antdata/src/com/nokia/helium/ant/data/types/WrongTypePropertyCheck.java Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,59 +0,0 @@
-/*
- * Copyright (c) 2007-2008 Nokia Corporation and/or its subsidiary(-ies).
- * All rights reserved.
- * This component and the accompanying materials are made available
- * under the terms of the License "Eclipse Public License v1.0"
- * which accompanies this distribution, and is available
- * at the URL "http://www.eclipse.org/legal/epl-v10.html".
- *
- * Initial Contributors:
- * Nokia Corporation - initial contribution.
- *
- * Contributors:
- *
- * Description:
- *
- */
-
-package com.nokia.helium.ant.data.types;
-
-import java.io.IOException;
-import java.util.List;
-
-import org.apache.tools.ant.Project;
-
-import com.nokia.helium.ant.data.PropertyMeta;
-
-/**
- * An AntLint check of the type of a property.
- */
-public class WrongTypePropertyCheck extends AntLintCheck {
- public static final String DESCRIPTION = "Property value does not match type";
-
- @Override
- public void run() throws IOException {
- List properties = getDb().getProperties();
- log("Properties total: " + properties.size(), Project.MSG_DEBUG);
- for (PropertyMeta propertyMeta : properties) {
- String type = propertyMeta.getType();
- String value = propertyMeta.getValue();
- if (value != null) {
- log("Testing for wrong type: " + propertyMeta.getName() + ", type: " + type, Project.MSG_DEBUG);
- if (type.equals("integer")) {
- try {
- Integer.decode(value);
- }
- catch (NumberFormatException e) {
- getTask().addLintIssue(new LintIssue(DESCRIPTION, getSeverity(), propertyMeta.getLocation()));
- }
- }
- }
- else {
- log("Testing for wrong type: " + propertyMeta.getName() + ": value cannot be found", Project.MSG_DEBUG);
- }
- }
- }
-}
-
-
-
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antdata/tests/data/test_project.ant.xml
--- a/buildframework/helium/sf/java/antdata/tests/data/test_project.ant.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/antdata/tests/data/test_project.ant.xml Mon Oct 11 11:16:47 2010 +0100
@@ -50,8 +50,18 @@
-
+
+
+
+
+
${property1}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antlint/ivy.xml
--- a/buildframework/helium/sf/java/antlint/ivy.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/antlint/ivy.xml Mon Oct 11 11:16:47 2010 +0100
@@ -30,6 +30,6 @@
-
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antlint/src/com/nokia/helium/antlint/AntLintHandler.java
--- a/buildframework/helium/sf/java/antlint/src/com/nokia/helium/antlint/AntLintHandler.java Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,171 +0,0 @@
-/*
- * Copyright (c) 2007-2008 Nokia Corporation and/or its subsidiary(-ies).
- * All rights reserved.
- * This component and the accompanying materials are made available
- * under the terms of the License "Eclipse Public License v1.0"
- * which accompanies this distribution, and is available
- * at the URL "http://www.eclipse.org/legal/epl-v10.html".
- *
- * Initial Contributors:
- * Nokia Corporation - initial contribution.
- *
- * Contributors:
- *
- * Description:
- *
- */
-package com.nokia.helium.antlint;
-
-import org.xml.sax.Attributes;
-import org.xml.sax.Locator;
-import org.xml.sax.helpers.DefaultHandler;
-
-import com.nokia.helium.antlint.ant.types.AbstractCheck;
-import com.nokia.helium.antlint.ant.types.Check;
-
-/**
- * AntLintHandler is an SAX2 event handler class used to check for
- * tab characters and indents inside the xml elements.
- *
- */
-public class AntLintHandler extends DefaultHandler {
- private int indentLevel;
- private int indentSpace;
- private Locator locator;
- private boolean textElement;
- private int currentLine;
- private StringBuffer strBuff = new StringBuffer();
-
- private boolean indentationCheck;
- private boolean tabCharacterCheck;
-
- private Check check;
-
- /**
- * Create an instance of {@link AntLintHandler}.
- *
- * @param check
- * is the check to be performed.
- */
- public AntLintHandler(AbstractCheck check) {
- super();
- this.check = check;
- }
-
- /**
- * {@inheritDoc}
- */
- public void setDocumentLocator(Locator locator) {
- this.locator = locator;
- }
-
- /**
- * {@inheritDoc}
- */
- public void startDocument() {
- indentLevel -= 4;
- }
-
- /**
- * Set whether the handler should check for indentation or not.
- *
- * @param indentationCheck
- * a boolean value to set.
- */
- public void setIndentationCheck(boolean indentationCheck) {
- this.indentationCheck = indentationCheck;
- }
-
- /**
- * Set whether the handler should check for tab characters or not.
- *
- * @param tabCharacterCheck
- * is a boolean value to set.
- */
- public void setTabCharacterCheck(boolean tabCharacterCheck) {
- this.tabCharacterCheck = tabCharacterCheck;
- }
-
- /**
- * {@inheritDoc}
- */
- public void startElement(String uri, String name, String qName,
- Attributes atts) {
- countSpaces();
- indentLevel += 4; // When an element start tag is encountered,
- // indentLevel is increased 4 spaces.
- checkIndent();
- currentLine = locator.getLineNumber();
- }
-
- /**
- * {@inheritDoc}
- */
- public void endElement(String uri, String name, String qName) {
- countSpaces();
- // Ignore end tags in the same line
- if (currentLine != locator.getLineNumber()) {
- checkIndent();
- }
- indentLevel -= 4; // When an element end tag is encountered,
- // indentLevel is decreased 4 spaces.
- textElement = false;
- }
-
- /**
- * Check for indentation.
- *
- */
- private void checkIndent() {
- if (indentationCheck) {
- if ((indentSpace != indentLevel) && !textElement) {
- check.getReporter().report(check.getSeverity(),
- "Bad indentation", check.getAntFile(),
- locator.getLineNumber());
- }
- }
- }
-
- /**
- * {@inheritDoc}
- */
- public void characters(char[] ch, int start, int length) {
- for (int i = start; i < start + length; i++) {
- strBuff.append(ch[i]);
- }
- }
-
- /**
- * Method counts the number of spaces.
- */
- public void countSpaces() {
- // Counts spaces and tabs in every newline.
- int numSpaces = 0;
- for (int i = 0; i < strBuff.length(); i++) {
- switch (strBuff.charAt(i)) {
- case '\t':
- numSpaces += 4;
- if (tabCharacterCheck) {
- check.getReporter().report(check.getSeverity(),
- "Tabs should not be used!", check.getAntFile(),
- locator.getLineNumber());
- }
- break;
- case '\n':
- numSpaces = 0;
- break;
- case '\r':
- break;
- case ' ':
- numSpaces++;
- break;
- default:
- textElement = true;
- break;
- }
- }
- indentSpace = numSpaces;
- strBuff.delete(0, strBuff.length());
- }
-
-}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antlint/src/com/nokia/helium/antlint/ant/antlib.xml
--- a/buildframework/helium/sf/java/antlint/src/com/nokia/helium/antlint/ant/antlib.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/antlint/src/com/nokia/helium/antlint/ant/antlib.xml Mon Oct 11 11:16:47 2010 +0100
@@ -27,8 +27,8 @@
-
+
@@ -39,16 +39,19 @@
-
-
+
+
+
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/antlint/src/com/nokia/helium/antlint/ant/taskdefs/AntLintTask.java
--- a/buildframework/helium/sf/java/antlint/src/com/nokia/helium/antlint/ant/taskdefs/AntLintTask.java Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/antlint/src/com/nokia/helium/antlint/ant/taskdefs/AntLintTask.java Mon Oct 11 11:16:47 2010 +0100
@@ -18,7 +18,9 @@
package com.nokia.helium.antlint.ant.taskdefs;
import java.io.File;
+import java.io.IOException;
import java.util.ArrayList;
+import java.util.Collection;
import java.util.List;
import java.util.Vector;
@@ -27,11 +29,14 @@
import org.apache.tools.ant.Task;
import org.apache.tools.ant.types.FileSet;
+import com.nokia.helium.ant.data.AntFile;
+import com.nokia.helium.ant.data.Database;
import com.nokia.helium.antlint.ant.AntlintException;
import com.nokia.helium.antlint.ant.Reporter;
import com.nokia.helium.antlint.ant.Severity;
import com.nokia.helium.antlint.ant.types.Check;
import com.nokia.helium.antlint.ant.types.ConsoleReporter;
+import com.nokia.helium.antlint.ant.types.Executor;
/**
* AntLint Task. This task checks for common coding conventions and errors in
@@ -42,30 +47,32 @@
*
*
CheckAntCall : checks whether antcall is used with no param elements and
* calls target with no dependencies
- *
CheckDescription : checks for project description
- *
CheckDuplicateNames : checks for duplicate macros
- *
CheckFileName : checks the naming convention of ant xml files
- *
CheckIndentation : checks indentation
- *
CheckJepJythonScript : checks the coding convention in Jep and Jython
- * scripts
- *
CheckPresetDefMacroDefName : checks the naming convention of presetdef
+ *
checkDescription : checks for project description
+ *
checkDuplicateNames : checks for duplicate macros and task names
+ *
checkFileName : checks the naming convention of ant xml files
+ *
checkIndentation : checks indentation
+ *
checkJythonScript : checks the coding convention Jython scripts
+ *
checkPresetDefMacroDefName : checks the naming convention of presetdef
* and macrodef
- *
CheckProjectName : checks the naming convention of project
- *
CheckPropertyName : checks the naming convention of properties
- *
+ *
checkProjectName : checks the naming convention of project
+ *
checkPropertyName : checks the naming convention of properties
+ *
checkPropertyTypeAndValueMismatch : checks for property type and value
+ * mismatch
*
CheckPythonTasks : checks the coding convention of python tasks
- *
CheckRunTarget : checks whether runtarget calls a target that has
+ *
checkRunTarget : checks whether runtarget calls a target that has
* dependencies
- *
CheckScriptCondition : checks the coding convention in script condition
- *
CheckScriptDef : checks the coding convention in scriptdef
- *
CheckScriptDefNameAttributes - checks the naming convention of scriptdef
- * name attributes
- *
CheckScriptDefStyle : checks the coding style of scriptdef
- *
CheckScriptSize : checks the size of scripts
- *
CheckTabCharacter : checks for tab characters
- *
CheckTargetName : checks the naming convention of targets
- *
CheckUseOfEqualsTask : checks the usage of equals task
- *
CheckUseOfIfInTargets : checks the usage of if task inside targets
+ *
checkScriptCondition : checks the coding convention in script condition
+ *
CheckScriptDef : checks the coding convention in scriptdef and attributes
+ * used any
+ *
checkScriptSize : checks the size of scripts
+ *
checkTabCharacter : checks for tab characters
+ *
checkTargetName : checks the naming convention of targets
+ *
checkTryCatchBlock : checks for empty or more than one catch element in a
+ * try-catch block
+ *
checkUseOfEqualsTask : checks the usage of equals task
+ *
checkUseOfIfInTargets : checks the usage of if task inside targets
+ *
checkVariableTask : checks whether value attribute for ant-contrib
+ * variable task is set or not when unset is set to true
+ *
+ * @ant.type name="blocksRepositoryExists" category="Blocks"
+ */
+
+public class RepositoryExists extends AbstractBlocksTask implements Condition {
+ private String name;
+
+ /**
+ * The name of the repository to check the existence.
+ * @param name
+ * @ant.not-required
+ */
+ public void setName(String name) {
+ this.name = name;
+ }
+ /**
+ * {@inheritDoc}
+ */
+ @Override
+ public boolean eval() {
+ Blocks blocks = getBlocks();
+ try {
+ for (Repository repository : blocks.listRepository(getWsid())) {
+ if (name != null && name.equals(repository.getName())) {
+ return true;
+ } else if (name == null) {
+ return true;
+ }
+
+ }
+ } catch (BlocksException e) {
+ throw new BuildException(e);
+ }
+ return false;
+ }
+}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/blocks/src/com/nokia/helium/blocks/ant/conditions/WorkspaceExists.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/java/blocks/src/com/nokia/helium/blocks/ant/conditions/WorkspaceExists.java Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,112 @@
+/*
+* Copyright (c) 2007-2008 Nokia Corporation and/or its subsidiary(-ies).
+* All rights reserved.
+* This component and the accompanying materials are made available
+* under the terms of the License "Eclipse Public License v1.0"
+* which accompanies this distribution, and is available
+* at the URL "http://www.eclipse.org/legal/epl-v10.html".
+*
+* Initial Contributors:
+* Nokia Corporation - initial contribution.
+*
+* Contributors:
+*
+* Description:
+*
+*/
+package com.nokia.helium.blocks.ant.conditions;
+
+import hidden.org.codehaus.plexus.interpolation.os.Os;
+
+import java.io.File;
+import java.io.IOException;
+
+import org.apache.tools.ant.BuildException;
+import org.apache.tools.ant.taskdefs.condition.Condition;
+
+import com.nokia.helium.blocks.Blocks;
+import com.nokia.helium.blocks.BlocksException;
+import com.nokia.helium.blocks.Workspace;
+import com.nokia.helium.blocks.ant.AbstractBlocksTask;
+import com.nokia.helium.core.filesystem.windows.Subst;
+
+/**
+ * The blocksWorkspaceExists condition help you to check the existence of a workspace
+ * based on his name or location.
+ *
+ * In the following example the property 'exists' is set if the blocks workspace named 'workspace_name' exists:
+ *
+ *
+ * @ant.task name="blocksAddWorkspace" category="Blocks"
+ */
+public class AddWorkspaceTask extends AbstractBlocksTask {
+
+ private String name;
+ private File dir;
+ private String wsidproperty;
+
+ /**
+ * {@inheritDoc}
+ */
+ @Override
+ public void execute() {
+ if (dir == null) {
+ throw new BuildException("dir attribute is not defined.");
+ }
+ if (name == null) {
+ throw new BuildException("name attribute is not defined.");
+ }
+ try {
+ Workspace workspace = getBlocks().addWorkspace(dir, name);
+ log("Workspace " + workspace.getWsid() + " has been created successfully.");
+ if (wsidproperty != null) {
+ getProject().setNewProperty(wsidproperty, "" + workspace.getWsid());
+ }
+ } catch (BlocksException exc) {
+ throw new BuildException(exc);
+ }
+ }
+
+ /**
+ * The name of the output property.
+ * @param wsidproperty
+ * @ant.required
+ */
+ public void setWsidproperty(String wsidproperty) {
+ this.wsidproperty = wsidproperty;
+ }
+
+ /**
+ * The name of the workspace.
+ * @param name
+ * @ant.required
+ */
+ public void setName(String name) {
+ this.name = name;
+ }
+
+
+ /**
+ * The location of the workspace.
+ * @param dir
+ * @ant.required
+ */
+ public void setDir(File dir) {
+ this.dir = dir;
+ }
+
+}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/blocks/src/com/nokia/helium/blocks/ant/taskdefs/CreateRepositoryIndexTask.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/java/blocks/src/com/nokia/helium/blocks/ant/taskdefs/CreateRepositoryIndexTask.java Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,149 @@
+/*
+* Copyright (c) 2007-2008 Nokia Corporation and/or its subsidiary(-ies).
+* All rights reserved.
+* This component and the accompanying materials are made available
+* under the terms of the License "Eclipse Public License v1.0"
+* which accompanies this distribution, and is available
+* at the URL "http://www.eclipse.org/legal/epl-v10.html".
+*
+* Initial Contributors:
+* Nokia Corporation - initial contribution.
+*
+* Contributors:
+*
+* Description:
+*
+*/
+package com.nokia.helium.blocks.ant.taskdefs;
+
+import java.io.File;
+
+import org.apache.tools.ant.BuildException;
+import org.apache.tools.ant.Project;
+import org.apache.tools.ant.Task;
+
+import com.nokia.helium.blocks.Bundle;
+import com.nokia.helium.blocks.BundleException;
+import com.nokia.helium.core.plexus.AntStreamConsumer;
+
+/**
+ * This task will help you to create a debian repository index.
+ * To generate the index you need to set the dest attribute to point
+ * to a directory containing .deb packages. The outcome of the process
+ * will be a Package file under the mentioned directory.
+ *
+ *
+ * @ant.task name="blocksCreateRepositoryIndex" category="Blocks"
+ */
+public class CreateRepositoryIndexTask extends Task {
+
+ private File dest;
+ private boolean sign;
+ private boolean failOnError = true;
+ private boolean verbose;
+
+ /**
+ * Location where to create the repository index.
+ * @param dest
+ * @ant.required
+ */
+ public void setDest(File dest) {
+ this.dest = dest;
+ }
+
+ /**
+ * Get the location where to create the repository index.
+ * @return the dest folder.
+ */
+ public File getDest() {
+ return dest;
+ }
+
+ /**
+ * Defines if the repository index creation should be signed.
+ * @param sign If true the repository will be signed
+ * @ant.not-required Default is false
+ */
+ public void setSign(boolean sign) {
+ this.sign = sign;
+ }
+
+ /**
+ * Shall we sign the repository?
+ * @return true is the repository should be signed.
+ */
+ public boolean isSign() {
+ return sign;
+ }
+
+ /**
+ * If defined as true it will fail the build in case of error,
+ * else it will keepgoing.
+ * @param failOnError
+ * @ant.not-required Default is true.
+ */
+ public void setFailOnError(boolean failOnError) {
+ this.failOnError = failOnError;
+ }
+
+ /**
+ * Shall we fail on error?
+ * @return return true if should fails on error.
+ */
+ public boolean isFailOnError() {
+ return failOnError;
+ }
+
+ /**
+ * Set true to show the output from blocks command execution.
+ * @param verbose
+ * @ant.not-required Default to false.
+ */
+ public void setVerbose(boolean verbose) {
+ this.verbose = verbose;
+ }
+
+ /**
+ * Are we execution the task in verbose mode.
+ * @return
+ */
+ public boolean isVerbose() {
+ return verbose;
+ }
+
+ /**
+ * {@inheritDoc}
+ */
+ public void execute() {
+ if (getDest() == null) {
+ throw new BuildException("'dest' attribute must be defined.");
+ }
+ try {
+ getBundle().createRepositoryIndex(getDest(), isSign());
+ } catch (BundleException e) {
+ if (isFailOnError()) {
+ throw new BuildException(e.getMessage(), e);
+ } else {
+ log(e.getMessage(), Project.MSG_ERR);
+ }
+ }
+ }
+
+ /**
+ * Get a pre-configured Bundle application wrapper instance.
+ * @return a new bundle object.
+ */
+ protected Bundle getBundle() {
+ Bundle bundle = new Bundle();
+ if (isVerbose()) {
+ bundle.addOutputLineHandler(new AntStreamConsumer(this, Project.MSG_INFO));
+ } else {
+ bundle.addOutputLineHandler(new AntStreamConsumer(this, Project.MSG_DEBUG));
+ }
+ bundle.addErrorLineHandler(new AntStreamConsumer(this, Project.MSG_ERR));
+ return bundle;
+ }
+}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/blocks/src/com/nokia/helium/blocks/ant/taskdefs/GetWorkspaceIdTask.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/java/blocks/src/com/nokia/helium/blocks/ant/taskdefs/GetWorkspaceIdTask.java Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,119 @@
+/*
+* Copyright (c) 2007-2008 Nokia Corporation and/or its subsidiary(-ies).
+* All rights reserved.
+* This component and the accompanying materials are made available
+* under the terms of the License "Eclipse Public License v1.0"
+* which accompanies this distribution, and is available
+* at the URL "http://www.eclipse.org/legal/epl-v10.html".
+*
+* Initial Contributors:
+* Nokia Corporation - initial contribution.
+*
+* Contributors:
+*
+* Description:
+*
+*/
+package com.nokia.helium.blocks.ant.taskdefs;
+
+import hidden.org.codehaus.plexus.interpolation.os.Os;
+
+import java.io.File;
+import java.io.IOException;
+
+import org.apache.tools.ant.BuildException;
+
+import com.nokia.helium.blocks.BlocksException;
+import com.nokia.helium.blocks.Workspace;
+import com.nokia.helium.blocks.ant.AbstractBlocksTask;
+import com.nokia.helium.core.filesystem.windows.Subst;
+
+/**
+ * Get the workspace based on the a name or a directory. The task fails if it can't find the information.
+ *
+ * This call will set the id of the workspace named myworkspace into the wsid property:
+ *
+ *
+ *
+ * @ant.task name="stagerecord" category="Logging"
+ */
+public class StageRecord extends DataType {
+
+ private int logLevel = Project.MSG_INFO;
+ private File logFile;
+ private File defaultLogFile;
+ private boolean append = true;
+ private String stageRefId;
+
+ /**
+ * Sets output log file name.
+ * @param output the file to log into
+ * @ant.required
+ */
+
+ public void setOutput(File output) {
+ this.logFile = output;
+ }
+
+ /**
+ * Returns output log file name.
+ * @return
+ */
+
+ public File getOutput() {
+ return this.logFile;
+ }
+
+ /**
+ * Sets log level for respective stage.
+ * @param logLevel
+ * @ant.not-required
+ */
+
+ public void setLogLevel(VerbosityLevelChoices logLevel) {
+ this.logLevel = logLevel.getLevel();
+ }
+
+ /**
+ * Returns log level of respective stage.
+ * @return
+ */
+
+ public int getLogLevel() {
+ return this.logLevel;
+ }
+
+ /**
+ * Get the name of this StageRefID.
+ *
+ * @return name of the Phase.
+ */
+ public String getStageRefID() {
+ return this.stageRefId;
+ }
+
+ /**
+ * Set the name of the StageRefID.
+ *
+ * @param name
+ * is the name to set.
+ * @ant.required
+ */
+ public void setStageRefId(String name) {
+ this.stageRefId = name;
+ }
+
+ /**
+ * Return default ant log file name.
+ * @return
+ */
+ public File getDefaultOutput() {
+ return this.defaultLogFile;
+ }
+
+ /**
+ * Set the default ant log name.
+ * @param name
+ * @ant.required
+ */
+ public void setDefaultOutput(File name) {
+ this.defaultLogFile = name;
+ }
+
+ /**
+ * Set append value.
+ * @param append
+ * @ant.not-required Default is true
+ */
+ public void setAppend(boolean append) {
+ this.append = append;
+ }
+
+ /**
+ * Return the append value.
+ * @param append
+ * @return
+ */
+ public boolean getAppend() {
+ return this.append;
+ }
+
+
+}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/logging/src/com/nokia/helium/logger/ant/types/StageSummary.java
--- a/buildframework/helium/sf/java/logging/src/com/nokia/helium/logger/ant/types/StageSummary.java Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/logging/src/com/nokia/helium/logger/ant/types/StageSummary.java Mon Oct 11 11:16:47 2010 +0100
@@ -18,12 +18,11 @@
import java.io.File;
-import org.apache.log4j.Logger;
-import org.apache.tools.ant.Project;
import org.apache.tools.ant.types.DataType;
+import com.nokia.helium.logger.ant.listener.CommonListenerRegister;
import com.nokia.helium.logger.ant.listener.StageSummaryHandler;
-import com.nokia.helium.logger.ant.listener.StatusAndLogListener;
+import com.nokia.helium.logger.ant.listener.CommonListener;
/**
* StageSummary is a Data type when set a build summary is
@@ -38,21 +37,9 @@
* @ant.task name="stagesummary" category="Logging"
*
*/
-public class StageSummary extends DataType {
-
- private static boolean isStageSummaryHandlerRegistered;
- private File template;
- private Logger log = Logger.getLogger(getClass());
+public class StageSummary extends DataType implements CommonListenerRegister {
- public void setProject(Project project)
- {
- super.setProject(project);
- if ( !isStageSummaryHandlerRegistered && StatusAndLogListener.getStatusAndLogListener() != null) {
- log.debug("Registering stage summary to the StatusAndLogListener listener");
- StatusAndLogListener.getStatusAndLogListener().register( new StageSummaryHandler() );
- isStageSummaryHandlerRegistered = true;
- }
- }
+ private File template;
/**
* Get the template used for displaying build stage summary.
@@ -73,4 +60,13 @@
public void setTemplate( File template ) {
this.template = template;
}
+
+ @Override
+ public void register(CommonListener commonListener) {
+ if (commonListener.getHandler(StageSummaryHandler.class) != null) {
+ log("Only one stageSummary configuration element should be used. Ignoring type at " + this.getLocation());
+ } else {
+ commonListener.register(new StageSummaryHandler(getTemplate()));
+ }
+ }
}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/logging/tests/antunit/run-scenario.ant.xml
--- a/buildframework/helium/sf/java/logging/tests/antunit/run-scenario.ant.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/logging/tests/antunit/run-scenario.ant.xml Mon Oct 11 11:16:47 2010 +0100
@@ -21,36 +21,36 @@
============================================================================
-->
- Helium Antlib logger macro.
+ Helium Antlib logger macro.
-
-
-
- --------------------------------------------
-
+
+
+
+ --------------------------------------------
+
-
- --------------------------------------------
-
-
+
+ --------------------------------------------
+
+
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/logging/tests/antunit/test_recorder.ant.xml
--- a/buildframework/helium/sf/java/logging/tests/antunit/test_recorder.ant.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/logging/tests/antunit/test_recorder.ant.xml Mon Oct 11 11:16:47 2010 +0100
@@ -45,4 +45,19 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/logging/tests/antunit/test_stageslogging.ant.xml
--- a/buildframework/helium/sf/java/logging/tests/antunit/test_stageslogging.ant.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/logging/tests/antunit/test_stageslogging.ant.xml Mon Oct 11 11:16:47 2010 +0100
@@ -27,6 +27,10 @@
+
+
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/logging/tests/scenarii/build_failure/build.xml
--- a/buildframework/helium/sf/java/logging/tests/scenarii/build_failure/build.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/logging/tests/scenarii/build_failure/build.xml Mon Oct 11 11:16:47 2010 +0100
@@ -28,6 +28,7 @@
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/logging/tests/scenarii/build_status/build.xml
--- a/buildframework/helium/sf/java/logging/tests/scenarii/build_status/build.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/logging/tests/scenarii/build_status/build.xml Mon Oct 11 11:16:47 2010 +0100
@@ -28,7 +28,8 @@
-
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/logging/tests/scenarii/inavlid_stage_refid_object/build.xml
--- a/buildframework/helium/sf/java/logging/tests/scenarii/inavlid_stage_refid_object/build.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/logging/tests/scenarii/inavlid_stage_refid_object/build.xml Mon Oct 11 11:16:47 2010 +0100
@@ -28,6 +28,7 @@
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/logging/tests/scenarii/invalid_stage_refid/build.xml
--- a/buildframework/helium/sf/java/logging/tests/scenarii/invalid_stage_refid/build.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/logging/tests/scenarii/invalid_stage_refid/build.xml Mon Oct 11 11:16:47 2010 +0100
@@ -28,6 +28,7 @@
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/logging/tests/scenarii/logger/build.xml
--- a/buildframework/helium/sf/java/logging/tests/scenarii/logger/build.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/logging/tests/scenarii/logger/build.xml Mon Oct 11 11:16:47 2010 +0100
@@ -127,5 +127,17 @@
+
+
+
+
+
+
+
+ test1
+
+ Not in mainlog
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/logging/tests/scenarii/missing_default_config/build.xml
--- a/buildframework/helium/sf/java/logging/tests/scenarii/missing_default_config/build.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/logging/tests/scenarii/missing_default_config/build.xml Mon Oct 11 11:16:47 2010 +0100
@@ -28,6 +28,7 @@
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/logging/tests/scenarii/missing_stage_refid/build.xml
--- a/buildframework/helium/sf/java/logging/tests/scenarii/missing_stage_refid/build.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/logging/tests/scenarii/missing_stage_refid/build.xml Mon Oct 11 11:16:47 2010 +0100
@@ -28,6 +28,7 @@
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/logging/tests/scenarii/override_scenario/build.xml
--- a/buildframework/helium/sf/java/logging/tests/scenarii/override_scenario/build.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/logging/tests/scenarii/override_scenario/build.xml Mon Oct 11 11:16:47 2010 +0100
@@ -28,6 +28,7 @@
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/logging/tests/scenarii/valid_build/build.xml
--- a/buildframework/helium/sf/java/logging/tests/scenarii/valid_build/build.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/logging/tests/scenarii/valid_build/build.xml Mon Oct 11 11:16:47 2010 +0100
@@ -39,8 +39,7 @@
-
-
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/logging/tests/scenarii/valid_build/stages_config.ant.xml
--- a/buildframework/helium/sf/java/logging/tests/scenarii/valid_build/stages_config.ant.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/logging/tests/scenarii/valid_build/stages_config.ant.xml Mon Oct 11 11:16:47 2010 +0100
@@ -27,7 +27,8 @@
-
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/logging/tests/stages_config.ant.xml
--- a/buildframework/helium/sf/java/logging/tests/stages_config.ant.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/logging/tests/stages_config.ant.xml Mon Oct 11 11:16:47 2010 +0100
@@ -27,7 +27,8 @@
-
+
+
@@ -38,6 +39,9 @@
+
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/metadata/doc/metadata.rst
--- a/buildframework/helium/sf/java/metadata/doc/metadata.rst Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/metadata/doc/metadata.rst Mon Oct 11 11:16:47 2010 +0100
@@ -49,8 +49,8 @@
Overview
--------
-Metadata filters are set of regular expressions used to match the text of the build output and process the errors, categorize it,
-and used to generate the output for diamonds, summary file, email output. Predefined set of ids are defined for each stage of the
+Metadata filters are a set of regular expressions used to match the text of the build output and process the errors, categorize it,
+and used to generate the output for diamonds, summary file, email output. A predefined set of ids are defined for each stage of the
build. For example for raptor compilation filter is defined as below,
The default definition of filterset.sbs is
@@ -71,11 +71,11 @@
-The complete list of predefined ids for various stages of the build are defined in this file,
+The complete list of predefined ids for various stages of the build are defined in the file,
helium/config/metadata_filter_config_default.xml
-Each ID can be overridden to provide additional regular expression to control the results of the build for different stages.
+Each ID can be overridden to provide additional regular expressions to control the results of the build for different stages.
Two ways to add the regular expressions
---------------------------------------
@@ -104,9 +104,9 @@
Note
----
-1. The order of metadatafilter / metadatafilterset is important, so the first one takes precedence than the second one.
+1. The order of metadatafilter / metadatafilterset is important, so the first one takes precedence over the second one.
-2. Order is also preserved in the csv file, the expressions which are defined first get precedence than the later one.
+2. Order is also preserved in the csv file, the expressions which are defined first has precedence over the later one.
3. All the regular expressions are JAVA patterns.
@@ -181,14 +181,14 @@
Using the Metadata framework with FMPP
======================================
-The Metadata framework gives an efficient opportunity to record huge amount or data in a fast and reliable way (timewise and memory consumption-wise).
-Thanks to the ORMFMPPLoader database loader it is really simple to access those data and render then in any other format: HTML for easy to read build summary,
-XML to communicated with other tools, text file...
+The Metadata framework gives an efficient opportunity to record huge amounts of data in a fast and reliable way (timewise and memory consumption-wise).
+Thanks to the ORMFMPPLoader database loader it is really simple to access those data and render them in another format: HTML for easy to read build summary,
+XML to communicate with other tools, text file...
Loading a database
------------------
-A database can be load and assigned to a template variable using the pp.loadData functionnality from the FMPP task. The 'com.nokia.helium.metadata.ORMFMPPLoader'
+A database can be loaded and assigned to a template variable using the pp.loadData functionnality from the FMPP task. The 'com.nokia.helium.metadata.ORMFMPPLoader'
accept one argument which is the path to the database.
Example::
@@ -206,7 +206,7 @@
---------------------------------------
The 'jpasingle' is the best way to access results from single values like count of entities. The jpasingle queries must be written in JPQL,
-please check the valid database schema in the previous section (case matter!).
+please check the valid database schema in the previous section (case matters!).
Example of a template that will return the number of log files recorded in the database::
@@ -217,7 +217,7 @@
--------------------------------
The JPA query allows you to perform query and directly use JPA entity object directly inside the template. The jpa queries must be written in JPQL,
-please check the valid database schema in the previous section (case matter!).
+please check the valid database schema in the previous section (case matters!).
In the following example the query loop through the available log files::
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/metadata/src/com/nokia/helium/metadata/ant/antlib.xml
--- a/buildframework/helium/sf/java/metadata/src/com/nokia/helium/metadata/ant/antlib.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/metadata/src/com/nokia/helium/metadata/ant/antlib.xml Mon Oct 11 11:16:47 2010 +0100
@@ -26,6 +26,7 @@
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/metadata/src/com/nokia/helium/metadata/ant/types/CoverityLogMetaDataInput.java
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/java/metadata/src/com/nokia/helium/metadata/ant/types/CoverityLogMetaDataInput.java Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,103 @@
+/*
+* Copyright (c) 2007-2008 Nokia Corporation and/or its subsidiary(-ies).
+* All rights reserved.
+* This component and the accompanying materials are made available
+* under the terms of the License "Eclipse Public License v1.0"
+* which accompanies this distribution, and is available
+* at the URL "http://www.eclipse.org/legal/epl-v10.html".
+*
+* Initial Contributors:
+* Nokia Corporation - initial contribution.
+*
+* Contributors:
+*
+* Description:
+*
+*/
+package com.nokia.helium.metadata.ant.types;
+
+import java.io.BufferedReader;
+import java.io.File;
+import java.io.FileNotFoundException;
+import java.io.FileReader;
+import java.io.IOException;
+import java.util.Map;
+
+import javax.persistence.EntityManager;
+import javax.persistence.EntityManagerFactory;
+
+import com.nokia.helium.metadata.AutoCommitEntityManager;
+import com.nokia.helium.metadata.MetadataException;
+import com.nokia.helium.metadata.model.metadata.LogFile;
+import com.nokia.helium.metadata.model.metadata.MetadataEntry;
+import com.nokia.helium.metadata.model.metadata.Severity;
+import com.nokia.helium.metadata.model.metadata.SeverityDAO;
+
+/**
+ * This Type is to specify and use the text logparser type to parse and store the data.
+ * This type will not replace any string of format [ababa] with ""
+ *
- <#list lognode.build[".//message[@priority='error']"] as message>
- ${message}
- #list>
-
-
- <#else>
- <#if (lognode.@filename[0])?exists>${lognode.@filename[0]}...OK #if>
- #if>
-#list>
-
-
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/signaling/src/templates/email_default.html.ftl
--- a/buildframework/helium/sf/java/signaling/src/templates/email_default.html.ftl Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,88 +0,0 @@
-<#--
-============================================================================
-Name : email.html.ftl
-Part of : Helium
-
-Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
-All rights reserved.
-This component and the accompanying materials are made available
-under the terms of the License "Eclipse Public License v1.0"
-which accompanies this distribution, and is available
-at the URL "http://www.eclipse.org/legal/epl-v10.html".
-
-Initial Contributors:
-Nokia Corporation - initial contribution.
-
-Contributors:
-
-Description:
-
-============================================================================
--->
-
-
-
-
-
-
- Build result e-mail from ${ant["env.COMPUTERNAME"]}.
-
-
-
-
-
-
-
This is an e-mail notification that a build has been completed on ${ant["env.COMPUTERNAME"]}
-
- ${signaling['signal.name']} is finished.
-
-
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/signaling/src/templates/email_subject.txt.ftl
--- a/buildframework/helium/sf/java/signaling/src/templates/email_subject.txt.ftl Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,22 +0,0 @@
-<#--
-============================================================================
-Name : email_subject.txt.ftl
-Part of : Helium
-
-Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
-All rights reserved.
-This component and the accompanying materials are made available
-under the terms of the License "Eclipse Public License v1.0"
-which accompanies this distribution, and is available
-at the URL "http://www.eclipse.org/legal/epl-v10.html".
-
-Initial Contributors:
-Nokia Corporation - initial contribution.
-
-Contributors:
-
-Description:
-
-============================================================================
--->
-${ant["build.id"]}:${ant["env.COMPUTERNAME"]}: ${signalname} alert
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/signaling/tests/antunit/signaling_test.ant.xml
--- a/buildframework/helium/sf/java/signaling/tests/antunit/signaling_test.ant.xml Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,272 +0,0 @@
-
-
-
-
- Test all the helium signals
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/signaling/tests/antunit/test_defered_failure.ant.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/java/signaling/tests/antunit/test_defered_failure.ant.xml Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,38 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/signaling/tests/antunit/test_executetasknotifier.ant.xml
--- a/buildframework/helium/sf/java/signaling/tests/antunit/test_executetasknotifier.ant.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/signaling/tests/antunit/test_executetasknotifier.ant.xml Mon Oct 11 11:16:47 2010 +0100
@@ -65,20 +65,20 @@
-
+
-
+
-
+
-
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/signaling/tests/antunit/test_hasdeferredfailure.ant.xml
--- a/buildframework/helium/sf/java/signaling/tests/antunit/test_hasdeferredfailure.ant.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/signaling/tests/antunit/test_hasdeferredfailure.ant.xml Mon Oct 11 11:16:47 2010 +0100
@@ -77,11 +77,7 @@
-
-
-
-
-
+
@@ -92,7 +88,7 @@
-
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/signaling/tests/antunit/test_signal_listener.ant.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/java/signaling/tests/antunit/test_signal_listener.ant.xml Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,55 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/signaling/tests/antunit/test_signaling_config.ant.xml
--- a/buildframework/helium/sf/java/signaling/tests/antunit/test_signaling_config.ant.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/signaling/tests/antunit/test_signaling_config.ant.xml Mon Oct 11 11:16:47 2010 +0100
@@ -229,8 +229,8 @@
-
-
+
+
@@ -239,7 +239,7 @@
-
+
@@ -248,7 +248,7 @@
-
+
@@ -274,12 +274,12 @@
-
+
@@ -320,7 +320,7 @@
-
+
@@ -332,4 +332,39 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/signaling/tests/antunit/test_signaltask.ant.xml
--- a/buildframework/helium/sf/java/signaling/tests/antunit/test_signaltask.ant.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/signaling/tests/antunit/test_signaltask.ant.xml Mon Oct 11 11:16:47 2010 +0100
@@ -64,7 +64,12 @@
-
+
+
+
+
+
+
@@ -101,13 +106,25 @@
+
+
+
+
+
+
+
+
+
+
+
+
-
+
@@ -120,13 +137,14 @@
-
+
+
-
+
@@ -136,4 +154,24 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/signaling/tests/build.xml
--- a/buildframework/helium/sf/java/signaling/tests/build.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/signaling/tests/build.xml Mon Oct 11 11:16:47 2010 +0100
@@ -24,17 +24,5 @@
Helium Antlib signaling tests.
-
-
-
-
-
-
-
-
-
-
-
-
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/signaling/tests/scenarii/dual-cond-config-failure/build.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/java/signaling/tests/scenarii/dual-cond-config-failure/build.xml Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,41 @@
+
+
+
+
+
+
+
+
+
+ Signal: ${signal.name}
+ Message: ${signal.message}
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/signaling/tests/scenarii/signal-listener-test/build.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/java/signaling/tests/scenarii/signal-listener-test/build.xml Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,94 @@
+
+
+
+
+
+
+
+
+
+ Signal: ${signal.name}
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/signaling/tests/scenarii/test-deferred-failure/build.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/java/signaling/tests/scenarii/test-deferred-failure/build.xml Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,47 @@
+
+
+
+
+
+
+
+
+
+ Signal: ${signal.name}
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/signaling/tests/src/com/nokia/helium/signaling/tests/TestEmailSender.java
--- a/buildframework/helium/sf/java/signaling/tests/src/com/nokia/helium/signaling/tests/TestEmailSender.java Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/signaling/tests/src/com/nokia/helium/signaling/tests/TestEmailSender.java Mon Oct 11 11:16:47 2010 +0100
@@ -51,7 +51,7 @@
en.setTitle("test");
en.setSmtp("test");
en.setLdap("test");
- NotifierInput input = new NotifierInput();
+ NotifierInput input = new NotifierInput(p);
input.setFile(new File(System.getProperty("testdir") + "/tests/data/test.log_status.html"));
en.sendData("test", true, input, "Test Message");
}
@@ -70,7 +70,7 @@
en.setTitle("test");
en.setSmtp("test");
en.setLdap("test");
- NotifierInput input = new NotifierInput();
+ NotifierInput input = new NotifierInput(p);
en.sendData("test", true, input, "Test Message");
}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/sysdef/demo/data/sf/os/buildtools/bldsystemtools/sysdeftools/joinsysdef.pl
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/java/sysdef/demo/data/sf/os/buildtools/bldsystemtools/sysdeftools/joinsysdef.pl Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,20 @@
+#============================================================================
+#Name : .pl
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#============================================================================
+use FindBin '$Bin';
+system("python $Bin/joinsysdef_mock.py @ARGV");
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/java/sysdef/src/com/nokia/helium/sysdef/templates/root_sysdef_model.xml.ftl
--- a/buildframework/helium/sf/java/sysdef/src/com/nokia/helium/sysdef/templates/root_sysdef_model.xml.ftl Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/java/sysdef/src/com/nokia/helium/sysdef/templates/root_sysdef_model.xml.ftl Mon Oct 11 11:16:47 2010 +0100
@@ -142,7 +142,7 @@
]>
xmlns:vendor="${idnamespace}"#if>>
-<#list layers?keys as layer>
+<#list layers?keys?sort as layer>
<#list roots?keys as root>
<#if roots[root]?keys?seq_contains(layer)>
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/blockspackager/build.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/blockspackager/build.xml Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,27 @@
+
+
+
+
+
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/blockspackager/ivy.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/blockspackager/ivy.xml Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,36 @@
+
+
+
+
+
+
+
+
+
+
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/blockspackager/lib/blockspackagercpythontests/__init__.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/blockspackager/lib/blockspackagercpythontests/__init__.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,19 @@
+#============================================================================
+#Name : __init__.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/blockspackager/lib/blockspackagertests/__init__.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/blockspackager/lib/blockspackagertests/__init__.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,18 @@
+#============================================================================
+#Name : __init__.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/blockspackager/lib/blockspackagertests/test_packager_cli.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/blockspackager/lib/blockspackagertests/test_packager_cli.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,44 @@
+#============================================================================
+#Name : test_packager_cli.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+import unittest
+import unittestadditions
+skipTest = False
+try:
+ import packager.cli
+except ImportError:
+ skipTest = True
+import logging
+
+
+#logging.basicConfig(level=logging.DEBUG)
+logger = logging.getLogger('nokiatest.datasources')
+
+
+class CliTest(unittest.TestCase):
+ """ Verifying the datasource interface. """
+
+ @unittestadditions.skip(skipTest)
+ def test_cli(self):
+ """ Check that --help-datasource works. """
+ app = packager.cli.PackagerApp()
+ ret = app.execute(['--help-datasource'])
+ print ret
+ assert ret == 0, "Return value for help must be 0."
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/blockspackager/lib/blockspackagertests/test_packager_datasources.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/blockspackager/lib/blockspackagertests/test_packager_datasources.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,377 @@
+#============================================================================
+#Name : test_packager_datasources.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+import unittest
+from unittestadditions import skip
+skipTest = False
+try:
+ import packager.datasources
+except ImportError:
+ skipTest = True
+import os
+from StringIO import StringIO
+import tempfile
+import xml.sax
+import logging
+import sys
+
+#logging.basicConfig(level=logging.DEBUG)
+logger = logging.getLogger('nokiatest.datasources')
+
+class DataSourceInterfaceTest(unittest.TestCase):
+ """ Verifying the datasource interface. """
+
+ @skip(skipTest)
+ def test_datasource_getComponent(self):
+ """ Check that getComponent is not implemented. """
+ ds = packager.datasources.DataSource('/')
+ self.assertRaises(NotImplementedError, ds.getComponents)
+
+ @skip(skipTest)
+ def test_datasource_getHelp(self):
+ """ Check that no help is defined. """
+ ds = packager.datasources.DataSource('/')
+ self.assertEqual(None, ds.getHelp())
+ self.assertEqual(ds.help, ds.getHelp())
+
+class CMakerDataSourceTest(unittest.TestCase):
+ """ Unit test for CMakerDataSource """
+ @skip(skipTest)
+ def test_whatlog_missing(self):
+ """ getComponent should fail if whatlog is missing. """
+ data = {}
+ ds = packager.datasources.CMakerDataSource('/', data)
+ self.assertRaises(packager.datasources.MissingProperty, ds.getComponents)
+
+ @skip(skipTest)
+ def test_configdir_missing(self):
+ """ getComponent should fail if configdir is missing. """
+ data = {'whatlog': 'somevalue'}
+ ds = packager.datasources.CMakerDataSource('/', data)
+ self.assertRaises(packager.datasources.MissingProperty, ds.getComponents)
+
+ @skip(skipTest)
+ def test_invalid_whatlog_invalid_configdir(self):
+ """ getComponent should fail because whatlog doesn't exists. """
+ data = {'whatlog': 'somevalue', 'configdir': 'somevalue'}
+ ds = packager.datasources.CMakerDataSource('/', data)
+ self.assertRaises(Exception, ds.getComponents)
+
+ @skip(skipTest)
+ def test_valid_whatlog_invalid_configdir(self):
+ """ getComponent should fail because configdir doesn't exists. """
+ data = {'whatlog': __file__, 'configdir': 'somevalue'}
+ ds = packager.datasources.CMakerDataSource('/', data)
+ self.assertRaises(Exception, ds.getComponents)
+
+ @skip(skipTest)
+ def test_install_log_parsing(self):
+ """ Test the parsing of a regular cmaker install log. """
+ log = r"""C:\APPS\actperl\bin\perl.exe -e 'use File::Copy; copy(q(src/env.mk),q(/epoc32/tools/cmaker/env.mk))'
+C:\APPS\actperl\bin\perl.exe -e 'use File::Copy; copy(q(src/functions.mk),q(/epoc32/tools/cmaker/functions.mk))'
+C:\APPS\actperl\bin\perl.exe -e 'use File::Copy; copy(q(src/include_template.mk),q(/epoc32/tools/cmaker/include_template.mk))'
+C:\APPS\actperl\bin\perl.exe -e 'use File::Copy; copy(q(src/settings.mk),q(/epoc32/tools/cmaker/settings.mk))'
+C:\APPS\actperl\bin\perl.exe -e 'use File::Copy; copy(q(src/tools.mk),q(/epoc32/tools/cmaker/tools.mk))'
+C:\APPS\actperl\bin\perl.exe -e 'use File::Copy; copy(q(src/utils.mk),q(/epoc32/tools/cmaker/utils.mk))'
+C:\APPS\actperl\bin\perl.exe -e 'use File::Copy; copy(q(bin/mingw_make.exe),q(/epoc32/tools/rom/mingw_make.exe))'
+C:\APPS\actperl\bin\perl.exe -e 'use File::Copy; copy(q(src/cmaker.cmd),q(/epoc32/tools/cmaker.cmd))'
+"""
+ (handle, filename) = tempfile.mkstemp()
+ os.write(handle, log)
+ os.close(handle)
+
+ data = {'whatlog': filename, 'configdir': os.path.dirname(__file__)}
+ ds = packager.datasources.CMakerDataSource('/', data)
+ components = ds.getComponents()
+ assert len(components) == 1
+ assert len(components[0].getTargetFiles()) == 8
+ assert 'epoc32/tools/rom/mingw_make.exe' in components[0].getTargetFiles()
+
+ os.remove(filename)
+
+
+ @skip(skipTest)
+ def test_what_log_parsing_windows(self):
+ """ Test the parsing of a regular cmaker what log (windows). """
+ if sys.platform == 'win32':
+ log = r"""\epoc32\tools\rom\image.txt
+\CreateImage.cmd
+cd \config\overlay && xcopy *.* \ /F /R /Y /S
+0 File(s) copied
+cd \tools\toolsmodTB92 && xcopy *.* \ /F /R /Y /S
+Y:\tools\toolsmodTB92\epoc32\tools\abld.pl -> Y:\epoc32\tools\abld.pl
+Y:\tools\toolsmodTB92\epoc32\tools\bldmake.pl -> Y:\epoc32\tools\bldmake.pl
+"""
+ (handle, filename) = tempfile.mkstemp()
+ os.write(handle, log)
+ os.close(handle)
+
+ data = {'whatlog': filename, 'configdir': os.path.dirname(__file__)}
+ ds = packager.datasources.CMakerDataSource('/', data)
+ components = ds.getComponents()
+ assert len(components) == 1
+ assert len(components[0].getTargetFiles()) == 2
+ print components[0].getTargetFiles()
+ assert 'CreateImage.cmd' in components[0].getTargetFiles()
+ assert 'epoc32/tools/rom/image.txt' in components[0].getTargetFiles()
+ assert 'epoc32/tools/abld.pl' not in components[0].getTargetFiles()
+ assert 'epoc32/tools/bldmake.pl' not in components[0].getTargetFiles()
+
+ os.remove(filename)
+
+ @skip(skipTest)
+ def test_what_log_parsing_linux(self):
+ """ Test the parsing of a regular cmaker what log (linux). """
+ if sys.platform != 'win32':
+ log = r"""/epoc32/tools/rom/image.txt
/CreateImage.cmd
+"""
+ (handle, filename) = tempfile.mkstemp()
+ os.write(handle, log)
+ os.close(handle)
+
+ data = {'whatlog': filename, 'configdir': os.path.dirname(__file__)}
+ ds = packager.datasources.CMakerDataSource('/', data)
+ components = ds.getComponents()
+ assert len(components) == 1
+ assert len(components[0].getTargetFiles()) == 2
+ print components[0].getTargetFiles()
+ assert 'CreateImage.cmd' in components[0].getTargetFiles()
+ assert 'epoc32/tools/rom/image.txt' in components[0].getTargetFiles()
+
+ os.remove(filename)
+
+
+ @skip(skipTest)
+ def test_getHelp(self):
+ """ Check that help is defined for CMakerDataSource. """
+ ds = packager.datasources.CMakerDataSource('/', {})
+ self.assertNotEqual(None, ds.getHelp())
+ self.assertEqual(ds.help, ds.getHelp())
+
+
+class SBSDataSourceTest(unittest.TestCase):
+ """ Unit test case for SBSDataSource """
+ @skip(skipTest)
+ def test_getHelp(self):
+ """ Check that help is defined for SBSDataSource. """
+ ds = packager.datasources.SBSDataSource('/', {})
+ self.assertNotEqual(None, ds.getHelp())
+ self.assertEqual(ds.help, ds.getHelp())
+
+
+class SysdefComponentListTest(unittest.TestCase):
+ """ Unit test case for packager.datasources.sbs.SysdefComponentList """
+ sysdef = None
+
+ def setUp(self):
+ self.sysdef = StringIO("""
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+""")
+
+ @skip(skipTest)
+ def test_unit_parsing(self):
+ """ SysdefComponentList extract correctly the units... """
+ cl = packager.datasources.sbs.SysdefComponentList('/')
+ p = xml.sax.make_parser()
+ p.setContentHandler(cl)
+ p.parse(self.sysdef)
+ assert len(cl) == 2
+ assert cl['unit1_name']['path'] == os.path.normpath('/path/to/component1')
+ assert cl['unit2_name']['path'] == os.path.normpath('/path/to/Component2')
+ assert cl['unit2_name']['name'] == 'unit2_name'
+
+ @skip(skipTest)
+ def test_get_component_name_by_path(self):
+ """ Check if get_component_name_by_path is case unsensitive. """
+ cl = packager.datasources.sbs.SysdefComponentList('/')
+ p = xml.sax.make_parser()
+ p.setContentHandler(cl)
+ p.parse(self.sysdef)
+
+ # reading path should be case independent.
+ assert cl.get_component_name_by_path(os.path.normpath('/path/to/Component2')) == 'unit2_name'
+ assert cl.get_component_name_by_path(os.path.normpath('/path/to/component2')) == 'unit2_name'
+
+ @skip(skipTest)
+ def test_get_component_name_by_path_invalid(self):
+ """ Check that get_component_name_by_path is raising an exception if """
+ cl = packager.datasources.sbs.SysdefComponentList('/')
+ p = xml.sax.make_parser()
+ p.setContentHandler(cl)
+ p.parse(self.sysdef)
+
+ # reading path should be case independent.
+ try:
+ cl.get_component_name_by_path(os.path.normpath('/path/to/invalid'))
+ except packager.datasources.sbs.ComponentNotFound:
+ pass
+ else:
+ self.fail("Expected get_component_name_by_path to raise an exception in case of non-existing component definition.")
+
+
+class SysdefComponentListSysdef3ParsingTest(unittest.TestCase):
+ """ Unit test case for packager.datasources.sbs.SysdefComponentList """
+ sysdef = None
+
+ def setUp(self):
+ self.sysdef = StringIO("""
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+""")
+
+ @skip(skipTest)
+ def test_unit_parsing(self):
+ """ SysdefComponentList extract correctly the units... """
+ cl = packager.datasources.sbs.SysdefComponentList('/')
+ p = xml.sax.make_parser()
+ p.setContentHandler(cl)
+ p.parse(self.sysdef)
+ assert len(cl) == 2
+ print cl
+ assert cl['helloworldcons_app_sf_app_helloworldcons_group']['path'] == os.path.normpath('/sf/app/helloworldcons/group')
+ assert cl['helloworld_api_sf_mw_helloworldapi_group']['path'] == os.path.normpath('/sf/mw/helloworldapi/group')
+ assert cl['helloworld_api_sf_mw_helloworldapi_group']['name'] == 'helloworld_api'
+
+ @skip(skipTest)
+ def test_get_component_name_by_path(self):
+ """ Check if get_component_name_by_path is case unsensitive. """
+ cl = packager.datasources.sbs.SysdefComponentList('/')
+ p = xml.sax.make_parser()
+ p.setContentHandler(cl)
+ p.parse(self.sysdef)
+
+ # reading path should be case independent.
+ assert cl.get_component_name_by_path(os.path.normpath('/sf/app/helloworldcons/group')) == 'helloworldcons_app_sf_app_helloworldcons_group'
+ assert cl.get_component_name_by_path(os.path.normpath('/sf/mw/helloworldapi/group')) == 'helloworld_api_sf_mw_helloworldapi_group'
+
+ @skip(skipTest)
+ def test_get_component_name_by_path_invalid(self):
+ """ Check that get_component_name_by_path is raising an exception if """
+ cl = packager.datasources.sbs.SysdefComponentList('/')
+ p = xml.sax.make_parser()
+ p.setContentHandler(cl)
+ p.parse(self.sysdef)
+
+ # reading path should be case independent.
+ try:
+ cl.get_component_name_by_path(os.path.normpath('/path/to/invalid'))
+ except packager.datasources.sbs.ComponentNotFound:
+ pass
+ else:
+ self.fail("Expected get_component_name_by_path to raise an exception in case of non-existing component definition.")
+
+
+class ObyParserTest(unittest.TestCase):
+ """ Unit test case for packager.datasources.imaker.ObyParser """
+ oby = None
+
+ def setUp(self):
+ (hld, filename) = tempfile.mkstemp(".oby", "datasource_test")
+ os.write(hld, """
+rofssize=0x10000000
+# file=\\epoc32\\release\\ARMV5\\urel\\COMMENT.DLL "Sys\\Bin\\EDISP.DLL"
+file=\\epoc32\\release\\ARMV5\\urel\\edisp.dll "Sys\\Bin\\EDISP.DLL"
+data=\\epoc32\\data\\Z\\Private\\10202BE9\\20000585.txt "Private\\10202BE9\\20000585.txt"
+extension[0x09080004]=\\epoc32\\release\ARMV5\urel\power_resources.dll "Sys\\Bin\\power_resources.dll"
+variant[0x09080004]=\\epoc32\\release\\ARMV5\\urel\\ecust.b23b7726cf4b5801b0dc14102b245fb8.dll "Sys\\Bin\\ecust.dll"
+# file="\\epoc32\\release\\ARMV5\\urel\\edisp.dll" "Sys\\Bin\\EDISP.DLL"
+data="/output/release_flash_images/langpack_01/rofs2/variant/private/10202be9/10281872.txt" "private\10202be9\10281872.txt"
+""")
+ os.close(hld)
+ self.oby = filename
+
+ def tearDown(self):
+ os.remove(self.oby)
+
+ @skip(skipTest)
+ def test_oby(self):
+ """ Testing the extraction of source files from an processed Oby file. """
+ print self.oby
+ p = packager.datasources.imaker.ObyParser('/', self.oby)
+ files = p.getSourceFiles()
+ print files
+ assert len(files) == 5
+ assert os.path.normpath(r'\epoc32\release\ARMV5\urel\edisp.dll'.replace('\\', os.sep).replace('/', os.sep)) in files
+ assert os.path.normpath(r'\epoc32\data\Z\Private\10202BE9\20000585.txt'.replace('\\', os.sep).replace('/', os.sep)) in files
+ assert os.path.normpath(r'\epoc32\release\ARMV5\urel\power_resources.dll'.replace('\\', os.sep).replace('/', os.sep)) in files
+ assert os.path.normpath(r'\epoc32\release\ARMV5\urel\ecust.b23b7726cf4b5801b0dc14102b245fb8.dll'.replace('\\', os.sep).replace('/', os.sep)) in files
+ assert os.path.normpath(r'/output/release_flash_images/langpack_01/rofs2/variant/private/10202be9/10281872.txt'.replace('\\', os.sep).replace('/', os.sep)) in files
+
+
+class ConEDataSourceTest(unittest.TestCase):
+ """ ConfToolDataSource unittest. """
+
+ @skip(skipTest)
+ def test_cone_input(self):
+ """ Testing ConE output log parsing. """
+ log = """ Generating file '\\epoc32\\release\\winscw\\urel\\z\\private\\10202BE9\\10208dd7.txt'...
+DEBUG : cone.crml(assets/symbianos/implml/usbmanager_10285c46.crml)
+ Generating file '\\epoc32\\release\\winscw\\urel\\z\\private\\10202BE9\\10285c46.txt'...
+DEBUG : cone.crml(assets/symbianos/implml/usbmanager_10286a43.crml)
+ Generating file '\\epoc32\\release\\winscw\\urel\\z\\private\\10202BE9\\10286a43.txt'...
+INFO : cone
+ Adding impl CrmlImpl(ref='assets/symbianos/implml/apputils_100048aa.crml', type='crml', index=0)
+INFO : cone
+"""
+ (handle, filename) = tempfile.mkstemp()
+ os.write(handle, log)
+ os.close(handle)
+ data = {'filename': filename, 'name': 'cone', 'version': '1.0'}
+ ds = packager.datasources.ConEDataSource('/', data)
+ components = ds.getComponents()
+ assert len(components) == 1
+ print components[0].getTargetFiles()
+ assert len(components[0].getTargetFiles()) == 3
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/blockspackager/lib/blockspackagertests/test_packager_io.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/blockspackager/lib/blockspackagertests/test_packager_io.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,113 @@
+#============================================================================
+#Name : test_packager_io.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+
+# pylint: disable=E0602
+import unittest
+import logging
+from unittestadditions import skip
+skipTest = False
+try:
+ import packager.io
+ from Blocks.Packaging.BuildData import *
+except ImportError:
+ skipTest = True
+
+#logging.basicConfig(level=logging.DEBUG)
+logger = logging.getLogger('nokiatest.datasources')
+
+class BuildDataSerializerTest(unittest.TestCase):
+ """ Check the de/serialisation of PlainBuildData objects. """
+
+ @skip(skipTest)
+ def test_serialization_deserialization(self):
+ """ Check if a serialized PlainBuildData can be deserialized correctly. """
+ bd = PlainBuildData()
+ bd.setComponentName("foobar")
+ bd.setComponentVersion("99")
+ bd.setSourceRoot('/src')
+ bd.setTargetRoot('/epoc32')
+ bd.addSourceFiles(['src.txt', 'cmp/src.txt'])
+ bd.addTargetFiles(['release/armv5/urel/target.dll', 'release/armv5/lib/target.lib'])
+ data_xml = packager.io.BuildDataSerializer(bd).toXml()
+ bdx = packager.io.BuildDataSerializer().fromXml(data_xml)
+ self.assertEquals(bd.getComponentName(), bdx.getComponentName())
+ self.assertEquals(bd.getComponentVersion(), bdx.getComponentVersion())
+ self.assertEquals(bd.getSourceRoot(), bdx.getSourceRoot())
+ self.assertEquals(bd.getTargetRoot(), bdx.getTargetRoot())
+ self.assertEquals(bd.getSourceFiles(), bdx.getSourceFiles())
+ self.assertEquals(bd.getTargetFiles(), bdx.getTargetFiles())
+ self.assertEquals(len(bdx.getSourceFiles()), 2)
+ self.assertEquals(len(bdx.getTargetFiles()), 2)
+ assert 'release/armv5/urel/target.dll' in bdx.getTargetFiles()
+ assert 'release/armv5/lib/target.lib' in bdx.getTargetFiles()
+ assert 'src.txt' in bdx.getSourceFiles()
+ assert 'cmp/src.txt' in bdx.getSourceFiles()
+
+class BdFileSerializerTest(unittest.TestCase):
+ """ Verifying the datasource interface. """
+
+ @skip(skipTest)
+ def test_serialization_deserialization(self):
+ """ Check if a serialized BdFile can be deserialized correctly. """
+ bd = BdFile("epoc32/release/armv5/urel/target.dll")
+ bd.getVariantPlatform()
+ bd.addSourceDependency("/src/src.txt")
+ bd.addOwnerDependency("/epoc32/release/armv5/urel/target.dll")
+ data_xml = packager.io.BdFileSerializer(bd).toXml()
+ bdx = packager.io.BdFileSerializer().fromXml(data_xml)
+ self.assertEquals(bd.getPath(), bdx.getPath())
+ self.assertEquals(bd.getVariantPlatform(), bdx.getVariantPlatform())
+ self.assertEquals(bd.getVariantType(), bdx.getVariantType())
+ self.assertEquals(bd.getSourceDependencies(), bdx.getSourceDependencies())
+ self.assertEquals(bd.getOwnerDependencies(), bdx.getOwnerDependencies())
+
+ assert len(bd.getSourceDependencies()) == 1
+ assert len(bd.getOwnerDependencies()) == 1
+
+ assert "/src/src.txt" in bd.getSourceDependencies()
+ assert '/epoc32/release/armv5/urel/target.dll' in bd.getOwnerDependencies()
+
+
+class BuildDataMergerTest(unittest.TestCase):
+ """ Unit test case for packager.io.BuildDataMerger """
+ @skip(skipTest)
+ def test_merge(self):
+ """ Testing a simple merge. """
+ bd = PlainBuildData()
+ bd.setComponentName("foobar")
+ bd.setComponentVersion("99")
+ bd.setSourceRoot('/src')
+ bd.setTargetRoot('/epoc32')
+ bd.addSourceFiles(['src.txt', 'cmp/src.txt'])
+ bd.addTargetFiles(['release/armv5/urel/target.dll', 'release/armv5/lib/target.lib'])
+
+ bd2 = PlainBuildData()
+ bd2.setComponentName("foobar")
+ bd2.setComponentVersion("99")
+ bd2.setSourceRoot('/src')
+ bd2.setTargetRoot('/epoc32')
+ bd2.addSourceFiles(['src.txt', 'cmp/src.txt', 'cmp2/src.txt'])
+ bd2.addTargetFiles(['release/armv5/urel/target2.dll'])
+
+ m = packager.io.BuildDataMerger(bd)
+ m.merge(bd2)
+
+ assert len(bd.getSourceFiles()) == 3
+ assert len(bd.getTargetFiles()) == 3
+
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/blockspackager/lib/packager/__init__.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/blockspackager/lib/packager/__init__.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,18 @@
+#============================================================================
+#Name : __init__.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/blockspackager/lib/packager/cli.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/blockspackager/lib/packager/cli.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,199 @@
+#============================================================================
+#Name : cli.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+import sys
+import re
+import os
+import xml.dom.minidom
+from optparse import OptionParser
+import Blocks
+from packager.io import BuildDataSerializer, BuildDataMerger
+from Blocks.Packaging.DependencyProcessors.DefaultProcessors import BuildDataDependencyProcessor
+import packager.datasources
+import logging
+logging.basicConfig(level=logging.INFO)
+
+
+
+
+class PackagerApp:
+ """ The packager CLI implementation. """
+ def __init__(self):
+ self.logger = logging.getLogger("packager")
+ self.cli = OptionParser(usage="%prog [options]")
+ self.cli.add_option("--epocroot", metavar="DIR", help="Epocroot location (must be an absolute path)")
+ self.cli.add_option("--config", metavar="DIR")
+ self.cli.add_option("--outputdir", metavar="DIR")
+ self.cli.add_option("--datasource", metavar="NAME")
+ self.cli.add_option("--metadatadir", metavar="DIR")
+ self.cli.add_option("--updateData", action="store_true", dest="action_update", default=False)
+ self.cli.add_option("--createBundles", action="store_true", dest="action_bundle", default=False)
+ self.cli.add_option("--help-datasource", action="store_true", dest="action_help_datasource", default=False)
+ self.cli.add_option("--workers", type="int", dest="workers", default=4)
+ self.cli.add_option("--writer", dest="writer", default='deb')
+ self.cli.add_option("--sourceRules", dest="sourceRules")
+ self.cli.add_option("--targetRules", dest="targetRules")
+ self.cli.add_option("--pkgDirectives", dest="pkgDirectives")
+ self.cli.add_option("--debug", action="store_true", default=False)
+ self.cli.add_option("--interdeps", choices=['true', 'false'], dest="interdeps", default='false')
+ self.__workers = 4
+ self.__writer = "deb"
+ self.__config = None
+ self.__epocroot = None
+ self.__datasource = None
+ self.__update = False
+ self.__bundle = False
+ self.__help_datasource = False
+ self.__outputdir = None
+ self.__source_rules = None
+ self.__target_rules = None
+ self.__directives = None
+ self.__metadatadir = None
+ self.__interdeps = None
+ self.__writerOptions = None
+ self.__data = {}
+
+ def __readoptions(self, argv=sys.argv):
+ # removing -Dxxx=xxx
+ args = []
+ for arg in argv:
+ res = re.match("-D(.+)=(.*)", arg)
+ if res is not None:
+ self.logger.debug("property: %s=%s" % (res.group(1), res.group(2)))
+ self.__data[res.group(1)] = res.group(2)
+ else:
+ args.append(arg)
+
+ opts, dummy_args = self.cli.parse_args(args)
+ self.__config = opts.config
+ self.__epocroot = opts.epocroot
+ self.__outputdir = opts.outputdir
+ self.__update = opts.action_update
+ self.__bundle = opts.action_bundle
+ self.__help_datasource = opts.action_help_datasource
+ self.__datasource = opts.datasource
+ self.__workers = opts.workers
+ self.__writer = opts.writer
+ self.__source_rules = opts.sourceRules
+ self.__target_rules = opts.targetRules
+ self.__directives = opts.pkgDirectives
+ self.__metadatadir = opts.metadatadir
+ self.__interdeps = opts.interdeps
+ if opts.debug:
+ logging.getLogger().setLevel(logging.DEBUG)
+
+ def __update_data(self):
+ if self.__config is None:
+ raise Exception("--config argument is missing.")
+ if self.__epocroot is None:
+ raise Exception("--epocroot argument is missing.")
+ if not os.path.exists(self.__config) or not os.path.isdir(self.__config):
+ raise Exception("Could not find directory: %s." % self.__config)
+ if not os.path.exists(self.__epocroot) or not os.path.isdir(self.__epocroot) or not os.path.isabs(self.__epocroot):
+ raise Exception("Could not find directory: %s." % self.__epocroot)
+ if self.__datasource is None:
+ raise Exception("--datasource argument is missing.")
+
+ self.logger.info("Retrieving components information...")
+ datasource = packager.datasources.getDataSource(self.__datasource, self.__epocroot, self.__data)
+ for component in datasource.getComponents():
+ outfilename = os.path.join(self.__config, component.getComponentName() + ".blocks_component.xml")
+ if os.path.exists(outfilename):
+ bd = BuildDataSerializer().fromXml(open(outfilename).read())
+ self.logger.info("Merging with previous data...")
+ component = BuildDataMerger(bd).merge(component)
+ serializer = BuildDataSerializer(component)
+ self.logger.info("Writing %s" % outfilename)
+ output = open(outfilename , 'wb')
+ output.write(xml.dom.minidom.parseString(serializer.toXml()).toprettyxml())
+ output.close()
+
+ def __create_bundles(self):
+ if self.__config is None:
+ raise Exception("--config argument is missing.")
+ if self.__epocroot is None:
+ raise Exception("--epocroot argument is missing.")
+ if self.__metadatadir is None:
+ raise Exception("--metadatadir argument is missing.")
+ if not os.path.exists(self.__config) or not os.path.isdir(self.__config):
+ raise Exception("Could not find directory: %s." % self.__config)
+ if not os.path.exists(self.__epocroot) or not os.path.isdir(self.__epocroot) or not os.path.isabs(self.__epocroot):
+ raise Exception("Could not find directory: %s." % self.__epocroot)
+ if self.__outputdir is None:
+ raise Exception("--outputdir argument is missing.")
+ if not os.path.exists(self.__outputdir) or not os.path.isdir(self.__outputdir):
+ raise Exception("Could not find directory: %s." % self.__epocroot)
+ if not os.path.exists(self.__metadatadir) or not os.path.isdir(self.__metadatadir):
+ raise Exception("Could not find directory: %s." % self.__metadatadir)
+
+ if self.__interdeps == 'false':
+ self.__writerOptions = {'STRONG_DEP_MAPPING': None}
+
+ # Creating the packager.
+ storage = Blocks.Packaging.OneoffStorage(self.__metadatadir)
+ packager_obj = Blocks.Packaging.Packager(storage,
+ self.__outputdir,
+ maxWorkers = self.__workers,
+ writer = self.__writer,
+ targetRules = self.__target_rules,
+ sourceRules = self.__source_rules,
+ directives = self.__directives,
+ writerOptions = self.__writerOptions
+ #startNow=False
+ )
+ # Adding processors
+ packager_obj.addProcessor(BuildDataDependencyProcessor)
+ try:
+ from Blocks.Packaging.DependencyProcessors.RaptorDependencyProcessor import DotDeeDependencyProcessor
+ packager_obj.addProcessor(DotDeeDependencyProcessor)
+ except ImportError:
+ logging.warning("Could not load DotDeeDependencyProcessor.")
+
+ for filename in os.listdir(self.__config):
+ filename = os.path.normpath(os.path.join(self.__config, filename))
+ if not filename.endswith('.blocks_component.xml'):
+ continue
+ self.logger.info("Loading %s" % filename)
+ packager_obj.addComponent(BuildDataSerializer().fromXml(open(filename, 'r').read()))
+
+ packager_obj.wait()
+
+ def execute(self, argv=sys.argv):
+ """ Run the CLI. """
+ try:
+ self.__readoptions(argv)
+ if self.__help_datasource:
+ print packager.datasources.getDataSourceHelp()
+ elif self.__update:
+ self.__update_data()
+ elif self.__bundle:
+ self.__create_bundles()
+ else:
+ self.cli.print_help()
+ except IOError, exc:
+ if self.logger.getEffectiveLevel() == logging.DEBUG:
+ self.logger.exception(exc)
+ self.logger.error(str(exc))
+ return -1
+ return 0
+
+
+if __name__ == "__main__":
+ app = PackagerApp()
+ sys.exit(app.execute())
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/blockspackager/lib/packager/datasources/__init__.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/blockspackager/lib/packager/datasources/__init__.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,29 @@
+#============================================================================
+#Name : __init__.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+"""
+Definitions of the datasource.
+"""
+from packager.datasources.api import *
+from packager.datasources.cmaker import CMakerDataSource
+from packager.datasources.imaker import IMakerDataSource
+from packager.datasources.sbs import SBSDataSource
+from packager.datasources.cone import ConEDataSource
+
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/blockspackager/lib/packager/datasources/api.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/blockspackager/lib/packager/datasources/api.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,73 @@
+#============================================================================
+#Name : api.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+
+import logging
+logger = logging.getLogger("datasources.api")
+
+class MissingProperty(Exception):
+ """ An exception to indicate about a missing property """
+ pass
+
+class DataSource(object):
+ """ This abstract class defines a DataSource for the packager application. """
+ def __init__(self, epocroot, data=None):
+ self.epocroot = epocroot
+ self._data = data
+ if data is None:
+ self._data = {}
+
+ def getComponents(self):
+ """ The getComponents method must return a list of BuildData object (one per component).
+ In case of error (e.g incomplete configuration) the method will raise an Exception.
+ """
+ raise NotImplementedError
+
+ def getHelp(self):
+ return None
+
+ help = property(lambda self: self.getHelp())
+
+
+DATASOURCES = {}
+
+def getDataSource(name, epocroot, data):
+ if name in DATASOURCES:
+ logger.debug("Creating datasource for %s." % name)
+ return DATASOURCES[name](epocroot, data)
+ else:
+ logger.info("Loading %s." % name)
+ def class_import(name):
+ try:
+ components = name.split('.')
+ klassname = components.pop()
+ mod = __import__('.'.join(components), globals(), locals(), [klassname])
+ return getattr(mod, klassname)
+ except:
+ raise Exception("Could not load %s" % name)
+ return class_import(name)(epocroot, data)
+
+
+def getDataSourceHelp():
+ doc = ""
+ for name in DATASOURCES:
+ dshelp = DATASOURCES[name](None, None).help
+ if dshelp is not None:
+ doc = doc + "--- %s -----------------------------------\n" % name + dshelp +\
+ "\n------------------------------------------\n"
+ return doc
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/blockspackager/lib/packager/datasources/cmaker.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/blockspackager/lib/packager/datasources/cmaker.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,106 @@
+#============================================================================
+#Name : cmaker.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+
+import os
+import re
+from packager.datasources.api import DataSource, MissingProperty, DATASOURCES
+from Blocks.Packaging.BuildData import PlainBuildData
+import logging
+
+logger = logging.getLogger('packager.datasources.cmaker')
+
+class CMakerDataSource(DataSource):
+ """ Extract information from cMaker logs """
+ def __init__(self, epocroot, data):
+ DataSource.__init__(self, epocroot, data)
+
+ def getComponents(self):
+ if 'whatlog' not in self._data:
+ raise MissingProperty("The whatlog property has not be defined.")
+ if 'configdir' not in self._data:
+ raise MissingProperty("The configdir property has not be defined.")
+ component_name = "cmaker"
+ if 'name' in self._data:
+ component_name = self._data['name']
+ version = "1"
+ if 'version' in self._data:
+ version = self._data['version']
+
+ # validating the inputs
+ if not os.path.exists(self._data['whatlog']) or not os.path.isfile(self._data['whatlog']):
+ raise Exception("Could not find %s file." % self._data['whatlog'])
+ cdir = os.path.abspath(self._data['configdir'])
+ if not os.path.exists(cdir) or not os.path.isdir(cdir):
+ raise Exception("Could not find %s directory." % cdir)
+
+
+ build_data = PlainBuildData()
+ build_data.setComponentName(component_name)
+ build_data.setComponentVersion(version) # need to get it from a the sysdef file
+ build_data.setSourceRoot(self.epocroot)
+ build_data.setTargetRoot(self.epocroot)
+
+ targets = [path[len(self.epocroot):].lstrip(os.sep) for path in self.getExportedFiles()]
+ build_data.addTargetFiles(targets)
+ sources = [path[len(self.epocroot):].lstrip(os.sep) for path in self.getSourceFiles()]
+ build_data.addSourceFiles(sources)
+ return [build_data]
+
+
+ def getExportedFiles(self):
+ """ Get the list of exported file from the log. The parser will recognize cMaker what output and
+ cMaker install log. The usage of xcopy will get warn to the user as its output will not be consider
+ and the target file will get dismissed. """
+ log = open(self._data['whatlog'], 'r')
+ for line in log:
+ line = line.rstrip()
+ rcopy = re.match(r'^.*\s+copy\(q\((.+)\),q\((.*)\)\)\'', line)
+ rxcopy = re.match(r'^(.*)\s+\-\>\s+(.+)$', line)
+ if ':' not in line and line.startswith(os.sep):
+ yield os.path.normpath(os.path.join(self.epocroot, line))
+ elif rcopy is not None:
+ yield os.path.normpath(os.path.join(self.epocroot, rcopy.group(2)))
+ elif rxcopy is not None:
+ logger.warning('This looks like an xcopy output! Make sure you use cmaker correctly: %s' % line)
+
+
+ def getSourceFiles(self):
+ """ Get the list of source file using the call dir and the whatdeps log if available. """
+ cdir = os.path.abspath(self._data['configdir'])
+ for (path, dirpath, namelist) in os.walk(cdir):
+ for name in namelist:
+ yield os.path.join(path, name)
+ if 'whatdepslog' in self._data:
+ log = open(self._data['whatdepslog'], 'r')
+ for line in log:
+ line = line.rstrip()
+ if ':' not in line and line.startswith(os.sep):
+ yield os.path.normpath(os.path.join(self.epocroot, line))
+
+ def getHelp(self):
+ help_ = """This datasource will gather information from the cMaker output logs.
+Plugin property configuration:
+whatlog Defines the location of the whatlog.
+configdir Defines cMaker calling location.
+whatdepslog Defines the location of the cMaker whatdeps log (optional).
+ """
+ return help_
+
+
+DATASOURCES['cmaker'] = CMakerDataSource
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/blockspackager/lib/packager/datasources/cone.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/blockspackager/lib/packager/datasources/cone.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,74 @@
+#============================================================================
+#Name : conftool.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+
+from packager.datasources.api import DataSource, DATASOURCES
+from Blocks.Packaging.BuildData import PlainBuildData
+import logging
+import re
+import os
+
+logger = logging.getLogger('packager.datasources.conftool')
+
+class ConEDataSource(DataSource):
+ """ Extract information from ConE logs """
+ def __init__(self, epocroot, data=None):
+ DataSource.__init__(self, epocroot, data)
+
+ def getTargetFiles(self):
+ """ Get the generated files from the log output. """
+ result = []
+ txtFile = open(self._data['filename'], 'r')
+ matcher = re.compile(r"^\s*Generating file '(.+)'\.\.\.\s*$")
+ for line in txtFile:
+ res = matcher.match(line)
+ if res:
+ result.append(os.path.normpath(os.path.join(self.epocroot,
+ res.group(1))))
+ txtFile.close()
+ return result
+
+ def getComponents(self):
+ """ Get the components list from the cli input. """
+ if 'name' not in self._data:
+ raise Exception("The name property has not be defined.")
+ if 'version' not in self._data:
+ raise Exception("The version property has not be defined.")
+
+ if 'filename' not in self._data:
+ raise Exception("The input conftool log file is not defined")
+
+ #todo: add the source iby / path for conftool input
+ build_data = PlainBuildData()
+ build_data.setComponentName(self._data['name'])
+ build_data.setComponentVersion(self._data['version'])
+ build_data.setSourceRoot(self.epocroot)
+ build_data.setTargetRoot(self.epocroot)
+ build_data.addTargetFiles([path[len(self.epocroot):].lstrip(os.sep) for path in self.getTargetFiles()])
+ return [build_data]
+
+ def getHelp(self):
+ """ Returns the help. """
+ return """
+name Defines the name of the component
+version Defines the version of the component
+filename Defines the log file name of ctool
+"""
+
+
+DATASOURCES['cone'] = ConEDataSource
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/blockspackager/lib/packager/datasources/imaker.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/blockspackager/lib/packager/datasources/imaker.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,117 @@
+#============================================================================
+#Name : imaker.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+
+from packager.datasources.api import DataSource, MissingProperty, DATASOURCES
+from Blocks.Packaging.BuildData import PlainBuildData, BdFile
+import logging
+import re
+import os
+
+logger = logging.getLogger('packager.datasources.imaker')
+
+class ObyParser:
+ """ Simplistic Oby file parser. """
+ def __init__(self, epocroot, filename):
+ self.epocroot = epocroot
+ self.filename = filename
+
+ def getSourceFiles(self):
+ """ Return the list of source file needed to create the image. """
+ result = []
+ logger.debug("Analyzing %s" % self.filename)
+ oby = open(self.filename, 'r')
+ for line in oby:
+ res = re.match(r'\s*(file|data|variant.+|bootbinary|primary.+|extension.+)\s*=\s*\"?(.+?)\"?\s+\".+\"', line)
+ if res is not None:
+ result.append(os.path.normpath(os.path.join(self.epocroot, res.group(2).strip().replace('\\', os.sep).replace('/', os.sep))))
+ oby.close()
+ return result
+
+
+class IMakerDataSource(DataSource):
+ """ Extract information from iMaker logs - iMaker integrated version """
+ def __init__(self, epocroot, data=None):
+ DataSource.__init__(self, epocroot, data)
+
+ def getComponents(self):
+ if 'name' not in self._data:
+ raise Exception("The name property has not be defined.")
+ if 'version' not in self._data:
+ raise Exception("The version property has not be defined.")
+ obys = [self._data[key] for key in self._data.keys() if key.startswith('oby')]
+ targets = [os.path.normpath(self._data[key]) for key in self._data.keys() if key.startswith('target')]
+ build_data = PlainBuildData()
+ build_data.setComponentName(self._data['name'])
+ build_data.setComponentVersion(self._data['version'])
+ build_data.setSourceRoot(self.epocroot)
+ build_data.setTargetRoot(self.epocroot)
+
+ build_data.addTargetFiles([path[len(self.epocroot):].lstrip(os.sep) for path in targets])
+
+ deps = []
+ for oby in obys:
+ deps.extend(ObyParser(self.epocroot, oby).getSourceFiles())
+ for target in targets:
+ print target
+ if target.endswith(".fpsx"):
+ target = os.path.normpath(target)
+ bdfile = BdFile(target[len(self.epocroot):].lstrip(os.sep))
+ bdfile.setOwnerDependencies([path[len(self.epocroot):].lstrip(os.sep) for path in deps])
+ build_data.addDeliverable(bdfile)
+ return [build_data]
+
+class IMakerRomDirDataSource(DataSource):
+ """ Extract information from iMaker logs - guess content of the package from the rom output dir. """
+ def __init__(self, epocroot, data=None):
+ DataSource.__init__(self, epocroot, data)
+
+ def getComponents(self):
+ if 'name' not in self._data:
+ raise MissingProperty("The name property has not be defined.")
+ if 'version' not in self._data:
+ raise MissingProperty("The version property has not be defined.")
+ if 'dir' not in self._data:
+ raise MissingProperty("The dir property has not be defined.")
+ cdir = os.path.normpath(self._data['dir'])
+ obys = []
+ targets = []
+ for (path, dirpath, namelist) in os.walk(cdir):
+ for name in namelist:
+ if name.endswith(".oby"):
+ obys.append(os.path.join(path, name)[len(self.epocroot):].lstrip(os.sep))
+ targets.append(os.path.join(path, name)[len(self.epocroot):].lstrip(os.sep))
+ build_data = PlainBuildData()
+ build_data.setComponentName(self._data['name'])
+ build_data.setComponentVersion(self._data['version'])
+ build_data.setSourceRoot(self.epocroot)
+ build_data.setTargetRoot(self.epocroot)
+ build_data.addTargetFiles(targets)
+ return [build_data]
+
+ def getHelp(self):
+ return """
+name Defines the name of the component
+version Defines the version of the component
+dir Defines the root location of ROM images.
+"""
+
+
+DATASOURCES['imaker'] = IMakerDataSource
+DATASOURCES['imaker-romdir'] = IMakerRomDirDataSource
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/blockspackager/lib/packager/datasources/sbs.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/blockspackager/lib/packager/datasources/sbs.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,299 @@
+#============================================================================
+#Name : imaker.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+
+import xml.sax
+import time
+import os
+import re
+import fileutils
+import sys
+from packager.datasources.api import DataSource, MissingProperty, DATASOURCES
+from Blocks.Packaging.DataSources.LinkInfoToBuildData import LinkInfoXmlReader # pylint: disable=F0401
+try:
+ from Blocks.Packaging.DataSources.SbsLinkInfoReader import LinkInfoReader # pylint: disable=E0611
+except ImportError:
+ if os.path.sep == '\\':
+ raptor_cmd = fileutils.which("sbs.bat")
+ else:
+ raptor_cmd = fileutils.which("sbs")
+ sbs_home = os.path.dirname(os.path.dirname(raptor_cmd))
+ os.environ['SBS_HOME'] = sbs_home
+ sys.path.append(os.path.join(sbs_home, 'python'))
+ sys.path.append(os.path.join(sbs_home, 'python', 'plugins'))
+ # loading as raptor plugin - loading also raptor.
+ if os.path.sep == '\\':
+ os.environ['HOSTPLATFORM'] = 'win 32'
+ os.environ['HOSTPLATFORM_DIR'] = 'win32'
+ else:
+ os.environ['HOSTPLATFORM'] = 'linux'
+ os.environ['HOSTPLATFORM_DIR'] = 'linux'
+ from filter_blocks import LinkInfoReader # pylint: disable=F0401
+
+from Blocks.Packaging.BuildData import PlainBuildData
+from Blocks.Packaging.DataSources.WhatLog import WhatLogReader as LogReader
+import logging
+from Queue import Queue
+from threading import Thread
+
+class ComponentNotFound(Exception):
+ """ Error raised in case of not found component. """
+
+ def __init__(self, message):
+ Exception.__init__(self, message)
+
+class SysdefComponentList(xml.sax.ContentHandler):
+ """ Simplistic sysdef data extractor, it will only get data from unit elements."""
+
+ def __init__(self, epocroot, version="1"):
+ xml.sax.ContentHandler.__init__(self)
+ self.__data = {}
+ self.__epocroot = epocroot
+ self.__version = version
+ self.__component = None
+
+ def startElement(self, tag, attributes):
+ if tag == "component" and attributes.get("id"):
+ self.__component = attributes.get("id")
+ elif tag == "unit" and self.__component and not attributes.get("unitID") and not attributes.get("name") and attributes.get("bldFile"):
+ data = {}
+ data['path'] = os.path.normpath(os.path.join(self.__epocroot, attributes.get("bldFile")).replace('\\', os.sep).replace('/', os.sep))
+ if attributes.get("version") is None:
+ data['version'] = self.__version
+ else:
+ data['version'] = attributes.get("version")
+ data['name'] = self.__cleanup_name(self.__component)
+ self.__data[self.__component + attributes.get("bldFile").replace('\\', '_').replace('/', '_')] = data
+ elif tag == "unit" and attributes.get("name") is not None and attributes.get("bldFile") is not None:
+ data = {}
+ data['path'] = os.path.normpath(os.path.join(self.__epocroot, attributes.get("bldFile")).replace('\\', os.sep).replace('/', os.sep))
+ if attributes.get("version") is None:
+ data['version'] = self.__version
+ else:
+ data['version'] = attributes.get("version")
+ data['name'] = self.__cleanup_name(attributes.get("name"))
+ self.__data[self.__cleanup_name(attributes.get("name"))] = data
+
+ def endElement(self, tag):
+ if tag == "component":
+ self.__component = None
+
+ def __cleanup_name(self, name):
+ return re.sub(r'[^a-zA-Z0-9_-]', '', re.sub(r'\.', '_', name))
+
+ def keys(self):
+ return self.__data.keys()
+
+ def __getitem__(self, key):
+ return self.__data[key]
+
+ def __contains__(self, key):
+ for data in self.__data:
+ if key in data['name']:
+ return True
+ return False
+
+ def __len__(self):
+ return self.__data.__len__()
+
+ def get_component_name_by_path(self, dir_):
+ dir_ = os.path.normpath(dir_)
+ for key in self.__data.keys():
+ if dir_.lower() == self.__data[key]['path'].lower():
+ return key
+ raise ComponentNotFound("Could not find component name for dir %s" % dir_)
+
+ def __str__(self):
+ return "<%s: %s>" % (type(self), self.__data)
+
+
+class BldInfWorker(Thread):
+ """ SBS component worker. """
+ def __init__(self, inqueue, outqueue, datasource, whatlog, cl, link_info):
+ Thread.__init__(self)
+ self.logger = logging.getLogger(self.__class__.__name__)
+ self.inqueue = inqueue
+ self.outqueue = outqueue
+ self.whatlog = whatlog
+ self.cl = cl
+ self.link_info = link_info
+ self.datasource = datasource
+
+ def run(self):
+ """ Thread implementation. """
+ while True:
+ tag, bldinf = self.inqueue.get()
+ if tag == 'STOP':
+ self.logger.debug("Builder thread exiting..." )
+ return
+ else:
+ try:
+ tick = time.time()
+ self.outqueue.put(self.datasource.getBuildData(bldinf, self.whatlog, self.cl, self.link_info))
+ tock = time.time()
+ self.logger.info("Analyzed component %s in %s seconds" % (bldinf, tock - tick))
+ except IOError, exc:
+ self.logger.error('Error happened in thread execution %s' % exc)
+ import traceback
+ self.logger.debug(traceback.format_exc())
+
+
+class SBSDataSource(DataSource):
+ """ That class implements the DataSource API"""
+
+ def __init__(self, epocroot, data=None):
+ DataSource.__init__(self, epocroot, data)
+ self.logger = logging.getLogger(self.__class__.__name__)
+
+ def _get_sysdef_info(self):
+ """ Returns a SysdefComponentList containing the result of sysdef parsing. """
+ self.logger.debug("Reading the component information from the sysdef (%s)." % self._data['sysdef'])
+ p = xml.sax.make_parser()
+ cl = SysdefComponentList(self.epocroot)
+ p.setContentHandler(cl)
+ p.parse(open(self._data['sysdef']))
+ return cl
+
+ def _get_whatlog(self):
+ self.logger.debug("Extracting whatlog data (%s)..." % self._data['sbslog'])
+ parser = xml.sax.make_parser()
+ lr = LogReader()
+ parser.setContentHandler(lr)
+ file_ = open(self._data['sbslog'])
+ while True:
+ data = file_.read()
+ if not data:
+ break
+ parser.feed(data)
+ file_.close()
+ parser.close()
+ return lr
+
+ def _generate_link_info(self, output=None):
+ """ Generate the link.info file from the build log. It returns the generated xml filename. """
+ self.logger.info("Generating the link information from the %s log." % self._data['sbslog'])
+ parser = xml.sax.make_parser()
+ reader = LinkInfoReader(self.epocroot)
+ parser.setContentHandler(reader)
+ parser.parse(open(self._data['sbslog'], 'r'))
+ if output is None:
+ output = self._data['sbslog'] + ".link.xml"
+ self.logger.info("Writing %s." % output)
+ out = open(output, 'wb')
+ reader.writeXml(out=out)
+ out.close()
+ return output
+
+ def getComponents(self):
+ if 'sbslog' not in self._data:
+ raise MissingProperty("The sbslog property has not be defined.")
+ if 'sysdef' not in self._data:
+ raise MissingProperty("The sysdef property has not be defined.")
+
+ # generating link info
+ link_info = LinkInfoXmlReader.getBuildData(self._generate_link_info())
+
+ # Read the component list
+ cl = self._get_sysdef_info()
+
+ # Get the whatlog
+ whatlog = self._get_whatlog()
+
+ result = []
+ if 'threads' in self._data and self._data['threads'].isdigit():
+ inqueue = Queue()
+ outqueue = Queue()
+ workers = []
+ # Work to be done
+
+ for bldinf in whatlog.getInfs():
+ inqueue.put(('', bldinf))
+ # Creating the builders
+ for i in range(int(self._data['threads'])):
+ b = BldInfWorker(inqueue, outqueue, self, \
+ whatlog, cl, link_info)
+ workers.append(b)
+ b.start()
+ # Waiting the work to finish.
+ for w in workers:
+ inqueue.put(('STOP', None))
+ for w in workers:
+ w.join()
+ self.logger.info("All done.")
+ while not outqueue.empty():
+ result.append(outqueue.get())
+ else:
+ for bldinf in whatlog.getInfs():
+ result.append(self.getBuildData(bldinf, whatlog, cl, link_info))
+ return result
+
+ def getBuildData(self, bldinf, whatlog, cl, link_info):
+ """ Get the build data from a bldinf name. """
+ tick = time.time()
+ src_walk_path = ""
+ abs_bldinf = os.path.abspath(bldinf)
+ self.logger.debug("component location: %s" % abs_bldinf)
+ component_name = cl.get_component_name_by_path(os.path.normpath(os.path.dirname(abs_bldinf)))
+ build_data = PlainBuildData()
+ self.logger.debug("component name: %s" % cl[component_name]['name'])
+ build_data.setComponentName(cl[component_name]['name'])
+ self.logger.debug("component version: %s" % cl[component_name]['version'])
+ build_data.setComponentVersion(cl[component_name]['version']) # need to get it from a the sysdef file
+ build_data.setSourceRoot(self.epocroot)
+ build_data.setTargetRoot(self.epocroot)
+
+ targets = [path[len(self.epocroot):].lstrip(os.sep) for path in whatlog.getFilePaths(abs_bldinf)]
+ build_data.addTargetFiles(targets)
+
+ # If path contains group folder then parent to parent is required else parent folder is enough
+ if os.path.dirname(abs_bldinf).endswith("group"):
+ src_walk_path = os.path.dirname(os.path.dirname(abs_bldinf))
+ else:
+ src_walk_path = os.path.dirname(abs_bldinf)
+
+ sources = []
+ for (path, dirpath, namelist) in os.walk(src_walk_path):
+ for name in namelist:
+ sources.append(os.path.join(path, name)[len(self.epocroot):].lstrip(os.sep))
+ build_data.addSourceFiles(sources)
+ tock = time.time()
+ self.logger.info(" + Content analysis %s in %s seconds" % (bldinf, tock - tick))
+
+ tick = time.time()
+ key_bldinf = abs_bldinf.replace(os.sep, '/')
+ if link_info.has_key(key_bldinf):
+ self.logger.debug("Found deps for %s" % key_bldinf)
+ for bdfile in link_info[key_bldinf].getDependencies():
+ if bdfile.getPath() in build_data.getTargetFiles():
+ # no dependency data above, only paths - OK to overwrite
+ build_data.addDeliverable(bdfile)
+ else:
+ self.logger.warning("Link data from %s has unlisted target %s" % (abs_bldinf, bdfile.getPath()))
+ tock = time.time()
+ self.logger.info(" + Dependency analysis for %s in %s seconds" % (bldinf, tock - tick))
+ return build_data
+
+ def getHelp(self):
+ help_ = """The sbs datasource will extract component information from the sbs logs. You need a recent version of raptor: e.g 2.8.4.
+Plugin property configuration:
+sbslog Location of the sbs log.
+sysdef Location of the canonical system definition file.
+ """
+ return help_
+
+DATASOURCES['sbs'] = SBSDataSource
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/blockspackager/lib/packager/io.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/blockspackager/lib/packager/io.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,143 @@
+#============================================================================
+#Name : io.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+import xml.dom.minidom
+import logging
+from Blocks.Packaging.BuildData import BdFile, PlainBuildData
+logger = logging.getLogger('io')
+
+class BdFileSerializer:
+ """ Class used to serialize or deserialize the DBFile """
+ def __init__(self, bdfile=None):
+ self.bdfile = bdfile
+
+ def toXml(self):
+ logger.debug("Serializing DBFile.")
+ document = xml.dom.minidom.Document()
+ component = document.createElement('bdfile')
+ component.setAttribute('path', self.bdfile.getPath())
+ if self.bdfile.variantType is not None:
+ component.setAttribute('variantType', self.bdfile.variantType)
+ if self.bdfile.variantPlatform is not None:
+ component.setAttribute('variantPlatform', self.bdfile.variantPlatform)
+ # Owner reqs
+ ownerReqs = document.createElement('ownerRequirements')
+ for path in self.bdfile.ownerRequirements:
+ req = document.createElement("ownerRequirement")
+ req.setAttribute('path', path)
+ ownerReqs.appendChild(req)
+ component.appendChild(ownerReqs)
+ # source Requirements
+ srcReqs = document.createElement('sourceRequirements')
+ for path in self.bdfile.sourceRequirements:
+ req = document.createElement("sourceRequirement")
+ req.setAttribute('path', path)
+ srcReqs.appendChild(req)
+ component.appendChild(srcReqs)
+ return component.toxml()
+
+ def fromXml(self, data):
+ logger.debug("Deserializing DBFile.")
+ node = xml.dom.minidom.parseString(data).childNodes[0]
+ if self.bdfile == None:
+ self.bdfile = BdFile(node.getAttribute('path'))
+
+ self.bdfile.path = node.getAttribute('path')
+ self.bdfile.variantPlatform = node.getAttribute('variantPlatform')
+ self.bdfile.variantType = node.getAttribute('variantType')
+ for src in node.getElementsByTagName('ownerRequirements')[0].getElementsByTagName('ownerRequirement'):
+ self.bdfile.ownerRequirements.append(src.getAttribute('path'))
+ for src in node.getElementsByTagName('sourceRequirements')[0].getElementsByTagName('sourceRequirement'):
+ self.bdfile.sourceRequirements.append(src.getAttribute('path'))
+ return self.bdfile
+
+class BuildDataSerializer:
+ """ Class used to serialize or deserialize the plain build data """
+ def __init__(self, builddata=None):
+ self.builddata = builddata
+ if self.builddata is None:
+ self.builddata = PlainBuildData()
+
+ def toXml(self):
+ logger.debug("Serializing PlainBuildData.")
+ document = xml.dom.minidom.Document()
+ component = document.createElement('component')
+ component.setAttribute('name', self.builddata.getComponentName())
+ component.setAttribute('version', self.builddata.getComponentVersion())
+ # sources
+ sources = document.createElement('sources')
+ sources.setAttribute('root', self.builddata.getSourceRoot())
+ for path in self.builddata.getSourceFiles():
+ source = document.createElement("source")
+ source.setAttribute('path', path)
+ sources.appendChild(source)
+ component.appendChild(sources)
+ # targets
+ targets = document.createElement('targets')
+ targets.setAttribute('root', self.builddata.getTargetRoot())
+ for path in self.builddata.targetFiles.keys():
+ target = document.createElement("target")
+ target.setAttribute('path', path)
+ if self.builddata.targetFiles[path] is not None:
+ target.appendChild(document.importNode(xml.dom.minidom.parseString(BdFileSerializer(self.builddata.targetFiles[path]).toXml()).childNodes[0], deep=1))
+ targets.appendChild(target)
+ component.appendChild(targets)
+ return component.toxml()
+
+ def fromXml(self, data):
+ logger.debug("Deserializing PlainBuildData.")
+ node = xml.dom.minidom.parseString(data).childNodes[0]
+ self.builddata.setComponentName(node.getAttribute('name'))
+ self.builddata.setComponentVersion(node.getAttribute('version'))
+ self.builddata.setSourceRoot(node.getElementsByTagName('sources')[0].getAttribute('root'))
+ self.builddata.setTargetRoot(node.getElementsByTagName('targets')[0].getAttribute('root'))
+ files = []
+ for src in node.getElementsByTagName('sources')[0].getElementsByTagName('source'):
+ files.append(src.getAttribute('path'))
+ self.builddata.addSourceFiles(files)
+
+ files = []
+ for target in node.getElementsByTagName('targets')[0].getElementsByTagName('target'):
+ files.append(target.getAttribute('path'))
+ self.builddata.addTargetFiles(files)
+ for target in node.getElementsByTagName('targets')[0].getElementsByTagName('target'):
+ for bdfile in target.getElementsByTagName('bdfile'):
+ self.builddata.addDeliverable(BdFileSerializer().fromXml(bdfile.toxml()))
+ return self.builddata
+
+
+class BuildDataMerger:
+ """ Class used to merge contents of build data """
+ def __init__(self, output):
+ self.output = output
+
+ def merge(self, bd):
+ """ Merge the content of bd into output. """
+ if bd.getComponentName() != self.output.getComponentName():
+ raise Exception("Trying to merger two different components (different name)")
+ if bd.getComponentVersion() != self.output.getComponentVersion():
+ raise Exception("Trying to merger two different components (different version)")
+ if bd.getSourceRoot() != self.output.getSourceRoot():
+ raise Exception("Trying to merger two different components (different source root)")
+ if bd.getTargetRoot() != self.output.getTargetRoot():
+ raise Exception("Trying to merger two different components (different target root)")
+ self.output.addSourceFiles(bd.getSourceFiles())
+ self.output.addTargetFiles(bd.getTargetFiles())
+ for dep in bd.getDependencies():
+ self.output.addDeliverable(dep)
+ return self.output
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/blockspackager/setup.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/blockspackager/setup.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,37 @@
+#============================================================================
+#Name : .py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+import os
+from setuptools import setup, find_packages
+pyfiles = []
+for x in os.listdir('lib'):
+ if x.endswith('.py'):
+ pyfiles.append(x.replace('.py', ''))
+
+setup(
+ name = 'blockspackager',
+ version = '0.1',
+ description = "blockspackager",
+ license = 'EPL',
+ package_dir = {'': 'lib'},
+ py_modules = pyfiles,
+ packages = find_packages('lib', exclude=["*tests"]),
+ test_suite = 'nose.collector',
+ package_data = {'': ['*.xml', '*.conf', '*.xsd', '*.nsi']},
+ zip_safe = False,
+ )
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/blockspackager/tests.ant.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/blockspackager/tests.ant.xml Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,28 @@
+
+
+
+ Helium unittests.
+
+
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/ivy.xml
--- a/buildframework/helium/sf/python/pythoncore/ivy.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/ivy.xml Mon Oct 11 11:16:47 2010 +0100
@@ -39,5 +39,6 @@
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/amara.py
--- a/buildframework/helium/sf/python/pythoncore/lib/amara.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/amara.py Mon Oct 11 11:16:47 2010 +0100
@@ -105,24 +105,26 @@
def __setitem__(self, key, value):
self.xml_set_attribute(key, value)
- def __getattr__(self, attr):
+ def __getattr__(self, attr):
+ if attr == 'xml_child_elements':
+ return self._getxml_child_elements()
if isinstance(attr, basestring):
res = self.dom.getElementsByTagName(attr)
if len(res) == 0:
if hasattr(self.dom, 'documentElement'):
val = self.dom.documentElement.getAttribute(attr)
if not self.dom.documentElement.hasAttribute(attr):
- raise Exception(attr + ' not found')
+ raise AttributeError(attr + ' not found')
else:
val = self.dom.getAttribute(attr)
if not self.dom.hasAttribute(attr):
- raise Exception(attr + ' not found')
+ raise AttributeError(attr + ' not found')
return val
return MinidomAmara(res[0], self.dom)
if self.parent:
return MinidomAmara(self.parent.getElementsByTagName(self.dom.tagName)[attr])
else:
- raise Exception(str(attr) + ' not found')
+ raise AttributeError(str(attr) + ' not found')
def __setattr__(self, name, value):
if isinstance(value, basestring):
@@ -134,13 +136,20 @@
for entry in self.parent.getElementsByTagName(self.dom.tagName):
yield MinidomAmara(entry)
- def __str__(self):
+ def _get_text(self, node):
+ """ Recursive method to collate sub-node strings. """
text = ''
- for t_text in self.dom.childNodes:
- if t_text.nodeType == t_text.TEXT_NODE and t_text.data != None:
- text = text + t_text.data
- return text
+ for child in node.childNodes:
+ if child.nodeType == child.TEXT_NODE and child.data != None:
+ text = text + ' ' + child.data
+ else:
+ text += self._get_text(child)
+ return text.strip()
+ def __str__(self):
+ """ Output a string representing the XML node. """
+ return self._get_text(self.dom)
+
def xml(self, out=None, indent=False, omitXmlDeclaration=False, encoding='utf-8'):
"""xml"""
if omitXmlDeclaration:
@@ -169,6 +178,14 @@
if elem.nodeType == elem.ELEMENT_NODE:
l_attrib.append(MinidomAmara(elem))
return l_attrib
+
+ def _getxml_child_elements(self):
+ """get xml children"""
+ l_attrib = {}
+ for elem in self.dom.childNodes:
+ if elem.nodeType == elem.ELEMENT_NODE:
+ l_attrib[elem.tagName] = MinidomAmara(elem)
+ return l_attrib
def _getxml_attributes(self):
"""get aml attributes"""
@@ -209,6 +226,7 @@
childNodes = property(_getxml_children)
xml_children = property(_getxml_children)
xml_attributes = property(_getxml_attributes)
+ xml_child_elements = property(_getxml_child_elements)
def __eq__(self, obj):
return str(self) == obj
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/archive/mappers.py
--- a/buildframework/helium/sf/python/pythoncore/lib/archive/mappers.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/archive/mappers.py Mon Oct 11 11:16:47 2010 +0100
@@ -51,21 +51,25 @@
self._metadata = None
if not os.path.exists(self._config['archives.dir']):
os.makedirs(self._config['archives.dir'])
- if self._config.has_key("grace.metadata") and self._config.get_boolean("grace.metadata", False):
- if self._config.has_key("grace.template") and os.path.exists(self._config["grace.template"]) and \
+
+ if self._config.has_key("grace.metadata"):
+ raise Exception('grace.metadata not supported, see documentation for correct configuration')
+
+ if self._config.has_key("release.metadata") and self._config.get_boolean("release.metadata", False):
+ if self._config.has_key("release.template") and os.path.exists(self._config["release.template"]) and \
not os.path.exists(os.path.join(self._config['archives.dir'], self._config['name'] + ".metadata.xml")):
- shutil.copy(config["grace.template"], os.path.join(self._config['archives.dir'], self._config['name'] + ".metadata.xml"))
+ shutil.copy(config["release.template"], os.path.join(self._config['archives.dir'], self._config['name'] + ".metadata.xml"))
self._metadata = symrec.ReleaseMetadata(os.path.join(self._config['archives.dir'], self._config['name']+ ".metadata.xml"),
- service=self._config['grace.service'],
- product=self._config['grace.product'],
- release=self._config['grace.release'])
+ service=self._config['release.service'],
+ product=self._config['release.product'],
+ release=self._config['release.name'])
self._metadata.save()
def declare_package(self, filename, extract="single"):
""" Add a package to the metadata file. """
if self._metadata is None:
return
- self._metadata.add_package(os.path.basename(filename), extract=extract, filters=self._config.get_list('grace.filters', None), default=self._config.get_boolean('grace.default', True))
+ self._metadata.add_package(os.path.basename(filename), extract=extract, filters=self._config.get_list('release.filters', None), default=self._config.get_boolean('release.default', True))
self._metadata.save()
def create_commands(self, manifest):
@@ -99,12 +103,12 @@
if len(manifests) == 1:
filename = os.path.join(self._config['archives.dir'], self._config['name'])
_logger.info(" * " + filename + self._tool.extension())
- self.declare_package(filename + self._tool.extension(), self._config.get('grace.extract', 'single'))
+ self.declare_package(filename + self._tool.extension(), self._config.get('release.extract', 'single'))
result.extend(self._tool.create_command(self._config.get('zip.root.dir', self._config['root.dir']), filename, manifests=[manifest]))
return [result]
- def _split_manifest_file(self, name, manifest_file_path):
+ def _split_manifest_file(self, name, manifest_file_path, key=None):
""" This method return a list of files that contain the content of the zip parts to create. """
filenames = []
@@ -137,7 +141,10 @@
files = 0
part += 1
- filename = "%s_part%02d" % (name, part)
+ if key is not None:
+ filename = "%s_part%02d_%s" % (name, part, key)
+ else:
+ filename = "%s_part%02d" % (name, part)
filenames.append(os.path.join(self._config['temp.build.dir'], filename + ".txt"))
output = codecs.open(os.path.join(self._config['temp.build.dir'], filename + ".txt"), 'w', "utf-8" )
@@ -156,7 +163,10 @@
files = 0
part += 1
- filename = "%s_part%02d" % (name, part)
+ if key is not None:
+ filename = "%s_part%02d_%s" % (name, part, key)
+ else:
+ filename = "%s_part%02d" % (name, part)
filenames.append(os.path.join(self._config['temp.build.dir'], filename + ".txt"))
output = open(os.path.abspath(os.path.join(self._config['temp.build.dir'], filename + ".txt")), 'w')
@@ -178,7 +188,7 @@
return filenames
-class PolicyMapper(Mapper):
+class PolicyMapper(DefaultMapper):
""" Implements a policy content mapper.
It transforms a list of files into a list of commands with their inputs.
@@ -188,7 +198,7 @@
def __init__(self, config, archiver):
""" Initialization. """
- Mapper.__init__(self, config, archiver)
+ DefaultMapper.__init__(self, config, archiver)
self._policies = {}
self._policy_cache = {}
self._binary = {}
@@ -233,11 +243,22 @@
# Generating sublists.
for key in self._policies.keys():
+ manifests = []
self._policies[key].close()
manifest = os.path.join(self._config['temp.build.dir'], self._config['name'] + "_%s" % key + ".txt")
- filename = os.path.join(self._config['archives.dir'], self._config['name'] + "_%s" % key)
- _logger.info(" * " + filename + self._tool.extension())
- result.extend(self._tool.create_command(self._config.get('zip.root.dir', self._config['root.dir']), filename, manifests=[manifest]))
+ _logger.info(" * Input manifest: " + manifest)
+ if self._config.has_key("split.on.uncompressed.size.enabled") and self._config.get_boolean("split.on.uncompressed.size.enabled", "false"):
+ manifests = self._split_manifest_file(self._config['name'], manifest, key)
+ else:
+ manifests.append(manifest)
+ for manifest in manifests:
+ _logger.info(" * Creating command for manifest: " + manifest)
+ filename = os.path.join(self._config['archives.dir'], os.path.splitext(os.path.basename(manifest))[0])
+ if len(manifests) == 1:
+ filename = os.path.join(self._config['archives.dir'], self._config['name'] + "_%s" % key)
+ _logger.info(" * " + filename + self._tool.extension())
+ self.declare_package(filename + self._tool.extension(), self._config.get('release.extract', 'single'))
+ result.extend(self._tool.create_command(self._config.get('zip.root.dir', self._config['root.dir']), filename, manifests=[manifest]))
stages.append(result)
# See if any internal archives need to be created
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/archive/scanners.py
--- a/buildframework/helium/sf/python/pythoncore/lib/archive/scanners.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/archive/scanners.py Mon Oct 11 11:16:47 2010 +0100
@@ -48,11 +48,10 @@
[self.add_exclude_lst(filename) for filename in self._config.get_list('exclude.lst', [])]
[self.add_filetype(filetype) for filetype in self._config.get_list('filetype', [])]
[self.add_selector(archive.selectors.get_selector(selector, self._config)) for selector in self._config.get_list('selector', [])]
- # To support old features.
- # TODO: inform customers and remove.
+
if 'distribution.policy.s60' in self._config:
self.add_selector(archive.selectors.get_selector('distribution.policy.s60', self._config))
-
+
def add_exclude_lst(self, filename):
""" Adding excludes from exclude list. """
if not os.path.exists(filename):
@@ -76,7 +75,7 @@
This method need to be overloaded by the specialized class.
return fullpath name
"""
- raise NotImplementedError()
+ raise NotImplementedError()
class AbldWhatScanner(Scanner):
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/ats3/__init__.py
--- a/buildframework/helium/sf/python/pythoncore/lib/ats3/__init__.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/ats3/__init__.py Mon Oct 11 11:16:47 2010 +0100
@@ -54,7 +54,11 @@
# Customize some attributes from how optparse leaves them.
if hasattr(self._opts, 'build_drive'):
self.build_drive = path(self._opts.build_drive)
- self.file_store = path(self._opts.file_store)
+ if os.path.exists(self._opts.file_store):
+ self.file_store = path(self._opts.file_store)
+ else:
+ self.file_store = ''
+ _logger.info(self._opts.file_store + ' not found')
self.flash_images = split_paths(self._opts.flash_images)
if hasattr(self._opts, 'sis_files'):
self.sis_files = split_paths(self._opts.sis_files)
@@ -62,6 +66,8 @@
self.config_file = self._opts.config
if hasattr(self._opts, 'obey_pkgfiles'):
self.obey_pkgfiles = to_bool(self._opts.obey_pkgfiles)
+ if hasattr(self._opts, 'minimum_execution_blocks'):
+ self.minimum_execution_blocks = (self._opts.minimum_execution_blocks == 'true')
if hasattr(self._opts, 'hti'):
self.hti = to_bool(self._opts.hti)
if hasattr(self._opts, 'test_type'):
@@ -91,7 +97,7 @@
for t_key, t_value in temp_dict.items():
self.tsrc_paths_dict[t_key] = t_value
else:
- _logger.error(tsrc + ' not found')
+ _logger.error(tsrc + ' - test source not found')
#preparing a list of main components
for main_component in self.tsrc_paths_dict.keys():
@@ -166,6 +172,9 @@
self.ctc_enabled = 'False'
if hasattr(config, 'ctc_enabled'):
self.ctc_enabled = to_bool(config.ctc_enabled)
+ self.ats_stf_enabled = 'False'
+ if hasattr(config, 'ats_stf_enabled'):
+ self.ats_stf_enabled = to_bool(config.ats_stf_enabled)
if hasattr(config, 'multiset_enabled'):
self.multiset_enabled = to_bool(config.multiset_enabled)
if hasattr(config, 'monsym_files'):
@@ -180,6 +189,9 @@
self.flash_images = config.flash_images
if hasattr(config, 'test_type'):
self.test_type = config.test_type
+ self.minimum_execution_blocks = False
+ if hasattr(config, 'minimum_execution_blocks'):
+ self.minimum_execution_blocks = config.minimum_execution_blocks
def insert_set(self, data_files=None, config_files=None,
engine_ini_file=None, image_files=None, sis_files=None,
@@ -230,12 +242,39 @@
if self.trace_enabled != "":
if self.trace_enabled.lower() == "true":
setd = dict(setd, pmd_files=pmd_files,
- trace_path=self.file_store.joinpath(self.REPORT_PATH, "traces", setd["name"], "tracelog.blx"),
+ trace_path=os.path.join(self.file_store, self.REPORT_PATH, "traces", setd["name"], "tracelog.blx"),
trace_activation_files=trace_activation_files)
else:
setd = dict(setd, pmd_files=[],
trace_path="",trace_activation_files=[])
- self.sets.append(setd)
+
+ if self.minimum_execution_blocks:
+ if self.sets == []:
+ self.sets = [setd]
+ else:
+ files = ['component_path',
+ 'trace_activation_files',
+ 'src_dst',
+ #'trace_path',
+ #'custom_dir',
+ 'pmd_files',
+ 'dll_files',
+ 'config_files',
+ 'data_files',
+ 'testmodule_files']
+
+ if self.sets[0]['test_harness'] == setd['test_harness'] and setd['engine_ini_file'] == None:
+ for param in files:
+ if setd[param]:
+ if type(setd[param]) == dict:
+ for key in setd[param].keys():
+ self.sets[0][param][key] = setd[param][key]
+ else:
+ self.sets[0][param] = self.sets[0][param] + setd[param]
+ else:
+ self.sets.append(setd)
+ else:
+ self.sets.append(setd)
def set_plan_harness(self):
"""setting up test harness for a plan"""
@@ -272,7 +311,7 @@
actions = []
temp_var = ""
include_ctc_runprocess = False
- report_path = self.file_store.joinpath(self.REPORT_PATH)
+ report_path = os.path.join(self.file_store, self.REPORT_PATH)
if self.ctc_enabled and adg.CTC_PATHS_LIST != [] and self.monsym_files != "" and not "${" in self.monsym_files:
include_ctc_runprocess = True
@@ -311,18 +350,18 @@
("send-files", "true"),
("to", self.report_email)))
ats3_report = ("FileStoreAction",
- (("to-folder", report_path.joinpath("ATS3_REPORT")),
+ (("to-folder", os.path.join(report_path, "ATS3_REPORT")),
("report-type", "ATS3_REPORT"),
("date-format", "yyyyMMdd"),
("time-format", "HHmmss")))
stif_report = ("FileStoreAction",
- (("to-folder", report_path.joinpath("STIF_REPORT")),
+ (("to-folder", os.path.join(report_path, "STIF_REPORT")),
("report-type", "STIF_COMPONENT_REPORT_ALL_CASES"),
("run-log", "true"),
("date-format", "yyyyMMdd"),
("time-format", "HHmmss")))
eunit_report = ("FileStoreAction",
- (("to-folder", report_path.joinpath("EUNIT_REPORT")),
+ (("to-folder", os.path.join(report_path, "EUNIT_REPORT")),
("report-type", "EUNIT_COMPONENT_REPORT_ALL_CASES"),
("run-log", "true"),
("date-format", "yyyyMMdd"),
@@ -405,7 +444,9 @@
tesplan_counter += 1
exe_flag = False
for srcanddst in plan_sets['src_dst']:
- _ext = srcanddst[0].rsplit(".")[1]
+ _ext = ''
+ if '.' in srcanddst[0]:
+ _ext = srcanddst[0].rsplit(".")[1]
#the list below are the files which are executable
#if none exists, set is not executable
for mat in ["dll", "ini", "cfg", "exe", "script"]:
@@ -488,9 +529,11 @@
default="")
cli.add_option("--specific-pkg", help="Text in name of pkg files to use", default='')
cli.add_option("--ats4-enabled", help="ATS4 enabled", default="False")
+ cli.add_option("--ats-stf-enabled", help="ATS STF enabled", default="False")
cli.add_option("--obey-pkgfiles", help="If this option is True, then only test components having PKG file are executable and if the compnents don't have PKG files they will be ignored.", default="False")
cli.add_option("--verbose", help="Increase output verbosity", action="store_true", default=False)
cli.add_option("--hti", help="HTI enabled", default="True")
+ cli.add_option("--minimum-execution-blocks", help="Create as few as possible execution blocks", default="false")
opts, tsrc_paths = cli.parse_args()
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/ats3/aste_template.xml
--- a/buildframework/helium/sf/python/pythoncore/lib/ats3/aste_template.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/ats3/aste_template.xml Mon Oct 11 11:16:47 2010 +0100
@@ -139,7 +139,7 @@
InstallSisTask
-
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/ats3/ats4_template.xml
--- a/buildframework/helium/sf/python/pythoncore/lib/ats3/ats4_template.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/ats3/ats4_template.xml Mon Oct 11 11:16:47 2010 +0100
@@ -77,7 +77,7 @@
{%- endif %}
{% if setd["ctc_enabled"] == "True" -%}
- {{ macros.ctc_initialization() }}
+ {{ macros.ctc_initialization(test_plan) }}
{%- endif %}
@@ -168,7 +168,7 @@
InstallSisTask
-
+
@@ -272,7 +272,7 @@
{% if setd["ctc_enabled"] == "True" -%}
- {{ macros.ctc_finalization(setd) }}
+ {{ macros.ctc_finalization(test_plan) }}
{%- endif %}
@@ -311,7 +311,7 @@
EmailAction
-
+
{% if test_plan['report_type'].lower() == 'no_attachment' -%}
@@ -319,6 +319,15 @@
{%- endif %}
+ {% if test_plan['file_store'] -%}
+
+ FileStoreAction
+
+
+
+
+
+ {% endif %}
{% if test_plan['diamonds_build_url'] -%}
DiamondsAction
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/ats3/atsconfigparser.py
--- a/buildframework/helium/sf/python/pythoncore/lib/ats3/atsconfigparser.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/ats3/atsconfigparser.py Mon Oct 11 11:16:47 2010 +0100
@@ -51,8 +51,11 @@
p_temp.value = value
changed = True
if not changed:
- for device in self.doc.test.target.device:
- device.xml_append(self.doc.xml_create_element(u"setting", attributes = {u'name': unicode(name), u'value': unicode(value)}))
+ if hasattr(self.doc, 'test'):
+ for device in self.doc.test.target.device:
+ device.xml_append(self.doc.xml_create_element(u"setting", attributes = {u'name': unicode(name), u'value': unicode(value)}))
+ else:
+ raise Exception("You can't add a setting with ats4")
def containsattribute(self, name, value):
""" returns true or false """
@@ -74,9 +77,12 @@
p_temp.value = value
changed = True
if not changed:
- for device in self.doc.test.target.device:
- device.xml_append(self.doc.xml_create_element(u"property", attributes = {u'name': unicode(name), u'value': unicode(value)}))
-
+ if hasattr(self.doc, 'test'):
+ for device in self.doc.test.target.device:
+ device.xml_append(self.doc.xml_create_element(u"property", attributes = {u'name': unicode(name), u'value': unicode(value)}))
+ else:
+ for device in self.doc.testrun.agents.agent:
+ device.xml_append(self.doc.xml_create_element(u"property", attributes = {u'name': unicode(name), u'value': unicode(value)}))
class ATSConfigParser:
""" ATS configuration parser"""
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/ats3/bootup_testing.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/pythoncore/lib/ats3/bootup_testing.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,361 @@
+# -*- encoding: latin-1 -*-
+
+#============================================================================
+#Name : bootup_testing.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+
+"""Bootup test drop generation."""
+
+#W0142 => * and ** were used
+#R* removed during refactoring
+
+from optparse import OptionParser
+from xml.etree import ElementTree as et
+import logging
+import os
+import re
+import tempfile
+import zipfile
+import pkg_resources # pylint: disable-msg=F0401
+from path import path # pylint: disable-msg=F0401
+import amara
+import ntpath as atspath
+import jinja2 # pylint: disable-msg=F0401
+import ats3.parsers as parser
+
+_logger = logging.getLogger('bootup-testing')
+
+# Shortcuts
+E = et.Element
+SE = et.SubElement
+
+class Configuration(object):
+ """
+ Bootup test drop generation configuration.
+ """
+
+ def __init__(self, opts):
+ """
+ Initialize from optparse configuration options.
+ """
+
+ self._opts = opts
+ # Customize some attributes from how optparse leaves them.
+ self.build_drive = path(self._opts.build_drive)
+ self.file_store = path(self._opts.file_store)
+ self.flash_images = self.split_paths(self._opts.flash_images)
+ self.template_loc = path(self._opts.template_loc)
+
+
+ def split_paths(self, arg, delim=","):
+ """
+ Split the string by delim, removing extra whitespace.
+ """
+ return [path(part.strip())
+ for part in arg.split(delim) if part.strip()]
+
+ def __getattr__(self, attr):
+ return getattr(self._opts, attr)
+
+ def __str__(self):
+ dump = "Configuration:\n"
+ seen = set()
+ for key, value in vars(self).items():
+ if not key.startswith("_"):
+ dump += "\t%s = %s\n" % (key, value)
+ seen.add(key)
+ for key, value in vars(self._opts).items():
+ if key not in seen:
+ dump += "\t%s = %s\n" % (key, value)
+ seen.add(key)
+ return dump
+
+
+class BootupTestPlan(object):
+ """
+ Tells ATS server that images have to be tested if they can start the device.
+ """
+
+ def __init__(self, config):
+ self.pkg_parser = parser.PkgFileParser()
+ self.file_store = config.file_store
+ self.build_drive = config.build_drive
+
+ def insert_execution_block(self, block_count=1, image_files=None):
+ """
+ Insert task and flash files into the execution block
+ """
+ if image_files is None:
+ image_files = []
+
+ exe_dict = dict(name="exe%d" % block_count, image_files=image_files)
+
+ return exe_dict
+
+
+ def __getitem__(self, key):
+ return self.__dict__[key]
+
+
+
+class BootupTestDropGenerator(object):
+ """
+ Generate test drop zip file for Bootup testing.
+
+ The main responsibility of this class is to create testdrop and
+ test.xml file.
+
+ """
+
+ def __init__(self):
+ self.drop_path_root = path("BootupDrop")
+ self.drop_path = None
+ self.defaults = {}
+
+ def generate(self, xml_dict, output_file, template_loc=None):
+ """Generate a test drop file."""
+ xml = self.generate_xml(xml_dict, template_loc)
+ return self.generate_drop(xml_dict, xml, output_file)
+
+ def generate_drop(self, xml_dict, xml, output_file):
+ """Generate test drop zip file."""
+
+ zfile = zipfile.ZipFile(output_file, "w", zipfile.ZIP_DEFLATED)
+ try:
+ for drop_file, src_file in self.drop_files(xml_dict):
+
+ _logger.info(" + Adding: %s" % src_file.strip())
+ try:
+ zfile.write(src_file.strip(), drop_file.encode('utf-8'))
+ except OSError, expr:
+ _logger.error(expr)
+ doc = amara.parse(et.tostring(xml.getroot()))
+ _logger.debug("XML output: %s" % doc.xml(indent=u"yes", encoding="ISO-8859-1"))
+ zfile.writestr("test.xml", doc.xml(indent="yes", encoding="ISO-8859-1"))
+ finally:
+ _logger.info("Testdrop for bootup testing created successfully!")
+ zfile.close()
+
+ def generate_xml(self, xml_dict, template_loc):
+ """ generate an XML file"""
+ template_loc = path(template_loc).normpath()
+ loader = jinja2.ChoiceLoader([jinja2.PackageLoader(__name__, 'templates')])
+ env = jinja2.Environment(loader=loader)
+ if template_loc is None or not ".xml" in template_loc.lower():
+ template = env.from_string(pkg_resources.resource_string(__name__, 'bootup_testing_template.xml'))# pylint: disable-msg=E1101
+ else:
+ template = env.from_string(open(template_loc).read())# pylint: disable-msg=E1101
+
+ xmltext = template.render(xml_dict=xml_dict, os=os, atspath=atspath, atsself=self).encode('ISO-8859-1')
+ #print xmltext
+ return et.ElementTree(et.XML(xmltext))
+
+
+ def drop_files(self, xml_dict):
+ """Yield a list of drop files."""
+
+ drop_set = set()
+ drop_files = []
+ #Adding test asset, there's an execution block for every test asset
+ for execution_block in xml_dict["execution_blocks"]:
+ drop_dir = path(execution_block["name"])
+ drop_files = execution_block["image_files"]
+ for file_path in drop_files:
+ if file_path != None:
+ #Adding image files to the top level,
+ drop_file = drop_dir.joinpath("images", file_path.name)
+
+ drop_file = drop_file.normpath()
+ if drop_file not in drop_set:
+ drop_set.add(drop_file)
+ yield (drop_file, file_path.normpath())
+
+
+class ComponentParser(object):
+ """
+ Add information to the XML dictionary
+ """
+ def __init__(self, config):
+ self.flash_images = [path(p) for p in config.flash_images]
+ self.build_drive = config.build_drive
+ self.diamonds_build_url = config.diamonds_build_url
+ self.testrun_name = config.testrun_name
+ self.alias_name = config.alias_name
+ self.device_type = config.device_type
+ self.report_email = config.report_email
+ self.email_format = config.email_format
+ self.email_subject = config.email_subject
+
+ self.xml_dict = {}
+
+
+ def insert_pre_data(self):
+ """
+ Creates a dictionary for the data before
+ the block starts.
+ """
+ self.xml_dict = dict(self.xml_dict, temp_directory=path(tempfile.mkdtemp()))
+ self.xml_dict = dict(self.xml_dict, diamonds_build_url=self.diamonds_build_url)
+ self.xml_dict = dict(self.xml_dict, testrun_name=self.testrun_name)
+ self.xml_dict = dict(self.xml_dict, alias_name=self.alias_name)
+ self.xml_dict = dict(self.xml_dict, device_type=self.device_type)
+
+ def create_execution_block(self, config):
+ """Parse flash images """
+ execution_block_list = []
+ test_plan = BootupTestPlan(config)
+ block_count = 1
+ self.flash_images = self.get_sorted_images(self.flash_images)
+ execution_block_list.append(test_plan.insert_execution_block(block_count, self.flash_images))
+
+
+ self.xml_dict = dict(self.xml_dict, execution_blocks=execution_block_list)
+
+ def insert_post_data(self):
+ """
+ Creates a dictionary for the data after
+ the block ends. Or, Postaction data
+ """
+ self.xml_dict = dict(self.xml_dict, report_email=self.report_email)
+ self.xml_dict = dict(self.xml_dict, email_format=self.email_format)
+ self.xml_dict = dict(self.xml_dict, email_subject=self.email_subject)
+
+ return self.xml_dict
+
+ def get_sorted_images(self, image_files):
+ """sort the images """
+ sorted_images = []
+ for image_file in image_files:
+ if 'core' in image_file.name:
+ sorted_images.append(image_file)
+ for image_file in image_files:
+ if 'rofs2' in image_file.name:
+ sorted_images.append(image_file)
+ for image_file in image_files:
+ if 'rofs3' in image_file.name:
+ sorted_images.append(image_file)
+ for image_file in image_files:
+ if 'udaerase' in image_file.name:
+ sorted_images.append(image_file)
+ for image_file in image_files:
+ if 'core' not in image_file.name and 'rofs2' not in image_file.name and 'rofs3' not in image_file.name and 'udaerase' not in image_file.name.lower():
+ sorted_images.append(image_file)
+ if len(sorted_images) > 0 and "rofs" in sorted_images[0]:
+ return image_files
+ return sorted_images
+
+def create_drop(config):
+ """Create a test drop."""
+ xml_dict = {}
+
+ _logger.debug("initialize configuration dictionary")
+ drop_parser = ComponentParser(config)
+
+ #Inserting data for test run and global through out the dictionary
+ drop_parser.insert_pre_data()
+
+ #for every asset path there should be a
+ #separate execution block
+ drop_parser.create_execution_block(config)
+
+ #Inserting reporting and email data (post actions)
+ xml_dict = drop_parser.insert_post_data()
+
+
+# print "-------------------------------------------------"
+# keys = xml_dict
+# for key in xml_dict.keys():
+# if key == "execution_blocks":
+# for exe in xml_dict[key]:
+# print key, "->"
+# print exe['name']
+# print exe['image_files']
+# for file1 in exe['image_files']:
+# print file1
+#
+# else:
+# print key, "->", xml_dict[key]
+#
+# print xml_dict['diamonds_build_url']
+# print xml_dict['testrun_name']
+# print xml_dict['email_format']
+# print xml_dict['email_subject']
+#
+# print "-------------------------------------------------"
+
+
+
+ generator = BootupTestDropGenerator()
+ _logger.info("generating drop file: %s" % config.drop_file)
+ generator.generate(xml_dict, output_file=config.drop_file, template_loc=config.template_loc)
+
+
+
+def to_bool(param):
+ """setting a true or false based on a param value"""
+ param = str(param).lower()
+ if "true" == param or "t" == param or "1" == param:
+ return "True"
+ else:
+ return "False"
+
+def main():
+ """Main entry point."""
+
+
+ cli = OptionParser(usage="%prog [options] PATH1 [PATH2 [PATH3 ...]]")
+ cli.add_option("--ats4-enabled", help="ATS4 enabled", default="True")
+ cli.add_option("--build-drive", help="Build area root drive")
+ cli.add_option("--drop-file", help="Name for the final drop zip file", default="ATSBootupDrop.zip")
+
+ cli.add_option("--minimum-flash-images", help="Minimum amount of flash images", default=2)
+ cli.add_option("--flash-images", help="Paths to the flash image files", default="")
+
+ cli.add_option("--template-loc", help="Custom template location", default="")
+
+ cli.add_option("--file-store", help="Destination path for reports.", default="")
+ cli.add_option("--report-email", help="Email notification receivers", default="")
+ cli.add_option("--testrun-name", help="Name of the test run", default="bootup_test")
+ cli.add_option("--alias-name", help="Name of the alias", default="alias")
+ cli.add_option("--device-type", help="Device type (e.g. 'PRODUCT')", default="unknown")
+ cli.add_option("--diamonds-build-url", help="Diamonds build url")
+ cli.add_option("--email-format", help="Format of an email", default="")
+ cli.add_option("--email-subject", help="Subject of an email", default="Bootup Testing")
+ cli.add_option("--verbose", help="Increase output verbosity", action="store_true", default=False)
+
+ opts, _ = cli.parse_args()
+
+ ats4_enabled = to_bool(opts.ats4_enabled)
+
+ if ats4_enabled == "False":
+ cli.error("Bootup test executes on ATS4. Set property 'ats4.enabled'")
+
+ if not opts.flash_images:
+ cli.error("no flash image files given")
+ if len(opts.flash_images.split(",")) < int(opts.minimum_flash_images):
+ cli.error("Not enough flash files: %i defined, %i needed" % (len(opts.flash_images.split(",")), int(opts.minimum_flash_images) ))
+
+ if opts.verbose:
+ _logger.setLevel(logging.DEBUG)
+ logging.basicConfig(level=logging.DEBUG)
+ _ = tempfile.mkdtemp()
+ config = Configuration(opts)
+ create_drop(config)
+
+if __name__ == "__main__":
+ main()
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/ats3/bootup_testing_template.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/pythoncore/lib/ats3/bootup_testing_template.xml Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,92 @@
+
+
+
+
+
+ {% if xml_dict['diamonds_build_url'] -%}
+ {{ xml_dict['diamonds_build_url'] }}
+ Smoke
+ {% endif %}
+ {{ xml_dict['testrun_name'] }}
+
+
+
+
+
+
+
+
+
+ {% for exe_block in xml_dict['execution_blocks'] -%}
+
+
+
+ {% if exe_block['image_files'] -%}
+
+ FlashTask
+
+ {% set i = 1 %}
+ {% for img in exe_block['image_files'] -%}
+
+ {% set i = i + 1 %}
+ {% endfor -%}
+
+
+ {% endif %}
+
+
+ RebootTask
+
+
+
+ CreateDirTask
+
+
+
+
+
+
+
+
+ CleanupTask
+
+
+
+
+
+
+ {% endfor -%}
+
+
+
+ EmailAction
+
+
+
+
+
+
+
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/ats3/dropgenerator.py
--- a/buildframework/helium/sf/python/pythoncore/lib/ats3/dropgenerator.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/ats3/dropgenerator.py Mon Oct 11 11:16:47 2010 +0100
@@ -440,7 +440,7 @@
if 'rofs3' in image_file.name:
sorted_images.append(image_file)
for image_file in setd["image_files"]:
- if 'core' not in image_file.name and 'rofs2' not in image_file.name and 'rofs3' not in image_file.name and 'udaerase' not in image_file.name.lower():
+ if 'core' not in image_file.name and 'rofs2' not in image_file.name and 'rofs3' not in image_file.name:
sorted_images.append(image_file)
if len(sorted_images) > 0 and "rofs" in sorted_images[0]:
return setd["image_files"]
@@ -1029,7 +1029,11 @@
template = env.from_string(pkg_resources.resource_string(__name__, 'ats4_template.xml'))# pylint: disable=E1101
xmltext = template.render(test_plan=test_plan, os=os, atspath=atspath, atsself=self).encode('ISO-8859-1')
- return et.ElementTree(et.XML(xmltext))
+ try:
+ xml = et.ElementTree(et.XML(xmltext))
+ except ExpatError, err:
+ raise ExpatError(str(err) + xmltext)
+ return xml
def get_template(self, directory, template_name):
if directory:
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/ats3/matti/MattiDrops.py
--- a/buildframework/helium/sf/python/pythoncore/lib/ats3/matti/MattiDrops.py Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,254 +0,0 @@
-# -*- encoding: latin-1 -*-
-
-#============================================================================
-#Name : MattiDrops.py
-#Part of : Helium
-
-#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
-#All rights reserved.
-#This component and the accompanying materials are made available
-#under the terms of the License "Eclipse Public License v1.0"
-#which accompanies this distribution, and is available
-#at the URL "http://www.eclipse.org/legal/epl-v10.html".
-#
-#Initial Contributors:
-#Nokia Corporation - initial contribution.
-#
-#Contributors:
-#
-#Description: Script for test drop generation and sending to execution to
-#ATS3-system
-#===============================================================================
-
-""" create the MATTI test drop file for use on the test server """
-# pylint: disable=R0902, R0903, R0912
-
-import os
-import re
-import sys
-import string
-import zipfile
-import logging
-from optparse import OptionParser
-from xml.etree import ElementTree as et
-from jinja2 import Environment, PackageLoader # pylint: disable=F0401
-
-# Shortcuts
-E = et.Element
-SE = et.SubElement
-
-_logger = logging.getLogger('matti')
-
-class Configuration(object):
- """
- ATS3 drop generation configuration.
- """
-
- def __init__(self, opts):
- """
- Initialize from optparse configuration options.
- """
- self._opts = opts
-
- # Customize some attributes from how optparse leaves them.
- self.build_drive = os.path.normpath(self._opts.build_drive)
- self.file_store = os.path.normpath(self._opts.file_store)
- self.matti_scripts = os.path.normpath(self._opts.matti_scripts)
- self.template_location = os.path.normpath(self._opts.template_loc)
- if self._opts.flash_images:
- self.flash_images = self._opts.flash_images.split(',')
- else:
- self.flash_images = []
- if not re.search(r'\A\s*?\Z', self._opts.sis_files):
- self.sis_files = self._opts.sis_files.split(',')
- else:
- self.sis_files = None
- self.step_list = []
- self.filelist = []
- self.image_list = []
- self.sis_list = []
- self.device_type = self._opts.device_type
- self.device_hwid = self._opts.device_hwid
- self.drop_file = self._opts.drop_file
- self.minimum_flash_images = self._opts.minimum_flash_images
- self.plan_name = self._opts.plan_name
- self.test_timeout = self._opts.test_timeout
- self.diamonds_build_url = self._opts.diamonds_build_url
- self.testrun_name = self._opts.testrun_name
- self.report_email = self._opts.report_email
- self.harness = self._opts.harness
- self.sis_enabled = False
- if self.sis_files:
- if len(self.sis_files) >= 1:
- self.sis_enabled = True
-
-
- def __getattr__(self, attr):
- return getattr(self._opts, attr)
-
- def __str__(self):
- dump = "Configuration:\n"
- seen = set()
- for key, value in vars(self).items():
- if not key.startswith("_"):
- dump += "\t%s = %s\n" % (key, value)
- seen.add(key)
- for key, value in vars(self._opts).items():
- if key not in seen:
- dump += "\t%s = %s\n" % (key, value)
- seen.add(key)
- return dump
-
-class MattiDrop(object):
- """
- ATS3 testdrop generation for MATTI tool
- """
-
- def __init__(self, config=None):
- self.configuration = config
- self.matti_cases = {}
- self.tmp_path = os.getcwd()
- self.files = []
- self.test_files = []
-
- def fetch_testfiles(self):
- """Needed flash files, sis-files and testscripts from given matti scripts -folder are added to file list."""
- tmp_case_list = []
-# tmp_image_list = []
- os.chdir(os.path.normpath(self.configuration.matti_scripts))
- try:
- for path, _, names in os.walk(os.getcwd()):
- for name in names:
- if re.search(r'.*?[.]rb\Z', name):
- tmp_case_list.append((os.path.normpath(os.path.join(path, name)), os.path.join("ats3", "matti", "script", name)))
- if tmp_case_list:
- for tmp_case in tmp_case_list:
- self.configuration.step_list.append(dict(path=os.path.join("§TEST_RUN_ROOT§", str(tmp_case[1])), name="Test case"))
- if self.configuration.flash_images:
- for image in self.configuration.flash_images:
- tmp = string.rsplit(image, os.sep)
- image_name = tmp[len(tmp)-1]
- self.configuration.image_list.append(os.path.join("ATS3Drop", "images", image_name))
- if self.configuration.sis_files:
- for sis in self.configuration.sis_files:
- tmp = string.rsplit(sis, os.sep)
- sis_name = tmp[len(tmp)-1]
- self.configuration.sis_list.append(dict(path=os.path.join("ATS3Drop", "sis", sis_name), dest=sis_name))
- except KeyError, error:
- _logger.error("Error in file reading / fetching!")
- sys.stderr.write(error)
- if tmp_case_list:
- for tmp_case in tmp_case_list:
- self.configuration.filelist.append(tmp_case[1])
- return tmp_case_list
- else:
- _logger.error("No test cases/files available!")
- return None
-
-
- def create_testxml(self):
- """This method will use Jinja2 template engine for test.xml creation"""
- os.chdir(self.tmp_path)
- env = Environment(loader=PackageLoader('ats3.matti', 'template'))
- if os.path.isfile(self.configuration.template_location):
- template = env.from_string(open(self.configuration.template_location).read())
- xml_file = open("test.xml", 'w')
- xml_file.write(template.render(configuration=self.configuration))
- xml_file.close()
- else:
- _logger.error("No template file found")
-
- def create_testdrop(self, output_file=None, file_list=None):
- """Creates testdrop zip-file to given location."""
- #env = Environment(loader=PackageLoader('MattiDrops', 'template'))
- os.chdir(self.tmp_path)
- if output_file and file_list:
- zfile = zipfile.ZipFile(output_file, "w", zipfile.ZIP_DEFLATED)
- try:
- _logger.info("Adding files to testdrop:")
- for src_file, drop_file in file_list:
- _logger.info(" + Adding: %s" % src_file.strip())
- if os.path.isfile(src_file):
- zfile.write(str(src_file.strip()), str(drop_file))
- else:
- _logger.error("invalid test file name supplied %s " % drop_file)
- if self.configuration.flash_images:
- for image in self.configuration.flash_images:
- tmp = string.rsplit(image, os.sep)
- image_name = tmp[len(tmp)-1]
- _logger.info(" + Adding: %s" % image_name)
- if os.path.isfile(image):
- zfile.write(image, os.path.join("ATS3Drop", "images", image_name))
- else:
- _logger.error("invalid flash file name supplied %s " % image_name)
- if self.configuration.sis_enabled:
- if self.configuration.sis_files:
- for sis in self.configuration.sis_files:
- tmp = string.rsplit(sis, os.sep)
- sis_name = tmp[len(tmp)-1]
- _logger.info(" + Adding: %s" % sis_name)
- if os.path.isfile(sis):
- zfile.write(sis, os.path.join("ATS3Drop", "sis", sis_name))
- else:
- _logger.error("invalid sis file name supplied %s " % sis_name)
- zfile.write(os.path.normpath(os.path.join(os.getcwd(),"test.xml")), "test.xml")
- finally:
- _logger.info("Testdrop created! %s" % output_file)
- zfile.close()
- return zfile
-
-def create_drop(configuration):
- """Testdrop creation"""
- if configuration:
- m_drop = MattiDrop(configuration)
- m_drop.fetch_testfiles()
- m_drop.create_testxml()
- return m_drop.create_testdrop(configuration.drop_file, m_drop.fetch_testfiles())
- else:
- _logger.error("No configuration available for test drop creation")
-
-def main():
- """Main entry point."""
- cli = OptionParser(usage="%prog [options] TSRC1 [TSRC2 [TSRC3 ...]]")
- cli.add_option("--build-drive", help="Build area root drive", default='X:')
- cli.add_option("--matti-scripts", help="Path to the directory where the MATTI test scripts are saved.", default="")
- cli.add_option("--flash-images", help="Flash image files as a list",
- default="")
- cli.add_option("--report-email", help="Email notification receivers",
- default="")
- cli.add_option("--harness", help="Test harness (default: %default)",
- default="unknown")
- cli.add_option("--file-store", help="Destination path for reports.",
- default="")
- cli.add_option("--testrun-name", help="Name of the test run",
- default="run")
- cli.add_option("--device-type", help="Device type (e.g. 'PRODUCT')",
- default="unknown")
- cli.add_option("--device-hwid", help="Device hwid",
- default="")
- cli.add_option("--diamonds-build-url", help="Diamonds build url")
- cli.add_option("--drop-file", help="Name for the final drop zip file",
- default="")
- cli.add_option("--minimum-flash-images", help="Minimum amount of flash images",
- default=2)
- cli.add_option("--plan-name", help="Name of the test plan",
- default="plan")
- cli.add_option("--sis-files", help="Sis files as a list",
- default="")
- cli.add_option("--template-loc", help="location of template file",
- default="..\template")
- cli.add_option("--test-timeout", help="Test execution timeout value (default: %default)",
- default="60")
- cli.add_option("--verbose", help="Increase output verbosity",
- action="store_true", default=True)
- opts, _ = cli.parse_args()
-
- if opts.verbose:
- _logger.setLevel(logging.DEBUG)
- logging.basicConfig(level=logging.DEBUG)
- config = Configuration(opts)
- create_drop(config)
-
-
-if __name__ == "__main__":
- main()
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/ats3/matti/__init__.py
--- a/buildframework/helium/sf/python/pythoncore/lib/ats3/matti/__init__.py Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,21 +0,0 @@
-# -*- coding: latin-1 -*-
-
-#============================================================================
-#Name : __init__.py
-#Part of : Helium
-
-#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
-#All rights reserved.
-#This component and the accompanying materials are made available
-#under the terms of the License "Eclipse Public License v1.0"
-#which accompanies this distribution, and is available
-#at the URL "http://www.eclipse.org/legal/epl-v10.html".
-#
-#Initial Contributors:
-#Nokia Corporation - initial contribution.
-#
-#Contributors:
-#
-#Description:
-#===============================================================================
-""" nothing needed here"""
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/ats3/matti/template/matti_demo.xml
--- a/buildframework/helium/sf/python/pythoncore/lib/ats3/matti/template/matti_demo.xml Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,128 +0,0 @@
-
-
-{{ configuration.testrun_name }}
-
-
-
-
-
-
-
-
-
- {%- set i = 0 -%}
-
-
-
-
-
- {% if configuration.image_list -%}
- {% for flash in configuration.image_list -%}
-
- {% endfor -%}
- {% endif %}
-
- makedir
-
-
-
-
- {% if configuration.sis_list -%}
- {% for sis in configuration.sis_list -%}
-
- install
-
-
-
-
-
-
-
-
- install-software
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- {% endfor -%}
- {% endif %}
- {% for step in configuration.step_list -%}
-
- execute
-
-
-
-
-
-
-
-
-
-
- {% endfor -%}
-
-
- {%- set i = i + 1 -%}
-
-
- {% if configuration.report_email %}
-
- SendEmailAction
-
-
-
-
-
-
-
- {% endif %}
- {% if configuration.filelist %}
-
- {% for img in configuration.image_list -%}
- {{ img }}
- {% endfor -%}
- {% for sis in configuration.sis_list -%}
- {{ sis['path'] }}
- {% endfor -%}
- {% for file in configuration.filelist -%}
- {{ file }}
- {% endfor -%}
-
- {% endif %}
-
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/ats3/matti2.py
--- a/buildframework/helium/sf/python/pythoncore/lib/ats3/matti2.py Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,446 +0,0 @@
-# -*- encoding: latin-1 -*-
-
-#============================================================================
-#Name : matti.py
-#Part of : Helium
-
-#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
-#All rights reserved.
-#This component and the accompanying materials are made available
-#under the terms of the License "Eclipse Public License v1.0"
-#which accompanies this distribution, and is available
-#at the URL "http://www.eclipse.org/legal/epl-v10.html".
-#
-#Initial Contributors:
-#Nokia Corporation - initial contribution.
-#
-#Contributors:
-#
-#Description:
-#===============================================================================
-
-"""MATTI test drop generation."""
-
-# pylint: disable=R0201,R0903,R0902,W0142
-#W0142 => * and ** were used
-#R* removed during refactoring
-
-from optparse import OptionParser
-from xml.etree import ElementTree as et
-import logging
-import os
-import re
-import tempfile
-import zipfile
-import pkg_resources # pylint: disable=F0401
-from path import path # pylint: disable=F0401
-import amara
-import ntpath as atspath
-import jinja2 # pylint: disable=F0401
-import ats3.parsers as parser
-
-_logger = logging.getLogger('matti')
-
-# Shortcuts
-E = et.Element
-SE = et.SubElement
-
-class Configuration(object):
- """
- MATTI drop generation configuration.
- """
-
- def __init__(self, opts):
- """
- Initialize from optparse configuration options.
- """
-
- self._opts = opts
- # Customize some attributes from how optparse leaves them.
- self.build_drive = path(self._opts.build_drive)
- self.file_store = path(self._opts.file_store)
- self.flash_images = self.split_paths(self._opts.flash_images)
- self.matti_sis_files = self.split_paths(self._opts.matti_sis_files)
- self.test_assets = self.split_paths(self._opts.testasset_location)
- self.template_loc = path(self._opts.template_loc)
-
-
- def split_paths(self, arg, delim=","):
- """
- Split the string by delim, removing extra whitespace.
- """
- return [path(part.strip())
- for part in arg.split(delim) if part.strip()]
-
- def __getattr__(self, attr):
- return getattr(self._opts, attr)
-
- def __str__(self):
- dump = "Configuration:\n"
- seen = set()
- for key, value in vars(self).items():
- if not key.startswith("_"):
- dump += "\t%s = %s\n" % (key, value)
- seen.add(key)
- for key, value in vars(self._opts).items():
- if key not in seen:
- dump += "\t%s = %s\n" % (key, value)
- seen.add(key)
- return dump
-
-
-class MattiTestPlan(object):
- """
- Tells MATTI server what to test and how.
-
- The MATTI test plan from which the test.xml file can be written. The test
- plan requires TestAsset(s) to perform the tests
- """
-
- def __init__(self, config):
- self.pkg_parser = parser.PkgFileParser()
- self.file_store = config.file_store
- self.matti_timeout = config.matti_timeout
- self.test_profiles = config.test_profiles
- self.sierra_enabled = to_bool(config.sierra_enabled)
- self.sierra_parameters = config.sierra_parameters
- self.test_profiles = config.test_profiles.strip().split(",")
- self.build_drive = config.build_drive
- self.matti_sis_files = ""
- self.install_files = []
- self.matti_task_files = None
-
- def insert_execution_block(self, block_count=1, image_files=None, matti_sis_files=None, test_asset_path=None, matti_parameters=None):
- """
- Insert Matti tasks and test data files into execution block
- """
- self.matti_sis_files = matti_sis_files
- temp_sis_files = []
- if self.matti_sis_files != None:
- for sis_file in self.matti_sis_files:
- temp_sis_files.append(sis_file.split("#"))
-
- test_asset_path = test_asset_path
- if image_files is None:
- image_files = []
-
- exe_dict = dict(name="exe%d" % block_count, asset_path=test_asset_path, image_files=image_files, matti_sis_files=temp_sis_files)
- exe_dict = dict(exe_dict, test_timeout=self.matti_timeout)
- exe_dict = dict(exe_dict, matti_parameters=matti_parameters)
- exe_dict = dict(exe_dict, sierra_enabled=self.sierra_enabled.lower())
- exe_dict = dict(exe_dict, sierra_parameters=self.sierra_parameters)
-
-
- self.matti_task_files = self.create_matti_task_files_list(self.sierra_enabled, test_asset_path)
- exe_dict = dict(exe_dict, matti_task_files=self.matti_task_files)
-
- self.install_files = self.create_install_files_list(test_asset_path)
- exe_dict = dict(exe_dict, install_files=self.install_files)
- return exe_dict
-
- def create_matti_task_files_list(self, enabler=None, asset_path=None):
- """
- Creates list of files needed to include in MATTI execution tasks
- if sierra.enabled then
- profiles (.sip files) are included
- else
- all ruby (.rb) files are included
- """
-
- profiles = []
- rb_files = []
-
- #If sierra engine is enabled (set to True)
- if self.sierra_enabled.lower() == "true":
- profile_path = path(os.path.join(asset_path, "profile"))
- if os.path.exists(profile_path):
- for profile_name in self.test_profiles:
- item = list(profile_path.walkfiles("%s.sip"%profile_name.lower().strip()))
- if len(item) > 0:
- #profiles.append(os.path.join(profile_path, item[0]))
- profiles.append(asset_path.rsplit(os.sep, 1)[1] + "/" + "profile" + "/" + item[0].rsplit(os.sep, 1)[1])
- return profiles
- else: #If sierra engine is not enabled (set to False)
- if os.path.exists(asset_path):
- #returns list(asset_path.walkfiles("*.rb")):
- for item in list(asset_path.walkfiles("*.rb")):
- rb_files.append(asset_path.rsplit(os.sep, 1)[1] + "/" + item.rsplit(os.sep, 1)[1])
- # Sorting the result, so we ensure they are always in similar order.
- rb_files.sort()
- return rb_files
-
- def create_install_files_list(self, asset_path=None):
- """
- Collects all the .pkg files and extract data
- Creates a list of src, dst files.
- """
- pkg_files = []
- if os.path.exists(asset_path):
- pkg_files = list(asset_path.walkfiles("*.pkg"))
- return self.pkg_parser.get_data_files(pkg_files, self.build_drive)
- else:
- return None
-
- def __getitem__(self, key):
- return self.__dict__[key]
-
-
-
-class MattiTestDropGenerator(object):
- """
- Generate test drop zip file for Matti.
-
- Generates drop zip files file from Test Assets. The main
- responsibility of this class is to create testdrop and test.xml
- file and build a zip file for the MATTI drop.
-
- """
-
- def __init__(self):
- self.drop_path_root = path("MATTIDrop")
- self.drop_path = None
- self.defaults = {}
-
- def generate(self, xml_dict, output_file, template_loc=None):
- """Generate a test drop file."""
- xml = self.generate_xml(xml_dict, template_loc)
- return self.generate_drop(xml_dict, xml, output_file)
-
- def generate_drop(self, xml_dict, xml, output_file):
- """Generate test drop zip file."""
-
- zfile = zipfile.ZipFile(output_file, "w", zipfile.ZIP_DEFLATED)
- try:
- for drop_file, src_file in self.drop_files(xml_dict):
-
- _logger.info(" + Adding: %s" % src_file.strip())
- try:
- zfile.write(src_file.strip(), drop_file.encode('utf-8'))
- except OSError, expr:
- _logger.error(expr)
- doc = amara.parse(et.tostring(xml.getroot()))
- _logger.debug("XML output: %s" % doc.xml(indent=u"yes", encoding="ISO-8859-1"))
- zfile.writestr("test.xml", doc.xml(indent="yes", encoding="ISO-8859-1"))
- finally:
- _logger.info("Matti testdrop created successfully!")
- zfile.close()
-
- def generate_xml(self, xml_dict, template_loc):
- """ generate an XML file"""
- template_loc = path(template_loc).normpath()
- loader = jinja2.ChoiceLoader([jinja2.PackageLoader(__name__, 'templates')])
- env = jinja2.Environment(loader=loader)
- if template_loc is None or not ".xml" in template_loc.lower():
- template = env.from_string(pkg_resources.resource_string(__name__, 'matti_template.xml'))# pylint: disable=E1101
- else:
- template = env.from_string(open(template_loc).read())# pylint: disable=E1101
-
- xmltext = template.render(xml_dict=xml_dict, os=os, atspath=atspath, atsself=self).encode('ISO-8859-1')
- #print xmltext
- return et.ElementTree(et.XML(xmltext))
-
-
- def generate_testasset_zip(self, xml_dict, output_file=None):
- """Generate TestAsset.zip for the MATTI server"""
- filename = xml_dict["temp_directory"].joinpath(r"TestAsset.zip")
-
- if output_file != None:
- filename = output_file
-
- for exe_block in xml_dict["execution_blocks"]:
- testasset_location = path(exe_block["asset_path"])
-
- zfile = zipfile.ZipFile(filename, "w", zipfile.ZIP_DEFLATED)
- try:
- for file_ in list(testasset_location.walkfiles()):
- file_mod = file_.replace(testasset_location, "")
- zfile.write(file_, file_mod.encode('utf-8'))
- finally:
- zfile.close()
- return filename
-
- def drop_files(self, xml_dict):
- """Yield a list of drop files."""
-
- drop_set = set()
- drop_files = []
-
- #Adding test asset, there's an execution block for every test asset
- for execution_block in xml_dict["execution_blocks"]:
- testasset_location = path(execution_block["asset_path"])
- asset_files = list(testasset_location.walkfiles())
-
- drop_path = path(execution_block["name"])
-
- drop_files = ((drop_path.parent, "images", execution_block["image_files"]),
- (drop_path.parent, "sisfiles", execution_block["matti_sis_files"]),
- (drop_path.parent, "mattiparameters", execution_block["matti_parameters"]),
- (drop_path.parent, execution_block["name"], asset_files))
-
- for drop_dir, sub_dir, files in drop_files:
- for file_path in files:
- if file_path != None:
-
- #Adding image files to the top level,
- #Also adding mattiparameters.xml file
- if sub_dir.lower() == "images" or sub_dir.lower() == "mattiparameters":
- drop_file = drop_dir.joinpath(sub_dir, file_path.name)
-
- #Adding sisfiles, installation of matti sisfiles is a bit different
- #than normal sisfiles
- elif sub_dir.lower() == "sisfiles":
- drop_file = drop_dir.joinpath(sub_dir, path(file_path[0]).name)
- file_path = path(file_path[0])
-
- #Adding test asset files
- else:
- temp_file = file_path.rsplit(os.sep, 1)[0]
- replace_string = testasset_location.rsplit(os.sep, 1)[0]
- drop_file = drop_dir.joinpath(sub_dir + "\\" + temp_file.replace(replace_string, ""), file_path.name)
-
- drop_file = drop_file.normpath()
- if drop_file not in drop_set:
- drop_set.add(drop_file)
- yield (drop_file, file_path.normpath())
-
-
-class MattiComponentParser(object):
- """
- Add information to the XML dictionary
- """
- def __init__(self, config):
- self.flash_images = [path(p) for p in config.flash_images]
- self.matti_parameters = [path(config.matti_parameters).normpath()]
- self.matti_sis_files = config.matti_sis_files
- self.build_drive = config.build_drive
- self.test_timeout = config.matti_timeout
- self.diamonds_build_url = config.diamonds_build_url
- self.testrun_name = config.testrun_name
- self.alias_name = config.alias_name
- self.device_type = config.device_type
- self.report_email = config.report_email
- self.email_format = config.email_format
- self.email_subject = config.email_subject
-
- self.xml_dict = {}
-
-
- def insert_pre_data(self):
- """
- Creates a dictionary for the data before
- the block starts.
- """
- self.xml_dict = dict(self.xml_dict, temp_directory=path(tempfile.mkdtemp()))
- self.xml_dict = dict(self.xml_dict, diamonds_build_url=self.diamonds_build_url)
- self.xml_dict = dict(self.xml_dict, testrun_name=self.testrun_name)
- self.xml_dict = dict(self.xml_dict, alias_name=self.alias_name)
- self.xml_dict = dict(self.xml_dict, device_type=self.device_type)
-
- def create_execution_block(self, config):
- """Parse flash images and creates execution block for matti"""
- execution_block_list = []
- block_count = 0
- for test_asset in config.test_assets:
- if os.path.exists(test_asset):
- test_plan = MattiTestPlan(config)
- block_count += 1
- execution_block_list.append(test_plan.insert_execution_block(block_count, self.flash_images, self.matti_sis_files, test_asset, self.matti_parameters))
-
-
- self.xml_dict = dict(self.xml_dict, execution_blocks=execution_block_list)
-
- def insert_post_data(self):
- """
- Creates a dictionary for the data after
- the block ends. Or, Postaction data
- """
- self.xml_dict = dict(self.xml_dict, report_email=self.report_email)
- self.xml_dict = dict(self.xml_dict, email_format=self.email_format)
- self.xml_dict = dict(self.xml_dict, email_subject=self.email_subject)
-
- return self.xml_dict
-
-def create_drop(config):
- """Create a test drop."""
- xml_dict = {}
-
- _logger.debug("initialize Matti dictionary")
- drop_parser = MattiComponentParser(config)
-
- #Inserting data for test run and global through out the dictionary
- drop_parser.insert_pre_data()
-
- #for every asset path there should be a
- #separate execution block
- drop_parser.create_execution_block(config)
-
- #Inserting reporting and email data (post actions)
- xml_dict = drop_parser.insert_post_data()
-
- generator = MattiTestDropGenerator()
-
- _logger.info("generating drop file: %s" % config.drop_file)
- generator.generate(xml_dict, output_file=config.drop_file, template_loc=config.template_loc)
-
-def to_bool(param):
- """setting a true or false based on a param value"""
- param = str(param).lower()
- if "true" == param or "t" == param or "1" == param:
- return "True"
- else:
- return "False"
-
-def main():
- """Main entry point."""
-
-
- cli = OptionParser(usage="%prog [options] PATH1 [PATH2 [PATH3 ...]]")
- cli.add_option("--ats4-enabled", help="ATS4 enabled", default="True")
- cli.add_option("--build-drive", help="Build area root drive")
- cli.add_option("--drop-file", help="Name for the final drop zip file", default="MATTIDrop.zip")
-
- cli.add_option("--minimum-flash-images", help="Minimum amount of flash images", default=2)
- cli.add_option("--flash-images", help="Paths to the flash image files", default="")
- cli.add_option("--matti-sis-files", help="Sis files location", default="")
-
- cli.add_option("--testasset-location", help="MATTI test assets location", default="")
- cli.add_option("--template-loc", help="Custom template location", default="")
- cli.add_option("--sierra-enabled", help="Enabled or disabled Sierra", default=True)
- cli.add_option("--test-profiles", help="Test profiles e.g. bat, fute", default="")
- cli.add_option("--matti-parameters", help="Location of xml file contains additional parameters for Matti", default="")
-
- cli.add_option("--matti-timeout", help="Test execution timeout value (default: %default)", default="60")
- cli.add_option("--sierra-parameters", help="Additional sierra parameters for matti task", default="")
- cli.add_option("--file-store", help="Destination path for reports.", default="")
- cli.add_option("--report-email", help="Email notification receivers", default="")
- cli.add_option("--testrun-name", help="Name of the test run", default="run")
- cli.add_option("--alias-name", help="Name of the alias", default="sut_s60")
- cli.add_option("--device-type", help="Device type (e.g. 'PRODUCT')", default="unknown")
- cli.add_option("--diamonds-build-url", help="Diamonds build url")
- cli.add_option("--email-format", help="Format of an email", default="")
- cli.add_option("--email-subject", help="Subject of an email", default="Matti Testing")
-
-
- cli.add_option("--verbose", help="Increase output verbosity", action="store_true", default=False)
-
- opts, _ = cli.parse_args()
-
- ats4_enabled = to_bool(opts.ats4_enabled)
-
- if ats4_enabled == "False":
- cli.error("MATTI tests execute on ATS4. Set property 'ats4.enabled'")
-
- if not opts.flash_images:
- cli.error("no flash image files given")
- if len(opts.flash_images.split(",")) < int(opts.minimum_flash_images):
- cli.error("Not enough flash files: %i defined, %i needed" % (len(opts.flash_images.split(",")), int(opts.minimum_flash_images) ))
-
- if opts.verbose:
- _logger.setLevel(logging.DEBUG)
- logging.basicConfig(level=logging.DEBUG)
- _ = tempfile.mkdtemp()
- config = Configuration(opts)
- create_drop(config)
-
-if __name__ == "__main__":
- main()
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/ats3/matti_template.xml
--- a/buildframework/helium/sf/python/pythoncore/lib/ats3/matti_template.xml Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,154 +0,0 @@
-
-
-
-
-
- {% if xml_dict['diamonds_build_url'] -%}
- {{ xml_dict['diamonds_build_url'] }}
- Smoke
- {% endif %}
- {{ xml_dict['testrun_name'] }}
-
-
-
-
-
-
-
-
-
- {% for exe_block in xml_dict['execution_blocks'] -%}
-
-
-
- {% if exe_block['image_files'] -%}
-
- FlashTask
-
- {% set i = 1 %}
- {% for img in exe_block['image_files'] -%}
-
- {% set i = i + 1 %}
- {% endfor -%}
-
-
- {% endif %}
-
-
- {% if exe_block['install_files'] != [] -%}
- {% for file in exe_block['install_files'] -%}
-
- FileUploadTask
-
-
-
-
-
- {% endfor -%}
- {% endif %}
-
- {% if exe_block['matti_sis_files'] != [] -%}
- {% for sisfile in exe_block['matti_sis_files'] -%}
-
- FileUploadTask
-
-
-
-
-
- {% endfor -%}
- {% endif %}
-
- {% for sis_file in exe_block["matti_sis_files"] -%}
-
- InstallSisTask
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- {%- endfor -%}
-
-
- RebootTask
-
-
-
- CreateDirTask
-
-
-
-
-
-
- {% for task_file in exe_block["matti_task_files"] -%}
-
- MATTITask
-
-
-
-
-
-
-
-
- {% endfor -%}
-
-
-
- CleanupTask
-
-
-
-
-
-
- {% endfor -%}
-
-
-
- EmailAction
-
-
-
-
-
-
-
-
-
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/ats3/parsers.py
--- a/buildframework/helium/sf/python/pythoncore/lib/ats3/parsers.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/ats3/parsers.py Mon Oct 11 11:16:47 2010 +0100
@@ -222,7 +222,7 @@
_logger.debug(itm)
for itm in test_cases:
_logger.debug(itm)
- _logger.error(path_to_bld + ' test_sets are empty')
+ _logger.error(path_to_bld + ' - test sets are empty')
return test_sets
@@ -553,7 +553,7 @@
elif harness is "STIF":
dll_type = "executable"
- except:
+ except IOError:
traceback.print_exc()
else:
returnvals = None
@@ -596,7 +596,6 @@
self.drive = drive
self._files = []
self.pkg_files = []
- self.pkg_file_path = None
self.exclude = ""
self.location = None
self.bldpath = bldpath
@@ -638,7 +637,7 @@
return self.pkg_files
- def get_data_files(self, location = [], drive = "", exclude = ""):
+ def get_data_files(self, location=None, drive="", exclude=""):
"""
Returns data files, source and destination of the files to be installed
on the phone
@@ -656,7 +655,9 @@
if pkg file is not given, the function will try to find the file(s) on the given location with extension ".pkg"
"""
-
+ if location == None:
+ location = []
+
self.drive = drive
self.exclude = exclude
self._files = []
@@ -681,15 +682,13 @@
return self.read_pkg_file(self._files)
- def __map_pkg_path(self, pkg_line, pkg_file_path, pkg_file):
+ def __map_pkg_path(self, pkg_line, pkg_file_path, pkg_file, test_type, libraries):
"""Parse package file to get the src and dst paths" for installing files"""
- mmp_parser = MmpFileParser(self.bldpath)
ext = ""
val1 = ""
val2 = ""
map_src = ""
map_dst = ""
- self.pkg_file_path = pkg_file_path
if not self.exclude == "":
if re.search(r'%s' % self.exclude, pkg_line) is not None:
@@ -712,19 +711,20 @@
if "$(target)" in val1.lower() and self.build_target is not None:
val1 = val1.lower().replace("$(target)", self.build_target)
- #For MATTI PKG files in which location of the data files are unknown or can be changed
+ #For TDriver PKG files in which location of the data files are unknown or can be changed
if "[PKG_LOC]" in val1.upper():
- val1 = val1.replace("[PKG_LOC]", self.pkg_file_path)
+ val1 = val1.replace("[PKG_LOC]", pkg_file_path)
- if os.path.exists(val1):
- map_src = os.path.abspath(val1)
+ if os.path.isabs(os.path.normpath(val1)):
+ map_src = os.path.normpath(os.path.join(self.drive, val1))
+ elif re.search(r"\A\w", val1, 1):
+ map_src = os.path.normpath(os.path.join(pkg_file_path + os.sep, os.path.normpath(val1)))
else:
- if os.path.isabs(os.path.normpath(val1)):
- map_src = os.path.normpath(os.path.join(self.drive, val1))
- elif re.search(r"\A\w", val1, 1):
- map_src = os.path.normpath(os.path.join(self.pkg_file_path + os.sep, os.path.normpath(val1)))
- else:
- map_src = os.path.normpath(os.path.join(self.pkg_file_path, val1))
+ map_src = os.path.normpath(os.path.join(pkg_file_path, val1))
+
+ if os.sep == '\\':
+ if os.path.splitunc(val1)[0].strip() != "":
+ map_src = os.path.normpath(val1)
map_dst = os.path.normpath(val2)
else:
map_src, map_dst = val1, val2
@@ -739,14 +739,6 @@
ext = indx[1]
else:
_logger.warning("File extension not found for " + map_dst)
-
- _test_type_ = ""
- _target_filename_ = ""
-
- _target_filename_ = mmp_parser.get_target_filename(self.pkg_file_path)
- _test_type_ = mmp_parser.get_dll_type(self.pkg_file_path)
- _harness_ = mmp_parser.get_harness(self.pkg_file_path)
- _libraries_ = mmp_parser.get_libraries(self.pkg_file_path)
if ext == "ini":
file_type = "engine_ini"
@@ -754,24 +746,24 @@
file_type = "conf"
elif ext == "dll":
#adding type of dll (executable or dependent), if file type is dll
- if _test_type_ == "dependent":
- file_type = "data" + ":%s" % _test_type_
+ if test_type == "dependent":
+ file_type = "data" + ":%s" % test_type
else:
- if "qttest.lib" in _libraries_:
+ if "qttest.lib" in libraries:
file_type = "data" + ":qt:dependent"
else:
- if 'symbianunittestfw.lib' in _libraries_:
+ if 'symbianunittestfw.lib' in libraries:
file_type = "testmodule:sut"
else:
file_type = "testmodule"
- elif ext == 'exe' and 'rtest' in _libraries_:
+ elif ext == 'exe' and 'rtest' in libraries:
file_type = "testmodule:rtest"
elif ext == "exe":
- if _test_type_ == "dependent":
- file_type = "data" + ":%s" % _test_type_
+ if test_type == "dependent":
+ file_type = "data" + ":%s" % test_type
else:
- if "qttest.lib" in _libraries_:
+ if "qttest.lib" in libraries:
file_type = "testmodule:qt"
else:
file_type = "testmodule"
@@ -783,7 +775,7 @@
elif ext == "pmd":
file_type = "pmd"
elif ext == "script":
- if "testframeworkclient.lib" in _libraries_:
+ if "testframeworkclient.lib" in libraries:
file_type = "testscript:mtf:testframework.exe"
else:
file_type = "testscript:testexecute.exe"
@@ -796,7 +788,7 @@
elif exename == 'testexecute.exe':
file_type = "testscript:" + exename
else:
- if "testframeworkclient.lib" in _libraries_:
+ if "testframeworkclient.lib" in libraries:
file_type = "testscript:mtf:" + exename
else:
file_type = "testscript:" + exename
@@ -817,8 +809,8 @@
pkg_files = [pkg_files]
for pkg_file in pkg_files:
- if not os.path.exists( pkg_file ):
- _logger.error("No PKG -file in path specified")
+ if not os.path.exists(pkg_file):
+ _logger.error(pkg_file + ' not found')
continue
else:
file1 = codecs.open(pkg_file, 'r', 'utf16')
@@ -829,11 +821,13 @@
lines = file1.readlines()
pkg_file_path = path((pkg_file.rsplit(os.sep, 1))[0])
+
+ mmp_parser = MmpFileParser(self.bldpath)
+ test_type = mmp_parser.get_dll_type(pkg_file_path)
+ libraries = mmp_parser.get_libraries(pkg_file_path)
for line in lines:
- pkg_path = self.__map_pkg_path(line, pkg_file_path, os.path.basename(pkg_file))
- if pkg_path is None:
- continue
- else:
+ pkg_path = self.__map_pkg_path(line, pkg_file_path, os.path.basename(pkg_file), test_type, libraries)
+ if pkg_path:
pkg_paths.append(pkg_path)
return pkg_paths
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/ats3/tdriver.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/pythoncore/lib/ats3/tdriver.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,454 @@
+# -*- encoding: latin-1 -*-
+
+#============================================================================
+#Name : tdriver.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+
+"""TDriver test drop generation."""
+
+
+#W0142 => * and ** were used
+#R* removed during refactoring
+
+from optparse import OptionParser
+from xml.etree import ElementTree as et
+import pkg_resources
+from path import path # pylint: disable=F0401
+import logging
+import os
+import re
+import tempfile
+import zipfile
+import amara
+import ntpath as atspath
+import jinja2 # pylint: disable=F0401
+import ats3.parsers
+
+_logger = logging.getLogger('tdriver')
+
+# Shortcuts
+E = et.Element
+SE = et.SubElement
+
+class Configuration(object):
+ """
+ TDriver drop generation configuration.
+ """
+
+ def __init__(self, opts):
+ """
+ Initialize from optparse configuration options.
+ """
+
+ self._opts = opts
+ # Customize some attributes from how optparse leaves them.
+ self.build_drive = path(self._opts.build_drive)
+ self.file_store = path(self._opts.file_store)
+ self.flash_images = self.split_paths(self._opts.flash_images)
+ self.tdriver_sis_files = self.split_paths(self._opts.tdriver_sis_files)
+ self.test_assets = self.split_paths(self._opts.testasset_location)
+ self.template_loc = path(self._opts.template_loc)
+
+
+ def split_paths(self, arg, delim=","):
+ """
+ Split the string by delim, removing extra whitespace.
+ """
+ return [path(part.strip())
+ for part in arg.split(delim) if part.strip()]
+
+ def __getattr__(self, attr):
+ return getattr(self._opts, attr)
+
+ def __str__(self):
+ dump = "Configuration:\n"
+ seen = set()
+ for key, value in vars(self).items():
+ if not key.startswith("_"):
+ dump += "\t%s = %s\n" % (key, value)
+ seen.add(key)
+ for key, value in vars(self._opts).items():
+ if key not in seen:
+ dump += "\t%s = %s\n" % (key, value)
+ seen.add(key)
+ return dump
+
+
+class TDriverTestPlan(object):
+ """ Tells TDriver server what to test and how.
+
+ The TDriver test plan from which the test.xml file can be written. The test
+ plan requires TestAsset(s) to perform the tests
+ """
+
+ def __init__(self, config):
+ self.pkg_parser = ats3.parsers.PkgFileParser()
+ self.file_store = config.file_store
+ self.tdriver_timeout = config.tdriver_timeout
+ self.test_profiles = config.test_profiles
+ self.tdrunner_enabled = to_bool(config.tdrunner_enabled)
+ self.tdrunner_parameters = config.tdrunner_parameters
+ self.test_profiles = config.test_profiles.strip().split(",")
+ self.build_drive = config.build_drive
+ self.tdriver_sis_files = ""
+ self.install_files = []
+ self.tdriver_task_files = None
+ self.ctc_enabled = 'False'
+ if hasattr(config, 'ctc_enabled'):
+ self.ctc_enabled = to_bool(config.ctc_enabled)
+
+ def insert_execution_block(self, block_count=1, image_files=None, tdriver_sis_files=None, test_asset_path=None, tdriver_parameters=None):
+ """
+ Insert TDriver tasks and test data files into execution block
+ """
+ self.tdriver_sis_files = tdriver_sis_files
+ temp_sis_files = []
+ if self.tdriver_sis_files != None:
+ for sis_file in self.tdriver_sis_files:
+ temp_sis_files.append(sis_file.split("#"))
+
+ test_asset_path = test_asset_path
+ if image_files is None:
+ image_files = []
+
+ exe_dict = dict(name="exe%d" % block_count, asset_path=test_asset_path, image_files=image_files, tdriver_sis_files=temp_sis_files, ctc_enabled=self.ctc_enabled)
+ exe_dict = dict(exe_dict, test_timeout=self.tdriver_timeout)
+ exe_dict = dict(exe_dict, tdriver_parameters=tdriver_parameters)
+ exe_dict = dict(exe_dict, tdrunner_enabled=self.tdrunner_enabled.lower())
+ exe_dict = dict(exe_dict, tdrunner_parameters=self.tdrunner_parameters)
+
+
+ self.tdriver_task_files = self._create_tdriver_task_files_list(test_asset_path)
+ exe_dict = dict(exe_dict, tdriver_task_files=self.tdriver_task_files)
+
+ self.install_files = self.create_install_files_list(test_asset_path)
+ exe_dict = dict(exe_dict, install_files=self.install_files)
+ return exe_dict
+
+ def _create_tdriver_task_files_list(self, asset_path=None):
+ """
+ Creates list of files needed to include in TDriver execution tasks
+ if tdrunner.enabled then
+ profiles (.sip files) are included
+ else
+ all ruby (.rb) files are included
+ """
+
+ profiles = []
+ rb_files = []
+
+ #If TDrunner engine is enabled (set to True)
+ if self.tdrunner_enabled.lower() == "true":
+ profile_path = path(os.path.join(asset_path, "profile"))
+ if os.path.exists(profile_path):
+ for profile_name in self.test_profiles:
+ item = list(profile_path.walkfiles("%s.sip"%profile_name.lower().strip()))
+ if len(item) > 0:
+ #profiles.append(os.path.join(profile_path, item[0]))
+ profiles.append(asset_path.rsplit(os.sep, 1)[1] + "/" + "profile" + "/" + item[0].rsplit(os.sep, 1)[1])
+ return profiles
+ else:
+ _logger.warning(profile_path + ' not found')
+ else: #If TDruner engine is not enabled (set to False)
+ if os.path.exists(asset_path):
+ #returns list(asset_path.walkfiles("*.rb")):
+ for item in list(asset_path.walkfiles("*.rb")):
+ rb_files.append(asset_path.rsplit(os.sep, 1)[1] + "/" + item.rsplit(os.sep, 1)[1])
+ # Sorting the result, so we ensure they are always in similar order.
+ rb_files.sort()
+ return rb_files
+ else:
+ _logger.warning(asset_path + ' not found')
+
+ def create_install_files_list(self, asset_path=None):
+ """
+ Collects all the .pkg files and extract data
+ Creates a list of src, dst files.
+ """
+ pkg_files = []
+ if os.path.exists(asset_path):
+ pkg_files = list(asset_path.walkfiles("*.pkg"))
+ return self.pkg_parser.get_data_files(pkg_files, self.build_drive)
+ else:
+ return None
+
+ def __getitem__(self, key):
+ return self.__dict__[key]
+
+
+
+class TDriverTestDropGenerator(object):
+ """
+ Generate test drop zip file for TDriver.
+
+ Generates drop zip files file from Test Assets. The main
+ responsibility of this class is to create testdrop and test.xml
+ file and build a zip file for the TDriver drop.
+
+ """
+
+ def __init__(self):
+ self.drop_path_root = path("TDriverDrop")
+ self.drop_path = None
+ self.defaults = {}
+ self.CTC_LOG_DIR = r"c:\data\ctc"
+
+ def generate(self, xml_dict, output_file, template_loc=None):
+ """Generate a test drop file."""
+ xml = self.generate_xml(xml_dict, template_loc)
+ return self.generate_drop(xml_dict, xml, output_file)
+
+ def generate_drop(self, xml_dict, xml, output_file):
+ """Generate test drop zip file."""
+
+ zfile = zipfile.ZipFile(output_file, "w", zipfile.ZIP_DEFLATED)
+ try:
+ for drop_file, src_file in self.drop_files(xml_dict):
+
+ _logger.info(" + Adding: %s" % src_file.strip())
+ try:
+ zfile.write(src_file.strip(), drop_file.encode('utf-8'))
+ except OSError, expr:
+ _logger.error(expr)
+ doc = amara.parse(et.tostring(xml.getroot()))
+ _logger.debug("XML output: %s" % doc.xml(indent=u"yes", encoding="ISO-8859-1"))
+ zfile.writestr("test.xml", doc.xml(indent="yes", encoding="ISO-8859-1"))
+ finally:
+ _logger.info("TDriver testdrop created successfully!")
+ zfile.close()
+
+ def generate_xml(self, xml_dict, template_loc):
+ """ generate an XML file"""
+ template_loc = path(template_loc).normpath()
+ loader = jinja2.ChoiceLoader([jinja2.PackageLoader(__package__, 'templates')])
+ env = jinja2.Environment(loader=loader)
+ if template_loc is None or not ".xml" in template_loc.lower():
+ template = env.from_string(pkg_resources.resource_string(__name__, 'tdriver_template.xml'))# pylint: disable=E1101
+ else:
+ template = env.from_string(open(template_loc).read())# pylint: disable=E1101
+
+ xmltext = template.render(xml_dict=xml_dict, test_plan=xml_dict, os=os, atspath=atspath, atsself=self).encode('ISO-8859-1')
+ _logger.info(xmltext)
+ return et.ElementTree(et.XML(xmltext))
+
+
+ def generate_testasset_zip(self, xml_dict, output_file=None):
+ """Generate TestAsset.zip for the TDriver server"""
+ filename = xml_dict["temp_directory"].joinpath(r"TestAsset.zip")
+
+ if output_file != None:
+ filename = output_file
+
+ for exe_block in xml_dict["execution_blocks"]:
+ testasset_location = path(exe_block["asset_path"])
+
+ zfile = zipfile.ZipFile(filename, "w", zipfile.ZIP_DEFLATED)
+ try:
+ for file_ in list(testasset_location.walkfiles()):
+ file_mod = file_.replace(testasset_location, "")
+ zfile.write(file_, file_mod.encode('utf-8'))
+ finally:
+ zfile.close()
+ return filename
+
+ def drop_files(self, xml_dict):
+ """Yield a list of drop files."""
+
+ drop_set = set()
+ drop_files = []
+
+ #Adding test asset, there's an execution block for every test asset
+ for execution_block in xml_dict["execution_blocks"]:
+ testasset_location = path(execution_block["asset_path"])
+ asset_files = list(testasset_location.walkfiles())
+
+ drop_path = path(execution_block["name"])
+
+ drop_files = ((drop_path.parent, "images", execution_block["image_files"]),
+ (drop_path.parent, "sisfiles", execution_block["tdriver_sis_files"]),
+ (drop_path.parent, "tdriverparameters", execution_block["tdriver_parameters"]),
+ (drop_path.parent, execution_block["name"], asset_files))
+
+ for drop_dir, sub_dir, files in drop_files:
+ for file_path in files:
+ if file_path != None:
+
+ #Adding image files to the top level,
+ #Also adding tdriverparameters.xml file
+ if sub_dir.lower() == "images" or sub_dir.lower() == "tdriverparameters":
+ drop_file = drop_dir.joinpath(sub_dir, file_path.name)
+
+ #Adding sisfiles, installation of tdriver sisfiles is a bit different
+ #than normal sisfiles
+ elif sub_dir.lower() == "sisfiles":
+ drop_file = drop_dir.joinpath(sub_dir, path(file_path[0]).name)
+ file_path = path(file_path[0])
+
+ #Adding test asset files
+ else:
+ temp_file = file_path.rsplit(os.sep, 1)[0]
+ replace_string = testasset_location.rsplit(os.sep, 1)[0]
+ drop_file = drop_dir.joinpath(sub_dir + "\\" + temp_file.replace(replace_string, ""), file_path.name)
+
+ drop_file = drop_file.normpath()
+ if drop_file not in drop_set:
+ drop_set.add(drop_file)
+ yield (drop_file, file_path.normpath())
+
+
+class TDriverComponentParser(object):
+ """
+ Add information to the XML dictionary
+ """
+ def __init__(self, config):
+ self.flash_images = [path(p) for p in config.flash_images]
+ self.tdriver_parameters = [path(config.tdriver_parameters).normpath()]
+ self.tdriver_sis_files = config.tdriver_sis_files
+ self.build_drive = config.build_drive
+ self.test_timeout = config.tdriver_timeout
+ self.diamonds_build_url = config.diamonds_build_url
+ self.testrun_name = config.testrun_name
+ self.alias_name = config.alias_name
+ self.device_type = config.device_type
+ self.report_email = config.report_email
+ self.email_format = config.email_format
+ self.email_subject = config.email_subject
+ self.file_store = config.file_store
+
+ self.xml_dict = {}
+
+
+ def insert_pre_data(self):
+ """
+ Creates a dictionary for the data before
+ the block starts.
+ """
+ self.xml_dict = dict(self.xml_dict, temp_directory=path(tempfile.mkdtemp()))
+ self.xml_dict = dict(self.xml_dict, diamonds_build_url=self.diamonds_build_url)
+ self.xml_dict = dict(self.xml_dict, testrun_name=self.testrun_name)
+ self.xml_dict = dict(self.xml_dict, alias_name=self.alias_name)
+ self.xml_dict = dict(self.xml_dict, device_type=self.device_type)
+
+ def create_execution_block(self, config):
+ """Parse flash images and creates execution block for TDriver"""
+ execution_block_list = []
+ block_count = 0
+ for test_asset in config.test_assets:
+ if os.path.exists(test_asset):
+ test_plan = TDriverTestPlan(config)
+ block_count += 1
+ execution_block_list.append(test_plan.insert_execution_block(block_count, self.flash_images, self.tdriver_sis_files, test_asset, self.tdriver_parameters))
+
+
+ self.xml_dict = dict(self.xml_dict, execution_blocks=execution_block_list)
+
+ def insert_post_data(self):
+ """
+ Creates a dictionary for the data after
+ the block ends. Or, Postaction data
+ """
+ self.xml_dict = dict(self.xml_dict, report_email=self.report_email)
+ self.xml_dict = dict(self.xml_dict, email_format=self.email_format)
+ self.xml_dict = dict(self.xml_dict, email_subject=self.email_subject)
+ self.xml_dict = dict(self.xml_dict, report_location=self.file_store)
+
+ return self.xml_dict
+
+def create_drop(config):
+ """Create a test drop."""
+ xml_dict = {}
+
+ _logger.debug("initialize TDriver dictionary")
+ drop_parser = TDriverComponentParser(config)
+
+ #Inserting data for test run and global through out the dictionary
+ drop_parser.insert_pre_data()
+
+ #for every asset path there should be a
+ #separate execution block
+ drop_parser.create_execution_block(config)
+
+ #Inserting reporting and email data (post actions)
+ xml_dict = drop_parser.insert_post_data()
+
+ generator = TDriverTestDropGenerator()
+
+ _logger.info("generating drop file: %s" % config.drop_file)
+ generator.generate(xml_dict, output_file=config.drop_file, template_loc=config.template_loc)
+
+def to_bool(param):
+ """setting a true or false based on a param value"""
+ param = str(param).lower()
+ if "true" == param or "t" == param or "1" == param:
+ return "True"
+ else:
+ return "False"
+
+def main():
+ """Main entry point."""
+
+
+ cli = OptionParser(usage="%prog [options] PATH1 [PATH2 [PATH3 ...]]")
+ cli.add_option("--ats4-enabled", help="ATS4 enabled", default="True")
+ cli.add_option("--build-drive", help="Build area root drive")
+ cli.add_option("--drop-file", help="Name for the final drop zip file", default="TDriverDrop.zip")
+
+ cli.add_option("--minimum-flash-images", help="Minimum amount of flash images", default=2)
+ cli.add_option("--flash-images", help="Paths to the flash image files", default="")
+ cli.add_option("--tdriver-sis-files", help="Sis files location", default="")
+
+ cli.add_option("--testasset-location", help="TDriver test assets location", default="")
+ cli.add_option("--template-loc", help="Custom template location", default="")
+ cli.add_option("--tdrunner-enabled", help="Enabled or disabled TDrunner", default=True)
+ cli.add_option("--test-profiles", help="Test profiles e.g. bat, fute", default="")
+ cli.add_option("--tdriver-parameters", help="Location of xml file contains additional parameters for TDriver", default="")
+
+ cli.add_option("--tdriver-timeout", help="Test execution timeout value (default: %default)", default="60")
+ cli.add_option("--tdrunner-parameters", help="Additional TDrunner parameters for TDriver task", default="")
+ cli.add_option("--file-store", help="Destination path for reports.", default="")
+ cli.add_option("--report-email", help="Email notification receivers", default="")
+ cli.add_option("--testrun-name", help="Name of the test run", default="run")
+ cli.add_option("--alias-name", help="Name of the alias", default="sut_s60")
+ cli.add_option("--device-type", help="Device type (e.g. 'PRODUCT')", default="unknown")
+ cli.add_option("--diamonds-build-url", help="Diamonds build url")
+ cli.add_option("--email-format", help="Format of an email", default="")
+ cli.add_option("--email-subject", help="Subject of an email", default="TDriver Testing")
+ cli.add_option("--ctc-enabled", help="CTC enabled", default="False")
+
+ cli.add_option("--verbose", help="Increase output verbosity", action="store_true", default=False)
+
+ opts, _ = cli.parse_args()
+
+ ats4_enabled = to_bool(opts.ats4_enabled)
+
+ if ats4_enabled == "False":
+ cli.error("TDriver tests execute on ATS4. Set property 'ats4.enabled'")
+
+ if not opts.flash_images:
+ cli.error("no flash image files given")
+ if len(opts.flash_images.split(",")) < int(opts.minimum_flash_images):
+ cli.error("Not enough flash files: %i defined, %i needed" % (len(opts.flash_images.split(",")), int(opts.minimum_flash_images) ))
+
+ if opts.verbose:
+ _logger.setLevel(logging.DEBUG)
+ logging.basicConfig(level=logging.DEBUG)
+ config = Configuration(opts)
+ create_drop(config)
+
+if __name__ == "__main__":
+ main()
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/ats3/tdriver_template.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/pythoncore/lib/ats3/tdriver_template.xml Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,184 @@
+
+
+
+{% import 'ats4_macros.xml' as macros with context %}
+
+
+
+ {% if xml_dict['diamonds_build_url'] -%}
+ {{ xml_dict['diamonds_build_url'] }}
+ Smoke
+ {% endif %}
+ {{ xml_dict['testrun_name'] }}
+
+
+
+
+
+
+
+
+
+ {% for exe_block in xml_dict['execution_blocks'] -%}
+
+
+
+ {% if exe_block['image_files'] -%}
+
+ FlashTask
+
+ {% set i = 1 %}
+ {% for img in exe_block['image_files'] -%}
+
+ {% set i = i + 1 %}
+ {% endfor -%}
+
+
+ {% endif %}
+
+ {% if exe_block['install_files'] != [] -%}
+ {% for file in exe_block['install_files'] -%}
+
+ FileUploadTask
+
+
+
+
+
+ {% endfor -%}
+ {% endif %}
+
+ {% if exe_block['tdriver_sis_files'] != [] -%}
+ {% for sisfile in exe_block['tdriver_sis_files'] -%}
+
+ FileUploadTask
+
+
+
+
+
+ {% endfor -%}
+ {% endif %}
+
+ {% for sis_file in exe_block["tdriver_sis_files"] -%}
+
+ InstallSisTask
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ {%- endfor -%}
+
+
+ RebootTask
+
+
+
+ CreateDirTask
+
+
+
+
+
+ {% if exe_block["ctc_enabled"] == "True" -%}
+ {{ macros.ctc_initialization(exe_block) }}
+ {%- endif %}
+
+
+ {% if exe_block["tdriver_task_files"] -%}
+ {% for task_file in exe_block["tdriver_task_files"] -%}
+
+ TestabilityTask
+
+
+
+
+
+
+
+
+ {% endfor -%}
+ {% endif %}
+
+
+ {% if exe_block["ctc_enabled"] == "True" -%}
+ {{ macros.ctc_finalization(exe_block) }}
+ {%- endif %}
+
+
+ CleanupTask
+
+
+
+
+
+
+ {% endfor -%}
+
+
+
+ EmailAction
+
+
+
+
+
+
+ {% if xml_dict['report_location'] -%}
+
+ FileStoreAction
+
+
+
+
+
+ {% endif %}
+ {% if xml_dict['diamonds_build_url'] -%}
+
+ DiamondsAction
+ {% if xml_dict['execution_blocks'] != [] and xml_dict['execution_blocks'][0]["ctc_enabled"] == "True" -%}
+
+
+
+ {%- endif %}
+
+ {%- endif %}
+
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/ats3/templates/ats4_macros.xml
--- a/buildframework/helium/sf/python/pythoncore/lib/ats3/templates/ats4_macros.xml Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/ats3/templates/ats4_macros.xml Mon Oct 11 11:16:47 2010 +0100
@@ -19,7 +19,7 @@
============================================================================
-->
-{% macro ctc_initialization() -%}
+{% macro ctc_initialization(test_plan) -%}
CreateDirTask
@@ -35,7 +35,7 @@
{%- endmacro %}
-{% macro ctc_finalization(setd) -%}
+{% macro ctc_finalization(test_plan) -%}
NonTestExecuteTask
@@ -68,6 +68,7 @@
{% macro generate_runsteps_stif(setd) -%}
{% set ini_file = atsself.stif_init_file(setd['src_dst']) %}
{% if ini_file -%}
+ {% set ini_file_module_name = atsself.stifmodulename(ini_file[0]) %}
{% if test_plan['hti'] == 'True' -%}
StifRunCasesTask
@@ -90,7 +91,8 @@
{%- endif %}
- {% else -%}
+ {%- endif %}
+
{% for file in setd['src_dst'] -%}
{% if setd["test_harness"] == "STIF" or setd["test_harness"] == "STIFUNIT" -%}
{% if file[2] == "conf" and ".dll" not in file[1].lower() -%}
@@ -98,10 +100,17 @@
StifRunCasesTask
+ {% if ini_file_module_name.upper() == 'TEFTESTMODULE' -%}
+
+ {% else -%}
+ {%- endif %}
+ {% if ini_file_module_name.upper() == 'TEFTESTMODULE' and test_plan['ats_stf_enabled'].lower() == "true" -%}
+
+ {%- endif %}
{% else -%}
@@ -110,8 +119,13 @@
+ {% if ini_file_module_name.upper() == 'TEFTESTMODULE' -%}
+
+
+ {% else -%}
+ {%- endif %}
@@ -123,7 +137,7 @@
StifRunCasesTask
-
+
@@ -144,11 +158,13 @@
{%- endif %}
{%- endif %}
{%- endfor %}
- {%- endif %}
{%- endmacro %}
{% macro generate_runsteps_stif_single_set(setd) -%}
{% if setd["engine_ini_file"] != None -%}
+
+ {% set ini_file_module_name = atsself.stifmodulename(setd["engine_ini_file"]) %}
+
{% if test_plan['hti'] == 'True' -%}
StifRunCasesTask
@@ -171,16 +187,24 @@
{%- endif %}
- {% elif setd["config_files"] != [] -%}
+ {%- endif %}
+ {% if setd["config_files"] != [] -%}
{% for config_file in setd["config_files"] -%}
{% if test_plan['hti'] == 'True' -%}
StifRunCasesTask
+ {% if ini_file_module_name.upper() == 'TEFTESTMODULE' -%}
+
+ {% else -%}
+ {%- endif %}
+ {% if ini_file_module_name.upper() == 'TEFTESTMODULE' and test_plan['ats_stf_enabled'].lower() == "true" -%}
+
+ {%- endif %}
{% else -%}
@@ -223,4 +247,4 @@
{%- endif %}
{%- endfor %}
{%- endif %}
-{%- endmacro %}
\ No newline at end of file
+{%- endmacro %}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/atsant.py
--- a/buildframework/helium/sf/python/pythoncore/lib/atsant.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/atsant.py Mon Oct 11 11:16:47 2010 +0100
@@ -56,7 +56,7 @@
return None
if noncust:
return noncust
- raise Exception('iconfig not found in ' + self.imagesdir)
+ raise IOError('iconfig not found in ' + self.imagesdir)
def getimage(self, name):
"""get image"""
@@ -64,7 +64,7 @@
for fname in files:
if fname.lower() == name.lower():
return os.path.join(root, fname)
- raise Exception(name + ' not found in ' + self.imagesdir)
+ raise IOError(name + ' not found in ' + self.imagesdir)
def findimages(self):
"""find images"""
@@ -82,10 +82,10 @@
if os.path.exists(image):
output = output + image + ','
else:
- raise Exception(image + ' not found')
+ raise IOError(image + ' not found')
else:
if imagetype == 'core':
- raise Exception(imagetypename + '_FLASH not found in iconfig.xml in ' + self.imagesdir)
+ raise IOError(imagetypename + '_FLASH not found in iconfig.xml in ' + self.imagesdir)
print imagetypename + '_FLASH not found in iconfig.xml'
return output
@@ -120,7 +120,10 @@
for unit in component.unit:
if group not in modules:
modules[group] = []
- modules[group].append(builddrive + os.sep + unit.bldFile)
+ if os.sep == '\\':
+ modules[group].append(builddrive + os.sep + unit.bldFile)
+ else:
+ modules[group].append(unit.bldFile)
else:
sdf = sysdef.api.SystemDefinition(canonicalsysdeffile)
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/build/model.py
--- a/buildframework/helium/sf/python/pythoncore/lib/build/model.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/build/model.py Mon Oct 11 11:16:47 2010 +0100
@@ -199,13 +199,7 @@
""" Initialisation. """
self._ccm_project = ccm_project
self._baselines = {}
- #TODO : could querying release attribute return the ccm object? Or add a release attribute to Project
- # class
- release = self._ccm_project['release']
- _logger.debug("Project release: '%s'" % release)
- self._ccm_release = None
- if release != '':
- self._ccm_project.session.create(release)
+ _logger.debug("Project release: '%s'" % self._ccm_project.release)
# capturing the frozen baseline.
_logger.debug('Capture baselines')
@@ -309,8 +303,8 @@
def _getsupplier(self):
"""get supplier"""
- if self._ccm_release != None:
- component = self._ccm_release.component
+ if self._ccm_project.release != None:
+ component = self._ccm_project.release.component
comparisons = {'MC': '^mc',
'S60': 'S60',
'SPP/NCP': '^spp_config|spp_psw|spp_tools|ncp_sw$',
@@ -458,12 +452,12 @@
for name in folder.name:
folder_name = unicode(name)
_logger.debug('folder_name: %s' % folder_name)
- if not old_folders.has_key(unicode(folder_name)):
- old_folders[unicode(folder_name)] = {}
- if hasattr(name, 'xml_attributes'):
- for attr_name, _ in sorted(name.xml_attributes.iteritems()):
- _logger.debug('attr_name: %s' % attr_name)
- old_folders[unicode(folder_name)][unicode(attr_name)] = unicode(getattr(name, attr_name))
+ if not old_folders.has_key(unicode(folder_name)):
+ old_folders[unicode(folder_name)] = {}
+ if hasattr(name, 'xml_attributes'):
+ for attr_name, _ in sorted(name.xml_attributes.iteritems()):
+ _logger.debug('attr_name: %s' % attr_name)
+ old_folders[unicode(folder_name)][unicode(attr_name)] = unicode(getattr(name, attr_name))
for task in recursive_node_scan(bom_log.bom.content, u'task'):
_logger.debug('task: %s' % task)
_logger.debug('task: %s' % task.id)
@@ -629,6 +623,14 @@
fix_node = doc.xml_create_element(u'fix', content=(unicode(task)), attributes = {u'type': unicode(fix.__class__.__name__)})
project_node.xml_append(fix_node)
+ self.write_icd_icfs(doc)
+ self.write_release_info(doc)
+
+ out = open(path, 'w')
+ doc.xml(out, indent='yes')
+ out.close()
+
+ def write_icd_icfs(self, doc):
if self._bom.icd_icfs != []:
# Add ICD info to BOM
doc.bom.content.xml_append(doc.xml_create_element(u'input'))
@@ -642,12 +644,13 @@
doc.bom.content.input.xml_append(doc.xml_create_element(u'version', content=(unicode(empty_bom_str))))
doc.bom.content.input.xml_append(doc.xml_create_element(u'icds'))
-
- # pylint: disable=R0914
- for i, icd in enumerate(self._bom.icd_icfs):
- doc.bom.content.input.icds.xml_append(doc.xml_create_element(u'icd'))
- doc.bom.content.input.icds.icd[i].xml_append(doc.xml_create_element(u'name', content=(unicode(icd))))
- #If currentRelease.xml exists then send s60 tag to diamonds
+
+ for i, icd in enumerate(self._bom.icd_icfs):
+ doc.bom.content.input.icds.xml_append(doc.xml_create_element(u'icd'))
+ doc.bom.content.input.icds.icd[i].xml_append(doc.xml_create_element(u'name', content=(unicode(icd))))
+
+ def write_release_info(self, doc):
+ # If currentRelease.xml exists then send s60 tag to diamonds
current_release_xml_path = self._bom.config['currentRelease.xml']
# data from the metadata will go first as they must be safer than the one
# given by the user
@@ -730,12 +733,6 @@
s60_input_source.xml_append(doc.xml_create_element(u'type', content=(unicode("unknown"))))
s60_input_node.xml_append(s60_input_source)
doc.bom.content.xml_append(s60_input_node)
-
-
- out = open(path, 'w')
- doc.xml(out, indent='yes')
- out.close()
-
def parse_status_log(self, log):
"""parse status log"""
_log_array = log.split('\r')
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/ccm/__init__.py
--- a/buildframework/helium/sf/python/pythoncore/lib/ccm/__init__.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/ccm/__init__.py Mon Oct 11 11:16:47 2010 +0100
@@ -202,10 +202,11 @@
if mresult != None:
project = self._session.create(mresult.group(1))
self._output[project] = []
- mresult = re.match(r"^(.*)\s+(\w+#\d+)\s+(.+)$", line)
+ mresult = re.match(r"^(.*?)\s+(\w+#\d+(?:,\s+\w+#\d+)*)\s+(.+)$", line)
if mresult != None and project != None:
- self._output[project].append({'object': self._session.create(mresult.group(1)),
- 'task': self._session.create("Task %s" % mresult.group(2)),
+ for task in mresult.group(2).split(','):
+ self._output[project].append({'object': self._session.create(mresult.group(1)),
+ 'task': self._session.create("Task %s" % task),
'comment': mresult.group(3)})
mresult = re.match(r"^(\w+#\d+)\s+(.+)$", line)
if mresult != None and project != None:
@@ -340,7 +341,6 @@
match_warning = re.compile(r"^Warning:(.*)")
match_failed = re.compile(r"(Update failed)")
- # TODO: cleanup the parsing to do that in a more efficient way.
for line in output.splitlines():
_logger.info(line)
res = match_object_update.match(line)
@@ -723,9 +723,6 @@
def __repr__(self):
return self.__str__()
-
- def __del__(self):
- self.close()
def purposes(self, role=None):
""" Returns available purposes. """
@@ -922,7 +919,7 @@
try:
for session in self._free_sessions:
- session.role = session._set_role(role)
+ session.role = role
finally:
self._lock_pool = False
self._pool_lock.notifyAll()
@@ -1349,11 +1346,13 @@
def _getrelease(self):
""" Get the release of the current object. Returns a Releasedef object. """
- self._release = Releasedef(self._session, self['release'])
+ if self._release == None and (self['release'] != None and self['release'] != ''):
+ self._release = Releasedef(self._session, self['release'])
return self._release
def _setrelease(self, release):
""" Set the release of the current object. """
+ self._release = release
self['release'] = release['displayname']
def refresh(self):
@@ -1381,7 +1380,7 @@
if result.status != None and result.status != 0:
raise CCMException("Error setting basline of project '%s'\n%s" % (self.objectname, result.output))
- def set_update_method(self, name, recurse = False):
+ def set_update_method(self, name, recurse=False):
""" Set the update method for the project (and subproject if recurse is True). """
assert name != None, "name must not be None."
assert len(name) > 0, "name must not be an empty string."
@@ -1392,7 +1391,7 @@
if result.status != None and result.status != 0:
raise CCMException("Error setting reconfigure properties to %s for project '%s'\nStatus: %s\n%s" % (name, self.objectname, result.status, result.output))
- def apply_update_properties(self, baseline = True, tasks_and_folders = True, recurse=True):
+ def apply_update_properties(self, baseline=True, tasks_and_folders=True, recurse=True):
""" Apply update properties to subprojects. """
args = ""
if not baseline:
@@ -1423,7 +1422,7 @@
return result.output
raise CCMException("Error creation snapshot of %s,\n%s" % (self.objectname, result.output), result)
- def checkout(self, release, version=None, purpose=None, subprojects=True):
+ def checkout(self, release, version=None, purpose=None, subprojects=True, path=None):
""" Create a checkout of this project.
This will only checkout the project in Synergy. It does not create a work area.
@@ -1448,6 +1447,9 @@
self._session.role = get_role_for_purpose(self._session, purpose)
args += " -purpose \"%s\"" % purpose
+ if path:
+ args += " -path \"%s\"" % path
+
if subprojects:
args += " -subprojects"
result = self._session.execute("checkout -project \"%s\" -release \"%s\" -no_wa %s" \
@@ -1456,47 +1458,7 @@
self._session.role = role
if result.project == None:
raise CCMException("Error checking out project %s,\n%s" % (self.objectname, result.output), result)
- return result
-
- def create_release_tag(self, release, new_tag):
- """ creates new release tag """
- role = self._session.role
-
- if role is None:
- self._session.role = "developer"
- role = self._session.role
-
- args = "release -create %s -from %s -bl %s -active -allow_parallel_check_out" % (new_tag, release, release)
- self._session.role = "build_mgr"
-
- result = self._session.execute(" %s" \
- % (args), Result(self._session))
- self._session.role = role
-
- return result.output
-
- def delete_release_tag(self, release, new_tag):
- """ deletes new release tag """
-
- role = self._session.role
- if role is None:
- self._session.role = "developer"
- role = self._session.role
-
-
- self._session.role = "build_mgr"
-
- result = self._session.execute("pg -l -r %s -u" \
- % (new_tag), Result(self._session))
- result = self._session.execute("pg -d \"%s\" -m" \
- % (result.output), Result(self._session))
- result = self._session.execute("release -d %s -force" \
- % (new_tag), Result(self._session))
-
- self._session.role = role
-
- return result.output
-
+ return result
def work_area(self, maintain, recursive=None, relative=None, path=None, pst=None, wat=False):
""" Configure the work area. This allow to enable it or disable it, set the path, recursion... """
@@ -1629,6 +1591,44 @@
return self.name
component = property(_getcomponent)
+
+ def create_tag(self, new_tag):
+ """ creates new release tag """
+ role = self._session.role
+
+ if role is None:
+ self._session.role = "developer"
+ role = self._session.role
+
+ args = "release -create %s -from %s -bl %s -active -allow_parallel_check_out" % (new_tag, self.objectname, self.objectname)
+ self._session.role = "build_mgr"
+
+ result = self._session.execute(" %s" \
+ % (args), Result(self._session))
+ self._session.role = role
+
+ return result.output
+
+ def delete_tag(self, new_tag):
+ """ deletes new release tag """
+
+ role = self._session.role
+ if role is None:
+ self._session.role = "developer"
+ role = self._session.role
+
+ self._session.role = "build_mgr"
+
+ result = self._session.execute("pg -l -r %s -u" \
+ % (new_tag), Result(self._session))
+ result = self._session.execute("pg -d \"%s\" -m" \
+ % (result.output), Result(self._session))
+ result = self._session.execute("release -d \"%s\" -force" \
+ % (new_tag), Result(self._session))
+
+ self._session.role = role
+
+ return result.output
class Folder(CCMObject):
@@ -1751,7 +1751,6 @@
objects = property(_getobjects)
def __unicode__(self):
- # TODO: use optimised query that makes only 1 ccm query with suitable format
if self.__unicode_str_text == None:
self.__unicode_str_text = u'%s: %s' % (self['displayname'], self['task_synopsis'])
return self.__unicode_str_text
@@ -1771,6 +1770,7 @@
release = property(get_release_tag, set_release_tag)
+
class UpdateTemplate:
""" Allow to access Update Template property using Release and Purpose. """
def __init__(self, releasedef, purpose):
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/ccm/extra.py
--- a/buildframework/helium/sf/python/pythoncore/lib/ccm/extra.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/ccm/extra.py Mon Oct 11 11:16:47 2010 +0100
@@ -105,7 +105,7 @@
_logger.error( "Exception occurred in request #%s: %s" % (request.requestID, exc_info[1]))
exceptions.append(exc_info[1])
- def handle_result(result):
+ def handle_result(_, result):
""" append the result"""
results.append(result)
@@ -133,7 +133,7 @@
_logger.error( "Exception occured in request #%s: %s\n%s" % (request.requestID, exc_info[1], traceback.format_exception(exc_info[0], exc_info[1], exc_info[2])))
exceptions.append(exc_info[1])
- def handle_result(request, result):
+ def handle_result(_, result):
"""append the result"""
results.append(result)
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/configuration.py
--- a/buildframework/helium/sf/python/pythoncore/lib/configuration.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/configuration.py Mon Oct 11 11:16:47 2010 +0100
@@ -558,10 +558,7 @@
A ConfigurationSet represents a number of Configuration objects
that all may need to be processed.
"""
- try:
- dom = xml.dom.minidom.parse(self.inputfile)
- except Exception, exc:
- raise Exception("XML file '%s' cannot be parsed properly: %s" % (self.inputfile, exc))
+ dom = xml.dom.minidom.parse(self.inputfile)
# The root element is typically but can be anything
self.rootNode = dom.documentElement
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/convertpkg.py
--- a/buildframework/helium/sf/python/pythoncore/lib/convertpkg.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/convertpkg.py Mon Oct 11 11:16:47 2010 +0100
@@ -64,6 +64,8 @@
submmpfile.write('//rtest\n')
elif testtype == 'stif':
submmpfile.write('LIBRARY stiftestinterface.lib\n')
+ elif testtype == 'sut':
+ submmpfile.write('LIBRARY symbianunittestfw.lib\n')
else:
raise Exception('Test type unknown: ' + testtype)
submmpfile.close()
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/dependancygraph.py
--- a/buildframework/helium/sf/python/pythoncore/lib/dependancygraph.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/dependancygraph.py Mon Oct 11 11:16:47 2010 +0100
@@ -139,17 +139,18 @@
if os.path.isfile(filename) and fname.endswith('.egg'):
eggfile = zipfile.ZipFile(filename, 'r', zipfile.ZIP_DEFLATED)
-
- data = eggfile.read('EGG-INFO/PKG-INFO')
-
- library = readPkgInfo(data.split('\n'))
-
- if 'EGG-INFO/requires.txt' in eggfile.namelist():
- requiresdata = eggfile.read('EGG-INFO/requires.txt')
- readRequiresFile(requiresdata.split('\n'), library)
+ if 'EGG-INFO/PKG-INFO' in eggfile.namelist():
+ data = eggfile.read('EGG-INFO/PKG-INFO')
+
+ library = readPkgInfo(data.split('\n'))
- libraries.addLibrary(notinsubcon, library)
-
+ if 'EGG-INFO/requires.txt' in eggfile.namelist():
+ requiresdata = eggfile.read('EGG-INFO/requires.txt')
+ readRequiresFile(requiresdata.split('\n'), library)
+
+ libraries.addLibrary(notinsubcon, library)
+ else:
+ print 'EGG-INFO/PKG-INFO not in ' + filename
eggfile.close()
def readRequiresFile(data, library):
@@ -326,8 +327,9 @@
if macro:
output.append("\"%s\" [fontname=\"Times-Italic\"];" % str(targ.name))
output.append('subgraph \"cluster%s\" {label = \"%s\"; \"%s\"}\n' % (str(proj.name), str(proj.name), str(targ.name)))
- splt = str(signal).split(',')
+ splt = str(signal).split('(')
if len(splt) > 1:
+ splt[1] = splt[1].replace(')', '')
if splt[1] == 'now':
color = 'red'
elif splt[1] == 'defer':
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/fileutils.py
--- a/buildframework/helium/sf/python/pythoncore/lib/fileutils.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/fileutils.py Mon Oct 11 11:16:47 2010 +0100
@@ -310,7 +310,7 @@
except OSError:
if os.path.isdir(src):
if destinsrc(src, dst):
- raise Exception, "Cannot move a directory '%s' into itself '%s'." % (src, dst)
+ raise OSError, "Cannot move a directory '%s' into itself '%s'." % (src, dst)
shutil.copytree(src, dst, symlinks=True)
rmtree(src)
else:
@@ -376,10 +376,15 @@
except os.error:
continue
# Check if the path is a regular file
- if stat.S_ISREG(status[stat.ST_MODE]):
- mode = stat.S_IMODE(status[stat.ST_MODE])
- if mode & 0111:
+ if os.sep == '\\':
+ if os.path.isfile(filename):
return os.path.normpath(filename)
+ else:
+ # On Unix also check the executable rigths
+ if stat.S_ISREG(status[stat.ST_MODE]):
+ mode = stat.S_IMODE(status[stat.ST_MODE])
+ if mode & 0111:
+ return os.path.normpath(filename)
return None
@@ -407,8 +412,8 @@
def load_policy_content(filename):
""" Testing policy content loading. """
data = ''
+ fileh = codecs.open(filename, 'r', 'ascii')
try:
- fileh = codecs.open(filename, 'r', 'ascii')
data = fileh.read()
except ValueError:
raise IOError("Error loading '%s' as an ASCII file." % filename)
@@ -607,7 +612,7 @@
if drive_type == win32con.DRIVE_REMOTE:
win32wnet.WNetCancelConnection2(drive, win32netcon.CONNECT_UPDATE_PROFILE, 1)
else:
- raise Exception("%s couldn't be umount." % drive)
+ raise OSError("%s couldn't be umount." % drive)
else:
def rmdir(path):
@@ -653,14 +658,14 @@
p_subst = subprocess.Popen("subst %s %s" % (drive, path), shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
errmsg = p_subst.communicate()[0]
if p_subst.returncode != 0:
- raise Exception("Error substing '%s' under '%s': %s" % (path, drive, errmsg))
+ raise OSError("Error substing '%s' under '%s': %s" % (path, drive, errmsg))
def unsubst(drive):
""" Unsubsting the drive. """
p_subst = subprocess.Popen("subst /D %s" % (drive), shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
errmsg = p_subst.communicate()[0]
if p_subst.returncode != 0:
- raise Exception("Error unsubsting '%s': %s" % (drive, errmsg))
+ raise OSError("Error unsubsting '%s': %s" % (drive, errmsg))
def getSubstedDrives():
"""get substituted drive"""
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/fileutils.py.orig
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/pythoncore/lib/fileutils.py.orig Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,698 @@
+#============================================================================
+#Name : fileutils.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+
+"""
+File manipulation related functionalities:
+ * Filescanner
+ * rmtree (fixed version)
+ * move (fixed version)
+"""
+import codecs
+import locale
+import logging
+import os
+import re
+import sys
+import shutil
+import hashlib
+import subprocess
+import string
+
+import pathaddition.match
+import stat
+
+if os.name == 'nt':
+ import win32api
+
+LOGGER = logging.getLogger('fileutils')
+LOGGER_LOCK = logging.getLogger('fileutils.lock')
+#LOGGER.addHandler(logging.FileHandler('default.log'))
+#logging.basicConfig(level=logging.DEBUG)
+#LOGGER.setLevel(logging.DEBUG)
+
+class AbstractScanner(object):
+ """ This class implements all the required infrastructure for filescanning. """
+
+ def __init__(self):
+ """ Initialization. """
+ self.includes = []
+ self.excludes = []
+ self.includes_files = []
+ self.excludes_files = []
+ self.selectors = []
+ self.filetypes = []
+
+ def add_include(self, include):
+ """ Adds an include path to the scanner. """
+ if include.endswith('/') or include.endswith('\\'):
+ include = include + '**'
+
+ self.includes.append(include)
+
+ def add_exclude(self, exclude):
+ """ Adds an exclude path to the scanner. """
+ if exclude.endswith('/') or exclude.endswith('\\'):
+ exclude = exclude + '**'
+
+ self.excludes.append(exclude)
+
+ def add_exclude_file(self, exclude):
+ """ Adds an exclude file to the scanner. """
+ self.excludes_files.append(exclude)
+
+ def add_selector(self, selector):
+ """ Add selector to the scanner. """
+ self.selectors.append(selector)
+
+ def add_filetype(self, filetype):
+ """ Adds a filetype selection to the scanner. """
+ self.filetypes.append(filetype)
+
+ def is_included(self, path):
+ """ Returns if path is included by the scanner. """
+ LOGGER.debug("is_included: path = " + path)
+ if path.replace('\\', '/') in self.includes_files or path in self.includes_files:
+ return True
+ for inc in self.includes:
+ if self.match(path, inc):
+ LOGGER.debug("Included: " + path + " by " + inc)
+ return True
+ return False
+
+ def is_excluded(self, path):
+ """ Returns if path is excluded by the scanner. """
+ LOGGER.debug("is_excluded: path = " + path)
+ if path.replace('\\', '/') in self.excludes_files or path in self.excludes_files:
+ return True
+ for ex in self.excludes:
+ if self.match(path, ex):
+ LOGGER.debug("Excluded: " + path + " by " + ex)
+ return True
+ return False
+
+ def is_selected(self, path):
+ """ Returns if path is selected by all selectors in the scanner. """
+ LOGGER.debug("is_selected: path = " + path)
+ for selector in self.selectors:
+ if not selector.is_selected(path):
+ return False
+ LOGGER.debug("Selected: " + path)
+ return True
+
+ def is_filetype(self, path):
+ """ Test if a file matches one filetype. """
+ if len(self.filetypes) == 0:
+ return True
+ LOGGER.debug("is_filetype: path = " + path)
+ for filetype in self.filetypes:
+ if self.match(path, filetype):
+ LOGGER.debug("Filetype: " + path + " by " + filetype)
+ return True
+ return False
+
+ def match(self, filename, pattern):
+ """ Is filename matching pattern? """
+ return pathaddition.match.ant_match(filename, pattern, casesensitive=(os.sep != '\\'))
+
+ def test_path(self, root, relpath):
+ """ Test if a path matches filetype, include, exclude, and selection process."""
+ return self.is_filetype(relpath) and self.is_included(relpath) \
+ and not self.is_excluded(relpath) and \
+ self.is_selected(os.path.join(root, relpath))
+
+ def __str__(self):
+ """ Returns a string representing this instance. """
+ content = []
+ for inc in self.includes:
+ content.append('include:' + os.path.normpath(inc))
+ for ex in self.excludes:
+ content.append('exclude:' + os.path.normpath(ex))
+ return ';'.join(content)
+
+ def __repr__(self):
+ """ Returns a string representing this instance. """
+ return self.__str__()
+
+ def scan(self):
+ """ Abstract method which much be overriden to implement the scanning process. """
+ raise Exception("scan method must be overriden")
+
+
+class FileScanner(AbstractScanner):
+ """Scans the filesystem for files that match the selection paths.
+
+ The scanner is configured with a root directory. Any number of include
+ and exclude paths can be added. The scan() method is a generator that
+ returns matching files one at a time when called as an iterator.
+
+ This is a revisited implementation of the filescanner. It now relies on
+ the module pathaddition.match that implements a Ant-like regular expression matcher.
+
+ Rules:
+ - Includes and excludes should not start with *
+ - Includes and excludes should not have wildcard searches ending with ** (e.g. wildcard**)
+
+ Supported includes and excludes:
+ - filename.txt
+ - filename.*
+ - dir/
+ - dir/*
+ - dir/**
+ """
+ def __init__(self, root_dir):
+ """ Initialization. """
+ AbstractScanner.__init__(self)
+ self.root_dir = os.path.normpath(root_dir)
+ if not self.root_dir.endswith(os.sep):
+ self.root_dir = self.root_dir + os.sep
+ # Add 1 so the final path separator is removed
+ #self.root_dirLength = len(self.root_dir) + 1
+
+ def scan(self):
+ """ Scans the files required to zip"""
+ #paths_cache = []
+
+ excludescopy = self.excludes[:]
+ for f_file in excludescopy:
+ if os.path.exists(os.path.normpath(os.path.join(self.root_dir, f_file))):
+ self.excludes_files.append(f_file)
+ self.excludes.remove(f_file)
+
+ includescopy = self.includes[:]
+ for f_file in includescopy:
+ if os.path.exists(os.path.normpath(os.path.join(self.root_dir, f_file))):
+ self.includes_files.append(f_file)
+ self.includes.remove(f_file)
+
+ LOGGER.debug('Scanning sub-root directories')
+ for root_dir in self.find_subroots():
+ for dirpath, subdirs, files in os.walk(unicode(root_dir)):
+ # Let's save the len before it's getting modified.
+ subdirsLen = len(subdirs)
+ subroot = dirpath[len(self.root_dir):]
+
+ dirs_to_remove = []
+ for subdir in subdirs:
+ if self.is_excluded(os.path.join(subroot, subdir)):
+ dirs_to_remove.append(subdir)
+
+ for dir_remove in dirs_to_remove:
+ subdirs.remove(dir_remove)
+
+ LOGGER.debug('Scanning directory: ' + dirpath)
+ for file_ in files:
+ path = os.path.join(subroot, file_)
+ if self.is_filetype(path) and self.is_included(path) and \
+ self.is_selected(os.path.join(dirpath, file_)) and not self.is_excluded(path):
+ ret_path = os.path.join(dirpath, file_)
+ yield ret_path
+
+ LOGGER.debug('Checking for empty directory: ' + dirpath)
+ # Check for including empty directories
+ if self.is_included(subroot) and not self.is_excluded(subroot):
+ if len(files) == 0 and subdirsLen == 0:
+ LOGGER.debug('Including empty dir: ' + dirpath)
+ yield dirpath
+
+
+ def find_subroots(self):
+ """Finds all the subdirectory roots based on the include paths.
+
+ Often large archive operations define a number of archives from the root
+ of the drive. Walking the tree from the root is very time-consuming, so
+ selecting more specific subdirectory roots improves performance.
+ """
+ def splitpath(path):
+ """ Returns the splitted path"""
+ return path.split(os.sep)
+
+ root_dirs = []
+
+ # Look for includes that start with wildcards.
+ subdirs_not_usable = False
+ for inc in self.includes + self.includes_files:
+ first_path_segment = splitpath(os.path.normpath(inc))[0]
+ if first_path_segment.find('*') != -1:
+ subdirs_not_usable = True
+
+ # Parse all includes for sub-roots
+ if not subdirs_not_usable:
+ for inc in self.includes + self.includes_files:
+ include = None
+ LOGGER.debug("===> inc %s" % inc)
+ contains_globs = False
+ for pathcomp in splitpath(os.path.normpath(inc)):
+ if pathcomp.find('*') != -1:
+ contains_globs = True
+ break
+ else:
+ if include == None:
+ include = pathcomp
+ else:
+ include = os.path.join(include, pathcomp)
+ if not contains_globs:
+ include = os.path.dirname(include)
+
+ LOGGER.debug("include %s" % include)
+ if include != None:
+ root_dir = os.path.normpath(os.path.join(self.root_dir, include))
+ is_new_root = True
+ for root in root_dirs[:]:
+ if destinsrc(root, root_dir):
+ LOGGER.debug("root contains include, skip it")
+ is_new_root = False
+ break
+ if destinsrc(root_dir, root):
+ LOGGER.debug("include contains root, so remove root")
+ root_dirs.remove(root)
+ if is_new_root:
+ root_dirs.append(root_dir)
+
+ if len(root_dirs) == 0:
+ root_dirs = [os.path.normpath(self.root_dir)]
+ LOGGER.debug('Roots = ' + str(root_dirs))
+ return root_dirs
+
+ def __str__(self):
+ return os.path.normpath(self.root_dir) + ';' + AbstractScanner.__str__(self)
+
+ def __repr__(self):
+ return self.__str__()
+
+
+def move(src, dst):
+ """Recursively move a file or directory to another location.
+
+ If the destination is on our current filesystem, then simply use
+ rename. Otherwise, copy src to the dst and then remove src.
+ A lot more could be done here... A look at a mv.c shows a lot of
+ the issues this implementation glosses over.
+
+ """
+ try:
+ os.rename(src, dst)
+ except OSError:
+ if os.path.isdir(src):
+ if destinsrc(src, dst):
+ raise Exception, "Cannot move a directory '%s' into itself '%s'." % (src, dst)
+ shutil.copytree(src, dst, symlinks=True)
+ rmtree(src)
+ else:
+ shutil.copy2(src, dst)
+ os.unlink(src)
+
+def rmtree(rootdir):
+ """ Catch shutil.rmtree failures on Windows when files are read-only. Thanks Google!"""
+ if sys.platform == 'win32':
+ rootdir = os.path.normpath(rootdir)
+ if not os.path.isabs(rootdir):
+ rootdir = os.path.join(os.path.abspath('.'), rootdir)
+ if not rootdir.startswith('\\\\'):
+ rootdir = u"\\\\?\\" + rootdir
+
+ def cb_handle_error(fcn, path, excinfo):
+ """ Error handler, removing readonly and deleting the file. """
+ os.chmod(path, 0666)
+ if os.path.isdir(path):
+ rmdir(path)
+ elif os.path.isfile(path):
+ remove(path)
+ else:
+ fcn(path)
+
+ if 'java' in sys.platform:
+ import java.io
+ import org.apache.commons.io.FileUtils
+ f_file = java.io.File(rootdir)
+ org.apache.commons.io.FileUtils.deleteDirectory(f_file)
+ else:
+ return shutil.rmtree(rootdir, onerror=cb_handle_error)
+
+def destinsrc(src, dst):
+ """ Fixed version of destinscr, that doesn't match dst with same root name."""
+ if os.sep == '\\':
+ src = src.lower()
+ dst = dst.lower()
+ src = os.path.abspath(src)
+ dst = os.path.abspath(dst)
+ if not src.endswith(os.path.sep):
+ src += os.path.sep
+ if not dst.endswith(os.path.sep):
+ dst += os.path.sep
+ return dst.startswith(src)
+
+
+def which(executable):
+ """ Search for executable in the PATH."""
+ pathlist = os.environ['PATH'].split(os.pathsep)
+ pathexts = ['']
+ if os.sep == '\\':
+ pathexts = os.environ['PATHEXT'].split(os.pathsep)
+
+ for folder in pathlist:
+ for pathext in pathexts:
+ exename = executable
+ if os.sep == '\\' and not exename.lower().endswith(pathext.lower()):
+ exename = exename + pathext
+ filename = os.path.join(folder, exename)
+ try:
+ status = os.stat(filename)
+ except os.error:
+ continue
+ # Check if the path is a regular file
+ if stat.S_ISREG(status[stat.ST_MODE]):
+ mode = stat.S_IMODE(status[stat.ST_MODE])
+ if mode & 0111:
+ return os.path.normpath(filename)
+ return None
+
+
+def read_policy_content(filename):
+ """ Read the policy number from the policy file.
+ strict allows to activate the new policy scanning.
+ """
+ value = None
+ error = ""
+ try:
+ LOGGER.debug('Opening policy file: ' + filename)
+ policy_data = load_policy_content(filename)
+ match = re.match(r'^((?:\d+)|(?:0842[0-9a-zA-Z]{3}))\s*$', policy_data, re.M|re.DOTALL)
+ if match != None:
+ value = match.group(1)
+ else:
+ error = "Content of '%s' doesn't match r'^\d+|0842[0-9a-zA-Z]{3}\s*$'." % filename
+ except Exception, exc:
+ error = str(exc)
+ if value is not None:
+ return value
+ # worse case....
+ raise Exception(error)
+
+def load_policy_content(filename):
+ """ Testing policy content loading. """
+ data = ''
+ try:
+ fileh = codecs.open(filename, 'r', 'ascii')
+ data = fileh.read()
+ except:
+ raise Exception("Error loading '%s' as an ASCII file." % filename)
+ finally:
+ fileh.close()
+ return data
+
+ENCODING_MATRIX = {
+ codecs.BOM_UTF8: 'utf_8',
+ codecs.BOM_UTF16: 'utf_16',
+ codecs.BOM_UTF16_BE: 'utf_16_be',
+ codecs.BOM_UTF16_LE: 'utf_16_le',
+}
+
+def guess_encoding(data):
+ """Given a byte string, guess the encoding.
+
+ First it tries for UTF8/UTF16 BOM.
+
+ Next it tries the standard 'UTF8', 'ISO-8859-1', and 'cp1252' encodings,
+ Plus several gathered from locale information.
+
+ The calling program *must* first call locale.setlocale(locale.LC_ALL, '')
+
+ If successful it returns (decoded_unicode, successful_encoding)
+ If unsuccessful it raises a ``UnicodeError``.
+
+ This was taken from http://www.voidspace.org.uk/python/articles/guessing_encoding.shtml
+ """
+ for bom, enc in ENCODING_MATRIX.items():
+ if data.startswith(bom):
+ return data.decode(enc), enc
+ encodings = ['ascii', 'UTF-8']
+ successful_encoding = None
+ try:
+ encodings.append(locale.getlocale()[1])
+ except (AttributeError, IndexError):
+ pass
+ try:
+ encodings.append(locale.getdefaultlocale()[1])
+ except (AttributeError, IndexError):
+ pass
+ # latin-1
+ encodings.append('ISO8859-1')
+ encodings.append('cp1252')
+ for enc in encodings:
+ if not enc:
+ continue
+ try:
+ decoded = unicode(data, enc)
+ successful_encoding = enc
+ break
+ except (UnicodeError, LookupError):
+ pass
+ if successful_encoding is None:
+ raise UnicodeError('Unable to decode input data. Tried the'
+ ' following encodings: %s.' %
+ ', '.join([repr(enc) for enc in encodings if enc]))
+ else:
+ if successful_encoding == 'ascii':
+ # our default ascii encoding
+ successful_encoding = 'ISO8859-1'
+ return (decoded, successful_encoding)
+
+def getmd5(fullpath, chunk_size=2**16):
+ """ returns the md5 value"""
+ file_handle = open(fullpath, "rb")
+ file_handle.seek(0, os.SEEK_END)
+ size = file_handle.tell()
+ file_handle.seek(0, os.SEEK_SET)
+ md5 = hashlib.md5()
+ while size > 0:
+ toread = chunk_size
+ if size < chunk_size:
+ toread = size
+ chunk = file_handle.read(toread)
+ size = size - len(chunk)
+ md5.update(chunk)
+ file_handle.close()
+ return md5.hexdigest()
+
+def read_symbian_policy_content(filename):
+ """ Read the policy category from the policy file. """
+ value = None
+ error = ""
+ try:
+ LOGGER.debug('Opening symbian policy file: ' + filename)
+ try:
+ fileh = codecs.open(filename, 'r', 'ascii')
+ except:
+ raise Exception("Error loading '%s' as an ASCII file." % filename)
+ for line in fileh:
+ match = re.match(r'^Category\s+([A-Z])\s*$', line, re.M|re.DOTALL)
+ if match != None:
+ value = match.group(1)
+ fileh.close()
+ return value
+ fileh.close()
+ if match == None:
+ error = "Content of '%s' doesn't match r'^Category\s+([A-Z])\s*$'." % filename
+ except Exception, exc:
+ error = str(exc)
+ if value is not None:
+ return value
+ # worse case....
+ raise Exception(error)
+
+
+class LockFailedException(Exception):
+ """ This class is used to indicate the failure in obtaining a Lock. """
+ pass
+
+if os.name == 'nt':
+ import win32file
+ import win32con
+ import winerror
+ import time
+ import win32netcon
+ import win32wnet
+
+ class Lock:
+ """ This object implement file locking for windows. """
+
+ def __init__(self, filename):
+ LOGGER_LOCK.debug("__init__")
+ self._filename = filename
+ self.f_desc = None
+
+ def lock(self, wait=False):
+ """lock the file"""
+ LOGGER_LOCK.debug("lock")
+ # Open the file
+ if self.f_desc == None:
+ self.f_desc = open(self._filename, "w+")
+ wfd = win32file._get_osfhandle(self.f_desc.fileno())
+ if not wait:
+ try:
+ win32file.LockFile(wfd, 0, 0, 0xffff, 0)
+ except:
+ raise LockFailedException()
+ else:
+ while True:
+ try:
+ win32file.LockFile(wfd, 0, 0, 0xffff, 0)
+ break
+ except win32file.error, exc:
+ if exc[0] != winerror.ERROR_LOCK_VIOLATION:
+ raise exc
+ LOGGER_LOCK.debug("waiting")
+ time.sleep(1)
+
+ def unlock(self):
+ """unlock the file"""
+ LOGGER_LOCK.debug("unlock")
+ if self.f_desc == None:
+ LOGGER_LOCK.debug("already unlocked")
+ return
+ wfd = win32file._get_osfhandle(self.f_desc.fileno())
+ try:
+ # pylint: disable-msg=E1101
+ win32file.UnlockFile(wfd, 0 , 0, 0xffff, 0)
+ self.f_desc.close()
+ self.f_desc = None
+ except win32file.error, exc:
+ if exc[0] != 158:
+ raise
+
+
+ def __del__(self):
+ LOGGER_LOCK.debug("__del__")
+ self.unlock()
+
+ def rmdir(path):
+ """ Catch os.rmdir failures on Windows when path is too long (more than 256 chars)."""
+ path = win32api.GetShortPathName(path)
+ win32file.RemoveDirectory(path)
+
+ def remove(filename):
+ """ Catch os.rmdir failures on Windows when path is too long (more than 256 chars)."""
+ filename = win32api.GetShortPathName(filename)
+ filename = filename.lstrip("\\\\?\\")
+ os.remove(filename)
+
+ def mount(drive, unc, username=None, password=None, persistent=False):
+ """ Windows helper function to map a network drive. """
+ flags = 0
+ if persistent:
+ flags = win32netcon.CONNECT_UPDATE_PROFILE
+ win32wnet.WNetAddConnection2(win32netcon.RESOURCETYPE_DISK, drive, unc, None, username, password, flags)
+
+
+ def umount(drive):
+ """ Windows helper function to map a network drive. """
+ drive_type = win32file.GetDriveType(drive)
+ if drive_type == win32con.DRIVE_REMOTE:
+ win32wnet.WNetCancelConnection2(drive, win32netcon.CONNECT_UPDATE_PROFILE, 1)
+ else:
+ raise Exception("%s couldn't be umount." % drive)
+
+else:
+ def rmdir(path):
+ """remove directory"""
+ return os.rmdir(path)
+
+ def remove(path):
+ """remove the files and folders"""
+ return os.remove(path)
+
+ class Lock:
+ """ This class represents a dummy lock """
+ def __init__(self, filename):
+ pass
+ def lock(self, wait=False):
+ """lock file - do nothing """
+ pass
+ def unlock(self):
+ """un lock file -do nothing """
+ pass
+
+if os.sep == '\\':
+ def get_next_free_drive():
+ """ Return the first free drive found else it raise an exception. """
+ if os.name == 'nt':
+ drive_labels = sorted(list(set(string.ascii_uppercase) - set(win32api.GetLogicalDriveStrings())), reverse=True)
+ if len(drive_labels) != 0 :
+ return drive_labels[0] + ":"
+ raise OSError("No free drive left.")
+ if 'java' in sys.platform:
+ import java.io
+ used = []
+ for _xx in java.io.File.listRoots():
+ used.append(str(_xx).replace(':\\', ''))
+ drive_labels = sorted(list(set(string.ascii_uppercase) - set(used)), reverse=True)
+ if len(drive_labels) != 0 :
+ return drive_labels[0] + ":"
+ raise OSError("No free drive left.")
+
+ def subst(drive, path):
+ """ Substing path as a drive. """
+ path = os.path.normpath(path)
+ p_subst = subprocess.Popen("subst %s %s" % (drive, path), shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
+ errmsg = p_subst.communicate()[0]
+ if p_subst.returncode != 0:
+ raise Exception("Error substing '%s' under '%s': %s" % (path, drive, errmsg))
+
+ def unsubst(drive):
+ """ Unsubsting the drive. """
+ p_subst = subprocess.Popen("subst /D %s" % (drive), shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
+ errmsg = p_subst.communicate()[0]
+ if p_subst.returncode != 0:
+ raise Exception("Error unsubsting '%s': %s" % (drive, errmsg))
+
+ def getSubstedDrives():
+ """get substituted drive"""
+ driveInformation = {}
+ subStedDriveList = []
+ p_subst = subprocess.Popen("subst", shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
+ subStedDriveList = re.split('\\n', p_subst.communicate()[0])
+ del subStedDriveList[len(subStedDriveList)-1]
+ for path in subStedDriveList:
+ subStedDrivePath = []
+ if(re.search(r'UNC', path) is not None):
+ subStedDrivePath = re.split('=>', path)
+ (drive_to_unsubst, _) = os.path.splitdrive(os.path.normpath(subStedDrivePath[0]))
+ uncPath = re.sub('UNC', r'\\', subStedDrivePath[1].strip())
+ if(uncPath != subStedDrivePath[1].strip()):
+ driveInformation[drive_to_unsubst] = uncPath
+ else:
+ subStedDrivePath = re.split('=>', path)
+ (drive_to_unsubst, _) = os.path.splitdrive(os.path.normpath(subStedDrivePath[0]))
+ driveInformation[drive_to_unsubst] = os.path.normpath(subStedDrivePath[1].strip())
+
+ return driveInformation
+
+def touch(srcdir):
+ """
+ Recursively touches all the files in the source path mentioned.
+ It does not touch the directories.
+ """
+ srcnames = os.listdir(srcdir)
+ for name in srcnames:
+ srcfname = os.path.join(srcdir, name)
+ if os.path.isdir(srcfname):
+ touch(srcfname)
+ else:
+ if os.path.exists(srcfname):
+ os.utime(srcfname, None)
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/idoprep.py
--- a/buildframework/helium/sf/python/pythoncore/lib/idoprep.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/idoprep.py Mon Oct 11 11:16:47 2010 +0100
@@ -28,23 +28,9 @@
logging.basicConfig(level=logging.INFO)
_logger = logging.getLogger("check_latest_release")
-
-def validate(grace, service, product, release):
- """ Validate s60 grace server, s60 grace service, s60 grace product and
- s60 grace release are set.
- """
- if not grace:
- raise EnvironmentError("Property 's60.grace.server' is not defined.")
- if not service:
- raise EnvironmentError("Property 's60.grace.service' is not defined.")
- if not product:
- raise EnvironmentError("Property 's60.grace.product' is not defined.")
- if not release:
- raise EnvironmentError("Property 's60.grace.release' is not defined.")
-def get_s60_env_details(grace, service, product, release, rev, cachefilename, s60gracecheckmd5, s60graceusetickler):
+def get_s60_env_details(server, service, product, release, rev, cachefilename, checkmd5, usetickler):
""" Return s60 environ details """
- validate(grace, service, product, release)
revision = r'(_\d{3})?'
if rev != None:
revision = rev
@@ -53,11 +39,11 @@
_logger.info(str("Using cache file: %s" % cachefilename))
checkmd5 = False
- if s60gracecheckmd5 != None:
- checkmd5 = str(s60gracecheckmd5).lower()
+ if checkmd5 != None:
+ checkmd5 = str(checkmd5).lower()
checkmd5 = ((checkmd5 == "true") or (checkmd5 == "1") or (checkmd5 == "on"))
- branch = os.path.join(grace, service, product)
+ branch = os.path.join(server, service, product)
if not os.path.exists(branch):
raise IOError("Error occurred: Could not find directory %s" % branch)
@@ -71,7 +57,7 @@
result.append(relpath)
result.sort(reverse=True)
use_tickler = False
- tickler_validation = str(s60graceusetickler).lower()
+ tickler_validation = str(usetickler).lower()
if tickler_validation != None:
use_tickler = ((tickler_validation == "true") or (tickler_validation == "1"))
validresults = []
@@ -99,7 +85,7 @@
result = validresults
if len(result) == 0:
- raise EnvironmentError("Error finding GRACE release.")
+ raise EnvironmentError("Error finding release.")
print result[0]
return result
@@ -117,11 +103,11 @@
_logger.info("Version file not found getting new environment...")
return version
-def create_ado_mapping(sysdefconfig, adomappingfile, adoqualitymappingfile, builddrive, adoqualitydirs):
+def create_ado_mapping(sysdefconfig, adomappingfile, qualityMapping, builddrive, adoqualitydirs):
""" Creates ado mapping and ado quality mapping files """
input_ = open(sysdefconfig, 'r')
output = open(adomappingfile, 'w')
- outputquality = open(adoqualitymappingfile, 'w')
+ print "ado mapping file: %s" % adomappingfile
for sysdef in input_.readlines():
sysdef = sysdef.strip()
if len(sysdef) > 0:
@@ -139,18 +125,11 @@
else:
component = os.path.normpath(os.path.join(builddrive, location)).replace('\\','/')
print "%s=%s\n" % (sysdef, component)
- output.write("%s=%s\n" % (sysdef, component))
- if adoqualitydirs == None:
- outputquality.write("%s=%s\n" % (sysdef, component))
+ if adoqualitydirs == None or qualityMapping == 'false':
+ output.write("%s=%s\n" % (sysdef, component))
else:
for dir_ in adoqualitydirs.split(','):
if os.path.normpath(dir_) == os.path.normpath(os.path.join(builddrive, os.environ['EPOCROOT'], location)):
- outputquality.write("%s=%s\n" % (sysdef, component))
-
-
- outputquality.close()
+ output.write("%s=%s\n" % (sysdef, component))
output.close()
- input_.close()
-
-
-
+ input_.close()
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/integration/quality.py
--- a/buildframework/helium/sf/python/pythoncore/lib/integration/quality.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/integration/quality.py Mon Oct 11 11:16:47 2010 +0100
@@ -33,6 +33,7 @@
import fileutils
import pathaddition.match
import logging
+import traceback
#logging.basicConfig(level=logging.DEBUG)
_logger = logging.getLogger("integration.quality")
@@ -111,7 +112,7 @@
class PolicyValidator(object):
""" Validate policy files on a hierarchy. """
- def __init__(self, policyfiles=None, csvfile=None, ignoreroot=False, excludes=None):
+ def __init__(self, policyfiles=None, ignoreroot=False, excludes=None):
"""The constructor """
if policyfiles is None:
policyfiles = ['distribution.policy.s60']
@@ -135,7 +136,16 @@
self._ids[row[0]] = row
if row[1].lower() != "yes" and row[1].lower() != "no" and row[1].lower() != "bin":
yield ["unknownstatus", row[0], row[2]]
-
+
+ def epl_load_policy_ids(self, csvfile):
+ """ Load the icds from the CSV file for epl check."""
+ self._ids = {}
+ reader = csv.reader(open(csvfile, "rU"))
+ for row in reader:
+ if len(row)>=3 and re.match(r"^\s*\d+\s*$", row[0]):
+ if row[1].lower() == "yes" or row[1].lower() == "bin":
+ self._ids[row[0]] = row
+
def validate_content(self, filename):
""" Validating the policy file content. If it cannot be decoded,
it reports an 'invalidencoding'.
@@ -151,6 +161,20 @@
if value not in self._ids:
yield ["notinidlist", filename, value]
+ def epl_validate_content(self, filename):
+ """ Validating the policy file content for epl"""
+ value = None
+ try:
+ value = fileutils.read_policy_content(filename)
+ except IOError, exc:
+ traceback.print_exc()
+ raise exc
+ if value is not None:
+ if self._ids != None:
+ if value not in self._ids:
+ return False
+ return True
+
def find_policy(self, path):
""" find the policy file under path using filenames under the list. """
for filename in self._policyfiles:
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/log2xml.py
--- a/buildframework/helium/sf/python/pythoncore/lib/log2xml.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/log2xml.py Mon Oct 11 11:16:47 2010 +0100
@@ -174,9 +174,11 @@
print exc
-def convert(inputfile, outputfile, fulllogging=True, configuration=DEFAULT_CONFIGURATION):
+def convert(inputfile, outputfile, fulllogging=True, configuration=None):
""" Convert an input log into an XML log and write an outputfile. """
-
+ if configuration == None:
+ configuration = DEFAULT_CONFIGURATION
+
# Compiling the regexp
built_config = {}
for category in configuration.keys():
@@ -243,9 +245,11 @@
# end file
xmllog.close()
-def convert_old(inputfile, outputfile, fulllogging=True, configuration=DEFAULT_CONFIGURATION):
+def convert_old(inputfile, outputfile, fulllogging=True, configuration=None):
""" Convert an input log into an XML log and write an outputfile. """
-
+ if configuration == None:
+ configuration = DEFAULT_CONFIGURATION
+
# Compiling the regexp
built_config = {}
for category in configuration.keys():
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/nokia/gscm.py
--- a/buildframework/helium/sf/python/pythoncore/lib/nokia/gscm.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/nokia/gscm.py Mon Oct 11 11:16:47 2010 +0100
@@ -35,7 +35,7 @@
""" Runs a command and returns the result data. """
process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
output = process.stdout.read()
- process.poll()
+ process.wait()
status = process.returncode
return (output, status)
@@ -53,9 +53,11 @@
_logger.debug("Status: %s" % status)
_logger.debug("Output: %s" % output)
if status == 0 or status == None and not ("Can't locate" in output):
+ _logger.debug("Returning output")
return output.strip()
if not 'HLM_SUBCON' in os.environ:
- raise Exception("Error retrieving get_db_path info for '%s' database.\nOUTPUT:%s" % (dbname, output.strip()))
+ _logger.debug("Raising exception")
+ raise IOError("Error retrieving get_db_path info for '%s' database.\nOUTPUT:%s" % (dbname, output.strip()))
return None
def get_db_path(dbname):
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pkg2iby.py
--- a/buildframework/helium/sf/python/pythoncore/lib/pkg2iby.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/pkg2iby.py Mon Oct 11 11:16:47 2010 +0100
@@ -52,10 +52,16 @@
atsautoexec.write(r'md c:\logs\testresults' + '\n')
atsautoexec.write(r'md c:\logs\testexecute' + '\n')
+ for _, dst, _, _ in pkgfiles:
+ (_, dstsplit) = os.path.splitdrive(dst)
+ dst_nodrive = 'atsdata' + dstsplit
+ zdst = 'z:\\' + dst_nodrive
+ atsautoexec.write(r'md ' + os.path.dirname(dst) + '\n')
+ atsautoexec.write(r'copy ' + zdst + ' ' + dst + '\n')
+
for src, dst, filetype, _ in pkgfiles:
- (_, dst) = os.path.splitdrive(dst)
- dst_nodrive = 'atsdata' + dst
- dst = r'z:\atsdata' + dst
+ (_, dstsplit) = os.path.splitdrive(dst)
+ dst_nodrive = 'atsdata' + dstsplit
myiby.write('data=' + src + ' ' + dst_nodrive + '\n')
if 'testscript' in filetype and testtype == 'tef':
atsautoexec.write('testexecute.exe ' + dst + '\n')
@@ -75,7 +81,6 @@
atsautoexec.write(r'runtests \sys\bin\atsrtestexec.bat' + '\n')
myiby.write(r'data=' + rtestexecfilename + r' \sys\bin\atsrtestexec.bat' + '\n')
-
myiby.write(r'data=' + dummyexecfilename + r' z:\dummytest.txt' + '\n')
atsautoexec.write(r'RUNTESTS z:\dummytest.txt -p')
myiby.write("#endif\n")
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/preparation.py
--- a/buildframework/helium/sf/python/pythoncore/lib/preparation.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/preparation.py Mon Oct 11 11:16:47 2010 +0100
@@ -125,6 +125,10 @@
session = self.get_session()
project = session.create(self._config.name)
+ session.home = self._config['dir']
+ path = os.path.join(session.home, project.name)
+ project.work_area(False, True, True, path=path)
+
target_dir = os.path.normpath(os.path.join(self._config['dir'], project.name))
_logger.info("Deleting snapshot under %s." % target_dir)
if os.path.exists(target_dir):
@@ -209,13 +213,9 @@
for project in self.__get_subbaselines():
self._check_object(project)
-
- try:
- if (not os.path.exists(self._config['dir'])):
- os.makedirs(self._config['dir'])
- except Exception:
- _logger.info("ERROR: Not able to create the synergy workarea %s " % (self._config['dir']))
- raise Exception("ERROR: Not able to create the synergy workarea %s" % self._config.name)
+
+ if (not os.path.exists(self._config['dir'])):
+ os.makedirs(self._config['dir'])
# checking if the purpose exists
if self._config.has_key('purpose'):
@@ -241,6 +241,10 @@
session.home = self._config['dir']
result = self.__find_project(project)
+
+ path = os.path.join(session.home, project.name)
+ project.work_area(False, True, True, path=path)
+
if (result != None):
_logger.info("Project found: '%s'" % result)
role = session.role
@@ -308,9 +312,14 @@
_logger.info("Using version: '%s'" % version)
try:
+ if (not self._config.get_boolean('use.default_wa_path', True)):
+ wa_path = self._config['dir']
+ _logger.info("Using work area path to checkout directly")
+ result = project.checkout(session.create(self._config['release']), version=version, purpose=purpose, path=wa_path)
+ else:
+ result = project.checkout(session.create(self._config['release']), version=version, purpose=purpose)
+ ccm.log_result(result, ccm.CHECKOUT_LOG_RULES, _logger)
self.__setRole(session)
- result = project.checkout(session.create(self._config['release']), version=version, purpose=purpose)
- ccm.log_result(result, ccm.CHECKOUT_LOG_RULES, _logger)
except ccm.CCMException, exc:
ccm.log_result(exc.result, ccm.CHECKOUT_LOG_RULES, _logger)
raise exc
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_archive.py
--- a/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_archive.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_archive.py Mon Oct 11 11:16:47 2010 +0100
@@ -266,13 +266,14 @@
content = [s.strip() for s in content]
content.sort()
- print content
if os.sep == '\\':
expected_paths = [s.strip().lower() for s in expected_paths]
else:
expected_paths = [s.strip() for s in expected_paths]
expected_paths.sort()
- print expected_paths
+
+ _logger.info("expected_paths:\n" + str("\n".join(expected_paths)))
+ _logger.info("content:\n" + str("\n".join(content)))
assert content == expected_paths
def test_split_manifest_file_unicode(self):
@@ -890,6 +891,66 @@
self.assert_(content == expectedPaths)
self.assert_(content1 == expectedPaths1)
self.assert_(content2 == expectedPaths2)
+
+ def test_split_on_uncompressed_size_enabled(self):
+ """ Testing the policy mapper with split on uncompressed size enabled. """
+ configDict = {'root.dir': root_test_dir,
+ 'temp.build.dir': os.path.abspath(os.path.join(root_test_dir, 'temp_build_files')),
+ 'archives.dir': root_test_dir,
+ 'name': 's60_policy_mapper_test',
+ 'include': 's60/',
+ 'archive.tool': '7za',
+ 'mapper': 'policy',
+ 'max.files.per.archive': '1',
+ 'split.on.uncompressed.size.enabled': 'true',
+ 'policy.csv': os.path.join(os.environ['TEST_DATA'], 'data/distribution.policy.id_status.csv'),
+ }
+ config = configuration.Configuration(configDict)
+
+ builder = archive.ArchivePreBuilder(configuration.ConfigurationSet([config]), "config", index=0)
+ manifest_file_path = builder.build_manifest(config)
+ cmds = builder.manifest_to_commands(config, manifest_file_path)
+
+ expectedPaths0 = ['s60' + os.sep + 'component_private' + os.sep + 'Distribution.Policy.S60\n']
+ expectedPaths1 = ['s60' + os.sep + 'component_private' + os.sep + 'component_private_file.txt\n']
+ expectedPaths2 = ['s60' + os.sep + 'Distribution.Policy.S60\n']
+ expectedPaths3 = ['s60' + os.sep + 'component_public' + os.sep + 'Distribution.Policy.S60\n']
+ expectedPaths4 = ['s60' + os.sep + 'component_public' + os.sep + 'component_public_file.txt\n']
+ expectedPaths5 = ['s60' + os.sep + 'missing' + os.sep + 'subdir' + os.sep + 'Distribution.Policy.S60\n']
+
+ includeFilePath0 = os.path.join(root_test_dir, 'temp_build_files/s60_policy_mapper_test_part01_1.txt')
+ includeFilePath1 = os.path.join(root_test_dir, 'temp_build_files/s60_policy_mapper_test_part02_1.txt')
+ includeFilePath2 = os.path.join(root_test_dir, 'temp_build_files/s60_policy_mapper_test_part01_0.txt')
+ includeFilePath3 = os.path.join(root_test_dir, 'temp_build_files/s60_policy_mapper_test_part02_0.txt')
+ includeFilePath4 = os.path.join(root_test_dir, 'temp_build_files/s60_policy_mapper_test_part03_0.txt')
+ includeFilePath5 = os.path.join(root_test_dir, 'temp_build_files/s60_policy_mapper_test_part04_0.txt')
+
+ with open(includeFilePath0) as f_file:
+ content0 = f_file.readlines()
+ with open(includeFilePath1) as f_file:
+ content1 = f_file.readlines()
+ with open(includeFilePath2) as f_file:
+ content2 = f_file.readlines()
+ with open(includeFilePath3) as f_file:
+ content3 = f_file.readlines()
+ with open(includeFilePath4) as f_file:
+ content4 = f_file.readlines()
+ with open(includeFilePath5) as f_file:
+ content5 = f_file.readlines()
+
+ print "content0: ", content0
+ print "content1: ", content1
+ print "content2: ", content2
+ print "content3: ", content3
+ print "content4: ", content4
+ print "content5: ", content5
+
+ self.assert_(content0 == expectedPaths0 or content0 == expectedPaths1)
+ self.assert_(content1 == expectedPaths1 or content1 == expectedPaths0)
+ self.assert_(content2 == expectedPaths2)
+ self.assert_(content3 == expectedPaths3 or content3 == expectedPaths4)
+ self.assert_(content4 == expectedPaths4 or content4 == expectedPaths3)
+ self.assert_(content5 == expectedPaths5)
class CheckRootDirValueTest(unittest.TestCase):
"""test root drive value"""
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_ats3.py
--- a/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_ats3.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_ats3.py Mon Oct 11 11:16:47 2010 +0100
@@ -20,7 +20,6 @@
#===============================================================================
""" Testing ATS3 framework. """
-# pylint: disable=E1101,C0302,w0142,w0603,R0912,R0902,R0903,R0201,W0404, R0915
#w0142 => * and ** were used
#w0603 => global variables used TSRC_PATH etc
#R* => will be fixed while refactoring
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_ats3_aste.py
--- a/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_ats3_aste.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_ats3_aste.py Mon Oct 11 11:16:47 2010 +0100
@@ -20,8 +20,6 @@
#===============================================================================
""" Testing ATS3 ASTE framework. """
-
-# pylint: disable=W0603,W0142,R0903,R0911,R0912,R0902,R0901,R0201
# pylint: disable=E1101
#E1101 => Mocker shows mockery
#R* remove during refactoring
@@ -61,6 +59,7 @@
self.__dict__.update(kwargs)
+# pylint: disable=R0911
def equal_xml(xml1, xml2):
"""Check the equality of the given XML snippets.
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_ats4.py
--- a/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_ats4.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_ats4.py Mon Oct 11 11:16:47 2010 +0100
@@ -20,7 +20,6 @@
#===============================================================================
""" Testing ats4 framework. """
-# pylint: disable=E1101, C0302, W0142, W0603, R0902,R0903,R0912,R0915
#E1101 => Mocker shows mockery
#C0302 => too many lines
#W0142 => used * or ** magic
@@ -271,6 +270,11 @@
TestReportFileName= TestReport
TestReportFormat= TXT # Possible values: TXT or HTML
+[End_Defaults]
+# -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
+
+[New_Module]
+ModuleName= testscripter
""")
@@ -294,7 +298,7 @@
assert params[0].get("value") == "writefile"
assert params[1].get("value") == path(r"z:\sys\bin\ctcman.exe")
-def check_ctc_log(steps, testtype=""):
+def check_ctc_log(steps):
"""Fetches CTC Log"""
#For the ctcdata.txt to be published on the ATS network drive
step = steps.next()
@@ -772,6 +776,7 @@
mocker.expect(test_plan["report_email"]).result(self.report_email)
mocker.expect(test_plan["ctc_run_process_params"]).result(self.ctc_run_process_params)
mocker.expect(test_plan["report_type"]).result("")
+ mocker.expect(test_plan["file_store"]).result("")
if self.trace_enabled.lower() == "true":
mocker.expect(test_plan["trace_enabled"]).result("True")
@@ -937,6 +942,20 @@
assert params[0].get("value") == "*"
assert params[1].get("value") == "60"
assert params[2].get("value") == "c:\\testframework\\" + ntpath.basename(self.engine_ini_file)
+ step = steps.next()
+ assert step.findtext("./type") == "StifRunCasesTask"
+ params = step.findall("./parameters/parameter")
+ assert params[0].get("value") == "TESTSCRIPTER"
+ assert params[1].get("value") == "*"
+ assert params[2].get("value") == "60"
+ assert params[3].get("value") == r"e:\testing\conf\file1.cfg"
+ step = steps.next()
+ assert step.findtext("./type") == "StifRunCasesTask"
+ params = step.findall("./parameters/parameter")
+ assert params[0].get("value") == "TESTSCRIPTER"
+ assert params[1].get("value") == "*"
+ assert params[2].get("value") == "60"
+ assert params[3].get("value") == r"e:\testing\conf\file2.cfg"
def test_steps_trace_enabled(self):
""" Test steps trace enabled. """
@@ -1113,8 +1132,9 @@
self.custom_files = None
self.component_path = None
self.ctc_run_process_params = None
+ self.ats_stf_enabled = None
- def generate_xml(self, harness, trace_enabled="False"):
+ def generate_xml(self, harness, trace_enabled="False", tef_test_module=None, ats_stf_enabled="False"):
"""Generates XML"""
def files(*paths):
"""generates paths for the files"""
@@ -1125,6 +1145,59 @@
self.config_files = files("conf/file1.cfg", "conf/file2.cfg")
self.testmodule_files = files("testmodules/file1.dll", "testmodules/file2.dll")
self.image_files = files("output/images/file1.fpsx", "output/images/file2.fpsx")
+ if tef_test_module:
+ TEST_PATH.joinpath(r"tsrc" + os.sep + "init" + os.sep + "TestFramework.ini").write_text(
+ r"""
+# - Sets a device reset module's dll name(Reboot).
+# + If Nokia specific reset module is not available or it is not correct one
+# StifHWResetStub module may use as a template for user specific reset
+# module.
+
+[Engine_Defaults]
+
+TestReportMode= FullReport # Possible values are: 'Empty', 'Summary', 'Environment',
+ 'TestCases' or 'FullReport'
+
+CreateTestReport= YES # Possible values: YES or NO
+
+TestReportFilePath= C:\LOGS\TestFramework\
+TestReportFileName= TestReport
+
+TestReportFormat= TXT # Possible values: TXT or HTML
+[End_Defaults]
+# -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
+
+[New_Module]
+ModuleName= teftestmodule
+
+ """)
+ else:
+ TEST_PATH.joinpath(r"tsrc" + os.sep + "init" + os.sep + "TestFramework.ini").write_text(
+ r"""
+# - Sets a device reset module's dll name(Reboot).
+# + If Nokia specific reset module is not available or it is not correct one
+# StifHWResetStub module may use as a template for user specific reset
+# module.
+
+[Engine_Defaults]
+
+TestReportMode= FullReport # Possible values are: 'Empty', 'Summary', 'Environment',
+ 'TestCases' or 'FullReport'
+
+CreateTestReport= YES # Possible values: YES or NO
+
+TestReportFilePath= C:\LOGS\TestFramework\
+TestReportFileName= TestReport
+
+TestReportFormat= TXT # Possible values: TXT or HTML
+[End_Defaults]
+# -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
+
+[New_Module]
+ModuleName= testscripter
+
+ """)
+
self.engine_ini_file = files("init/TestFramework.ini")[0]
self.report_email = "test.receiver@company.com"
self.file_store = path("path/to/reports")
@@ -1133,6 +1206,7 @@
self.pmd_files = TEST_FILES["pmd_file"]
self.trace_activation_files = files("trace_init/trace_activation_1.xml")
self.ctc_enabled = "True"
+ self.ats_stf_enabled = ats_stf_enabled
self.eunitexerunner_flags = "/E S60AppEnv /R Off"
self.custom_dir = "custom"
self.custom_files = files("custom/postpostaction.xml", "custom/prepostaction.xml")
@@ -1184,11 +1258,13 @@
mocker.expect(test_plan["device_hwid"]).result("5425")
mocker.expect(test_plan["trace_enabled"]).result(self.trace_enabled)
mocker.expect(test_plan["ctc_enabled"]).result(self.ctc_enabled)
+ mocker.expect(test_plan["ats_stf_enabled"]).result(self.ats_stf_enabled)
mocker.expect(test_plan["custom_dir"]).result("custom1A")
mocker.expect(test_plan.custom_dir).result(path(r"self.custom_dir"))
mocker.expect(test_plan["ctc_run_process_params"]).result(self.ctc_run_process_params)
mocker.expect(test_plan["report_email"]).result(self.report_email)
mocker.expect(test_plan["report_type"]).result("")
+ mocker.expect(test_plan["file_store"]).result("")
if self.trace_enabled == "False":
mocker.expect(test_plan.sets).result([
dict(name="set0", image_files=self.image_files, data_files=self.data_files,
@@ -1314,7 +1390,7 @@
self.check_install_step(steps, "EUNIT", set_count="1")
self.check_run_cases(steps, "EUNIT")
check_ctc_write(steps)
- check_ctc_log(steps, "withpkgfiles")
+ check_ctc_log(steps)
check_fetch_logs(steps, "EUNIT")
else:
self.check_install_step(steps, thar)
@@ -1322,6 +1398,66 @@
check_ctc_write(steps)
check_ctc_log(steps)
check_fetch_logs(steps, thar)
+
+ def test_case_steps_teftestmodule(self):
+ """Checks cases in steps in the test.xml file for TEFTESTMODULE"""
+ test_harness = ["STIF", "EUNIT", "MULTI_HARNESS"]
+ for thar in test_harness:
+ xml = self.generate_xml(thar, tef_test_module=True, ats_stf_enabled="True")
+ #print et.tostring(xml.getroot())
+ steps = iter(xml.findall(".//task"))
+ steps.next() # Flash images
+ check_ctc_start(steps)
+ check_log_dir(steps)
+ if "MULTI_HARNESS" in thar:
+ self.check_install_step(steps, "STIF")
+ self.check_run_cases(steps, "STIF", tef_test_module=True, ats_stf_enabled="True")
+ check_ctc_write(steps)
+ check_ctc_log(steps)
+ check_fetch_logs(steps, "STIF")
+
+ steps.next() # Flash images
+ check_ctc_start(steps)
+ check_log_dir(steps)
+ self.check_install_step(steps, "EUNIT", set_count="1")
+ self.check_run_cases(steps, "EUNIT")
+ check_ctc_write(steps)
+ check_ctc_log(steps)
+ check_fetch_logs(steps, "EUNIT")
+ else:
+ self.check_install_step(steps, thar)
+ self.check_run_cases(steps, thar, tef_test_module=True)
+ check_ctc_write(steps)
+ check_ctc_log(steps)
+ check_fetch_logs(steps, thar)
+ for thar in test_harness:
+ xml = self.generate_xml(thar, tef_test_module=True)
+ #print et.tostring(xml.getroot())
+ steps = iter(xml.findall(".//task"))
+ steps.next() # Flash images
+ check_ctc_start(steps)
+ check_log_dir(steps)
+ if "MULTI_HARNESS" in thar:
+ self.check_install_step(steps, "STIF")
+ self.check_run_cases(steps, "STIF", tef_test_module=True)
+ check_ctc_write(steps)
+ check_ctc_log(steps)
+ check_fetch_logs(steps, "STIF")
+
+ steps.next() # Flash images
+ check_ctc_start(steps)
+ check_log_dir(steps)
+ self.check_install_step(steps, "EUNIT", set_count="1")
+ self.check_run_cases(steps, "EUNIT")
+ check_ctc_write(steps)
+ check_ctc_log(steps)
+ check_fetch_logs(steps, "EUNIT")
+ else:
+ self.check_install_step(steps, thar)
+ self.check_run_cases(steps, thar, tef_test_module=True)
+ check_ctc_write(steps)
+ check_ctc_log(steps)
+ check_fetch_logs(steps, thar)
def check_install_step(self, steps, harness, set_count="0"):
"""Checks install steps in the test.xml file"""
@@ -1348,7 +1484,7 @@
assert ntpath.basename(dst) == filename
assert ntpath.dirname(dst) == drive + "\\sys\\bin"
- def check_run_cases(self, steps, harness="STIF"):
+ def check_run_cases(self, steps, harness="STIF", tef_test_module=None, ats_stf_enabled="False"):
"""Checks run cases in the test.xml file"""
step = steps.next()
if harness == "STIF":
@@ -1358,6 +1494,45 @@
assert params[0].get("value") == "*"
assert params[1].get("value") == "60"
assert params[2].get("value") == "c:\\sys\\bin\\" + ntpath.basename(self.engine_ini_file)
+ step = steps.next()
+ assert step.findtext("./type") == "StifRunCasesTask"
+ params = step.findall("./parameters/parameter")
+ assert params[0].get("value") == "file1.dll"
+ assert params[1].get("value") == "*"
+ assert params[2].get("value") == "60"
+ step = steps.next()
+ assert step.findtext("./type") == "StifRunCasesTask"
+ params = step.findall("./parameters/parameter")
+ assert params[0].get("value") == "file2.dll"
+ assert params[1].get("value") == "*"
+ assert params[2].get("value") == "60"
+ step = steps.next()
+ assert step.findtext("./type") == "StifRunCasesTask"
+ params = step.findall("./parameters/parameter")
+ if tef_test_module:
+ assert params[0].get("value") == "teftestmodule"
+ else:
+ assert params[0].get("value") == "TESTSCRIPTER"
+ assert params[1].get("value") == "*"
+ assert params[2].get("value") == "60"
+ assert params[3].get("value") == r"c:\sys\bin\file1.cfg"
+ if tef_test_module and ats_stf_enabled.lower() == "true":
+ assert params[4].get("value") == r"c:\spd_logs\xml\teftestmodule.xml"
+
+ step = steps.next()
+ assert step.findtext("./type") == "StifRunCasesTask"
+ params = step.findall("./parameters/parameter")
+ if tef_test_module:
+ assert params[0].get("value") == "teftestmodule"
+ else:
+ assert params[0].get("value") == "TESTSCRIPTER"
+ assert params[1].get("value") == "*"
+ assert params[2].get("value") == "60"
+ assert params[3].get("value") == r"c:\sys\bin\file2.cfg"
+ if tef_test_module and ats_stf_enabled.lower() == "true":
+ assert params[4].get("value") == r"c:\spd_logs\xml\teftestmodule.xml"
+
+
elif harness == "EUNIT":
_ = self.testmodule_files[0]
assert step.findtext("./type") == "EUnitTask"
@@ -1569,6 +1744,7 @@
mocker.expect(test_plan["ctc_run_process_params"]).result(self.ctc_run_process_params)
mocker.expect(test_plan["report_email"]).result(self.report_email)
mocker.expect(test_plan["report_type"]).result("")
+ mocker.expect(test_plan["file_store"]).result("")
mocker.expect(test_plan.sets).result([
dict(name="set0", image_files=self.image_files, sis_files=self.sis_files,
engine_ini_file=self.engine_ini_file, test_harness=self.harness, ctc_enabled="False", component_path=self.component_path, custom_dir=None),
@@ -1613,6 +1789,7 @@
assert params[-1].get("value") == "c:\\testframework\\" + ntpath.basename(filename)
def test_ats_sut():
+ """Test SymbianUnitTest"""
opts = Bunch(file_store='', flash_images='', diamonds_build_url='', testrun_name='', device_type='', report_email='', test_timeout='', drop_file='', config_file='', target_platform='', data_dir='', build_drive='', sis_files='', harness='', trace_enabled='', specific_pkg='', ats4_enabled='true', device_hwid='')
test_plan = ats3.Ats3TestPlan(opts)
@@ -1631,4 +1808,4 @@
step = steps.next()
assert step.findtext("./type") == "SymbianUnitTestTask"
params = step.findall("./parameters/parameter")
- assert params[1].get("value") == r"-tests=c:\sys\bin\file1.dll -noprompt"
\ No newline at end of file
+ assert params[1].get("value") == r"-tests=c:\sys\bin\file1.dll -noprompt"
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_ats4_aste.py
--- a/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_ats4_aste.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_ats4_aste.py Mon Oct 11 11:16:47 2010 +0100
@@ -21,7 +21,6 @@
""" Testing ATS4 ASTE framework. """
-# pylint: disable=E1101, R0903, R0911, R0912, W0603, W0142, R0902, R0201
#E1101 => Mocker shows mockery
#C0302 => too many lines
#W0142 => used * or ** magic
@@ -44,6 +43,8 @@
import ats3.aste
+import pythoncorecpythontests.test_ats3_aste
+
TEST_PATH = None
TEST_FILES = {}
TEST_ASSET_FILES = {}
@@ -63,75 +64,7 @@
self.__dict__.update(kwargs)
def equal_xml(xml1, xml2):
- """Check the equality of the given XML snippets.
-
- Tag name equality:
-
- >>> equal_xml('', '')
- True
- >>> equal_xml('', '')
- False
-
- Attribute equality:
-
- >>> equal_xml('', '')
- True
- >>> equal_xml('', '')
- False
-
- Text content equality:
-
- >>> equal_xml('v', 'v')
- True
- >>> equal_xml('v', 'w')
- False
- >>> equal_xml('v', '')
- False
-
- Text content equality when whitespace differs:
- >>> equal_xml('v', 'v ')
- True
-
- Equality of child elements:
-
- >>> equal_xml('', '')
- True
- >>> equal_xml('', '')
- False
- >>> equal_xml('v', 'w')
- False
- >>> equal_xml('v', 'v ')
- True
-
- """
- if isinstance(xml1, basestring):
- xml1 = fromstring(xml1)
- if isinstance(xml2, basestring):
- xml2 = fromstring(xml2)
- if xml1.tag != xml2.tag:
- return False
- if xml1.attrib != xml2.attrib:
- return False
- if xml1.text:
- if not xml2.text:
- return False
- if xml2.text:
- if not xml1.text:
- return False
- if xml1.text and xml2.text and xml1.text.strip() != xml2.text.strip():
- return False
- if xml1.tail is not None and xml2.tail is not None:
- if xml1.tail.strip() != xml2.tail.strip():
- return False
- elif xml1.tail != xml2.tail:
- return False
- children1 = list(xml1.getchildren())
- children2 = list(xml2.getchildren())
- if len(children1) != len(children2):
- return False
- for child1, child2 in zip(children1, children2):
- return equal_xml(child1, child2)
- return True
+ return pythoncorecpythontests.test_ats3_aste.equal_xml(xml1, xml2)
def setup_module():
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_atsconfigparser.py
--- a/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_atsconfigparser.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_atsconfigparser.py Mon Oct 11 11:16:47 2010 +0100
@@ -100,3 +100,64 @@
self.assert_( '' in output)
self.assert_( '' not in output)
self.assert_( '' in output)
+
+
+ def test_converttestxml_ats4(self):
+ spectext = """
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ """
+ testxmldataats4 = """
+
+ mybuild
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ """
+
+ (file_descriptor, filename) = tempfile.mkstemp()
+ file_handle = os.fdopen(file_descriptor, 'w')
+ file_handle.write(spectext)
+ file_handle.close()
+
+ output = ats3.atsconfigparser.converttestxml(filename, testxmldataats4)
+ os.remove(filename)
+
+ self.assert_( '' in output)
+ self.assert_( '' in output)
+ self.assert_( '' not in output)
+ self.assert_( '' in output)
+ self.assert_( '' not in output)
+ self.assert_( '' in output)
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_bootup_testing.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_bootup_testing.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,147 @@
+# -*- coding: latin-1 -*-
+
+#============================================================================
+#Name : test_bootup_testing.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+
+""" Testing Bootup tests framework. """
+
+# pylint: disable=E1101
+
+import logging
+logging.getLogger().setLevel(logging.INFO)
+import os
+#import shutil
+from path import path
+import ats3.bootup_testing
+import tempfile
+import zipfile
+import platform
+
+TEST_PATH = None
+TEST_FILES = {}
+OUTPUT = None
+TOTAL_TESTS_COUNT = 3
+
+
+class Bunch(object):
+ """ Configuration object. Argument from constructor are converted into class attributes. """
+ def __init__(self, **kwargs):
+ self.__dict__.update(kwargs)
+
+class SetUp(object):
+ """ Setup the module. """
+
+ def __init__(self):
+ """ Setup test environment. """
+ global TEST_PATH, OUTPUT
+
+ TEST_PATH = path(tempfile.mkdtemp())
+ component = TEST_PATH
+ component.joinpath("ats_build_drive").makedirs()
+ for path_parts in (("output", "images", "image1.fpsx"),
+ ("output", "images", "image2.fpsx"),
+ ("output", "ats", "temp.txt")):
+ filepath = component.joinpath(*path_parts)
+ if not filepath.parent.exists():
+ filepath.parent.makedirs()
+ filepath.touch()
+ TEST_FILES.setdefault(path_parts[1], []).append(filepath)
+
+ OUTPUT = component.joinpath(r"output")
+
+ if not filepath.parent.exists():
+ filepath.parent.makedirs()
+ filepath.touch()
+
+
+def teardown_module(test_run_count):
+ """ stuff to do after running the tests """
+
+ if test_run_count == 0:
+ path(TEST_PATH).rmtree()
+
+class TestBootupTestPlan(SetUp):
+ """ test BootupTestDrop.py """
+
+ def __init__(self):
+ """initialize bootup Tests"""
+ SetUp.__init__(self)
+ self.file_store = OUTPUT
+ self.build_drive = "j:"
+ self.drop_file = path(r"%s/ats/ATSBootupDrop.zip" %OUTPUT).normpath()
+
+ image_files = r"%s/images/image1.fpsx, %s/images/image2.fpsx " % (OUTPUT, OUTPUT)
+ self.flash_images = image_files
+ self.config = None
+
+ def read_xml(self, file_location, zip_file=False):
+ """reads test.xml file if a path is given"""
+
+ xml_text = ""
+ file_location = path(file_location)
+ if zip_file:
+ if zipfile.is_zipfile(file_location):
+ myzip = zipfile.ZipFile(file_location, 'r')
+ xml_text = myzip.read('test.xml')
+ myzip.close()
+
+ else:
+ hnd = open(file_location, 'r')
+ for line in hnd.readlines():
+ xml_text = xml_text + line
+
+ return xml_text
+
+ def test_xml_file(self):
+ """ test bootup_testing.py generates correct test.xml file"""
+ global TOTAL_TESTS_COUNT
+ opts = Bunch(build_drive=self.build_drive,
+ drop_file=path(r"%s/ats/ATSBootupDrop.zip" %OUTPUT).normpath(),
+ flash_images=self.flash_images,
+ template_loc="",
+ file_store=self.file_store,
+ report_email="firstname.lastname@domain.com",
+ testrun_name="Bootup test run",
+ alias_name="alias",
+ device_type="new_device",
+ diamonds_build_url="http://diamonds.com/1234",
+ email_format="simplelogger",
+ email_subject="Bootup test report",
+ verbose="false")
+
+ self.config = ats3.bootup_testing.Configuration(opts)
+ ats3.bootup_testing.create_drop(self.config)
+
+ xml_loc = os.path.join(os.environ['TEST_DATA'], 'data/bootup_testing/test_bootup.xml')
+ stored_xml = self.read_xml(xml_loc, False).strip()
+ drop_loc = os.path.join(OUTPUT, 'ats/ATSBootupDrop.zip')
+ generated_xml = self.read_xml(drop_loc, True).strip()
+
+ if platform.system().lower() == "linux":
+ assert stored_xml.replace('\r', '') in generated_xml
+ else:
+ assert stored_xml in generated_xml
+
+ TOTAL_TESTS_COUNT -= 1
+ teardown_module(TOTAL_TESTS_COUNT)
+
+
+
+
+
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_buildmodel.py
--- a/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_buildmodel.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_buildmodel.py Mon Oct 11 11:16:47 2010 +0100
@@ -77,7 +77,7 @@
class BOMTest(unittest.TestCase):
""" Test BOM and related classes. """
-# TODO - removed until non-Synergy dependent tests can be provided.
+# Removed until non-Synergy dependent tests can be provided.
# def test_bom_output(self):
# """ Test basic BOM execution. Only new spec format will be covered!"""
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_logger.py
--- a/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_logger.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_logger.py Mon Oct 11 11:16:47 2010 +0100
@@ -21,6 +21,7 @@
import logging
import os
import unittest
+import urllib2
import helium.logger
import helium.outputer
@@ -88,10 +89,13 @@
mclogger.WriteToFile('log.xml')
_logger.info(mclogger)
-
- out = helium.outputer.XML2XHTML('log.xml')
- out.generate()
- out.WriteToFile('log.html')
+
+ try:
+ out = helium.outputer.XML2XHTML('log.xml')
+ out.generate()
+ out.WriteToFile('log.html')
+ except urllib2.URLError, e:
+ _logger.warning('Test cannont run properly as the configuration url cannot be accessed properly.')
os.unlink('log.xml')
os.unlink('log.html')
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_matti.py
--- a/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_matti.py Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,209 +0,0 @@
-# -*- coding: latin-1 -*-
-
-#============================================================================
-#Name : test_matti.py
-#Part of : Helium
-
-#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
-#All rights reserved.
-#This component and the accompanying materials are made available
-#under the terms of the License "Eclipse Public License v1.0"
-#which accompanies this distribution, and is available
-#at the URL "http://www.eclipse.org/legal/epl-v10.html".
-#
-#Initial Contributors:
-#Nokia Corporation - initial contribution.
-#
-#Contributors:
-#
-#Description:
-#===============================================================================
-
-""" Testing MATTI framework. """
-
-# pylint: disable=E1101
-
-import logging
-logging.getLogger().setLevel(logging.ERROR)
-import os
-#import shutil
-from path import path
-import ats3.aste
-import ats3.matti.MattiDrops
-import tempfile
-
-TEST_FILE_NAME = 'test.xml'
-ZIP_FILE_NAME = os.path.join(tempfile.mkdtemp(), 'MATTIDrop.zip')
-
-class Bunch(object):
- """do something with the paramerters passed to it"""
- def __init__(self, **kwargs):
- self.__dict__.update(kwargs)
-
-
-def equal_xml(result, expect):
- """Check the equality of the given XML snippets. """
-# logging.info(" expect %s" % expect)
-# xml1 = objectify.fromstring(expect)
-# expect1 = etree.tostring(xml1)
-# logging.info(" expect1 %s" % expect1)
-# logging.info(" expect2 -------------%s" % expect2)
-#
-# xml2 = objectify.fromstring(result)
-# result2 = etree.tostring(xml2)
-# self.assertEquals(expect1, result1)
-#
-# if xml1.tag != xml2.tag:
-# return False
-# if xml1.attrib != xml2.attrib:
-# return False
-# if xml1.text:
-# if not xml2.text:
-# return False
-# if xml2.text:
-# if not xml1.text:
-# return False
-# if xml1.text and xml2.text and xml1.text.strip() != xml2.text.strip():
-# return False
-# if xml1.tail is not None and xml2.tail is not None:
-# if xml1.tail.strip() != xml2.tail.strip():
-# return False
-# elif xml1.tail != xml2.tail:
-# return False
-# children1 = list(xml1.getchildren())
-# children2 = list(xml2.getchildren())
-# if len(children1) != len(children2):
-# return False
-# for child1, child2 in zip(children1, children2):
-# return equal_xml(child1, child2)
-# return True
- if expect:
- return result
-
-
-def setup_module():
- """ stuff to do before running the tests """
- pass
-
-def teardown_module():
- """ stuff to do after running the tests """
- if os.path.exists(TEST_FILE_NAME):
- os.remove(TEST_FILE_NAME)
- if os.path.exists(ZIP_FILE_NAME):
- os.remove(ZIP_FILE_NAME)
-
-
-class TestPlanMatti():
- """ test MattiDrop.py """
- def __init__(self):
- self.config = None
- self.tp_result = None
-
- (_, self.image1) = tempfile.mkstemp()
- (_, self.image2) = tempfile.mkstemp()
- (_, self.image3) = tempfile.mkstemp()
- (_, self.sis1) = tempfile.mkstemp()
- (_, self.sis2) = tempfile.mkstemp()
-
- def test_all_present(self):
- """ test mattiDrops.py with all parameters present and correct"""
- teardown_module()
- opts = Bunch(build_drive="z:",
- matti_scripts=os.path.join(os.environ['TEST_DATA'], 'data/matti'),
- flash_images = '%s,%s,%s' % (self.image1, self.image2, self.image3),
- report_email="", harness="STIF",
- file_store=path(), testrun_name="testrun",
- device_type="product", device_hwid="5425", diamonds_build_url="", drop_file=ZIP_FILE_NAME,
- minimum_flash_images="2", plan_name="matti_test_plan",
- sis_files = '%s,%s' % (self.sis1, self.sis2),
- template_loc=os.path.join(os.path.dirname(__file__), '../ats3/matti/template/matti_demo.xml'),
- test_timeout="60", verbose="false")
-
- self.config = ats3.matti.MattiDrops.Configuration(opts)
- self.tp_result = ats3.matti.MattiDrops.create_drop(self.config)
- assert os.path.exists(ZIP_FILE_NAME)
- assert os.path.exists(TEST_FILE_NAME)
- #shutil.copy(TEST_FILE_NAME, os.path.join(TMPDIR, 'test_all_present.xml'))
- #equal_xml(TEST_FILE_NAME, os.path.join(TMPDIR, 'test_all_present.xml'))
-
- def test_no_sis_or_flash_files(self):
- """test mattiDrops.py with no sis or flash files in the parameters"""
- teardown_module()
- opts = Bunch(build_drive="z:",
- matti_scripts=os.path.join(os.environ['TEST_DATA'], 'data/matti'),
- flash_images = "",
- report_email="", harness="STIF",
- file_store=path(), testrun_name="testrun",
- device_type="product", device_hwid="5425", diamonds_build_url="", drop_file=ZIP_FILE_NAME,
- minimum_flash_images="2", plan_name="matti_test_plan",
- sis_files= "",
- template_loc=os.path.join(os.path.dirname(__file__), '../ats3/matti/template/matti_demo.xml'),
- test_timeout="60", verbose="true")
-
- self.config = ats3.matti.MattiDrops.Configuration(opts)
- self.tp_result = ats3.matti.MattiDrops.create_drop(self.config)
- assert os.path.exists(ZIP_FILE_NAME)
- assert os.path.exists(TEST_FILE_NAME)
- #shutil.copy(TEST_FILE_NAME, os.path.join(TMPDIR, 'test_no_sis_or_flash.xml'))
- #equal_xml(TEST_FILE_NAME, os.path.join(TMPDIR, 'test_no_sis_or_flash.xml'))
-
-
- def test_no_files(self):
- """ test mattiDtops.py with no filespresent at all"""
- teardown_module()
- opts = Bunch(build_drive="z:",
- matti_scripts=tempfile.mkdtemp(),
- flash_images = "",
- report_email="", harness="STIF",
- file_store=path(), testrun_name="testrun",
- device_type="product", device_hwid="5425", diamonds_build_url="", drop_file=ZIP_FILE_NAME,
- minimum_flash_images="2", plan_name="matti_test_plan",
- sis_files= "",
- template_loc=os.path.join(os.path.dirname(__file__), '../ats3/matti/template/matti_demo.xml'),
- test_timeout="60", verbose="true")
- self.config = ats3.matti.MattiDrops.Configuration(opts)
- self.tp_result = ats3.matti.MattiDrops.create_drop(self.config)
- assert not os.path.exists(ZIP_FILE_NAME)
- assert os.path.exists(TEST_FILE_NAME)
- #shutil.copy(TEST_FILE_NAME, os.path.join(TMPDIR, 'test_no_files.xml'))
- #equal_xml(TEST_FILE_NAME, os.path.join(TMPDIR, 'test_no_files.xml'))
-
- def test_no_params(self):
- """test MattiDrops.py with no parameters present at all"""
- teardown_module()
- opts = Bunch(build_drive="",
- matti_scripts="",
- flash_images = "",
- report_email="", harness="",
- file_store="", testrun_name="",
- device_type="", device_hwid="", diamonds_build_url="", drop_file="",
- minimum_flash_images="", plan_name="",
- sis_files= "",
- template_loc="",
- test_timeout="", verbose="true")
-
- self.config = ats3.matti.MattiDrops.Configuration(opts)
- self.tp_result = ats3.matti.MattiDrops.create_drop(self.config)
- assert not os.path.exists(ZIP_FILE_NAME)
- assert not os.path.exists(TEST_FILE_NAME)
-
- def test_some_not_present(self):
- """ test MattiDrops.py with an extra file not present in the dir"""
- teardown_module()
- opts = Bunch(build_drive="z:",
- matti_scripts=os.path.join(os.environ['TEST_DATA'], 'data/matti'),
- flash_images = '%s,%s,%s' % (self.image1, self.image2, self.image3),
- report_email="", harness="STIF",
- file_store=path(), testrun_name="testrun",
- device_type="product", device_hwid="5425", diamonds_build_url="", drop_file=ZIP_FILE_NAME,
- minimum_flash_images="2", plan_name="matti_test_plan",
- sis_files = '%s,%s' % (self.sis1, self.sis2),
- template_loc=os.path.join(os.path.dirname(__file__), '../ats3/matti/template/matti_demo.xml'),
- test_timeout="60", verbose="false")
-
- self.config = ats3.matti.MattiDrops.Configuration(opts)
- self.tp_result = ats3.matti.MattiDrops.create_drop(self.config)
- assert os.path.exists(ZIP_FILE_NAME)
- assert os.path.exists(TEST_FILE_NAME)
- #shutil.copy(TEST_FILE_NAME, os.path.join(TMPDIR, 'test_some_not_present.xml'))
- #equal_xml(TEST_FILE_NAME, os.path.join(TMPDIR, 'test_some_not_present.xml'))
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_matti2.py
--- a/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_matti2.py Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,294 +0,0 @@
-# -*- coding: latin-1 -*-
-
-#============================================================================
-#Name : test_matti2.py
-#Part of : Helium
-
-#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
-#All rights reserved.
-#This component and the accompanying materials are made available
-#under the terms of the License "Eclipse Public License v1.0"
-#which accompanies this distribution, and is available
-#at the URL "http://www.eclipse.org/legal/epl-v10.html".
-#
-#Initial Contributors:
-#Nokia Corporation - initial contribution.
-#
-#Contributors:
-#
-#Description:
-#===============================================================================
-
-""" Testing MATTI framework. """
-
-# pylint: disable=E1101
-
-import logging
-logging.getLogger().setLevel(logging.INFO)
-import os
-#import shutil
-from path import path
-import ats3.matti2
-import tempfile
-import zipfile
-import platform
-
-TEST_PATH = None
-TEST_FILES = {}
-MATTI = None
-OUTPUT = None
-SISFILES = None
-TOTAL_TESTS_COUNT = 3
-
-
-class Bunch(object):
- """ Configuration object. Argument from constructor are converted into class attributes. """
- def __init__(self, **kwargs):
- self.__dict__.update(kwargs)
-
-class SetUp(object):
- """ Setup the module. """
-
- def __init__(self):
- """ Setup test environment. """
- global TEST_PATH, MATTI, OUTPUT, SISFILES
-
- TEST_PATH = path(tempfile.mkdtemp())
- component = TEST_PATH
- component.joinpath("matti").makedirs()
- for path_parts in (("matti_testcases", "profile", "all.sip"),
- ("matti_testcases", "profile", "bat.sip"),
- ("matti_testcases", "profile", "fute.sip"),
- ("matti_testcases", "hwdata", "paths.pkg"),
- ("matti_testcases", "hwdata", "file1.txt"),
- ("matti_testcases", "hwdata", "settings.ini"),
- ("matti_testcases", "matti_parameters", "matti_parameters.xml"),
- ("matti_testcases", "unit_test1.rb"),
- ("matti_testcases", "unit_test2.rb"),
- ("output", "images", "image1.fpsx"),
- ("output", "images", "image2.fpsx"),
- ("sisfiles", "abc.sis"),
- ("sisfiles", "xyz.sis"),
- ("output", "ats", "temp.txt")):
- filepath = component.joinpath(*path_parts)
- if not filepath.parent.exists():
- filepath.parent.makedirs()
- filepath.touch()
- TEST_FILES.setdefault(path_parts[1], []).append(filepath)
-
- OUTPUT = component.joinpath(r"output")
- MATTI = component.joinpath("matti_testcases")
- SISFILES = component.joinpath(r"sisfiles")
-
- if not filepath.parent.exists():
- filepath.parent.makedirs()
- filepath.touch()
-
- #mtc => matti_testcases
- mtc = component.joinpath("matti_testcases")
- mtc.joinpath("unit_test1.rb").write_text("unit_tests")
- mtc.joinpath("unit_test2.rb").write_text("unit_tests")
-
- # profiles
- profiles = component.joinpath("matti_testcases", "profile")
- profiles.joinpath("all.sip").write_text("sip profile")
- profiles.joinpath("bat.sip").write_text("sip profile")
- profiles.joinpath("fute.sip").write_text("sip profile")
-
- #hwdata => hardware data
- profiles = component.joinpath("matti_testcases", "hwdata")
- profiles.joinpath("file1.txt").write_text("data file")
- profiles.joinpath("settings.ini").write_text("settings initialization file")
- profiles.joinpath("paths.pkg").write_text(
- r"""
- ;Language - standard language definitions
- &EN
-
- ; standard SIS file header
- #{"BTEngTestApp"},(0x04DA27D5),1,0,0
-
- ;Supports Series 60 v 3.0
- (0x101F7961), 0, 0, 0, {"Series60ProductID"}
-
- ;Localized Vendor Name
- %{"BTEngTestApp"}
-
- ;Unique Vendor name
- :"Nokia"
-
- ; Files to copy
-
- "[PKG_LOC]\file1.txt"-"C:\Private\10202BE9\PERSISTS\file1.txt"
- "[PKG_LOC]\settings.ini"-"c:\sys\settings.ini"
- """.replace('\\', os.sep))
-
-
-def teardown_module(test_run_count):
- """ stuff to do after running the tests """
-
- if test_run_count == 0:
- path(TEST_PATH).rmtree()
-
-class TestMattiTestPlan(SetUp):
- """ test MattiDrop.py """
- global OUTPUT, MATTI, SISFILES
-
- def __init__(self):
- """initialize Matti Tests"""
- SetUp.__init__(self)
- self.file_store = OUTPUT
- self.test_asset_path = MATTI
- self.matti_sis_files = r"%s/abc.sis#f:\data\abc.sis#c:\abc.sis, %s/xyz.sis#f:\data\abc.sis#f:\xyz.sis" % (SISFILES, SISFILES)
- self.build_drive = "j:"
- self.drop_file = path(r"%s/ats/ATSMattiDrop.zip" %OUTPUT).normpath()
-
- image_files = r"%s/images/image1.fpsx, %s/images/image2.fpsx " % (OUTPUT, OUTPUT)
- self.flash_images = image_files
-
- self.template_loc = os.path.join(os.environ['TEST_DATA'], 'data/matti/matti_template.xml')
- self.template_loc = os.path.normpath(self.template_loc)
- self.matti_parameters = ""
- self.config = None
-
- def read_xml(self, file_location, zip_file=False):
- """reads test.xml file if a path is given"""
-
- xml_text = ""
- file_location = path(file_location)
- if zip_file:
- if zipfile.is_zipfile(file_location):
- myzip = zipfile.ZipFile(file_location, 'r')
- xml_text = myzip.read('test.xml')
- myzip.close()
-
- else:
- hnd = open(file_location, 'r')
- for line in hnd.readlines():
- xml_text = xml_text + line
-
- return xml_text
-
- def test_xml_with_all_parameters(self):
- """ test Matti2.py with all parameters present and correct and sierra is enabled"""
- global TOTAL_TESTS_COUNT
- opts = Bunch(build_drive=self.build_drive,
- drop_file=path(r"%s/ats/ATSMattiDrop1.zip" %OUTPUT).normpath(),
- flash_images=self.flash_images,
- matti_sis_files=self.matti_sis_files,
- testasset_location=self.test_asset_path,
- template_loc=self.template_loc,
- sierra_enabled="True",
- test_profiles="bat, fute",
- matti_parameters="",
- matti_timeout="1200",
- sierra_parameters="--teardown",
- file_store=self.file_store,
- report_email="firstname.lastname@domain.com",
- testrun_name="matti test run",
- alias_name="alias",
- device_type="new_device",
- diamonds_build_url="http://diamonds.com/1234",
- email_format="simplelogger",
- email_subject="Matti test report",
- verbode="false")
-
- self.config = ats3.matti2.Configuration(opts)
- ats3.matti2.create_drop(self.config)
-
- xml_loc = os.path.join(os.environ['TEST_DATA'], 'data/matti/test_all_present.xml')
- stored_xml = self.read_xml(xml_loc, False).strip()
- drop_loc = os.path.join(OUTPUT, 'ats/ATSMattiDrop1.zip')
- generated_xml = self.read_xml(drop_loc, True).strip()
-
- if platform.system().lower() == "linux":
- assert stored_xml.replace('\r', '') in generated_xml
- else:
- assert stored_xml in generated_xml
-
- TOTAL_TESTS_COUNT -= 1
- teardown_module(TOTAL_TESTS_COUNT)
-
- def test_xml_if_sierra_is_not_enabled(self):
- """ test Matti2.py with all parameters present and correct and sierra is not enabled (or false)"""
- global TOTAL_TESTS_COUNT
- opts = Bunch(build_drive=self.build_drive,
- drop_file=path(r"%s/ats/ATSMattiDrop2.zip" %OUTPUT).normpath(),
- flash_images=self.flash_images,
- matti_sis_files=self.matti_sis_files,
- testasset_location=self.test_asset_path,
- template_loc=self.template_loc,
- sierra_enabled="False",
- test_profiles="bat, fute",
- matti_parameters="",
- matti_timeout="1200",
- sierra_parameters="--teardown",
- file_store=self.file_store,
- report_email="firstname.lastname@domain.com",
- testrun_name="matti test run",
- alias_name="alias",
- device_type="new_device",
- diamonds_build_url="http://diamonds.com/1234",
- email_format="simplelogger",
- email_subject="Matti test report",
- verbode="false")
-
- self.config = ats3.matti2.Configuration(opts)
- ats3.matti2.create_drop(self.config)
-
- xml_loc = os.path.join(os.environ['TEST_DATA'], 'data/matti/test_all_present_sierra_disabled.xml')
- stored_xml = self.read_xml(xml_loc, False).strip()
- drop_loc = os.path.join(OUTPUT, 'ats/ATSMattiDrop2.zip')
- generated_xml = self.read_xml(drop_loc, True).strip()
-
- if platform.system().lower() == "linux":
- assert stored_xml.replace('\r', '') in generated_xml
- else:
- assert stored_xml in generated_xml
-
- TOTAL_TESTS_COUNT -= 1
- teardown_module(TOTAL_TESTS_COUNT)
-
- def test_xml_if_sierra_is_enabled_template_location_is_missing(self):
- """ test Matti2.py with all parameters present and correct and if sierra is enabled but template location is used as default one"""
- global TOTAL_TESTS_COUNT
- opts = Bunch(build_drive=self.build_drive,
- drop_file=path(r"%s/ats/ATSMattiDrop3.zip" %OUTPUT).normpath(),
- flash_images=self.flash_images,
- matti_sis_files=self.matti_sis_files,
- testasset_location=self.test_asset_path,
- template_loc="",
- sierra_enabled="True",
- test_profiles="bat, fute",
- matti_parameters="",
- matti_timeout="1200",
- sierra_parameters="--teardown",
- file_store=self.file_store,
- report_email="firstname.lastname@domain.com",
- testrun_name="matti test run",
- alias_name="alias",
- device_type="new_device",
- diamonds_build_url="http://diamonds.com/1234",
- email_format="simplelogger",
- email_subject="Matti test report",
- verbode="false")
-
- self.config = ats3.matti2.Configuration(opts)
- ats3.matti2.create_drop(self.config)
-
- xml_loc = os.path.join(os.environ['TEST_DATA'], 'data/matti/test_all_present.xml')
- stored_xml = self.read_xml(xml_loc, False).strip()
- drop_loc = os.path.join(OUTPUT, 'ats/ATSMattiDrop3.zip')
- generated_xml = self.read_xml(drop_loc, True).strip()
-
- if platform.system().lower() == "linux":
- assert stored_xml.replace('\r', '') in generated_xml
- else:
- assert stored_xml in generated_xml
-
- TOTAL_TESTS_COUNT -= 1
- teardown_module(TOTAL_TESTS_COUNT)
-
-
-
-
-
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_quality.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_quality.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,59 @@
+#============================================================================
+#Name : test_quality.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+
+"""Test the archive.py module."""
+
+from __future__ import with_statement
+import os
+import unittest
+import logging
+import fileutils
+import pythoncorecpythontests.test_fileutils
+import integration.quality
+
+_logger = logging.getLogger('test.quality')
+
+
+root_test_dir = pythoncorecpythontests.test_fileutils.root_test_dir
+
+def setup_module():
+ """ Creates some test data files for file-related testing. """
+ pythoncorecpythontests.test_fileutils.setup_module()
+
+def teardown_module():
+ """ Cleans up test data files for file-related testing. """
+ pythoncorecpythontests.test_fileutils.teardown_module()
+
+
+class QualityTest(unittest.TestCase):
+
+ def test_epl_validate_content(self):
+ """Tests loading policy ID's from CSV file for EPL"""
+ pattern = "distribution.policy.s60,distribution.policy,distribution.policy.pp"
+ ignoreroot = False
+ excludes = ".static_wa,_ccmwaid.inf"
+ validator = integration.quality.PolicyValidator(pattern, ignoreroot=ignoreroot, excludes=excludes)
+
+ validator.epl_load_policy_ids(os.path.join(os.environ['TEST_DATA'], 'data/distribution.policy.extended_for_sf.id_status.csv'))
+
+ assert validator.epl_validate_content(os.path.join(os.environ['TEST_DATA'], 'data/distribution.policy.S60')) == True
+ assert validator.epl_validate_content(os.path.join(os.environ['TEST_DATA'], 'data/Invalid_distribution.policy.S60')) == False
+
+if __name__ == "__main__":
+ unittest.main()
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_sphinx_ext.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_sphinx_ext.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,121 @@
+#============================================================================
+#Name : test_sphinx_ext.py
+#Part of : Helium
+
+#Copyright (c) 2010 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+
+""" Test sphinx_ext module. """
+import re
+
+import logging
+import os
+import time
+import unittest
+import sys
+import mocker
+import sphinx_ext
+
+_logger = logging.getLogger('test.sphinx_ext')
+random_number = 10
+
+class SphinxTest(mocker.MockerTestCase):
+ """ Class for testing sphinx_ext module """
+ def __init__(self, methodName="runTest"):
+ mocker.MockerTestCase.__init__(self, methodName)
+
+ def setUp(self):
+ # some dummy input
+ self.inlineDocument = r'Macros list'
+ sphinx_ext.exit_with_failure = 0
+ sphinx_ext.database_path = os.path.join(os.environ['TEST_DATA'], "data", "test_database.xml")
+
+ def test_handle_hlm_role_callback(self):
+ """ Check roles and description unit."""
+ obj = _MockApp()
+ sphinx_ext.setup(obj)
+ assert 'hlm-t' in obj.dict.keys()
+ assert 'hlm-p' in obj.dict.keys()
+ assert 'hlm-m' in obj.dict.keys()
+ assert sphinx_ext.handle_hlm_role == obj.dict['hlm-t']
+ assert sphinx_ext.handle_hlm_role == obj.dict['hlm-p']
+ assert sphinx_ext.handle_hlm_role == obj.dict['hlm-m']
+ assert ['property', 'ant-prop', 'pair: %s; property'] in obj.descUnit
+ assert ['target', 'ant-target', 'pair: %s; target'] in obj.descUnit
+
+ def test_handle_hlm_role_target(self):
+ """ Check target to build the link """
+ obj = self.mocker.mock(count=False)
+ mocker.expect(obj.document).result(self.inlineDocument)
+ self.mocker.replay()
+ response = sphinx_ext.handle_hlm_role('hlm-t' , "", 'cmaker-install', random_number, obj)
+ assert "../../api/helium/project-compile.cmaker.html#cmaker-install" in response[0][0].children[0].attributes['refuri']
+
+ def test_handle_hlm_role_property(self):
+ """ Check property to build the link """
+ obj = self.mocker.mock(count=False)
+ mocker.expect(obj.document).result(self.inlineDocument)
+ self.mocker.replay()
+ response = sphinx_ext.handle_hlm_role('hlm-p' , "", 'cmaker-export', random_number, obj)
+ assert "../../api/helium/project-compile.cmaker.html#cmaker-export" in response[0][0].children[0].attributes['refuri']
+
+
+ def test_handle_hlm_role_macro(self):
+ """ Check macro to build the link """
+ obj = self.mocker.mock(count=False)
+ mocker.expect(obj.document).result(self.inlineDocument)
+ self.mocker.replay()
+ response = sphinx_ext.handle_hlm_role('hlm-m' , "", 'cmaker-export', random_number, obj)
+ assert "../../api/helium/project-compile.cmaker.html#cmaker-export" in response[0][0].children[0].attributes['refuri']
+
+ def test_handle_hlm_role_missing_api(self):
+ """ Check for failure when there are missing api's """
+ error = ""
+ line = ""
+ obj = self.mocker.mock(count=False)
+ mocker.expect(obj.document).result(self.inlineDocument)
+ mocker.expect(obj.reporter.error('Missing API doc for "cmaker-clean".', line=random_number)).result('Missing API doc for "cmaker-clean".')
+ self.mocker.replay()
+ sphinx_ext.handle_hlm_role('hlm-t' , "", 'cmaker-clean', random_number, obj)
+
+ def test_handle_hlm_role_missing_field_value(self):
+ """ Check for failure when there are missing fields for api's """
+ error = ""
+ line = ""
+ obj = self.mocker.mock(count=False)
+ mocker.expect(obj.document).result(self.inlineDocument)
+ mocker.expect(obj.reporter.error('Field value cannot be found for API field: "cmaker-export[summary]".', line=random_number)).result('Field value cannot be found for API field: "cmaker-export[summary]".')
+ self.mocker.replay()
+ sphinx_ext.handle_hlm_role('hlm-t' , "", 'cmaker-export[summary]', random_number, obj)
+
+ def test_handle_hlm_role_valid_field_value(self):
+ """ Check when there is '[' present """
+ obj = self.mocker.mock(count=False)
+ mocker.expect(obj.document).result(self.inlineDocument)
+ self.mocker.replay()
+ response = sphinx_ext.handle_hlm_role('hlm-t' , "", 'cmaker-export[location]', random_number, obj)
+ assert r"C:\Helium_svn\helium\tools\compile\cmaker.ant.xml:87:" in response[0][0].data
+
+class _MockApp:
+
+ def __init__(self):
+ self.dict = {}
+ self.descUnit = []
+
+ def add_role(self, role, ref):
+ self.dict[role] = ref
+
+ def add_description_unit(self, text1, text2, text3):
+ self.descUnit.append([text1, text2, text3])
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_tdriver.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_tdriver.py Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,314 @@
+# -*- coding: latin-1 -*-
+
+#============================================================================
+#Name : test_tdriver.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+
+""" Testing TDriver framework. """
+
+# pylint: disable=E1101
+
+import logging
+logging.getLogger().setLevel(logging.INFO)
+import os
+#import shutil
+from path import path
+import ats3.tdriver
+import tempfile
+import zipfile
+import platform
+import unittest
+
+TEST_PATH = None
+TEST_FILES = {}
+TDRIVER = None
+OUTPUT = None
+SISFILES = None
+
+
+class Bunch(object):
+ """ Configuration object. Argument from constructor are converted into class attributes. """
+ def __init__(self, **kwargs):
+ self.__dict__.update(kwargs)
+
+def setup_module(_):
+ """ Setup the module. """
+ global TEST_PATH
+ global OUTPUT
+ global TDRIVER
+ global SISFILES
+
+ TEST_PATH = path(tempfile.mkdtemp())
+ component = TEST_PATH
+ component.joinpath("tdriver").makedirs()
+ for path_parts in (("tdriver_testcases", "profile", "all.sip"),
+ ("tdriver_testcases", "profile", "bat.sip"),
+ ("tdriver_testcases", "profile", "fute.sip"),
+ ("tdriver_testcases", "hwdata", "paths.pkg"),
+ ("tdriver_testcases", "hwdata", "file1.txt"),
+ ("tdriver_testcases", "hwdata", "settings.ini"),
+ ("tdriver_testcases", "tdriver_parameters", "tdriver_parameters.xml"),
+ ("tdriver_testcases", "unit_test1.rb"),
+ ("tdriver_testcases", "unit_test2.rb"),
+ ("output", "images", "image1.fpsx"),
+ ("output", "images", "image2.fpsx"),
+ ("sisfiles", "abc.sis"),
+ ("sisfiles", "xyz.sis"),
+ ("output", "ats", "temp.txt")):
+ filepath = component.joinpath(*path_parts)
+ if not filepath.parent.exists():
+ filepath.parent.makedirs()
+ filepath.touch()
+ TEST_FILES.setdefault(path_parts[1], []).append(filepath)
+
+ OUTPUT = component.joinpath(r"output")
+ TDRIVER = component.joinpath("tdriver_testcases")
+ SISFILES = component.joinpath(r"sisfiles")
+
+ if not filepath.parent.exists():
+ filepath.parent.makedirs()
+ filepath.touch()
+
+ #mtc => tdriver_testcases
+ mtc = component.joinpath("tdriver_testcases")
+ mtc.joinpath("unit_test1.rb").write_text("unit_tests")
+ mtc.joinpath("unit_test2.rb").write_text("unit_tests")
+
+ # profiles
+ profiles = component.joinpath("tdriver_testcases", "profile")
+ profiles.joinpath("all.sip").write_text("sip profile")
+ profiles.joinpath("bat.sip").write_text("sip profile")
+ profiles.joinpath("fute.sip").write_text("sip profile")
+
+ #hwdata => hardware data
+ profiles = component.joinpath("tdriver_testcases", "hwdata")
+ profiles.joinpath("file1.txt").write_text("data file")
+ profiles.joinpath("settings.ini").write_text("settings initialization file")
+ profiles.joinpath("paths.pkg").write_text(
+ r"""
+ ;Language - standard language definitions
+ &EN
+
+ ; standard SIS file header
+ #{"BTEngTestApp"},(0x04DA27D5),1,0,0
+
+ ;Supports Series 60 v 3.0
+ (0x101F7961), 0, 0, 0, {"Series60ProductID"}
+
+ ;Localized Vendor Name
+ %{"BTEngTestApp"}
+
+ ;Unique Vendor name
+ :"Nokia"
+
+ ; Files to copy
+
+ "[PKG_LOC]\file1.txt"-"C:\Private\10202BE9\PERSISTS\file1.txt"
+ "[PKG_LOC]\settings.ini"-"c:\sys\settings.ini"
+ """.replace('\\', os.sep))
+
+
+def teardown_module():
+ """ stuff to do after running the tests """
+ path(TEST_PATH).rmtree()
+
+class TestTDriverTestPlan(unittest.TestCase):
+ """ test TDriverDrop.py """
+
+ def setUp(self):
+ """initialize TDriver Tests"""
+ self.file_store = OUTPUT
+ self.test_asset_path = TDRIVER
+ self.tdriver_sis_files = r"%s/abc.sis#f:\data\abc.sis#c:\abc.sis, %s/xyz.sis#f:\data\abc.sis#f:\xyz.sis" % (SISFILES, SISFILES)
+ self.build_drive = "j:"
+ self.drop_file = path(r"%s/ats/ATSTDriverDrop.zip" %OUTPUT).normpath()
+
+ image_files = r"%s/images/image1.fpsx, %s/images/image2.fpsx " % (OUTPUT, OUTPUT)
+ self.flash_images = image_files
+
+ self.template_loc = os.path.join(os.environ['TEST_DATA'], 'data/tdriver/tdriver_template.xml')
+ self.template_loc = os.path.normpath(self.template_loc)
+ self.tdriver_parameters = ""
+ self.config = None
+
+ def read_xml(self, file_location, zip_file=False):
+ """reads test.xml file if a path is given"""
+
+ xml_text = ""
+ file_location = path(file_location)
+ if zip_file:
+ if zipfile.is_zipfile(file_location):
+ myzip = zipfile.ZipFile(file_location, 'r')
+ xml_text = myzip.read('test.xml')
+ myzip.close()
+ xml_text = xml_text.replace("\n", "")
+ xml_text = xml_text.replace("\t", "")
+
+ else:
+ hnd = open(file_location, 'r')
+ for line in hnd.readlines():
+ xml_text = xml_text + line.strip()
+
+ return xml_text
+
+ def test_xml_with_all_parameters(self):
+ """ test tdriver.py with all parameters present and correct and tdrunner is enabled"""
+ opts = Bunch(build_drive=self.build_drive,
+ drop_file=path(r"%s/ats/ATSTDriverDrop1.zip" %OUTPUT).normpath(),
+ flash_images=self.flash_images,
+ tdriver_sis_files=self.tdriver_sis_files,
+ testasset_location=self.test_asset_path,
+ template_loc=self.template_loc,
+ tdrunner_enabled="True",
+ test_profiles="bat, fute",
+ tdriver_parameters="",
+ tdriver_timeout="1200",
+ tdrunner_parameters="--teardown",
+ file_store="",
+ report_email="firstname.lastname@domain.com",
+ testrun_name="TDriver test run",
+ alias_name="alias",
+ device_type="new_device",
+ diamonds_build_url="http://diamonds.com/1234",
+ email_format="simplelogger",
+ email_subject="TDriver test report",
+ verbode="false")
+
+ self.config = ats3.tdriver.Configuration(opts)
+ ats3.tdriver.create_drop(self.config)
+
+ xml_loc = os.path.join(os.environ['TEST_DATA'], 'data/tdriver/test_all_present.xml')
+ stored_xml = self.read_xml(xml_loc, False).strip()
+ drop_loc = os.path.join(OUTPUT, 'ats/ATSTDriverDrop1.zip')
+ generated_xml = self.read_xml(drop_loc, True).strip()
+
+ if platform.system().lower() == "linux":
+ assert stored_xml.replace('\r', '') in generated_xml
+ else:
+ assert stored_xml in generated_xml
+
+
+ def test_xml_if_tdrunner_is_not_enabled(self):
+ """ test tdriver.py with all parameters present and correct and tdrunner is not enabled (or false)"""
+ opts = Bunch(build_drive=self.build_drive,
+ drop_file=path(r"%s/ats/ATSTDriverDrop2.zip" %OUTPUT).normpath(),
+ flash_images=self.flash_images,
+ tdriver_sis_files=self.tdriver_sis_files,
+ testasset_location=self.test_asset_path,
+ template_loc=self.template_loc,
+ tdrunner_enabled="False",
+ test_profiles="bat, fute",
+ tdriver_parameters="",
+ tdriver_timeout="1200",
+ tdrunner_parameters="--teardown",
+ file_store=r"\\network\drive",
+ report_email="firstname.lastname@domain.com",
+ testrun_name="TDriver test run",
+ alias_name="alias",
+ device_type="new_device",
+ diamonds_build_url="http://diamonds.com/1234",
+ email_format="simplelogger",
+ email_subject="TDriver test report",
+ verbode="false")
+
+ self.config = ats3.tdriver.Configuration(opts)
+ ats3.tdriver.create_drop(self.config)
+
+ xml_loc = os.path.join(os.environ['TEST_DATA'], 'data/tdriver/test_all_present_tdrunner_disabled.xml')
+ stored_xml = self.read_xml(xml_loc, False).strip()
+ drop_loc = os.path.join(OUTPUT, 'ats/ATSTDriverDrop2.zip')
+ generated_xml = self.read_xml(drop_loc, True).strip()
+
+ if platform.system().lower() == "linux":
+ assert stored_xml.replace('\r', '') in generated_xml
+ else:
+ assert stored_xml in generated_xml
+
+
+ def test_xml_if_tdrunner_is_enabled_template_location_is_missing(self):
+ """ test tdriver.py with all parameters present and correct and if tdrunner is enabled but template location is used as default one"""
+ opts = Bunch(build_drive=self.build_drive,
+ drop_file=path(r"%s/ats/ATSTDriverDrop3.zip" %OUTPUT).normpath(),
+ flash_images=self.flash_images,
+ tdriver_sis_files=self.tdriver_sis_files,
+ testasset_location=self.test_asset_path,
+ template_loc="",
+ tdrunner_enabled="True",
+ test_profiles="bat, fute",
+ tdriver_parameters="",
+ tdriver_timeout="1200",
+ tdrunner_parameters="--teardown",
+ file_store="",
+ report_email="firstname.lastname@domain.com",
+ testrun_name="TDriver test run",
+ alias_name="alias",
+ device_type="new_device",
+ diamonds_build_url="http://diamonds.com/1234",
+ email_format="simplelogger",
+ email_subject="TDriver test report",
+ verbode="false")
+
+ self.config = ats3.tdriver.Configuration(opts)
+ ats3.tdriver.create_drop(self.config)
+
+ xml_loc = os.path.join(os.environ['TEST_DATA'], 'data/tdriver/test_all_present.xml')
+ stored_xml = self.read_xml(xml_loc, False).strip()
+ drop_loc = os.path.join(OUTPUT, 'ats/ATSTDriverDrop3.zip')
+ generated_xml = self.read_xml(drop_loc, True).strip()
+
+ if platform.system().lower() == "linux":
+ assert stored_xml.replace('\r', '') in generated_xml
+ else:
+ assert stored_xml in generated_xml
+
+ def test_ctc(self):
+ """ test ctc """
+ opts = Bunch(build_drive=self.build_drive,
+ drop_file=path(r"%s/ats/ATSTDriverDrop3.zip" %OUTPUT).normpath(),
+ flash_images=self.flash_images,
+ tdriver_sis_files=self.tdriver_sis_files,
+ testasset_location=self.test_asset_path,
+ template_loc="",
+ tdrunner_enabled="True",
+ test_profiles="bat, fute",
+ tdriver_parameters="",
+ tdriver_timeout="1200",
+ tdrunner_parameters="--teardown",
+ file_store="",
+ report_email="firstname.lastname@domain.com",
+ testrun_name="TDriver test run",
+ alias_name="alias",
+ device_type="new_device",
+ diamonds_build_url="http://diamonds.com/1234",
+ email_format="simplelogger",
+ email_subject="TDriver test report",
+ verbode="false",
+ ctc_enabled=True)
+
+ self.config = ats3.tdriver.Configuration(opts)
+ ats3.tdriver.create_drop(self.config)
+
+ xml_loc = os.path.join(os.environ['TEST_DATA'], 'data/tdriver/test_ctc.xml')
+ stored_xml = self.read_xml(xml_loc, False).strip()
+ drop_loc = os.path.join(OUTPUT, 'ats/ATSTDriverDrop3.zip')
+ generated_xml = self.read_xml(drop_loc, True).strip()
+
+ if platform.system().lower() == "linux":
+ assert stored_xml.replace('\r', '') in generated_xml
+ else:
+ assert stored_xml in generated_xml
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_timeout_launcher.py
--- a/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_timeout_launcher.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncorecpythontests/test_timeout_launcher.py Mon Oct 11 11:16:47 2010 +0100
@@ -32,9 +32,9 @@
# Platform
WINDOWS = False
if sys.platform == "win32":
- import win32process
+# import win32process
import win32con
- import win32api
+# import win32api
WINDOWS = True
@@ -175,6 +175,6 @@
failed = False
try:
timeout_launcher.main()
- except Exception:
+ except (OSError, IOError):
failed = True
assert failed
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_amara.py
--- a/buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_amara.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_amara.py Mon Oct 11 11:16:47 2010 +0100
@@ -31,12 +31,16 @@
"""test amara"""
xxx = amara.parse(r'Add rofsfiles for usage in paged images')
assert str(xxx.commentLog.branchInfo) == 'Add rofsfiles for usage in paged images'
+ print "xxx: '" + str(xxx) + "'"
+ print xxx.xml()
xxx = amara.parse(r'12')
for yyy in xxx.commentLog.branchInfo:
assert str(yyy) == '1'
break
-
+ print "xxx: '" + str(xxx) + "'"
+ print xxx.xml()
+
myxml = """"""
xcf = amara.parse(myxml)
assert xcf.DpComponent['name'] == 'dp.cfg.xml'
@@ -118,6 +122,8 @@
newppxml = amara.parse(ppxml)
oldppxml = amara.parse(ppxml)
+ assert 'SettingsData' in newppxml.xml_child_elements
+
oldppdata = {}
for oldfeature in oldppxml.SettingsData.ProductProfile.Feature:
oldppdata[str(oldfeature.Index)] = oldfeature.Value
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_ccm_results.py
--- a/buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_ccm_results.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_ccm_results.py Mon Oct 11 11:16:47 2010 +0100
@@ -332,6 +332,30 @@
assert len(result.output[subproj]) == 2, "%s should contain 2 conflicts" % subproj.objectname
+
+ def test_ConflictsResult_object_result(self):
+ """ Validating ConflictsResult with object checking output."""
+ behave = {'test_update' : """
+Project: Cartman-Release_v4
+
+header.h-8.1.1:incl:tr1test1#1 tr1test1#30010, tr1test1#42792 Implicitly required by multiple tasks - parallel
+header2.h-5.2.1:prj_spec:tr1test1#2 tr1test1#28554 Implicitly required but not included - parallel
+
+Project: Cartman_sub03-next
+
+ No conflicts detected.
+
+ """}
+ session = MockResultSession(behave)
+ result = session.execute('test_update', ccm.ConflictsResult(session))
+ #_logger.debug(result.output)
+ # pylint: disable=E1103
+ assert len(result.output.keys()) == 2, "Should detect 2 projects."
+ subproj = session.create("Cartman-Release_v4:project:%s#1" % session.database())
+ # 3 conflicts will be detected one per tasks.
+ assert len(result.output[subproj]) == 3, "%s should contain 3 conflicts" % subproj.objectname
+
+
def test_DataMapperListResult_result(self):
""" Validating DataMapperListResult."""
behave = {'test_query' : """>>>objectname>>>task5204-1:task:tr1test1>>>task_synopsis>>>Create Cartman_sub03>>>
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_ccmutil.py
--- a/buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_ccmutil.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_ccmutil.py Mon Oct 11 11:16:47 2010 +0100
@@ -27,9 +27,8 @@
_logger = logging.getLogger('test.ccmutil')
logging.basicConfig(level=logging.INFO)
-def open_session(username=None, password=None, engine=None, dbpath=None, database=None, reuse=True):
+def open_session(username=None, password=None, engine=None, dbpath=None, database=None):
"""open session"""
- reuse = True #just for pylint
return MockSession(None, username, password, engine, dbpath, database)
nokia.nokiaccm.open_session = open_session
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_configuration.py
--- a/buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_configuration.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_configuration.py Mon Oct 11 11:16:47 2010 +0100
@@ -157,8 +157,8 @@
-
-
+
+
@@ -166,8 +166,8 @@
-
-
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_gscm.py
--- a/buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_gscm.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_gscm.py Mon Oct 11 11:16:47 2010 +0100
@@ -53,7 +53,7 @@
try:
_logger.info("get_db_path('not_valid_db'): %s" % nokia.gscm.get_db_path('not_valid_db'))
assert False, "Should raise Exception when giving unexisting db.'"
- except Exception, exc:
+ except IOError, exc:
_logger.info(exc)
def test_get_engine_host(self):
@@ -66,7 +66,7 @@
try:
_logger.info("get_engine_host('not_valid_db'): %s" % nokia.gscm.get_engine_host('not_valid_db'))
assert False, "Should raise Exception when giving unexisting db.'"
- except Exception, exc:
+ except IOError, exc:
_logger.info(exc)
def test_get_router_address(self):
@@ -78,5 +78,5 @@
try:
_logger.info("get_router_address('not_valid_db'): %s" % nokia.gscm.get_router_address('not_valid_db'))
assert False, "Should raise Exception when giving unexisting db.'"
- except Exception, exc:
+ except IOError, exc:
_logger.info(exc)
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_idoprep.py
--- a/buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_idoprep.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_idoprep.py Mon Oct 11 11:16:47 2010 +0100
@@ -34,22 +34,6 @@
"""called before any of the tests are run"""
self.server = os.path.join(os.environ['TEST_DATA'], "data/symrec/GRACE/")
- def test_validate_grace(self):
- """Verifiying validate(grace) method"""
- self.assertRaises(Exception, idoprep.validate, None, 'test', 'test', 'test')
-
- def test_validate_service(self):
- """Verifiying validate(service) method"""
- self.assertRaises(Exception, idoprep.validate, 'test', None, 'test', 'test')
-
- def test_validate_product(self):
- """Verifiying validate(product) method"""
- self.assertRaises(Exception, idoprep.validate, 'test', 'test', None, 'test')
-
- def test_validate_release(self):
- """Verifiying validate(release) method"""
- self.assertRaises(Exception, idoprep.validate, 'test', 'test', 'test', None)
-
def test_get_s60_env_details_valid(self):
"""Verifiying get_s60_env_details(valid args) method"""
(fileDes, cacheFilename) = tempfile.mkstemp()
@@ -91,47 +75,16 @@
"""Verifiying create_ado_mapping method"""
(sysdefFileDes, sysdefConfig) = tempfile.mkstemp()
(adoFileDes, adoMappingFile) = tempfile.mkstemp()
- (adoqtyFileDes, adoQualityMappingFile) = tempfile.mkstemp()
buildDrive = tempfile.gettempdir()
adoQualityDirs = None
testSysdefFile = os.path.join(os.environ['TEST_DATA'], 'data', 'packageiad', 'layers.sysdef.xml')
os.write(sysdefFileDes, testSysdefFile)
os.close(sysdefFileDes)
- idoprep.create_ado_mapping(sysdefConfig, adoMappingFile, adoQualityMappingFile, buildDrive, adoQualityDirs)
+ idoprep.create_ado_mapping(sysdefConfig, adoMappingFile, 'false', buildDrive, adoQualityDirs)
os.unlink(sysdefConfig)
os.close(adoFileDes)
- os.close(adoqtyFileDes)
adoFile = open(adoMappingFile, 'r')
adoMappingFileContents = adoFile.readlines()
adoFile.close()
- adoQtyFile = open(adoQualityMappingFile, 'r')
- adoQualityMappingFileContents = adoQtyFile.readlines()
- adoQtyFile.close()
os.unlink(adoMappingFile)
- os.unlink(adoQualityMappingFile)
- assert len(adoMappingFileContents) >= 1 and len(adoQualityMappingFileContents) >= 1
-
- def test_create_ado_mapping_adoqualitydirs(self):
- """Verifiying create_ado_mapping (with valid adoqualitydirs) method"""
- (sysdefFileDes, sysdefConfig) = tempfile.mkstemp()
- (adoFileDes, adoMappingFile) = tempfile.mkstemp()
- (adoqtyFileDes, adoQualityMappingFile) = tempfile.mkstemp()
- buildDrive = tempfile.gettempdir()
- testSysdefFile = os.path.join(os.environ['TEST_DATA'], 'data', 'packageiad', 'layers.sysdef.xml')
- location = ido.get_sysdef_location(testSysdefFile)
- adoQualityDirs = (os.path.normpath(os.path.join(buildDrive, os.environ['EPOCROOT'], location)))
- os.write(sysdefFileDes, testSysdefFile)
- os.close(sysdefFileDes)
- idoprep.create_ado_mapping(sysdefConfig, adoMappingFile, adoQualityMappingFile, buildDrive, adoQualityDirs)
- os.unlink(sysdefConfig)
- os.close(adoFileDes)
- os.close(adoqtyFileDes)
- adoFile = open(adoMappingFile, 'r')
- adoMappingFileContents = adoFile.readlines()
- adoFile.close()
- adoQtyFile = open(adoQualityMappingFile, 'r')
- adoQualityMappingFileContents = adoQtyFile.readlines()
- adoQtyFile.close()
- os.unlink(adoMappingFile)
- os.unlink(adoQualityMappingFile)
- assert len(adoMappingFileContents) >= 1 and len(adoQualityMappingFileContents) >= 1
+ assert len(adoMappingFileContents) >= 1
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_preparation.py
--- a/buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_preparation.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_preparation.py Mon Oct 11 11:16:47 2010 +0100
@@ -18,8 +18,6 @@
#===============================================================================
""" Testing preparation module """
-# pylint: disable=R0201
-
import tempfile
from shutil import rmtree
import os
@@ -496,9 +494,9 @@
"""Emulating project.exists method"""
return True
- def snapshot(self, target_dir, status):
+ def snapshot(self, target_dir, _):
"""Emulating project.snapshot method"""
- print "Snapshot created"
+ print "Snapshot created: target_dir = " + str(target_dir)
def update(self, status, replace_subprojects, update_keepgoing, result):
"""Emulating project.update method"""
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_sysdef_io.py
--- a/buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_sysdef_io.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/pythoncoretests/test_sysdef_io.py Mon Oct 11 11:16:47 2010 +0100
@@ -26,41 +26,45 @@
import unittest
from sysdef.io import FlashImageSizeWriter
+
_logger = logging.getLogger('test.sysdef.io')
logging.basicConfig(level=logging.INFO)
+
class FlashImageSizeWriterTest(unittest.TestCase):
"""Verifiying sysdef/io module"""
def test_write(self):
"""Verifiying write method"""
- (fileDes, filename) = tempfile.mkstemp()
- flashWriter = FlashImageSizeWriter(filename)
- oldOut = flashWriter._out
- flashWriter._out = duppedOut = StringIO()
+ output = StringIO()
+ flashWriter = FlashImageSizeWriter(output)
config_list = ("testconfig1","testconfig2")
flashWriter.write(_sysdef(), config_list)
- flashWriter._out = oldOut
+ assert len(output.getvalue().splitlines()) == 9
flashWriter.close()
- os.close(fileDes)
- os.unlink(filename)
- assert len(duppedOut.getvalue().splitlines()) == 9
+
# dummy classes to emulate sysdef configuration
class _sysdef():
"""Emulate sysdef """
def __init__(self):
self.configurations = {"name1": _config("testconfig1"), "name2" : _config("testconfig2")}
+
+
class _config():
"""Emulate config"""
def __init__(self, name):
self.name = name
self.units = (_unit(), _unit())
+
+
class _unit():
"""Emulate unit"""
def __init__(self):
self.name = "testUnit"
self.binaries = (_binary(), _binary())
+
+
class _binary():
"""Emulate binary"""
def __init__(self):
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/sphinx_ext.py
--- a/buildframework/helium/sf/python/pythoncore/lib/sphinx_ext.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/sphinx_ext.py Mon Oct 11 11:16:47 2010 +0100
@@ -1,7 +1,7 @@
#============================================================================
#Name : sphinx_ext.py
#Part of : Helium
-
+#
#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
#All rights reserved.
#This component and the accompanying materials are made available
@@ -16,61 +16,126 @@
#
#Description:
#===============================================================================
-""" aids to creating the API documentation"""
+""" Custom Sphinx operations to help with Helium doc linking. """
+
import os
import re
+import atexit
from docutils import nodes, utils
from docutils.parsers.rst import directives
import amara
+tree = None
treecache = None
+database_path = os.path.abspath(os.path.join(os.getcwd() + '/build', 'public_database.xml'))
+
+# Error count for custom sphinx operations
+exit_with_failure = 0
-def handle_hlm_role(role, rawtext, text, lineno, inliner,
- options=None, content=None):
+def check_cached_database():
+ """ Check the Ant database XML data is cached as needed. """
+ global tree
+ global treecache
+
+ if tree == None or treecache == None:
+ f = open(database_path)
+ tree = amara.parse(f)
+
+ treecache = {}
+ for project in tree.antDatabase.project:
+ for x in project.xml_children:
+ if hasattr(x, 'name'):
+ treecache[str(x.name)] = [str(project.name),'project']
+ if hasattr(tree.antDatabase, "antlib"):
+ for antlib in tree.antDatabase.antlib:
+ for x in antlib.xml_children:
+ if hasattr(x, 'name'):
+ treecache[str(x.name)] = [str(antlib.name),'antlib']
+
+def handle_hlm_role(role, _, text, lineno, inliner, options=None, content=None): # pylint: disable=W0613
""" Process a custom Helium ReStructuredText role to link to a target, property or macro. """
if options == None:
options = {}
if content == None:
content = []
+
+ # See if the role is used to embed a API element field
+ if '[' in text:
+ role_data = _embed_role_field(role, text, lineno, inliner)
+ else:
+ role_data = _build_link(text, lineno, inliner, options)
+
+ return role_data
+
+def _embed_role_field(role, text, lineno, inliner):
+ """ Insert the contents of an element field.
+
+ These take the form of e.g. hlm-p:`build.drive[summary]`
+ """
+ messages = []
+ node = nodes.Text('', '')
+
+ field_match = re.search("(.*?)\[(.*?)\]", text)
+ if field_match != None:
+ element_name = field_match.group(1)
+ field_name = field_match.group(2)
+ if field_name != None and len(field_name) > 0:
+ field_value = find_field_value(role, element_name, field_name)
+ if field_value != None and len(field_value) > 0:
+ node = nodes.Text(field_value, utils.unescape(field_value))
+ else:
+ messages.append(inliner.reporter.error(('Field value cannot be found for API field: "%s".' % text), line=lineno))
+ else:
+ messages.append(inliner.reporter.error(('Invalid field name for API value replacement: "%s".' % text), line=lineno))
+ return [node], messages
+
+def find_field_value(role, element_name, field_name):
+ """ Gets the value of a field from an API element. """
+ check_cached_database()
+
+ field_value = None
+ element = tree.xml_xpath('//' + roles[role] + "[name='" + element_name + "']")
+
+ if element != None and len(element) == 1:
+ field_value_list = element[0].xml_xpath(field_name)
+ if field_value_list != None and len(field_value_list) == 1:
+ field_value = str(field_value_list[0])
+ return field_value
+
+
+def _build_link(text, lineno, inliner, options):
+ """ Build an HTML link to the API doc location for API element. """
+ global exit_with_failure
full_path_match = re.search(r"'" % (exit_with_failure) )
+
+# Register a cleanup routine to handle exit with failure
+atexit.register(check_for_failure)
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/symrec.py
--- a/buildframework/helium/sf/python/pythoncore/lib/symrec.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/symrec.py Mon Oct 11 11:16:47 2010 +0100
@@ -306,7 +306,10 @@
result.append(ServicePack(spack))
return result
- filename = property(lambda self:self._filename)
+ @property
+ def filename(self):
+ return self._filename
+
service = property(lambda self:self.get_releasedetails_info('service'), lambda self, value:self.set_releasedetails_info('service', value))
product = property(lambda self:self.get_releasedetails_info('product'), lambda self, value:self.set_releasedetails_info('product', value))
release = property(lambda self:self.get_releasedetails_info('release'), lambda self, value:self.set_releasedetails_info('release', value))
@@ -346,6 +349,8 @@
def is_valid(self, checkmd5=True, checkPath=True):
""" Run the validation mechanism. """
+ valid = True
+
status = os.path.join(os.path.dirname(self._filename), 'HYDRASTATUS.xml')
if os.path.exists(status):
hydraxml = xml.dom.minidom.parse(open(status, "r"))
@@ -353,53 +358,63 @@
if t_name.nodeType == t_name.TEXT_NODE:
if t_name.nodeValue != 'Ready':
LOGGER.error("HYDRASTATUS.xml is not ready")
- return False
- if checkPath:
+ valid = False
+
+ if valid and checkPath:
if os.path.basename(self.location) != self.release:
LOGGER.error("Release doesn't match.")
- return False
+ valid = False
if os.path.basename(os.path.dirname(self.location)) != self.product:
LOGGER.error("Product doesn't match.")
- return False
+ valid = False
if os.path.basename(os.path.dirname(os.path.dirname(self.location))) != self.service:
LOGGER.error("Service doesn't match.")
- return False
+ valid = False
- for name in self.keys():
- path = os.path.join(self.location, name)
- if not os.path.exists(path):
- LOGGER.error("%s doesn't exist." % path)
- return False
- try:
- LOGGER.debug("Trying to open %s" % path)
- content_file = open(path)
- content_file.read(1)
- except IOError:
- LOGGER.error("%s is not available yet" % path)
- return False
-
- if checkmd5 and self[name].has_key('md5checksum'):
- if self[name]['md5checksum'] != None:
- if fileutils.getmd5(path).lower() != self[name]['md5checksum']:
- LOGGER.error("%s md5checksum missmatch." % path)
- return False
-
- for spack in self.servicepacks:
- for name in spack.files:
+ if valid:
+ for name in self.keys():
path = os.path.join(self.location, name)
if not os.path.exists(path):
LOGGER.error("%s doesn't exist." % path)
- return False
- for name in spack.instructions:
- path = os.path.join(self.location, name)
- if not os.path.exists(path):
- LOGGER.error("%s doesn't exist." % path)
- return False
+ valid = False
+ break
+ try:
+ LOGGER.debug("Trying to open %s" % path)
+ content_file = open(path)
+ content_file.read(1)
+ except IOError:
+ LOGGER.error("%s is not available yet" % path)
+ valid = False
+ break
+
+ if checkmd5 and self[name].has_key('md5checksum'):
+ if self[name]['md5checksum'] != None:
+ if fileutils.getmd5(path).lower() != self[name]['md5checksum']:
+ LOGGER.error("%s md5checksum missmatch." % path)
+ valid = False
+
+ if valid:
+ for spack in self.servicepacks:
+ if valid:
+ for name in spack.files:
+ path = os.path.join(self.location, name)
+ if not os.path.exists(path):
+ LOGGER.error("%s doesn't exist." % path)
+ valid = False
+ break
+ for name in spack.instructions:
+ path = os.path.join(self.location, name)
+ if not os.path.exists(path):
+ LOGGER.error("%s doesn't exist." % path)
+ valid = False
+ break
- dependency = self.get_dependsof()
- if dependency != None:
- return ValidateReleaseMetadata(dependency.filename).is_valid(checkmd5)
- return True
+ if valid:
+ dependency = self.get_dependsof()
+ if dependency != None:
+ return ValidateReleaseMetadata(dependency.filename).is_valid(checkmd5)
+
+ return valid
class MetadataMerger(object):
@@ -549,11 +564,11 @@
ValidateReleaseMetadataCached.__init__(self, filename)
self.location = os.path.dirname(filename)
- def is_valid(self, checkmd5=True):
+ def is_valid(self, checkmd5=True, checkPath=True):
""" Run the validation mechanism. """
tickler_path = os.path.join(self.location,"TICKLER")
if not os.path.exists(tickler_path):
LOGGER.error("Release not available yet")
return False
else:
- return ValidateReleaseMetadataCached.is_valid(self, checkmd5)
+ return ValidateReleaseMetadataCached.is_valid(self, checkmd5, checkPath)
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/symrec.py.orig
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/pythoncore/lib/symrec.py.orig Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,564 @@
+#============================================================================
+#Name : symrec.py
+#Part of : Helium
+
+#Copyright (c) 2009 Nokia Corporation and/or its subsidiary(-ies).
+#All rights reserved.
+#This component and the accompanying materials are made available
+#under the terms of the License "Eclipse Public License v1.0"
+#which accompanies this distribution, and is available
+#at the URL "http://www.eclipse.org/legal/epl-v10.html".
+#
+#Initial Contributors:
+#Nokia Corporation - initial contribution.
+#
+#Contributors:
+#
+#Description:
+#===============================================================================
+
+""" SYMREC metadata file generation. """
+import xml.dom.minidom
+import codecs
+import os
+import re
+import logging
+import fileutils
+import csv
+
+LOGGER = logging.getLogger("symrec")
+logging.basicConfig(level=logging.INFO)
+
+def _cleanup_list(input):
+ """cleanup list"""
+ result = []
+ for chars in input:
+ if chars is not None and chars.strip() != "":
+ result.append(chars)
+ return result
+
+def xml_setattr(node, attr, value):
+ """ Create the attribute if needed. """
+ node.setAttribute(attr, value)
+
+def is_child_text_only(node):
+ """ Returns true if child node are all from TEXT_NODE type. """
+ for child in node.childNodes:
+ if child.nodeType != xml.dom.minidom.Node.TEXT_NODE:
+ return False
+ return True
+
+
+def ignore_whitespace_writexml(self, writer, indent="", addindent="", newl=""):
+ """ This version of writexml will ignore whitespace text to alway render
+ the output in a structure way.
+ indent = current indentation
+ addindent = indentation to add to higher levels
+ newl = newline string
+ """
+ writer.write(indent + "<" + self.tagName)
+
+ attrs = self._get_attributes()
+ a_names = attrs.keys()
+ a_names.sort()
+
+ for a_name in a_names:
+ writer.write(" %s=\"" % a_name)
+ xml.dom.minidom._write_data(writer, attrs[a_name].value)
+ writer.write("\"")
+ if self.childNodes:
+ writer.write(">")
+ if is_child_text_only(self):
+ for node in self.childNodes:
+ node.writexml(writer, '', '', '')
+ writer.write("%s>%s" % (self.tagName, newl))
+ else:
+ writer.write(newl)
+ for node in self.childNodes:
+ if node.nodeType == xml.dom.minidom.Node.TEXT_NODE and node.data.isspace():
+ pass
+ else:
+ node.writexml(writer, indent + addindent, addindent, newl)
+ writer.write("%s%s>%s" % (indent, self.tagName, newl))
+ else:
+ writer.write("/>%s" % (newl))
+
+xml.dom.minidom.Element.writexml = ignore_whitespace_writexml
+
+
+class ServicePack(object):
+ """ Create a ServicePack """
+ def __init__(self, node):
+ self.__xml = node
+
+ @property
+ def name(self):
+ """name"""
+ return self.__xml.getAttribute('name')
+
+ @property
+ def files(self):
+ """files"""
+ result = []
+ for filen in self.__xml.getElementsByTagName('file'):
+ result.append(filen.getAttribute('name'))
+ return result
+
+ @property
+ def instructions(self):
+ """instructions"""
+ result = []
+ for instr in self.__xml.getElementsByTagName('instructions'):
+ result.append(instr.getAttribute('name'))
+ return result
+
+class ReleaseMetadata(object):
+ """ Create or read Metadata XML from SYMREC/SYMDEC. """
+
+ def __init__(self, filename, service=None, product=None, release=None):
+ self._filename = filename
+ if filename and os.path.exists(filename):
+ self._xml = xml.dom.minidom.parse(open(filename, "r"))
+ releaseInformation = self._xml.getElementsByTagName(u"releaseInformation")
+ if releaseInformation != []:
+ self._releaseInformation = releaseInformation[0]
+ else:
+ self._releaseInformation = self._xml.createElement(u"releaseInformation")
+ releaseDetails = self._xml.getElementsByTagName(u'releaseDetails')
+ if releaseDetails != []:
+ self._releaseDetails = releaseDetails[0]
+ else:
+ self._releaseDetails = self._xml.createElement(u'releaseDetails')
+ releaseFiles = self._xml.getElementsByTagName(u'releaseFiles')
+ if releaseFiles != []:
+ self._releaseFiles = releaseFiles[0]
+ else:
+ self._releaseFiles = self._xml.createElement(u'releaseFiles')
+
+ if service != None:
+ self.service = service
+ if product != None:
+ self.product = product
+ if release != None:
+ self.release = release
+ elif service!=None and product!=None and release!=None:
+ self._xml = xml.dom.minidom.Document()
+ self._releaseInformation = self._xml.createElement(u"releaseInformation")
+ self._xml.appendChild(self._releaseInformation)
+ self._releaseDetails = self._xml.createElement(u'releaseDetails')
+ self._releaseInformation.appendChild(self._releaseDetails)
+ releaseID = self._xml.createElement(u'releaseID')
+ self._releaseDetails.appendChild(releaseID)
+
+ # service
+ serv = self._xml.createElement(u'service')
+ xml_setattr(serv, 'name', unicode(service))
+ releaseID.appendChild(serv)
+ # product
+ prod = self._xml.createElement(u'product')
+ xml_setattr(prod, 'name', unicode(product))
+ releaseID.appendChild(prod)
+ # release
+ rel = self._xml.createElement(u'release')
+ xml_setattr(rel, 'name', unicode(release))
+ releaseID.appendChild(rel)
+
+ # releaseFiles
+ self._releaseFiles = self._xml.createElement(u'releaseFiles')
+ self._releaseInformation.appendChild(self._releaseFiles)
+
+ # releaseFiles
+ self._releaseInformation.appendChild(self._xml.createElement(u'externalFiles'))
+ else:
+ raise Exception("Error metadata file doesn't exists.")
+
+
+ def get_dependsof(self):
+ """ Return a ReleaseMetada object pointing to the dependency release. """
+ if self.dependsof_service != None and self.dependsof_product != None and self.dependsof_release != None:
+ filename = os.path.join(os.path.dirname(self._filename), "../../..",
+ self.dependsof_service,
+ self.dependsof_product,
+ self.dependsof_release)
+ return ReleaseMetadata(find_latest_metadata(filename))
+ else:
+ return None
+
+
+ def set_dependsof(self, filename):
+ """ Setting the dependency release. """
+ metadata = ReleaseMetadata(filename)
+ self.dependsof_service = metadata.service
+ self.dependsof_product = metadata.product
+ self.dependsof_release = metadata.release
+
+ def add_package(self, name, type=None, default=True, filters=None, extract="single", md5checksum=None, size=None):
+ """ Adding a package to the metadata file. """
+ # check if update mode
+ package = None
+
+ for pkg in self._xml.getElementsByTagName('package'):
+ if (pkg.getAttribute('name').lower() == os.path.basename(name).lower()):
+ package = pkg
+ break
+
+ # if not found create new package.
+ if package is None:
+ package = self._xml.createElement(u'package')
+ self._releaseFiles.appendChild(package)
+
+ xml_setattr(package, 'name', os.path.basename(name))
+ if type != None:
+ xml_setattr(package, 'type', type)
+ else:
+ xml_setattr(package, 'type', os.path.splitext(name)[1].lstrip('.'))
+ xml_setattr(package, 'default', str(default).lower())
+ xml_setattr(package, 'extract', extract)
+ if filters and len(filters)>0:
+ xml_setattr(package, 'filters', ','.join(filters))
+ xml_setattr(package, 's60filter', ','.join(filters))
+ else:
+ xml_setattr(package, 'filters', '')
+ xml_setattr(package, 's60filter', '')
+ if md5checksum != None:
+ xml_setattr(package, unicode("md5checksum"), unicode(md5checksum))
+ if size != None:
+ xml_setattr(package, unicode("size"), unicode(size))
+
+
+ def keys(self):
+ """keys"""
+ keys = []
+ for pkg in self._releaseFiles.getElementsByTagName('package'):
+ keys.append(pkg.getAttribute('name'))
+ return keys
+
+ def __getitem__(self, key):
+ for pkg in self._releaseFiles.getElementsByTagName('package'):
+ if pkg.getAttribute('name').lower() == key.lower():
+ filters = []
+ s60filters = []
+ md5checksum = None
+ size = None
+ if pkg.hasAttribute(u'filters'):
+ filters = _cleanup_list(pkg.getAttribute('filters').split(','))
+ if pkg.hasAttribute(u's60filter'):
+ s60filters = _cleanup_list(pkg.getAttribute('s60filter').split(','))
+ if pkg.hasAttribute(u'md5checksum'):
+ md5checksum = pkg.getAttribute('md5checksum')
+ if pkg.hasAttribute(u'size'):
+ size = pkg.getAttribute('size')
+ return {'type': pkg.getAttribute('type'), 'extract': pkg.getAttribute('extract'), 'default': (pkg.getAttribute('default')=="true"), \
+ 'filters': filters, 's60filter': s60filters, 'md5checksum': md5checksum, 'size': size}
+ raise Exception("Key '%s' not found." % key)
+
+ def __setitem__(self, key, value):
+ self.add_package(key, value['type'], value['default'], value['filters'], value['extract'], value['md5checksum'], value['size'])
+
+ def set_releasedetails_info(self, name, value, details="releaseID"):
+ """ Generic function to set releaseid info. """
+ detailsnode = None
+ if self._releaseDetails.getElementsByTagName(details) == []:
+ detailsnode = self._xml.createElement(details)
+ self._releaseDetails.appendChild(detailsnode)
+ else:
+ detailsnode = self._releaseDetails.getElementsByTagName(details)[0]
+ namenode = None
+ if detailsnode.getElementsByTagName(name) == []:
+ namenode = self._xml.createElement(name)
+ namenode.setAttribute(u'name', unicode(value))
+ detailsnode.appendChild(namenode)
+ else:
+ namenode = detailsnode.getElementsByTagName(name)[0]
+ namenode.setAttribute('name', value)
+
+
+ def get_releasedetails_info(self, name, details="releaseID"):
+ """ Generic function to extract releaseid info. """
+ for group in self._releaseDetails.getElementsByTagName(details):
+ for i in group.getElementsByTagName(name):
+ return i.getAttribute('name')
+ return None
+
+ def getVariantPackage(self, variant_name):
+ """get variant package"""
+ for variant in self._xml.getElementsByTagName('variant'):
+ if variant.getAttribute('name').lower() == variant_name.lower():
+ for xxx in variant.getElementsByTagName('file'):
+ return xxx.getAttribute('name')
+
+ def xml(self):
+ """ Returning the XML as a string. """
+ return self._xml.toprettyxml()
+
+ def save(self, filename = None):
+ """ Saving the XML into the provided filename. """
+ if filename == None:
+ filename = self._filename
+ file_object = codecs.open(os.path.join(filename), 'w', "utf_8")
+ file_object.write(self.xml())
+ file_object.close()
+
+ @property
+ def servicepacks(self):
+ """ Getting the service pack names. """
+ result = []
+ for spack in self._releaseInformation.getElementsByTagName('servicePack'):
+ result.append(ServicePack(spack))
+ return result
+
+ filename = property(lambda self:self._filename)
+ service = property(lambda self:self.get_releasedetails_info('service'), lambda self, value:self.set_releasedetails_info('service', value))
+ product = property(lambda self:self.get_releasedetails_info('product'), lambda self, value:self.set_releasedetails_info('product', value))
+ release = property(lambda self:self.get_releasedetails_info('release'), lambda self, value:self.set_releasedetails_info('release', value))
+ dependsof_service = property(lambda self:self.get_releasedetails_info('service', 'dependsOf'), lambda self, value:self.set_releasedetails_info('service', value, 'dependsOf'))
+ dependsof_product = property(lambda self:self.get_releasedetails_info('product', 'dependsOf'), lambda self, value:self.set_releasedetails_info('product', value, 'dependsOf'))
+ dependsof_release = property(lambda self:self.get_releasedetails_info('release', 'dependsOf'), lambda self, value:self.set_releasedetails_info('release', value, 'dependsOf'))
+ baseline_service = property(lambda self:self.get_releasedetails_info('service', 'previousBaseline'), lambda self, value:self.set_releasedetails_info('service', value, 'previousBaseline'))
+ baseline_product = property(lambda self:self.get_releasedetails_info('product', 'previousBaseline'), lambda self, value:self.set_releasedetails_info('product', value, 'previousBaseline'))
+ baseline_release = property(lambda self:self.get_releasedetails_info('release', 'previousBaseline'), lambda self, value:self.set_releasedetails_info('release', value, 'previousBaseline'))
+
+
+class MD5Updater(ReleaseMetadata):
+ """ Update Metadata XML already created from SYMREC/SYMDEC. """
+ def __init__(self, filename):
+ ReleaseMetadata.__init__(self, filename)
+ self._filepath = os.path.dirname(filename)
+
+ def update(self):
+ """ Update each existing package md5checksum and size attribute."""
+ for name in self.keys():
+ fullname = os.path.join(self._filepath, name)
+ if os.path.exists(fullname):
+ LOGGER.info("Updating %s MD5." % fullname)
+ md5value = None
+ for trial in range(3):
+ try:
+ md5value = fileutils.getmd5(fullname)
+ result = self[name]
+ result['md5checksum'] = unicode(md5value)
+ result['size'] = unicode(os.path.getsize(fullname))
+ self[name] = result
+ break
+ except Exception, e:
+ LOGGER.warning(str(e))
+ else:
+ raise Exception('Error determining %s MD5' % fullname)
+
+class ValidateReleaseMetadata(ReleaseMetadata):
+ """ This class validate if a metadata file is stored in the correct location and
+ if all deps exists.
+ """
+ def __init__(self, filename):
+ ReleaseMetadata.__init__(self, filename)
+ self.location = os.path.dirname(filename)
+
+ def is_valid(self, checkmd5=True, checkPath=True):
+ """ Run the validation mechanism. """
+ status = os.path.join(os.path.dirname(self._filename), 'HYDRASTATUS.xml')
+ if os.path.exists(status):
+ hydraxml = xml.dom.minidom.parse(open(status, "r"))
+ for t_name in hydraxml.getElementsByTagName('state')[0].childNodes:
+ if t_name.nodeType == t_name.TEXT_NODE:
+ if t_name.nodeValue != 'Ready':
+ LOGGER.error("HYDRASTATUS.xml is not ready")
+ return False
+ if checkPath:
+ if os.path.basename(self.location) != self.release:
+ LOGGER.error("Release doesn't match.")
+ return False
+ if os.path.basename(os.path.dirname(self.location)) != self.product:
+ LOGGER.error("Product doesn't match.")
+ return False
+ if os.path.basename(os.path.dirname(os.path.dirname(self.location))) != self.service:
+ LOGGER.error("Service doesn't match.")
+ return False
+
+ for name in self.keys():
+ path = os.path.join(self.location, name)
+ if not os.path.exists(path):
+ LOGGER.error("%s doesn't exist." % path)
+ return False
+ try:
+ LOGGER.debug("Trying to open %s" % path)
+ content_file = open(path)
+ content_file.read(1)
+ except IOError:
+ LOGGER.error("%s is not available yet" % path)
+ return False
+
+ if checkmd5 and self[name].has_key('md5checksum'):
+ if self[name]['md5checksum'] != None:
+ if fileutils.getmd5(path).lower() != self[name]['md5checksum']:
+ LOGGER.error("%s md5checksum missmatch." % path)
+ return False
+
+ for spack in self.servicepacks:
+ for name in spack.files:
+ path = os.path.join(self.location, name)
+ if not os.path.exists(path):
+ LOGGER.error("%s doesn't exist." % path)
+ return False
+ for name in spack.instructions:
+ path = os.path.join(self.location, name)
+ if not os.path.exists(path):
+ LOGGER.error("%s doesn't exist." % path)
+ return False
+
+ dependency = self.get_dependsof()
+ if dependency != None:
+ return ValidateReleaseMetadata(dependency.filename).is_valid(checkmd5)
+ return True
+
+class MetadataMerger(object):
+ """ Merge packages definition to the root metadata. """
+
+ def __init__(self, metadata):
+ """ Construct a metadata merger providing root metadata filename. """
+ self._metadata = ReleaseMetadata(metadata)
+
+ def merge(self, filename):
+ """ Merge the content of filename into the root metadata. """
+ metadata = ReleaseMetadata(filename)
+ for name in metadata.keys():
+ if name in self._metadata.keys():
+ LOGGER.warning('Package %s already declared, overriding previous definition!' % name)
+ self._metadata[name] = metadata[name]
+
+ def xml(self):
+ """ Returning the XML as a string. """
+ return self._metadata.xml()
+
+ def save(self, filename = None):
+ """ Saving the XML into the provided filename. """
+ return self._metadata.save(filename)
+
+class Metadata2TDD(ReleaseMetadata):
+ """ Convert Metadata to a TDD file """
+ def __init__(self, filename, includes=None, excludes=None):
+ ReleaseMetadata.__init__(self, filename)
+ if includes is None:
+ includes = []
+ if excludes is None:
+ excludes = []
+ self.location = os.path.dirname(filename)
+ self.includes = includes
+ self.excludes = excludes
+
+ def archives_to_tdd(self, metadata):
+ """archives"""
+ tdd = "\t[\n"
+ for name in metadata.keys():
+ path_ = os.path.join(os.path.dirname(metadata.filename), name)
+ if (((len(self.includes) == 0) and metadata[name]['extract']) or (self.includes in metadata[name]['s60filter'])) and self.excludes not in metadata[name]['s60filter']:
+ tdd += "\t\t{\n"
+ tdd += "\t\t\t\"command\": \"unzip_%s\",\n" % metadata[name]['extract']
+ tdd += "\t\t\t\"src\": \"%s\",\n" % os.path.normpath(path_).replace('\\', '/')
+ tdd += "\t\t},\n"
+ tdd += "\t],\n"
+ return tdd
+
+ def to_tdd(self):
+ """ Generating a TDD file that contains a list of list of filenames. """
+ tdd = "[\n"
+ # generates unarchiving steps for dependency
+ dependency = self.get_dependsof()
+ if dependency != None:
+ tdd += self.archives_to_tdd(dependency)
+ # generates unarchiving steps
+ tdd += self.archives_to_tdd(self)
+ tdd += "]\n"
+ return tdd
+
+
+
+def find_latest_metadata(releasedir):
+ """ Finding the release latest release metadata file. """
+ try:
+ metadatas = []
+ for filename in os.listdir(releasedir):
+ if re.match(r'^release_metadata(_\d+)?\.xml$', filename, re.I) is not None:
+ LOGGER.debug("Found %s" % filename)
+ metadatas.append(filename)
+ # reverse the order...
+ metadatas.sort(reverse=True)
+ if len(metadatas) > 0:
+ return os.path.normpath(os.path.join(releasedir, metadatas[0]))
+ except Exception, exc:
+ LOGGER.error(exc)
+ return None
+ return None
+
+class ValidateReleaseMetadataCached(ValidateReleaseMetadata):
+ """ Cached version of the metadata validation. """
+ def __init__(self, filename, cachefile=None):
+ ValidateReleaseMetadata.__init__(self, filename)
+ self.__cachefile = cachefile
+
+ def is_valid(self, checkmd5=True, checkPath=True):
+ """ Check if file is in the local cache.
+ Add valid release to the cache.
+ """
+ metadatas = self.load_cache()
+ if self.in_cache(metadatas, os.path.normpath(self._filename)):
+ LOGGER.debug("Release found in cache.")
+ return self.value_from_cache(metadatas, os.path.normpath(self._filename))
+ else:
+ result = ValidateReleaseMetadata.is_valid(self, checkmd5, checkPath)
+ LOGGER.debug("Updating the cache.")
+ metadatas.append([os.path.normpath(self._filename), result])
+ self.update_cache(metadatas)
+ return result
+
+ def in_cache(self, metadatas, key):
+ """in cache"""
+ for metadata in metadatas:
+ if metadata[0] == key:
+ return True
+ return False
+
+ def value_from_cache(self, metadatas, key):
+ """value from cache"""
+ for metadata in metadatas:
+ if metadata[0] == key:
+ return metadata[1]
+ return None
+
+ def load_cache(self):
+ """load cache"""
+ metadatas = []
+ if self.__cachefile is not None and os.path.exists(self.__cachefile):
+ f_file = open(self.__cachefile, "rb")
+ for row in csv.reader(f_file):
+ if len(row) == 2:
+ metadatas.append([os.path.normpath(row[0]), row[1].lower() == "true"])
+ elif len(row) == 1:
+ # backward compatibility with old cache.
+ metadatas.append([os.path.normpath(row[0]), True])
+ f_file.close()
+ return metadatas
+
+ def update_cache(self, metadatas):
+ """update cache"""
+ if self.__cachefile is not None and os.path.exists(os.path.dirname(self.__cachefile)):
+ f_file = open(self.__cachefile, "wb")
+ writer = csv.writer(f_file)
+ writer.writerows(metadatas)
+ f_file.close()
+
+class ValidateTicklerReleaseMetadata(ValidateReleaseMetadataCached):
+ """ This class validate if a metadata file is stored in the correct location and
+ if all deps exists.
+ """
+ def __init__(self, filename):
+ ReleaseMetadata.__init__(self, filename)
+ self.location = os.path.dirname(filename)
+
+ def is_valid(self, checkmd5=True):
+ """ Run the validation mechanism. """
+ tickler_path = os.path.join(self.location,"TICKLER")
+ if not os.path.exists(tickler_path):
+ LOGGER.error("Release not available yet")
+ return False
+ else:
+ return ValidateReleaseMetadataCached.is_valid(self, checkmd5)
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/sysdef/api.py
--- a/buildframework/helium/sf/python/pythoncore/lib/sysdef/api.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/sysdef/api.py Mon Oct 11 11:16:47 2010 +0100
@@ -382,7 +382,6 @@
reason = filter_out(self.filters, unit.filters)
if reason == None:
# Get the unit object from the cache if this is a string
- # TODO - remove once unitlist returns list of Unit objects
if isinstance(unit, types.UnicodeType):
unit = self._sysDef[unit]
result.append(unit)
@@ -458,7 +457,6 @@
""" Initialisation """
self.__xml = xml.dom.minidom.parse(open(filename, "r"))
self._cache = {}
- #TODO - why store these as hashes?
self._units = {}
self._layers = {}
self._modules = {}
@@ -551,7 +549,6 @@
def addElement(self, element):
""" Adds SysDef element to cache. """
- #TODO - handle duplicate names of different types
if not self._cache.has_key(element.get_id()):
self._cache[element.get_id()] = element
#_logger.info('Adding SysDef element to cache: %s' % str(element))
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/sysdef/io.py
--- a/buildframework/helium/sf/python/pythoncore/lib/sysdef/io.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/sysdef/io.py Mon Oct 11 11:16:47 2010 +0100
@@ -26,8 +26,11 @@
def __init__(self, output):
""" Initialisation. """
self.output = output
- self._out = file(output, 'w')
-
+ if isinstance(output, basestring):
+ self._out = file(output, 'w')
+ else:
+ self._out = output
+
def write(self, sys_def, config_list):
""" Write the .csv data to a file for the given System Definition and configuration name. """
self._out.write('component,binary,rom,rofs1,rofs2,rofs3\n')
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/timeout_launcher.py
--- a/buildframework/helium/sf/python/pythoncore/lib/timeout_launcher.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/timeout_launcher.py Mon Oct 11 11:16:47 2010 +0100
@@ -61,14 +61,15 @@
print "e.g: timeout_launcher.py --timeout=1 -- cmd /c sleep 10"
sys.exit(-1)
else:
- _logger.debug("Start command")
+ command = ' '.join(cmdline)
+ _logger.debug("Start command: " + command)
shell = True
if _windows:
shell = False
if timeout != None:
finish = time.time() + timeout
timedout = False
- p_file = subprocess.Popen(' '.join(cmdline), stdout=subprocess.PIPE, stderr=subprocess.STDOUT, shell=shell)
+ p_file = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, shell=shell)
while (p_file.poll() == None):
if time.time() > finish:
timedout = True
@@ -82,12 +83,12 @@
handle = win32api.OpenProcess(True, win32con.PROCESS_TERMINATE, p_file.pid)
win32process.TerminateProcess(handle, -1)
print "ERROR: Process killed..."
- except Exception, exc:
+ except Exception, exc: # pylint: disable=W0703
print "ERROR: %s" % exc
else:
os.kill(p_file.pid, 9) # pylint: disable=E1101
print "ERROR: exiting..."
- raise Exception("Timeout exception.")
+ raise IOError("Timeout exception.")
else:
print p_file.communicate()[0]
sys.exit(p_file.returncode)
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/lib/unittestadditions.py
--- a/buildframework/helium/sf/python/pythoncore/lib/unittestadditions.py Fri Oct 08 21:02:28 2010 +0100
+++ b/buildframework/helium/sf/python/pythoncore/lib/unittestadditions.py Mon Oct 11 11:16:47 2010 +0100
@@ -41,7 +41,7 @@
def __call__(self, f_file):
""" Returns the function f_file if shouldSkip is False. Else a stub function is returned. """
- def __skiptest(*args, **kargs):
+ def __skiptest(*args, **kargs): # pylint: disable=W0613
"""skip test"""
_logger.warning("Skipping test %s" % f_file.__name__)
return self.returns
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/tests/data/Invalid_distribution.policy.S60
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/pythoncore/tests/data/Invalid_distribution.policy.S60 Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,1 @@
+1
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/tests/data/bootup_testing/test_bootup.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/pythoncore/tests/data/bootup_testing/test_bootup.xml Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,52 @@
+
+
+
+ http://diamonds.com/1234
+ Smoke
+ Bootup test run
+
+
+
+
+
+
+
+
+
+ FlashTask
+
+
+
+
+
+
+ RebootTask
+
+
+
+ CreateDirTask
+
+
+
+
+
+
+
+ CleanupTask
+
+
+
+
+
+
+
+
+ EmailAction
+
+
+
+
+
+
+
+
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/tests/data/matti/matti_template.xml
--- a/buildframework/helium/sf/python/pythoncore/tests/data/matti/matti_template.xml Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,154 +0,0 @@
-
-
-
-
-
- {% if xml_dict['diamonds_build_url'] -%}
- {{ xml_dict['diamonds_build_url'] }}
- Smoke
- {% endif %}
- {{ xml_dict['testrun_name'] }}
-
-
-
-
-
-
-
-
-
- {% for exe_block in xml_dict['execution_blocks'] -%}
-
-
-
- {% if exe_block['image_files'] -%}
-
- FlashTask
-
- {% set i = 1 %}
- {% for img in exe_block['image_files'] -%}
-
- {% set i = i + 1 %}
- {% endfor -%}
-
-
- {% endif %}
-
-
- {% if exe_block['install_files'] != [] -%}
- {% for file in exe_block['install_files'] -%}
-
- FileUploadTask
-
-
-
-
-
- {% endfor -%}
- {% endif %}
-
- {% if exe_block['matti_sis_files'] != [] -%}
- {% for sisfile in exe_block['matti_sis_files'] -%}
-
- FileUploadTask
-
-
-
-
-
- {% endfor -%}
- {% endif %}
-
- {% for sis_file in exe_block["matti_sis_files"] -%}
-
- InstallSisTask
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- {%- endfor -%}
-
-
- RebootTask
-
-
-
- CreateDirTask
-
-
-
-
-
-
- {% for task_file in exe_block["matti_task_files"] -%}
-
- MATTITask
-
-
-
-
-
-
-
-
- {% endfor -%}
-
-
-
- CleanupTask
-
-
-
-
-
-
- {% endfor -%}
-
-
-
- EmailAction
-
-
-
-
-
-
-
-
-
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/tests/data/matti/test.rb
--- a/buildframework/helium/sf/python/pythoncore/tests/data/matti/test.rb Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,34 +0,0 @@
-# require needed Ruby, MATTI and Orbit files
-require 'test/unit'
-require 'otest/testcase'
-
-
- #TODO: Give suitable name for test class
-class TestClassName < Test::Unit::TestCase
-
- #no need to do anything for initialize method
- def initialize (args)
- super(args)
- # TODO define application name
- app_path("hbinputtest.exe")
- end
-
- # Test case method
- #TODO: name test method with suitable name
- # Must: Test method must start test_
- # Recomended: really descriping name
- def test_do_something
-
- # create test object app from defined sut
- app = @sut.run(:name => @app_name)
- sleep(10)
-
- #Application is closed after test
- app.close
- #Verifies it is closed
-
- end #End of test case test_do_something
-
-
-end#Testsuite
-
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/tests/data/matti/test_all_present.xml
--- a/buildframework/helium/sf/python/pythoncore/tests/data/matti/test_all_present.xml Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,142 +0,0 @@
-
-
-
- http://diamonds.com/1234
- Smoke
- matti test run
-
-
-
-
-
-
-
-
-
- FlashTask
-
-
-
-
-
-
- FileUploadTask
-
-
-
-
-
-
- FileUploadTask
-
-
-
-
-
-
- FileUploadTask
-
-
-
-
-
-
- FileUploadTask
-
-
-
-
-
-
- InstallSisTask
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- InstallSisTask
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- RebootTask
-
-
-
- CreateDirTask
-
-
-
-
-
-
- MATTITask
-
-
-
-
-
-
-
-
-
- MATTITask
-
-
-
-
-
-
-
-
-
-
- CleanupTask
-
-
-
-
-
-
-
-
- EmailAction
-
-
-
-
-
-
-
-
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/tests/data/matti/test_all_present_sierra_disabled.xml
--- a/buildframework/helium/sf/python/pythoncore/tests/data/matti/test_all_present_sierra_disabled.xml Fri Oct 08 21:02:28 2010 +0100
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,142 +0,0 @@
-
-
-
- http://diamonds.com/1234
- Smoke
- matti test run
-
-
-
-
-
-
-
-
-
- FlashTask
-
-
-
-
-
-
- FileUploadTask
-
-
-
-
-
-
- FileUploadTask
-
-
-
-
-
-
- FileUploadTask
-
-
-
-
-
-
- FileUploadTask
-
-
-
-
-
-
- InstallSisTask
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- InstallSisTask
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- RebootTask
-
-
-
- CreateDirTask
-
-
-
-
-
-
- MATTITask
-
-
-
-
-
-
-
-
-
- MATTITask
-
-
-
-
-
-
-
-
-
-
- CleanupTask
-
-
-
-
-
-
-
-
- EmailAction
-
-
-
-
-
-
-
-
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/tests/data/packageiad/sis/testPackage.zip
Binary file buildframework/helium/sf/python/pythoncore/tests/data/packageiad/sis/testPackage.zip has changed
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/tests/data/tdriver/tdriver_template.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/pythoncore/tests/data/tdriver/tdriver_template.xml Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,168 @@
+
+
+
+
+
+ {% if xml_dict['diamonds_build_url'] -%}
+ {{ xml_dict['diamonds_build_url'] }}
+ Smoke
+ {% endif %}
+ {{ xml_dict['testrun_name'] }}
+
+
+
+
+
+
+
+
+
+ {% for exe_block in xml_dict['execution_blocks'] -%}
+
+
+
+ {% if exe_block['image_files'] -%}
+
+ FlashTask
+
+ {% set i = 1 %}
+ {% for img in exe_block['image_files'] -%}
+
+ {% set i = i + 1 %}
+ {% endfor -%}
+
+
+ {% endif %}
+
+
+ {% if exe_block['install_files'] != [] -%}
+ {% for file in exe_block['install_files'] -%}
+
+ FileUploadTask
+
+
+
+
+
+ {% endfor -%}
+ {% endif %}
+
+ {% if exe_block['tdriver_sis_files'] != [] -%}
+ {% for sisfile in exe_block['tdriver_sis_files'] -%}
+
+ FileUploadTask
+
+
+
+
+
+ {% endfor -%}
+ {% endif %}
+
+ {% for sis_file in exe_block["tdriver_sis_files"] -%}
+
+ InstallSisTask
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ {%- endfor -%}
+
+
+ RebootTask
+
+
+
+ CreateDirTask
+
+
+
+
+
+
+ {% for task_file in exe_block["tdriver_task_files"] -%}
+
+ TestabilityTask
+
+
+
+
+
+
+
+
+ {% endfor -%}
+
+
+
+ CleanupTask
+
+
+
+
+
+
+ {% endfor -%}
+
+
+
+ EmailAction
+
+
+
+
+
+
+ {% if xml_dict['report_location'] -%}
+
+ FileStoreAction
+
+
+
+
+
+ {% endif %}
+ {% if xml_dict['diamonds_build_url'] -%}
+
+ DiamondsAction
+
+ {% endif %}
+
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/tests/data/tdriver/test.rb
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/pythoncore/tests/data/tdriver/test.rb Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,34 @@
+# require needed Ruby, MATTI and Orbit files
+require 'test/unit'
+require 'otest/testcase'
+
+
+ #TODO: Give suitable name for test class
+class TestClassName < Test::Unit::TestCase
+
+ #no need to do anything for initialize method
+ def initialize (args)
+ super(args)
+ # TODO define application name
+ app_path("hbinputtest.exe")
+ end
+
+ # Test case method
+ #TODO: name test method with suitable name
+ # Must: Test method must start test_
+ # Recomended: really descriping name
+ def test_do_something
+
+ # create test object app from defined sut
+ app = @sut.run(:name => @app_name)
+ sleep(10)
+
+ #Application is closed after test
+ app.close
+ #Verifies it is closed
+
+ end #End of test case test_do_something
+
+
+end#Testsuite
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/tests/data/tdriver/test_all_present.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/pythoncore/tests/data/tdriver/test_all_present.xml Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,145 @@
+
+
+
+ http://diamonds.com/1234
+ Smoke
+ TDriver test run
+
+
+
+
+
+
+
+
+
+ FlashTask
+
+
+
+
+
+
+ FileUploadTask
+
+
+
+
+
+
+ FileUploadTask
+
+
+
+
+
+
+ FileUploadTask
+
+
+
+
+
+
+ FileUploadTask
+
+
+
+
+
+
+ InstallSisTask
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ InstallSisTask
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ RebootTask
+
+
+
+ CreateDirTask
+
+
+
+
+
+
+ TestabilityTask
+
+
+
+
+
+
+
+
+
+ TestabilityTask
+
+
+
+
+
+
+
+
+
+
+ CleanupTask
+
+
+
+
+
+
+
+
+ EmailAction
+
+
+
+
+
+
+
+ DiamondsAction
+
+
+
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/tests/data/tdriver/test_all_present_tdrunner_disabled.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/pythoncore/tests/data/tdriver/test_all_present_tdrunner_disabled.xml Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,152 @@
+
+
+
+ http://diamonds.com/1234
+ Smoke
+ TDriver test run
+
+
+
+
+
+
+
+
+
+ FlashTask
+
+
+
+
+
+
+ FileUploadTask
+
+
+
+
+
+
+ FileUploadTask
+
+
+
+
+
+
+ FileUploadTask
+
+
+
+
+
+
+ FileUploadTask
+
+
+
+
+
+
+ InstallSisTask
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ InstallSisTask
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ RebootTask
+
+
+
+ CreateDirTask
+
+
+
+
+
+
+ TestabilityTask
+
+
+
+
+
+
+
+
+
+ TestabilityTask
+
+
+
+
+
+
+
+
+
+
+ CleanupTask
+
+
+
+
+
+
+
+
+ EmailAction
+
+
+
+
+
+
+
+ FileStoreAction
+
+
+
+
+
+
+ DiamondsAction
+
+
+
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/sf/python/pythoncore/tests/data/tdriver/test_ctc.xml
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/sf/python/pythoncore/tests/data/tdriver/test_ctc.xml Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,190 @@
+
+
+
+ http://diamonds.com/1234
+ Smoke
+ TDriver test run
+
+
+
+
+
+
+
+
+
+ FlashTask
+
+
+
+
+
+
+ FileUploadTask
+
+
+
+
+
+
+ FileUploadTask
+
+
+
+
+
+
+ FileUploadTask
+
+
+
+
+
+
+ FileUploadTask
+
+
+
+
+
+
+ InstallSisTask
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ InstallSisTask
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ RebootTask
+
+
+
+ CreateDirTask
+
+
+
+
+
+ CreateDirTask
+
+
+
+
+
+ NonTestExecuteTask
+
+
+
+
+
+
+
+ TestabilityTask
+
+
+
+
+
+
+
+
+
+ TestabilityTask
+
+
+
+
+
+
+
+
+
+
+ NonTestExecuteTask
+
+
+
+
+
+
+
+
+ NonTestExecuteTask
+
+
+
+
+
+
+
+
+ CTCDATA
+
+ FileDownloadTask
+
+
+
+
+
+
+
+
+ CleanupTask
+
+
+
+
+
+
+
+
+ EmailAction
+
+
+
+
+
+
+
+ DiamondsAction
+
+
+
+
+
+
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/tests/data/diamonds/build_roms_sample.log
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/tests/data/diamonds/build_roms_sample.log Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,281 @@
+Starting build: 75090
+mkdir epoc32\rombuild\temp
+Number of threads: 4
+
+++ Started at Thu May 6 13:08:38 2010
++++ HiRes Start 1273147718
+-- cmd /c imaker WORKDIR=epoc32\rombuild\temp/config_482 -f /epoc32/rom/config/platform/productexample/image_conf_productexample_ui.mk TYPE=prd rofs3-dir
+iMaker 10.12.01, 23-Mar-2010.
+===============================================================================
+Target: rofs3-dir Duration: 00:02 Status: OK
+iMaker log = `Y:/output/release_flash_images/productexample/prd/customer/vanilla/rofs3/RM-XXX_010.014_00.01_79.92_prd_imaker_rofs3-dir.log'
+ROFS3 dir = `y:/output/release_flash_images/productexample/prd/customer/vanilla/rofs3'
+ROFS3 image = `y:/output/release_flash_images/productexample/prd/customer/vanilla/rofs3/RM-XXX_010.014_00.01_79.92_prd.rofs3.img' - DOESN'T EXIST
+ROFS3 symbols = `y:/output/release_flash_images/productexample/prd/customer/vanilla/rofs3/RM-XXX_010.014_00.01_79.92_prd.rofs3.symbol' - DOESN'T EXIST
+ROFS3 flash = `y:/output/release_flash_images/productexample/prd/customer/vanilla/RM-XXX_010.014_00.01_79.92_prd.rofs3.fpsx' - DOESN'T EXIST
+-------------------------------------------------------------------------------
+Total duration: 00:03 Status: OK
+===============================================================================
++++ HiRes End 1273147721
+++ Finished at Thu May 6 13:08:41 2010
+++ Started at Thu May 6 13:08:41 2010
++++ HiRes Start 1273147721
+-- cmd /c imaker WORKDIR=epoc32\rombuild\temp/config_484 -f /epoc32/rom/config/platform/productexample/image_conf_productexample_ui.mk TYPE=rnd udaerase-dir
+iMaker 10.12.01, 23-Mar-2010.
+===============================================================================
+Target: udaerase-dir Duration: 00:01 Status: OK
+iMaker log = `Y:/output/release_flash_images/productexample/rnd/uda/udaerase/udadata/RM-XXX_010.014_79.92.2010.15_rnd_imaker_udaerase-dir.log'
+UDA Erase flash = `y:/output/release_flash_images/productexample/rnd/uda/RM-XXX_010.014_79.92.2010.15_rnd.udaerase.fpsx' - DOESN'T EXIST
+-------------------------------------------------------------------------------
+Total duration: 00:02 Status: OK
+===============================================================================
++++ HiRes End 1273147723
+++ Finished at Thu May 6 13:08:43 2010
+++ Started at Thu May 6 13:08:41 2010
++++ HiRes Start 1273147721
+-- cmd /c imaker WORKDIR=epoc32\rombuild\temp/config_485 -f /epoc32/rom/config/platform/productexample/image_conf_productexample_ui.mk TYPE=prd udaerase-dir
+iMaker 10.12.01, 23-Mar-2010.
+===============================================================================
+Target: udaerase-dir Duration: 00:01 Status: OK
+iMaker log = `Y:/output/release_flash_images/productexample/prd/uda/udaerase/udadata/RM-XXX_010.014_79.92_prd_imaker_udaerase-dir.log'
+UDA Erase flash = `y:/output/release_flash_images/productexample/prd/uda/RM-XXX_010.014_79.92_prd.udaerase.fpsx' - DOESN'T EXIST
+-------------------------------------------------------------------------------
+Total duration: 00:02 Status: OK
+===============================================================================
++++ HiRes End 1273147723
+++ Finished at Thu May 6 13:08:43 2010
+++ Started at Thu May 6 13:08:41 2010
++++ HiRes Start 1273147721
+-- cmd /c imaker WORKDIR=epoc32\rombuild\temp/config_486 -f /epoc32/rom/config/platform/productexample/image_conf_productexample_ui.mk TYPE=subcon udaerase-dir
+iMaker 10.12.01, 23-Mar-2010.
+===============================================================================
+Target: udaerase-dir Duration: 00:01 Status: OK
+iMaker log = `Y:/output/release_flash_images/productexample/subcon/uda/udaerase/udadata/RM-XXX_010.014_79.92_subcon_imaker_udaerase-dir.log'
+UDA Erase flash = `y:/output/release_flash_images/productexample/subcon/uda/RM-XXX_010.014_79.92_subcon.udaerase.fpsx' - DOESN'T EXIST
+-------------------------------------------------------------------------------
+Total duration: 00:02 Status: OK
+===============================================================================
++++ HiRes End 1273147723
+++ Finished at Thu May 6 13:08:43 2010
+++ Started at Thu May 6 13:08:40 2010
++++ HiRes Start 1273147720
+-- cmd /c imaker WORKDIR=epoc32\rombuild\temp/config_483 -f /epoc32/rom/config/platform/productexample/image_conf_productexample_ui.mk TYPE=subcon rofs3-dir
+iMaker 10.12.01, 23-Mar-2010.
+===============================================================================
+Target: rofs3-dir Duration: 00:02 Status: OK
+iMaker log = `Y:/output/release_flash_images/productexample/subcon/customer/vanilla/rofs3/RM-XXX_010.014_00.01_79.92_subcon_imaker_rofs3-dir.log'
+ROFS3 dir = `y:/output/release_flash_images/productexample/subcon/customer/vanilla/rofs3'
+ROFS3 image = `y:/output/release_flash_images/productexample/subcon/customer/vanilla/rofs3/RM-XXX_010.014_00.01_79.92_subcon.rofs3.img' - DOESN'T EXIST
+ROFS3 symbols = `y:/output/release_flash_images/productexample/subcon/customer/vanilla/rofs3/RM-XXX_010.014_00.01_79.92_subcon.rofs3.symbol' - DOESN'T EXIST
+ROFS3 flash = `y:/output/release_flash_images/productexample/subcon/customer/vanilla/RM-XXX_010.014_00.01_79.92_subcon.rofs3.fpsx' - DOESN'T EXIST
+-------------------------------------------------------------------------------
+Total duration: 00:03 Status: OK
+===============================================================================
++++ HiRes End 1273147724
+++ Finished at Thu May 6 13:08:44 2010
+-- imaker -f /epoc32/rom/config/platform/productexample/image_conf_productexample_ui.mk TYPE=rnd core-image
+++ Started at Thu May 6 13:08:45 2010
++++ HiRes Start 1273147725.83
+imaker WORKDIR=epoc32\rombuild\temp/config_16 -f /epoc32/rom/config/platform/productexample/image_conf_productexample_ui.mk TYPE=rnd core-image
+iMaker 10.12.01, 23-Mar-2010.
+Generating file(s) for Core (ROM & ROFS1) image creation
+Generating Feature manager file(s)
+Creating Core (ROM & ROFS1) SOS image
+
+Missing file(s):
+1) y:/output/release_flash_images/productexample/rnd/core/RM-XXX_010.014_79.92.2010.15_rnd/RM-XXX_010.014_79.92.2010.15_rnd_core_master.oby(1686): Missing file: '/epoc32/data/Z/resource/apps/MPSettingsROPModel.rsc' in statement 'data='
+2) y:/output/release_flash_images/productexample/rnd/core/RM-XXX_010.014_79.92.2010.15_rnd/RM-XXX_010.014_79.92.2010.15_rnd_core_master.oby(1824): Missing file: '/epoc32/data/Z/private/20021377/backup_registration.xml' in statement 'data='
+3) y:/output/release_flash_images/productexample/rnd/core/RM-XXX_010.014_79.92.2010.15_rnd/RM-XXX_010.014_79.92.2010.15_rnd_core_master.oby(2596): Missing file: '/epoc32/data/z/data/system/application/licenses/product/data.xml' in statement 'data='
+4) y:/output/release_flash_images/productexample/rnd/core/RM-XXX_010.014_79.92.2010.15_rnd/RM-XXX_010.014_79.92.2010.15_rnd_core_master.oby(2597): Missing file: '/epoc32/data/z/data/system/application/licenses/product/key' in statement 'data='
+5) y:/output/release_flash_images/productexample/rnd/core/RM-XXX_010.014_79.92.2010.15_rnd/RM-XXX_010.014_79.92.2010.15_rnd_core_master.oby(2614): Missing file: '/epoc32/data/Z/Resource/wappush/si.dtd' in statement 'data='
+6) y:/output/release_flash_images/productexample/rnd/core/RM-XXX_010.014_79.92.2010.15_rnd/RM-XXX_010.014_79.92.2010.15_rnd_core_master.oby(2615): Missing file: '/epoc32/data/Z/Resource/wappush/sl.dtd' in statement 'data='
+7) y:/output/release_flash_images/productexample/rnd/core/RM-XXX_010.014_79.92.2010.15_rnd/RM-XXX_010.014_79.92.2010.15_rnd_core_master.oby(2616): Missing file: '/epoc32/data/Z/Resource/wappush/si10.tok' in statement 'data='
+8) y:/output/release_flash_images/productexample/rnd/core/RM-XXX_010.014_79.92.2010.15_rnd/RM-XXX_010.014_79.92.2010.15_rnd_core_master.oby(2617): Missing file: '/epoc32/data/Z/Resource/wappush/sl10.tok' in statement 'data='
+9) y:/output/release_flash_images/productexample/rnd/core/RM-XXX_010.014_79.92.2010.15_rnd/RM-XXX_010.014_79.92.2010.15_rnd_core_master.oby(4905): Missing file: '/epoc32/data/Z/resource/help/juicehelp.hlp' in statement 'data='
+
+Warning(s):
+ 1) WARNING: the value of attribute hrhmacro has been overridden in original feature 0x000003f5
+ 2) WARNING: the value of attribute statusflags has been overridden in original feature 0x000003f5
+ 3) WARNING: the value of attribute hrhmacro has been overridden in original feature 0x0000000b
+ 4) WARNING: the value of attribute statusflags has been overridden in original feature 0x0000000b
+ 5) WARNING: the value of attribute hrhmacro has been overridden in original feature 0x00000584
+ 6) WARNING: the value of attribute statusflags has been overridden in original feature 0x00000584
+ 7) WARNING: the value of attribute hrhmacro has been overridden in original feature 0x0000019b
+ 8) WARNING: the value of attribute statusflags has been overridden in original feature 0x0000019b
+ 9) WARNING: the value of attribute statusflags has been overridden in original feature 0x00000007
+ 10) WARNING: the value of attribute hrhmacro has been overridden in original feature 0x000005ff
+ 11) WARNING: the value of attribute statusflags has been overridden in original feature 0x000005ff
+ 12) WARNING: the value of attribute hrhmacro has been overridden in original feature 0x000001f8
+ 13) WARNING: the value of attribute statusflags has been overridden in original feature 0x000001f8
+ 14) WARNING: the value of attribute hrhmacro has been overridden in original feature 0x0000000c
+ 15) WARNING: the value of attribute statusflags has been overridden in original feature 0x0000000c
+ 16) WARNING: the value of attribute hrhmacro has been overridden in original feature 0x0000007a
+ 17) WARNING: the value of attribute statusflags has been overridden in original feature 0x0000007a
+ 18) WARNING: the value of attribute hrhmacro has been overridden in original feature 0x00000059
+ 19) WARNING: the value of attribute statusflags has been overridden in original feature 0x00000059
+ 20) WARNING: the value of attribute hrhmacro has been overridden in original feature 0x00000003
+ 21) WARNING: the value of attribute statusflags has been overridden in original feature 0x00000003
+ 22) WARNING: the value of attribute hrhmacro has been overridden in original feature 0x00000126
+ 23) WARNING: the value of attribute statusflags has been overridden in original feature 0x00000126
+ 24) WARNING: the value of attribute statusflags has been overridden in original feature 0x0000005b
+ 25) WARNING: the value of attribute hrhmacro has been overridden in original feature 0x00000066
+ 26) WARNING: the value of attribute statusflags has been overridden in original feature 0x00000066
+ 27) WARNING: the value of attribute hrhmacro has been overridden in original feature 0x00000001
+ 28) WARNING: the value of attribute statusflags has been overridden in original feature 0x00000001
+ 29) WARNING: the value of attribute hrhmacro has been overridden in original feature 0x0000006b
+ 30) WARNING: the value of attribute statusflags has been overridden in original feature 0x0000006b
+ 31) WARNING: the value of attribute hrhmacro has been overridden in original feature 0x000005f9
+ 32) WARNING: the value of attribute statusflags has been overridden in original feature 0x000005f9
+ 33) WARNING: the value of attribute statusflags has been overridden in original feature 0x00000072
+ 34) WARNING: the value of attribute hrhmacro has been overridden in original feature 0x0000000d
+ 35) WARNING: the value of attribute statusflags has been overridden in original feature 0x0000000d
+ 36) WARNING: the value of attribute hrhmacro has been overridden in original feature 0x000006a9
+ 37) WARNING: the value of attribute statusflags has been overridden in original feature 0x000006a9
+ 38) WARNING: Error in reading features database file "/epoc32/include/s60customswfeatures.xml"
+ 39) WARNING: Unknown keyword 'LOCALISE_ALL_RESOURCES_BEGIN'. Line 2133 ignored
+ 40) WARNING: Unknown keyword 'LOCALISE_ALL_RESOURCES_END'. Line 2134 ignored
+ 41) WARNING: Unknown keyword '-----------------------------------------------------------'. Line 2210 ignored
+ 42) WARNING: Unknown keyword '-----------------------------------------------------------'. Line 2337 ignored
+ 43) WARNING: Unknown keyword '-----------------------------------------------------------'. Line 2338 ignored
+ 44) WARNING: Unknown keyword '***'. Line 2602 ignored
+ 45) WARNING: Unknown keyword 'LOCALISE_ALL_RESOURCES_BEGIN'. Line 2133 ignored
+ 46) WARNING: Unknown keyword 'LOCALISE_ALL_RESOURCES_END'. Line 2134 ignored
+ 47) WARNING: Unknown keyword '-----------------------------------------------------------'. Line 2210 ignored
+ 48) WARNING: Unknown keyword '-----------------------------------------------------------'. Line 2337 ignored
+ 49) WARNING: Unknown keyword '-----------------------------------------------------------'. Line 2338 ignored
+ 50) WARNING: Unknown keyword '***'. Line 2602 ignored
+ 51) WARNING: Kernel/variant/extension
+ 52) WARNING: Kernel/variant/extension
+ 53) WARNING: Kernel/variant/extension
+ 54) Warning: Can't open "\epoc32\release\ARMV5\urel\AR_LServer.exe.map" or "\epoc32\release\ARMV5\urel\AR_LServer.map"
+ 55) Warning: Can't open "\epoc32\release\ARMV5\urel\AP_CES_CoreComponents.dll.map" or "\epoc32\release\ARMV5\urel\AP_CES_CoreComponents.map"
+ 56) Warning: Can't open "\epoc32\release\ARMV5\urel\AP_HDMetafile2.dll.map" or "\epoc32\release\ARMV5\urel\AP_HDMetafile2.map"
+ 57) Warning: Can't open "\epoc32\release\ARMV5\urel\AP_HDOfficeCommon2.dll.map" or "\epoc32\release\ARMV5\urel\AP_HDOfficeCommon2.map"
+ 58) Warning: Can't open "\epoc32\release\ARMV5\urel\ap_CES_QORecognizer.dll.map" or "\epoc32\release\ARMV5\urel\ap_CES_QORecognizer.map"
+ 59) Warning: Can't open "\epoc32\release\ARMV5\urel\ap_CES_S60_CoreComponents.dll.map" or "\epoc32\release\ARMV5\urel\ap_CES_S60_CoreComponents.map"
+ 60) Warning: Can't open "\epoc32\release\ARMV5\urel\ap_CES_DeviceTransport.DLL.map" or "\epoc32\release\ARMV5\urel\ap_CES_DeviceTransport.map"
+ 61) Warning: Can't open "\epoc32\release\ARMV5\urel\ap_zip.dll.map" or "\epoc32\release\ARMV5\urel\ap_zip.map"
+ 62) Warning: Can't open "\epoc32\release\ARMV5\urel\ap_syb_XMLarchive.dll.map" or "\epoc32\release\ARMV5\urel\ap_syb_XMLarchive.map"
+ 63) Warning: Can't open "\epoc32\release\ARMV5\urel\ap_FMPluginBase.dll.map" or "\epoc32\release\ARMV5\urel\ap_FMPluginBase.map"
+ 64) Warning: Can't open "\epoc32\release\ARMV5\urel\ap_CES_FILELISTFMPLUGIN.DLL.map" or "\epoc32\release\ARMV5\urel\ap_CES_FILELISTFMPLUGIN.map"
+ 65) Warning: Can't open "\epoc32\release\ARMV5\urel\ap_QO_FileManager.exe.map" or "\epoc32\release\ARMV5\urel\ap_QO_FileManager.map"
+ 66) Warning: Can't open "\epoc32\release\ARMV5\urel\ap_ces_ziprecognizer.dll.map" or "\epoc32\release\ARMV5\urel\ap_ces_ziprecognizer.map"
+ 67) Warning: Can't open "\epoc32\release\ARMV5\urel\ap_CES_PptRecognizer.dll.map" or "\epoc32\release\ARMV5\urel\ap_CES_PptRecognizer.map"
+ 68) Warning: Can't open "\epoc32\release\ARMV5\urel\ap_CES_PptXRecognizer.dll.map" or "\epoc32\release\ARMV5\urel\ap_CES_PptXRecognizer.map"
+ 69) Warning: Can't open "\epoc32\release\ARMV5\urel\ap_CES_XlsRecognizer.dll.map" or "\epoc32\release\ARMV5\urel\ap_CES_XlsRecognizer.map"
+ 70) Warning: Can't open "\epoc32\release\ARMV5\urel\ap_CES_XlsXRecognizer.dll.map" or "\epoc32\release\ARMV5\urel\ap_CES_XlsXRecognizer.map"
+ 71) Warning: Can't open "\epoc32\release\ARMV5\urel\ap_CES_TxtRecognizer.dll.map" or "\epoc32\release\ARMV5\urel\ap_CES_TxtRecognizer.map"
+ 72) Warning: Can't open "\epoc32\release\ARMV5\urel\ap_CES_WrdRecognizer.dll.map" or "\epoc32\release\ARMV5\urel\ap_CES_WrdRecognizer.map"
+ 73) Warning: Can't open "\epoc32\release\ARMV5\urel\ap_CES_WrdXRecognizer.dll.map" or "\epoc32\release\ARMV5\urel\ap_CES_WrdXRecognizer.map"
+ 74) Warning: Can't open "\epoc32\release\ARMV5\urel\AP_CES_QPDFDescriptor.DLL.map" or "\epoc32\release\ARMV5\urel\AP_CES_QPDFDescriptor.map"
+ 75) Warning: Can't open "\epoc32\release\ARMV5\urel\ap_CES_PdfRecognizer.dll.map" or "\epoc32\release\ARMV5\urel\ap_CES_PdfRecognizer.map"
+ 76) Warning: Can't open "\epoc32\release\ARMV5\urel\AP_CES_PDFFILTERINFO.DLL.map" or "\epoc32\release\ARMV5\urel\AP_CES_PDFFILTERINFO.map"
+ 77) Warning: Can't open "\epoc32\release\ARMV5\urel\ap_applicationStub.exe.map" or "\epoc32\release\ARMV5\urel\ap_applicationStub.map"
+ 78) Warning: Can't open "\epoc32\release\ARMV5\urel\ap_quickpdf.exe.map" or "\epoc32\release\ARMV5\urel\ap_quickpdf.map"
+ 79) WARNING: File \epoc32\data\Z\Resource\APPS\ap_registration.rom does not exist or is 0 bytes in length.
+ 80) Warning: Can't open "\epoc32\release\ARMV5\urel\ap_CES_HTTP.dll.map" or "\epoc32\release\ARMV5\urel\ap_CES_HTTP.map"
+ 81) WARNING: File \epoc32\data\Z\private\102033E6\installer\inst_plugins.cfg does not exist or is 0 bytes in length.
+ 82) Warning: Can't open "\epoc32\release\ARMV5\urel\arcimagefundamental.dll.map" or "\epoc32\release\ARMV5\urel\arcimagefundamental.map"
+ 83) Warning: Can't open "\epoc32\release\ARMV5\urel\arcplatform.dll.map" or "\epoc32\release\ARMV5\urel\arcplatform.map"
+ 84) Warning: Can't open "\epoc32\release\ARMV5\urel\arcimagecodecs.dll.map" or "\epoc32\release\ARMV5\urel\arcimagecodecs.map"
+ 85) Warning: Can't open "\epoc32\release\ARMV5\urel\photoeditor.exe.map" or "\epoc32\release\ARMV5\urel\photoeditor.map"
+ 86) Warning: Can't open "\epoc32\release\ARMV5\urel\arcamui.dll.map" or "\epoc32\release\ARMV5\urel\arcamui.map"
+ 87) Warning: Can't open "\epoc32\release\ARMV5\urel\arcpebasictool.dll.map" or "\epoc32\release\ARMV5\urel\arcpebasictool.map"
+ 88) Warning: Can't open "\epoc32\release\ARMV5\urel\arcpebase.dll.map" or "\epoc32\release\ARMV5\urel\arcpebase.map"
+ 89) Warning: Can't open "\epoc32\release\ARMV5\urel\arcpemanager.dll.map" or "\epoc32\release\ARMV5\urel\arcpemanager.map"
+ 90) Warning: Can't open "\epoc32\release\ARMV5\urel\arcampe.dll.map" or "\epoc32\release\ARMV5\urel\arcampe.map"
+ 91) Warning: Can't open "\epoc32\release\ARMV5\urel\photoeditoraiwplugin.dll.map" or "\epoc32\release\ARMV5\urel\photoeditoraiwplugin.map"
+ 92) WARNING: File \epoc32\data\Z\Resource\APPS\registration.rom does not exist or is 0 bytes in length.
+ 93) Warning: Can't open "\epoc32\release\ARMV5\urel\QO_LServer.exe.map" or "\epoc32\release\ARMV5\urel\QO_LServer.map"
+ 94) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_devicetransport.dll.map" or "\epoc32\release\ARMV5\urel\ces_devicetransport.map"
+ 95) Warning: Can't open "\epoc32\release\ARMV5\urel\CES_CoreComponents.dll.map" or "\epoc32\release\ARMV5\urel\CES_CoreComponents.map"
+ 96) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_pdfrecognizer.dll.map" or "\epoc32\release\ARMV5\urel\ces_pdfrecognizer.map"
+ 97) Warning: Can't open "\epoc32\release\ARMV5\urel\CES_OGLES_Charts.dll.map" or "\epoc32\release\ARMV5\urel\CES_OGLES_Charts.map"
+ 98) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_filelistfmplugin.dll.map" or "\epoc32\release\ARMV5\urel\ces_filelistfmplugin.map"
+ 99) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_pptfilterinfo.dll.map" or "\epoc32\release\ARMV5\urel\ces_pptfilterinfo.map"
+100) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_pptfilter.dll.map" or "\epoc32\release\ARMV5\urel\ces_pptfilter.map"
+101) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_pptrecognizer.dll.map" or "\epoc32\release\ARMV5\urel\ces_pptrecognizer.map"
+102) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_pptxfilterinfo.dll.map" or "\epoc32\release\ARMV5\urel\ces_pptxfilterinfo.map"
+103) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_ziprecognizer.dll.map" or "\epoc32\release\ARMV5\urel\ces_ziprecognizer.map"
+104) Warning: Can't open "\epoc32\release\ARMV5\urel\QO_PPTX_Engine_Conv.dll.map" or "\epoc32\release\ARMV5\urel\QO_PPTX_Engine_Conv.map"
+105) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_pptxrecognizer.dll.map" or "\epoc32\release\ARMV5\urel\ces_pptxrecognizer.map"
+106) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_qorecognizer.dll.map" or "\epoc32\release\ARMV5\urel\ces_qorecognizer.map"
+107) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_qshdescriptor.dll.map" or "\epoc32\release\ARMV5\urel\ces_qshdescriptor.map"
+108) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_qwddescriptor.dll.map" or "\epoc32\release\ARMV5\urel\ces_qwddescriptor.map"
+109) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_qptdescriptor.dll.map" or "\epoc32\release\ARMV5\urel\ces_qptdescriptor.map"
+110) Warning: Can't open "\epoc32\release\ARMV5\urel\CES_PPTXFILTER.DLL.map" or "\epoc32\release\ARMV5\urel\CES_PPTXFILTER.map"
+111) Warning: Can't open "\epoc32\release\ARMV5\urel\CES_QWS_SpellChecker.dll.map" or "\epoc32\release\ARMV5\urel\CES_QWS_SpellChecker.map"
+112) Warning: Can't open "\epoc32\release\ARMV5\urel\CES_RichText_Engine.dll.map" or "\epoc32\release\ARMV5\urel\CES_RichText_Engine.map"
+113) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_txtfilter.dll.map" or "\epoc32\release\ARMV5\urel\ces_txtfilter.map"
+114) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_txtfilterinfo.dll.map" or "\epoc32\release\ARMV5\urel\ces_txtfilterinfo.map"
+115) Warning: Can't open "\epoc32\release\ARMV5\urel\CES_S60_CoreComponents.dll.map" or "\epoc32\release\ARMV5\urel\CES_S60_CoreComponents.map"
+116) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_txtrecognizer.dll.map" or "\epoc32\release\ARMV5\urel\ces_txtrecognizer.map"
+117) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_wrdfilter.dll.map" or "\epoc32\release\ARMV5\urel\ces_wrdfilter.map"
+118) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_wrdfilterinfo.dll.map" or "\epoc32\release\ARMV5\urel\ces_wrdfilterinfo.map"
+119) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_wrdrecognizer.dll.map" or "\epoc32\release\ARMV5\urel\ces_wrdrecognizer.map"
+120) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_wrdxfilterinfo.dll.map" or "\epoc32\release\ARMV5\urel\ces_wrdxfilterinfo.map"
+121) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_wrdxrecognizer.dll.map" or "\epoc32\release\ARMV5\urel\ces_wrdxrecognizer.map"
+122) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_wrdxfilter.dll.map" or "\epoc32\release\ARMV5\urel\ces_wrdxfilter.map"
+123) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_xlsfilter.dll.map" or "\epoc32\release\ARMV5\urel\ces_xlsfilter.map"
+124) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_xlsfilterinfo.dll.map" or "\epoc32\release\ARMV5\urel\ces_xlsfilterinfo.map"
+125) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_xlsrecognizer.dll.map" or "\epoc32\release\ARMV5\urel\ces_xlsrecognizer.map"
+126) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_xlsxfilterinfo.dll.map" or "\epoc32\release\ARMV5\urel\ces_xlsxfilterinfo.map"
+127) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_xlsxrecognizer.dll.map" or "\epoc32\release\ARMV5\urel\ces_xlsxrecognizer.map"
+128) Warning: Can't open "\epoc32\release\ARMV5\urel\FMPluginBase.dll.map" or "\epoc32\release\ARMV5\urel\FMPluginBase.map"
+129) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_xlsxfilter.dll.map" or "\epoc32\release\ARMV5\urel\ces_xlsxfilter.map"
+130) Warning: Can't open "\epoc32\release\ARMV5\urel\HDExcel2.dll.map" or "\epoc32\release\ARMV5\urel\HDExcel2.map"
+131) Warning: Can't open "\epoc32\release\ARMV5\urel\HDMetafile2.dll.map" or "\epoc32\release\ARMV5\urel\HDMetafile2.map"
+132) Warning: Can't open "\epoc32\release\ARMV5\urel\HDOfficeCommon2.dll.map" or "\epoc32\release\ARMV5\urel\HDOfficeCommon2.map"
+133) Warning: Can't open "\epoc32\release\ARMV5\urel\QO_FileManager.exe.map" or "\epoc32\release\ARMV5\urel\QO_FileManager.map"
+134) Warning: Can't open "\epoc32\release\ARMV5\urel\HDPowerPoint2.dll.map" or "\epoc32\release\ARMV5\urel\HDPowerPoint2.map"
+135) Warning: Can't open "\epoc32\release\ARMV5\urel\HDWord2.dll.map" or "\epoc32\release\ARMV5\urel\HDWord2.map"
+136) Warning: Can't open "\epoc32\release\ARMV5\urel\quickpoint.exe.map" or "\epoc32\release\ARMV5\urel\quickpoint.map"
+137) Warning: Can't open "\epoc32\release\ARMV5\urel\quicksheet.exe.map" or "\epoc32\release\ARMV5\urel\quicksheet.map"
+138) Warning: Can't open "\epoc32\release\ARMV5\urel\Quickpoint_DOM.dll.map" or "\epoc32\release\ARMV5\urel\Quickpoint_DOM.map"
+139) Warning: Can't open "\epoc32\release\ARMV5\urel\Quicksheet_DOM.dll.map" or "\epoc32\release\ARMV5\urel\Quicksheet_DOM.map"
+140) Warning: Can't open "\epoc32\release\ARMV5\urel\Quickword.exe.map" or "\epoc32\release\ARMV5\urel\Quickword.map"
+141) Warning: Can't open "\epoc32\release\ARMV5\urel\SYB_XMLArchive.dll.map" or "\epoc32\release\ARMV5\urel\SYB_XMLArchive.map"
+142) Warning: Can't open "\epoc32\release\ARMV5\urel\Quickword_DOM.dll.map" or "\epoc32\release\ARMV5\urel\Quickword_DOM.map"
+143) Warning: Can't open "\epoc32\release\ARMV5\urel\QuickPDFStub.exe.map" or "\epoc32\release\ARMV5\urel\QuickPDFStub.map"
+144) Warning: Can't open "\epoc32\release\ARMV5\urel\CES_HTTP.dll.map" or "\epoc32\release\ARMV5\urel\CES_HTTP.map"
+145) Warning: Can't open "\epoc32\release\ARMV5\urel\ces_pdffilterinfostub.dll.map" or "\epoc32\release\ARMV5\urel\ces_pdffilterinfostub.map"
+146) Warning: Can't open "\epoc32\release\ARMV5\urel\zip.dll.map" or "\epoc32\release\ARMV5\urel\zip.map"
+147) Warning: Can't open "\epoc32\release\ARMV5\urel\videoeditor.exe.map" or "\epoc32\release\ARMV5\urel\videoeditor.map"
+148) Warning: Can't open "\epoc32\release\ARMV5\urel\amur.dll.map" or "\epoc32\release\ARMV5\urel\amur.map"
+149) Warning: Can't open "\epoc32\release\ARMV5\urel\arcavcodecs.dll.map" or "\epoc32\release\ARMV5\urel\arcavcodecs.map"
+150) Warning: Can't open "\epoc32\release\ARMV5\urel\amvesession.dll.map" or "\epoc32\release\ARMV5\urel\amvesession.map"
+151) Warning: Can't open "\epoc32\release\ARMV5\urel\videoeditoraiwplugin.dll.map" or "\epoc32\release\ARMV5\urel\videoeditoraiwplugin.map"
+152) Warning: Can't open "\epoc32\release\ARMV5\urel\3GPExtParser.dll.map" or "\epoc32\release\ARMV5\urel\3GPExtParser.map"
+153) Warning: Can't open "\epoc32\release\ARMV5\urel\VtcpCmpFilter.dll.map" or "\epoc32\release\ARMV5\urel\VtcpCmpFilter.map"
+154) Warning: Can't open "\epoc32\release\ARMV5\urel\SenVtcpTransport.dll.map" or "\epoc32\release\ARMV5\urel\SenVtcpTransport.map"
+155) Warning: Can't open "\epoc32\release\ARMV5\urel\wmdrmkeystorage.dll.map" or "\epoc32\release\ARMV5\urel\wmdrmkeystorage.map"
+156) Warning: Can't open "\epoc32\release\ARMV5\urel\hxwmdrmplugin.dll.map" or "\epoc32\release\ARMV5\urel\hxwmdrmplugin.map"
+157) Warning: Can't open "\epoc32\release\ARMV5\urel\wmdrmota.dll.map" or "\epoc32\release\ARMV5\urel\wmdrmota.map"
+158) Warning: Can't open "\epoc32\release\ARMV5\urel\wmdrmpkclient.dll.map" or "\epoc32\release\ARMV5\urel\wmdrmpkclient.map"
+159) Warning: Can't open "\epoc32\release\ARMV5\urel\wmdrmpkserver.exe.map" or "\epoc32\release\ARMV5\urel\wmdrmpkserver.map"
+160) Warning: Can't open "\epoc32\release\ARMV5\urel\wmdrmpkclientwrapper.dll.map" or "\epoc32\release\ARMV5\urel\wmdrmpkclientwrapper.map"
+161) Warning: Can't open "\epoc32\release\ARMV5\urel\cameseutility.dll.map" or "\epoc32\release\ARMV5\urel\cameseutility.map"
+162) Warning: Can't open "\epoc32\release\ARMV5\urel\wmdrmagent.dll.map" or "\epoc32\release\ARMV5\urel\wmdrmagent.map"
+163) Warning: Can't open "\epoc32\release\ARMV5\urel\wmdrmdla.dll.map" or "\epoc32\release\ARMV5\urel\wmdrmdla.map"
+164) Warning: Can't open "\epoc32\release\ARMV5\urel\wmdrmpd.dll.map" or "\epoc32\release\ARMV5\urel\wmdrmpd.map"
+165) Warning: Can't open "\epoc32\release\gcce\urel\zipmanager.exe.map" or "\epoc32\release\gcce\urel\zipmanager.map"
+166) Warning: Can't open "\epoc32\release\armv5\udeb\ctcmangui.exe.map" or "\epoc32\release\armv5\udeb\ctcmangui.map"
+167) Warning: Can't open "\epoc32\release\ARMV5\urel\eunits60gui.exe.map" or "\epoc32\release\ARMV5\urel\eunits60gui.map"
+168) Warning: Can't open "\epoc32\release\ARMV5\urel\eunitappenvironment.exe.map" or "\epoc32\release\ARMV5\urel\eunitappenvironment.map"
+169) Warning: Can't open "\epoc32\release\ARMV5\urel\euniteikappenvironment.exe.map" or "\epoc32\release\ARMV5\urel\euniteikappenvironment.map"
+170) Warning: Can't open "\epoc32\release\ARMV5\urel\qakitcommonui.dll.map" or "\epoc32\release\ARMV5\urel\qakitcommonui.map"
+171) Warning: Can't open "\epoc32\release\ARMV5\urel\digiaconnect.exe.map" or "\epoc32\release\ARMV5\urel\digiaconnect.map"
+
+Error(s):
+1) ERROR: (/epoc32/include/s60customswfeatures.xml) Feature "KFEATUREIDFFMOBILITYMANAGEMENTERRORS" already exists
+===============================================================================
+Target: core-image Duration: 14:26 Status: OK
+iMaker log = `Y:/output/release_flash_images/productexample/rnd/core/RM-XXX_010.014_79.92.2010.15_rnd/RM-XXX_010.014_79.92.2010.15_rnd_imaker_core-image.log'
+Core (ROM & ROFS1) dir = `y:/output/release_flash_images/productexample/rnd/core/RM-XXX_010.014_79.92.2010.15_rnd'
+Core ROM image = `y:/output/release_flash_images/productexample/rnd/core/RM-XXX_010.014_79.92.2010.15_rnd/RM-XXX_010.014_79.92.2010.15_rnd.rom.img'
+Core ROM symbols = `y:/output/release_flash_images/productexample/rnd/core/RM-XXX_010.014_79.92.2010.15_rnd/RM-XXX_010.014_79.92.2010.15_rnd.rom.symbol'
+Core ROFS1 image = `y:/output/release_flash_images/productexample/rnd/core/RM-XXX_010.014_79.92.2010.15_rnd/RM-XXX_010.014_79.92.2010.15_rnd.rofs1.img'
+Core ROFS1 symbols = `y:/output/release_flash_images/productexample/rnd/core/RM-XXX_010.014_79.92.2010.15_rnd/RM-XXX_010.014_79.92.2010.15_rnd.rofs1.symbol'
+-------------------------------------------------------------------------------
+Total duration: 14:29 Status: OK
+===============================================================================
\ No newline at end of file
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/tests/data/doc/input_for_failure/.static/default.css
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/tests/data/doc/input_for_failure/.static/default.css Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,843 @@
+/**
+ * Sphinx Doc Design
+ */
+
+body {
+ font-family: sans-serif;
+ font-size: 100%;
+ background-color: #11303d;
+ color: #000;
+ margin: 0;
+ padding: 0;
+}
+
+/* :::: LAYOUT :::: */
+
+div.document {
+ background-color: #1c4e63;
+}
+
+div.documentwrapper {
+ float: left;
+ width: 100%;
+}
+
+div.bodywrapper {
+ margin: 0 0 0 230px;
+}
+
+div.body {
+ background-color: white;
+ padding: 0 20px 30px 20px;
+}
+
+div.sphinxsidebarwrapper {
+ padding: 10px 5px 0 10px;
+}
+
+div.sphinxsidebar {
+ float: left;
+ width: 230px;
+ margin-left: -100%;
+ font-size: 90%;
+}
+
+div.clearer {
+ clear: both;
+}
+
+div.footer {
+ color: #fff;
+ width: 100%;
+ padding: 9px 0 9px 0;
+ text-align: center;
+ font-size: 75%;
+}
+
+div.footer a {
+ color: #fff;
+ text-decoration: underline;
+}
+
+div.related {
+ background-color: #133f52;
+ color: #fff;
+ width: 100%;
+ height: 30px;
+ line-height: 30px;
+ font-size: 90%;
+}
+
+div.related h3 {
+ display: none;
+}
+
+div.related ul {
+ margin: 0;
+ padding: 0 0 0 10px;
+ list-style: none;
+}
+
+div.related li {
+ display: inline;
+}
+
+div.related li.right {
+ float: right;
+ margin-right: 5px;
+}
+
+div.related a {
+ color: white;
+}
+
+/* ::: TOC :::: */
+div.sphinxsidebar h3 {
+ font-family: 'Trebuchet MS', sans-serif;
+ color: white;
+ font-size: 1.4em;
+ font-weight: normal;
+ margin: 0;
+ padding: 0;
+}
+
+div.sphinxsidebar h4 {
+ font-family: 'Trebuchet MS', sans-serif;
+ color: white;
+ font-size: 1.3em;
+ font-weight: normal;
+ margin: 5px 0 0 0;
+ padding: 0;
+}
+
+div.sphinxsidebar p {
+ color: white;
+}
+
+div.sphinxsidebar p.topless {
+ margin: 5px 10px 10px 10px;
+}
+
+div.sphinxsidebar ul {
+ margin: 10px;
+ padding: 0;
+ list-style: none;
+ color: white;
+}
+
+div.sphinxsidebar ul ul,
+div.sphinxsidebar ul.want-points {
+ margin-left: 20px;
+ list-style: square;
+}
+
+div.sphinxsidebar ul ul {
+ margin-top: 0;
+ margin-bottom: 0;
+}
+
+div.sphinxsidebar a {
+ color: #98dbcc;
+}
+
+div.sphinxsidebar form {
+ margin-top: 10px;
+}
+
+div.sphinxsidebar input {
+ border: 1px solid #98dbcc;
+ font-family: sans-serif;
+ font-size: 1em;
+}
+
+/* :::: MODULE CLOUD :::: */
+div.modulecloud {
+ margin: -5px 10px 5px 10px;
+ padding: 10px;
+ line-height: 160%;
+ border: 1px solid #cbe7e5;
+ background-color: #f2fbfd;
+}
+
+div.modulecloud a {
+ padding: 0 5px 0 5px;
+}
+
+/* :::: SEARCH :::: */
+ul.search {
+ margin: 10px 0 0 20px;
+ padding: 0;
+}
+
+ul.search li {
+ padding: 5px 0 5px 20px;
+ background-image: url(file.png);
+ background-repeat: no-repeat;
+ background-position: 0 7px;
+}
+
+ul.search li a {
+ font-weight: bold;
+}
+
+ul.search li div.context {
+ color: #888;
+ margin: 2px 0 0 30px;
+ text-align: left;
+}
+
+ul.keywordmatches li.goodmatch a {
+ font-weight: bold;
+}
+
+/* :::: COMMON FORM STYLES :::: */
+
+div.actions {
+ padding: 5px 10px 5px 10px;
+ border-top: 1px solid #cbe7e5;
+ border-bottom: 1px solid #cbe7e5;
+ background-color: #e0f6f4;
+}
+
+form dl {
+ color: #333;
+}
+
+form dt {
+ clear: both;
+ float: left;
+ min-width: 110px;
+ margin-right: 10px;
+ padding-top: 2px;
+}
+
+input#homepage {
+ display: none;
+}
+
+div.error {
+ margin: 5px 20px 0 0;
+ padding: 5px;
+ border: 1px solid #d00;
+ font-weight: bold;
+}
+
+/* :::: INLINE COMMENTS :::: */
+
+div.inlinecomments {
+ position: absolute;
+ right: 20px;
+}
+
+div.inlinecomments a.bubble {
+ display: block;
+ float: right;
+ background-image: url(style/comment.png);
+ background-repeat: no-repeat;
+ width: 25px;
+ height: 25px;
+ text-align: center;
+ padding-top: 3px;
+ font-size: 0.9em;
+ line-height: 14px;
+ font-weight: bold;
+ color: black;
+}
+
+div.inlinecomments a.bubble span {
+ display: none;
+}
+
+div.inlinecomments a.emptybubble {
+ background-image: url(style/nocomment.png);
+}
+
+div.inlinecomments a.bubble:hover {
+ background-image: url(style/hovercomment.png);
+ text-decoration: none;
+ color: #3ca0a4;
+}
+
+div.inlinecomments div.comments {
+ float: right;
+ margin: 25px 5px 0 0;
+ max-width: 50em;
+ min-width: 30em;
+ border: 1px solid #2eabb0;
+ background-color: #f2fbfd;
+ z-index: 150;
+}
+
+div#comments {
+ border: 1px solid #2eabb0;
+ margin-top: 20px;
+}
+
+div#comments div.nocomments {
+ padding: 10px;
+ font-weight: bold;
+}
+
+div.inlinecomments div.comments h3,
+div#comments h3 {
+ margin: 0;
+ padding: 0;
+ background-color: #2eabb0;
+ color: white;
+ border: none;
+ padding: 3px;
+}
+
+div.inlinecomments div.comments div.actions {
+ padding: 4px;
+ margin: 0;
+ border-top: none;
+}
+
+div#comments div.comment {
+ margin: 10px;
+ border: 1px solid #2eabb0;
+}
+
+div.inlinecomments div.comment h4,
+div.commentwindow div.comment h4,
+div#comments div.comment h4 {
+ margin: 10px 0 0 0;
+ background-color: #2eabb0;
+ color: white;
+ border: none;
+ padding: 1px 4px 1px 4px;
+}
+
+div#comments div.comment h4 {
+ margin: 0;
+}
+
+div#comments div.comment h4 a {
+ color: #d5f4f4;
+}
+
+div.inlinecomments div.comment div.text,
+div.commentwindow div.comment div.text,
+div#comments div.comment div.text {
+ margin: -5px 0 -5px 0;
+ padding: 0 10px 0 10px;
+}
+
+div.inlinecomments div.comment div.meta,
+div.commentwindow div.comment div.meta,
+div#comments div.comment div.meta {
+ text-align: right;
+ padding: 2px 10px 2px 0;
+ font-size: 95%;
+ color: #538893;
+ border-top: 1px solid #cbe7e5;
+ background-color: #e0f6f4;
+}
+
+div.commentwindow {
+ position: absolute;
+ width: 500px;
+ border: 1px solid #cbe7e5;
+ background-color: #f2fbfd;
+ display: none;
+ z-index: 130;
+}
+
+div.commentwindow h3 {
+ margin: 0;
+ background-color: #2eabb0;
+ color: white;
+ border: none;
+ padding: 5px;
+ font-size: 1.5em;
+ cursor: pointer;
+}
+
+div.commentwindow div.actions {
+ margin: 10px -10px 0 -10px;
+ padding: 4px 10px 4px 10px;
+ color: #538893;
+}
+
+div.commentwindow div.actions input {
+ border: 1px solid #2eabb0;
+ background-color: white;
+ color: #135355;
+ cursor: pointer;
+}
+
+div.commentwindow div.form {
+ padding: 0 10px 0 10px;
+}
+
+div.commentwindow div.form input,
+div.commentwindow div.form textarea {
+ border: 1px solid #3c9ea2;
+ background-color: white;
+ color: black;
+}
+
+div.commentwindow div.error {
+ margin: 10px 5px 10px 5px;
+ background-color: #fbe5dc;
+ display: none;
+}
+
+div.commentwindow div.form textarea {
+ width: 99%;
+}
+
+div.commentwindow div.preview {
+ margin: 10px 0 10px 0;
+ background-color: #70d0d4;
+ padding: 0 1px 1px 25px;
+}
+
+div.commentwindow div.preview h4 {
+ margin: 0 0 -5px -20px;
+ padding: 4px 0 0 4px;
+ color: white;
+ font-size: 1.3em;
+}
+
+div.commentwindow div.preview div.comment {
+ background-color: #f2fbfd;
+}
+
+div.commentwindow div.preview div.comment h4 {
+ margin: 10px 0 0 0!important;
+ padding: 1px 4px 1px 4px!important;
+ font-size: 1.2em;
+}
+
+/* :::: SUGGEST CHANGES :::: */
+div#suggest-changes-box input, div#suggest-changes-box textarea {
+ border: 1px solid #ccc;
+ background-color: white;
+ color: black;
+}
+
+div#suggest-changes-box textarea {
+ width: 99%;
+ height: 400px;
+}
+
+
+/* :::: PREVIEW :::: */
+div.preview {
+ background-image: url(style/preview.png);
+ padding: 0 20px 20px 20px;
+ margin-bottom: 30px;
+}
+
+
+/* :::: INDEX PAGE :::: */
+
+table.contentstable {
+ width: 90%;
+}
+
+table.contentstable p.biglink {
+ line-height: 150%;
+}
+
+a.biglink {
+ font-size: 1.3em;
+}
+
+span.linkdescr {
+ font-style: italic;
+ padding-top: 5px;
+ font-size: 90%;
+}
+
+/* :::: INDEX STYLES :::: */
+
+table.indextable td {
+ text-align: left;
+ vertical-align: top;
+}
+
+table.indextable dl, table.indextable dd {
+ margin-top: 0;
+ margin-bottom: 0;
+}
+
+table.indextable tr.pcap {
+ height: 10px;
+}
+
+table.indextable tr.cap {
+ margin-top: 10px;
+ background-color: #f2f2f2;
+}
+
+img.toggler {
+ margin-right: 3px;
+ margin-top: 3px;
+ cursor: pointer;
+}
+
+form.pfform {
+ margin: 10px 0 20px 0;
+}
+
+/* :::: GLOBAL STYLES :::: */
+
+.docwarning {
+ background-color: #ffe4e4;
+ padding: 10px;
+ margin: 0 -20px 0 -20px;
+ border-bottom: 1px solid #f66;
+}
+
+p.subhead {
+ font-weight: bold;
+ margin-top: 20px;
+}
+
+/*BMT added 16/10/08 */
+a:link {
+ color: blue;
+ text-decoration: none;
+}
+
+a:visited {
+ color: navy;
+ text-decoration: none;
+}
+
+a:hover {
+ color: purple;
+ text-decoration: underline;
+}
+/* BMT end added 16/10/08 */
+
+
+div.body h1,
+div.body h2,
+div.body h3,
+div.body h4,
+div.body h5,
+div.body h6 {
+ font-family: 'Trebuchet MS', sans-serif;
+ background-color: #f2f2f2;
+ font-weight: normal;
+ color: #20435c;
+ border-bottom: 1px solid #ccc;
+ margin: 20px -20px 10px -20px;
+ padding: 3px 0 3px 10px;
+}
+
+div.body h1 { margin-top: 0; font-size: 250%; color:black;}
+div.body h2 { font-size: 190%; color:#36237f;}
+div.body h3 { font-size: 150%; color:#4933af;}
+div.body h4 { font-size: 120%; color:#6223df;}
+div.body h5 { font-size: 100%; color:#6f23ef;}
+div.body h6 { font-size: 80%; color:#5a62ff;}
+
+a.headerlink {
+ color: #c60f0f;
+ font-size: 0.8em;
+ padding: 0 4px 0 4px;
+ text-decoration: none;
+ visibility: hidden;
+}
+
+h1:hover > a.headerlink,
+h2:hover > a.headerlink,
+h3:hover > a.headerlink,
+h4:hover > a.headerlink,
+h5:hover > a.headerlink,
+h6:hover > a.headerlink,
+dt:hover > a.headerlink {
+ visibility: visible;
+}
+
+a.headerlink:hover {
+ background-color: #c60f0f;
+ color: white;
+}
+
+div.body p, div.body dd, div.body li {
+ text-align: left;
+ line-height: 130%;
+}
+
+div.body p.caption {
+ text-align: inherit;
+}
+
+div.body td {
+ text-align: left;
+}
+
+ul.fakelist {
+ list-style: none;
+ margin: 10px 0 10px 20px;
+ padding: 0;
+}
+
+.field-list ul {
+ padding-left: 1em;
+}
+
+.first {
+ margin-top: 0 !important;
+}
+
+/* "Footnotes" heading */
+p.rubric {
+ margin-top: 30px;
+ font-weight: bold;
+}
+
+/* "Topics" */
+
+div.topic {
+ background-color: #eee;
+ border: 1px solid #ccc;
+ padding: 0 7px 0 7px;
+ margin: 10px 0 10px 0;
+}
+
+p.topic-title {
+ font-size: 1.1em;
+ font-weight: bold;
+ margin-top: 10px;
+}
+
+/* Admonitions */
+
+div.admonition {
+ margin-top: 10px;
+ margin-bottom: 10px;
+ padding: 7px;
+}
+
+div.admonition dt {
+ font-weight: bold;
+}
+
+div.admonition dl {
+ margin-bottom: 0;
+}
+
+div.admonition p {
+ display: inline;
+}
+
+div.seealso {
+ background-color: #ffc;
+ border: 1px solid #ff6;
+}
+
+div.warning {
+ background-color: #ffe4e4;
+ border: 1px solid #f66;
+}
+
+div.note {
+ background-color: #eee;
+ border: 1px solid #ccc;
+}
+
+p.admonition-title {
+ margin: 0px 10px 5px 0px;
+ font-weight: bold;
+ display: inline;
+}
+
+p.admonition-title:after {
+ content: ":";
+}
+
+div.body p.centered {
+ text-align: center;
+ margin-top: 25px;
+}
+
+table.docutils {
+ border: 0;
+}
+
+table.docutils td, table.docutils th {
+ padding: 1px 8px 1px 0;
+ border-top: 0;
+ border-left: 0;
+ border-right: 0;
+ border-bottom: 1px solid #aaa;
+}
+
+table.field-list td, table.field-list th {
+ border: 0 !important;
+}
+
+table.footnote td, table.footnote th {
+ border: 0 !important;
+}
+
+.field-list ul {
+ margin: 0;
+ padding-left: 1em;
+}
+
+.field-list p {
+ margin: 0;
+}
+
+dl {
+ margin-bottom: 15px;
+ clear: both;
+}
+
+dd p {
+ margin-top: 0px;
+}
+
+dd ul, dd table {
+ margin-bottom: 10px;
+}
+
+dd {
+ margin-top: 3px;
+ margin-bottom: 10px;
+ margin-left: 30px;
+}
+
+.refcount {
+ color: #060;
+}
+
+dt:target,
+.highlight {
+ background-color: #fbe54e;
+}
+
+dl.glossary dt {
+ font-weight: bold;
+ font-size: 1.1em;
+}
+
+th {
+ text-align: left;
+ padding-right: 5px;
+}
+
+pre {
+ padding: 5px;
+ background-color: #efc;
+ color: #333;
+ border: 1px solid #ac9;
+ border-left: none;
+ border-right: none;
+ overflow: auto;
+ word-wrap: break-word;
+}
+
+td.linenos pre {
+ padding: 5px 0px;
+ border: 0;
+ background-color: transparent;
+ color: #aaa;
+}
+
+table.highlighttable {
+ margin-left: 0.5em;
+}
+
+table.highlighttable td {
+ padding: 0 0.5em 0 0.5em;
+}
+
+tt {
+ background-color: #ecf0f3;
+ padding: 0 1px 0 1px;
+ font-size: 0.95em;
+}
+
+tt.descname {
+ background-color: transparent;
+ font-weight: bold;
+ font-size: 1.2em;
+}
+
+tt.descclassname {
+ background-color: transparent;
+}
+
+tt.xref, a tt {
+ background-color: transparent;
+ font-weight: bold;
+}
+
+.footnote:target { background-color: #ffa }
+
+h1 tt, h2 tt, h3 tt, h4 tt, h5 tt, h6 tt {
+ background-color: transparent;
+}
+
+.optional {
+ font-size: 1.3em;
+}
+
+.versionmodified {
+ font-style: italic;
+}
+
+form.comment {
+ margin: 0;
+ padding: 10px 30px 10px 30px;
+ background-color: #eee;
+}
+
+form.comment h3 {
+ background-color: #326591;
+ color: white;
+ margin: -10px -30px 10px -30px;
+ padding: 5px;
+ font-size: 1.4em;
+}
+
+form.comment input,
+form.comment textarea {
+ border: 1px solid #ccc;
+ padding: 2px;
+ font-family: sans-serif;
+ font-size: 100%;
+}
+
+form.comment input[type="text"] {
+ width: 240px;
+}
+
+form.comment textarea {
+ width: 100%;
+ height: 200px;
+ margin-bottom: 10px;
+}
+
+.system-message {
+ background-color: #fda;
+ padding: 5px;
+ border: 3px solid red;
+}
+
+/* :::: PRINT :::: */
+@media print {
+ div.document,
+ div.documentwrapper,
+ div.bodywrapper {
+ margin: 0;
+ width : 100%;
+ }
+
+ div.sphinxsidebar,
+ div.related,
+ div.footer,
+ div#comments div.new-comment-box,
+ #top-link {
+ display: none;
+ }
+}
diff -r 01667c882e63 -r a010554f8551 buildframework/helium/tests/data/doc/input_for_failure/.templates/indexcontent.html
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/buildframework/helium/tests/data/doc/input_for_failure/.templates/indexcontent.html Mon Oct 11 11:16:47 2010 +0100
@@ -0,0 +1,38 @@
+{% extends "defindex.html" %}
+{% block tables %}
+