628
|
1 |
.. index::
|
|
2 |
module: Configuring Metadata
|
|
3 |
|
|
4 |
====================
|
|
5 |
Configuring Metadata
|
|
6 |
====================
|
|
7 |
|
|
8 |
.. contents::
|
|
9 |
|
|
10 |
This document describes the purpose of metadata and how is being used in helium and
|
|
11 |
how it can be used by the customer.
|
|
12 |
|
|
13 |
Overview
|
|
14 |
========
|
|
15 |
|
|
16 |
Metadata is process to find the errors, warnings from the output log differnt section of the build
|
|
17 |
and store it to the database, which could be used during each stage of the build to process efficiently
|
|
18 |
and send details to diamonds, used for signaling, and generating summary file.
|
|
19 |
|
|
20 |
|
|
21 |
Metadata Details
|
|
22 |
================
|
|
23 |
|
|
24 |
1. Metadatarecord : Which is used to store the errors, warnings information to the database
|
|
25 |
|
|
26 |
1.1. It takes the metadatainput (the type of log parsing to be used), currently supported parsing are
|
|
27 |
|
|
28 |
a. sbsmetadatainput - sbs log processing (based on xml processing)
|
|
29 |
|
|
30 |
b. textmetadatainput - general text log processing (based on text processing)
|
|
31 |
|
|
32 |
c. policylogmetadatainput - policy log output processing (based on xml processing)
|
|
33 |
|
|
34 |
d. antlogmetadatainput - ant log output processing (based on text processing)
|
|
35 |
|
|
36 |
e. abldlogmetadatainput - abld log output processing (based on text processing)
|
|
37 |
|
|
38 |
Please see ant doc for more details on metadatarecord.
|
|
39 |
|
|
40 |
1.2 It takes the fileset containing list of log files
|
|
41 |
|
|
42 |
1.3 It takes the metadata filter, list of regular expression for searching strings.
|
|
43 |
|
|
44 |
Metadata Filters
|
|
45 |
================
|
|
46 |
|
|
47 |
This document describes the usage of metadata filter to change the severity level during different stages of the build.
|
|
48 |
|
|
49 |
Overview
|
|
50 |
--------
|
|
51 |
|
|
52 |
Metadata filters are set of regular expressions used to match the text of the build output and process the errors, categorize it,
|
|
53 |
and used to generate the output for diamonds, summary file, email output. Predefined set of ids are defined for each stage of the
|
|
54 |
build. For example for raptor compilation filter is defined as below,
|
|
55 |
|
|
56 |
The default definition of filterset.sbs is
|
|
57 |
|
|
58 |
.. code-block:: xml
|
|
59 |
|
|
60 |
<hlm:metadatafilterset id="filterset.sbs">
|
|
61 |
<metadatafilterset refid="filterset.common" />
|
|
62 |
</hlm:metadatafilterset>
|
|
63 |
|
|
64 |
|
|
65 |
which is using the common definition which is,
|
|
66 |
|
|
67 |
.. code-block:: xml
|
|
68 |
|
|
69 |
<hlm:metadatafilterset id="filterset.common">
|
|
70 |
<metadatafilterset filterfile="${helium.dir}/config/metadata_regex.csv" />
|
|
71 |
</hlm:metadatafilterset>
|
|
72 |
|
|
73 |
|
|
74 |
The complete list of predefined ids for various stages of the build are defined in this file,
|
|
75 |
|
|
76 |
helium/config/metadata_filter_config_default.xml
|
|
77 |
|
|
78 |
Each ID can be overridden to provide additional regular expression to control the results of the build for different stages.
|
|
79 |
|
|
80 |
Two ways to add the regular expressions
|
|
81 |
---------------------------------------
|
|
82 |
|
|
83 |
- Adding more than one regular expression
|
|
84 |
|
|
85 |
Define your own csv file and override it in your configuration as below (add this after importing helium.ant.xml file),
|
|
86 |
|
|
87 |
.. code-block:: xml
|
|
88 |
|
|
89 |
<hlm:metadatafilterset id="filterset.sbs">
|
|
90 |
<metadatafilterset filterfile="${s60.config}/config/metadata_regex.csv" />
|
|
91 |
</hlm:metadatafilterset>
|
|
92 |
|
|
93 |
- Adding just one regular expression
|
|
94 |
|
|
95 |
This can be done as below,
|
|
96 |
|
|
97 |
.. code-block:: xml
|
|
98 |
|
|
99 |
<hlm:metadatafilterset id="filterset.sbs">
|
|
100 |
<metadatafilter severity="error" regex=".*Error\s*:\s+.*" description="sbs compilation error" />
|
|
101 |
<metadatafilterset filterfile="${helium.dir}/config/metadata_regex.csv" />
|
|
102 |
</hlm:metadatafilterset>
|
|
103 |
|
|
104 |
Note
|
|
105 |
----
|
|
106 |
|
|
107 |
1. The order of metadatafilter / metadatafilterset is important, so the first one takes precedence than the second one.
|
|
108 |
|
|
109 |
2. Order is also preserved in the csv file, the expressions which are defined first get precedence than the later one.
|
|
110 |
|
|
111 |
3. All the regular expressions are JAVA patterns.
|
|
112 |
|
|
113 |
|
|
114 |
Usage in Helium
|
|
115 |
===============
|
|
116 |
|
|
117 |
Different build stages were processed and identified the type of output and added the metadatarecord
|
|
118 |
task for each build stage and captured the output in the database. And after storing it, using fmpp
|
|
119 |
template the error information from database are processed to send to diamonds, raised signal accordingly.
|
|
120 |
|
|
121 |
Usage
|
|
122 |
=====
|
|
123 |
|
|
124 |
Examples:
|
|
125 |
SBS comilation output in db:
|
|
126 |
|
|
127 |
.. code-block:: xml
|
|
128 |
|
|
129 |
<hlm:metadatarecord database="${metadata.dbfile}">
|
|
130 |
<hlm:sbsmetadatainput cleanLogFile="${sbs.clean.log.file}">
|
|
131 |
<fileset casesensitive="false" file="${sbs.log.file}"/>
|
|
132 |
<metadatafilterset refid="filterset.sbs" />
|
|
133 |
</hlm:sbsmetadatainput>
|
|
134 |
</hlm:metadatarecord>
|
|
135 |
|
|
136 |
This example is to process sbs output. The metadatainput is sbsmetadatainput to process the sbs log file, takes the sbs.log.file
|
|
137 |
uses the regular expression defined by filterset.sbs
|
|
138 |
|
|
139 |
.. code-block:: xml
|
|
140 |
|
|
141 |
<hlm:metadatarecord database="${metadata.dbfile}">
|
|
142 |
<hlm:abldmetadatainput>
|
|
143 |
<fileset casesensitive="false" file="${build.log.dir}/${build.id}${cmaker.log.label}.export.cmaker.log" />
|
|
144 |
<metadatafilterset refid="filterset.compile" />
|
|
145 |
</hlm:abldmetadatainput>
|
|
146 |
</hlm:metadatarecord>
|
|
147 |
|
|
148 |
This example process the cmaker output as abld output log. It takes abldmetadatainput as metadatainput
|
|
149 |
and the logfile as ${build.log.dir}/${build.id}${cmaker.log.label}.export.cmaker.log and the regular
|
|
150 |
expression is used from the reference filterset.compile.
|
|
151 |
|
|
152 |
Similarly any of the log output file can be easily processed in a similar way.
|
|
153 |
|
|
154 |
Database schema
|
|
155 |
===============
|
|
156 |
|
|
157 |
The following diagram describes the current database schema (for SQL based queries).
|
|
158 |
|
|
159 |
.. image:: metadata_schema.png
|
|
160 |
:align: center
|
|
161 |
|
|
162 |
|
|
163 |
It is also possible to use the JPQL language which allows the usage of the Java ORM mapping class. This means that database will be represented
|
|
164 |
by their Java model class, table fields by the class attributes. This diagrams describes the JPQL diagram:
|
|
165 |
|
|
166 |
.. image:: metadata_jpql_schema.png
|
|
167 |
:align: center
|
|
168 |
|
|
169 |
|
|
170 |
Example of queries:
|
|
171 |
|
|
172 |
SQL::
|
|
173 |
|
|
174 |
select * from metadataentry as e, severity as s where e.severity_id = s.severity_id and s.severity = 'error'
|
|
175 |
|
|
176 |
JPQL::
|
|
177 |
|
|
178 |
select e from MetadataEntry e JOIN e.severity s WHERE s.severity = 'error'
|
|
179 |
|
|
180 |
|
|
181 |
Using the Metadata framework with FMPP
|
|
182 |
======================================
|
|
183 |
|
|
184 |
The Metadata framework gives an efficient opportunity to record huge amount or data in a fast and reliable way (timewise and memory consumption-wise).
|
|
185 |
Thanks to the ORMFMPPLoader database loader it is really simple to access those data and render then in any other format: HTML for easy to read build summary,
|
|
186 |
XML to communicated with other tools, text file...
|
|
187 |
|
|
188 |
Loading a database
|
|
189 |
------------------
|
|
190 |
|
|
191 |
A database can be load and assigned to a template variable using the pp.loadData functionnality from the FMPP task. The 'com.nokia.helium.metadata.ORMFMPPLoader'
|
|
192 |
accept one argument which is the path to the database.
|
|
193 |
|
|
194 |
Example::
|
|
195 |
|
|
196 |
<#assign database = pp.loadData('com.nokia.helium.metadata.ORMFMPPLoader', "C:/path/to/database_db") >
|
|
197 |
|
|
198 |
|
|
199 |
Then the database variable can be used to access the database the following different ways:
|
|
200 |
|
|
201 |
- jpasingle: Query with single result e.g: select count(s) from Severity s
|
|
202 |
- jpa: allow iteration on the JPA object results: select s from Severity s
|
|
203 |
- native:<type>: Native SQL format query, type is use to determine the object to use in the model
|
|
204 |
|
|
205 |
Accessing data using a JPA single query
|
|
206 |
---------------------------------------
|
|
207 |
|
|
208 |
The 'jpasingle' is the best way to access results from single values like count of entities. The jpasingle queries must be written in JPQL,
|
|
209 |
please check the valid database schema in the previous section (case matter!).
|
|
210 |
|
|
211 |
Example of a template that will return the number of log files recorded in the database::
|
|
212 |
|
|
213 |
<#assign database = pp.loadData('com.nokia.helium.metadata.ORMFMPPLoader', "C:/path/to/database_db") >
|
|
214 |
Number of logfiles: ${database['jpasingle']['select l from LogFile l'][0]}
|
|
215 |
|
|
216 |
Accessing data using a JPA query
|
|
217 |
--------------------------------
|
|
218 |
|
|
219 |
The JPA query allows you to perform query and directly use JPA entity object directly inside the template. The jpa queries must be written in JPQL,
|
|
220 |
please check the valid database schema in the previous section (case matter!).
|
|
221 |
|
|
222 |
In the following example the query loop through the available log files::
|
|
223 |
|
|
224 |
<#assign database = pp.loadData('com.nokia.helium.metadata.ORMFMPPLoader', "C:/path/to/database_db") >
|
|
225 |
<#list database['jpasingle']['select l from LogFile l'] as l>
|
|
226 |
${l.id}: ${l.path}
|
|
227 |
</#list>
|
|
228 |
|
|
229 |
|
|
230 |
Accessing data using a native query
|
|
231 |
-----------------------------------
|
|
232 |
|
|
233 |
The native query enables you to perform SQL queries through the JDBC interface of the database. If native is used then make sure you use
|
|
234 |
the SQL schema.
|
|
235 |
|
|
236 |
In the following example the query loop through the available log files path::
|
|
237 |
|
|
238 |
<#assign database = pp.loadData('com.nokia.helium.metadata.ORMFMPPLoader', "C:/path/to/database_db") >
|
|
239 |
<#list table_info['native:java.lang.String']['SELECT l.PATH FROM LOGFILE as l'] as l>
|
|
240 |
${l}
|
|
241 |
</#list>
|
|
242 |
|
|
243 |
|