Test Case Title
|TC6.3 - Creating ETL processes for automated data ingestion and data transformation|
Modify ETL processes for data transformation
Develop a new ETL processes for data transformation with Penthao Kettle. Using: external services, direct access to data store, Advanced Smart City API. This can be performed: (i) using the VM provided, downloading it, putting in execution and developing; (ii) accessing with VNC to a ready to use VM for direct development
Put them in execution, on demand or periodically, on the back office via DISCES
Getting results from the process executed.
Using a PC or Mobile with a web browser. Conquer a minimal skill on producing ETL processes. See provided user manuals (Pentaho Kettle tool).
The following functionalities are available only for specific Snap4city users with specific privileges.
Expected successful result
|Please note that to correctly perform this Test Case you need and access to the ETL Virtual Machine as described below. To have access to a Virtual Machine to perform ETL please contact firstname.lastname@example.org.|
Users roles for which the test case is not available: Manager
User used to describe the Example: ‘snap4city’
- Collect data from external sources, including external services of third parties.
- Perform data transformation and analytics
- Produce data to external location.
The referential data could be produced from:
- external sources via ETL, for example: traffic flows, parking status, etc.;
- real-time data arriving at the infrastructure, via IOT Brokers for example, sensors, actuators, etc.;
- results from Data Analytics processes in: ETL, C/C++, java, Python, etc.;
- collective traces from mobile applications, via Smart City API
- crowd sourcing from the city users: votes, ranks, images, etc.;
- social media data: for example: Facebook, tweet.com;
- events in the city: entertainment, traffic, accidents, etc.;
- Knowledge Base of the smart city;
- DataGate: producing data from those collected;
- Advanced Smart City API: producing data from mobile usage;
- results from Snap4City Applications;
Log collected by the EventLogger
See the following Video to learn on how to work on the development environment
- VIDEO: 05-ETL-process-builder-VM.mp4, in Google Drive and FTP
- Open the Data Processing Development Environment into the VM
- Open the ETL process editor as Penthao Kettle
- Open the DISCES Scheduler,
- load a new process, and set execution to periodic
- Open the R stat tool
- Edit some R file for reading SOLR data of IOT or Phoenix/HBASE as the parking prediction, for example
- To test click on the link to see a live example: https://www.km4city.org/webapp-new/?serviceuri=CarParkStazioneFirenzeS.M.N.
- Perform some descriptive analysis, correlation, PCA, etc.
- To use the Virtual Machine to develop ETLs. Two methodologies can be adopted (using the credentials reported in Section 3.4 Development Environment):
- CASE 1) Open the VM from the snap4city home page
- CASE 2) Open the VM directly from Teamviewer (deprecated)
CASE 1) Open the VM from the snap4city home page
- Go to the snap4city home page
- Make the login
- If you have the developer permissions, you finds the menu: ‘Development Tools > ETL development’
- Click on the menu and insert the correct password (the same credentials two times, Fig a1, b)
Fig.a1: vnc connection.
Fig.b: Virtual machine connection.
For the snap4city public users or for the snap4city users with the role Manager, the Development tools are not available. For example, if a Manager (user name: ‘finaluser1’) goes to the snap4city home page, he/she cannot view the ‘Development’ menu. In fact the Managers actually are users with no permissions related to the development activites.
Fig.a2: snap4city home page for Manger and public users.
Following steps available both for CASE 1) and for CASE2)
- Locate the dataset of interest, than it can be static (e.g., the ‘http://www.nk.hel.fi/avoindata/avoin_data-avustukset.xls ’) or real-time, then you can create the ETL (Extract Transform and Load) process to ingest the data. The ETL are realized thanks to Pentaho Kettle tool. It is possible to realize an ETL (i) using the VM provided, downloading it, putting in execution and developing; or (ii) accessing with VNC to a ready to use VM for direct development. Now it is descripted in detail the procedure ii) using the Virtual Machine (VM) via VNC.
- Open a Terminal emulator (search the Icon at the bottom of the screen or click on ‘Application’ menu > ‘Terminal Emulator’), then write the command ‘spoon.sh’ and click ‘enter’, Fig. d.
Fig. d –Virtual Machine (VM) Terminal Emulator (command line).
- The Spoon Interface will open (Fig. e), now it is possible to create/load ETL processes.
Fig. e – Spoon Interface.
- In the case of the dataset ‘Helsinki_youth_subsidies’ we have already created and ETL capable to: i) download the dataset; ii) transform it; iii) Save the new data in a file (into the VM). To verify the ETL functionalities follow the steps:
- Click on the menu and select: ‘File > Open’ and select the file from: ‘Ubuntu/Desktop/snap4cityETL/Helsinki_youth_subsidies_XLS/Ingestion/Maink.kjb’, Fig. f
- In the Spoon Interface the Main Spoon Job appears, Fig. g.
Fig. f – Spoon Interface: open the ‘Helsinki_youth_subsidies_XLS’ ETL.
Fig. g – Spoon Interface: the ‘Helsinki_youth_subsidies_XLS’ ETL opened.
- To run the ETL process, click on the main ‘Run’ button (), a popup will open with a set of default values (including the url from which to download the dataset: ‘’), click on the Second ‘Run’ button, Fig. h.
Fig. h – Spoon Interface: launch an ETL (‘Run button’).
- Once launched the process, a set of windows will open. One relevant is the ‘Logging window’ from which it is possible to read the logs. When the logging windows report the sentence ‘‘Spoon – job has ended’, the job is ended with no errors, Fig. i.
Fig. i – Spoon Interface: ETL executes, logging details.
- What are the actions done by this process and how it is possible to verify them? Some block visible in Fig. I, can be expanded, with double left click or 1 right click of the mouse.
- block ‘Start’: start the process (see its details with: ‘right click > edit’),
- block ‘Create original data folder’:
- if not exists, creates a new folder for the downloaded file: ‘‘/home/ubuntu/Desktop/ETL_Output/Helsinki_youth_subsidies_XLS/Original_data’
- block ‘HTTP’: makes the HTTP REST request (to ‘http://www.nk.hel.fi/avoindata/avoin_data-avustukset.xls’) and saves the xls file (named ‘avoin_data-avustukset.xls’) in the folder created in the previous block
- block ‘File Exists’: checks if the uploaded file is already present
- block ‘Create Elaborated data folder’: ‘‘/home/ubuntu/Desktop/ETL_Output/Helsinki_youth_subsidies_XLS/Elaborated_data’’
- block ‘DUMMY’: do nothing
- block ‘process data’: writes the downloaded file, made data analysis and save the result in a new file in the folder ‘‘/home/ubuntu/Desktop/ETL_Output/Helsinki_youth_subsidies_XLS/Elaborated_data/Result_data.csv’
- block ‘Success’: ends the process.
- How Verify, Fig. l:
- Search the folder into the file system and verify the presence of the files, you can also delete them, and launch again the ETL process, Fig. l.
Fig. l – How verify the Helsinki_youth_subsidies_XLS Results.
Example 3: To Access to a large library of ETL process for smart city, from GITHUB:
- An ample collection of ETL processes can be obtained by GIThub of the DISIT lab. Link: https://github.com/disit/smart-city-etl, Fig. n.
Fig. n – Process Loader: List of Public Processes.
Example 4: Modify ETL processes for data transformation
- To modify a process, it is possible to follow the steps explained in the previous point 1 (from Fig. a to Fig. l). For example, it is possible to open one of the ETL, that are present in the virtual Machine and it. It is possible to open the ETL called ‘Florence_Pharmacies_csv’, capable to: (i) download a dataset; ii) transform it; iii) save the data transformed in a file (into the VM):
- Launch the spoon tool (see point) and Open the ETL (Click on the menu and select: ‘File > Open’ > search file: ‘Desktop/snap4cityETL/Florence_Pharmacies_csv/Ingestion/Main.kjb’), Fig. o.