version 2.0 init
authorTuomas Peltonen <tuomas.peltonen@stuk.fi>
Fri, 13 Dec 2013 10:22:09 +0000 (12:22 +0200)
committerTuomas Peltonen <tuomas.peltonen@stuk.fi>
Fri, 13 Dec 2013 10:22:09 +0000 (12:22 +0200)
18 files changed:
INSTALL
README
baltrad_singlelayer.kml
baltrad_wms.cfg [deleted file]
baltrad_wms.cfg.template [new file with mode: 0644]
baltrad_wms.map
baltrad_wms.py
baltrad_wms.wsgi
baltrad_wms_tools.py
cleaner.py [new file with mode: 0644]
clear_baltrad_wms_data.py
configurator.py [new file with mode: 0644]
db_setup.py [new file with mode: 0644]
demo/demo.html
demo/demo.js
fmi_open.py [new file with mode: 0755]
h5togeotiff.py
update_baltrad_wms.py

diff --git a/INSTALL b/INSTALL
index f5294d0..da0bef0 100644 (file)
--- a/INSTALL
+++ b/INSTALL
-Updating existing installion
+Installation
+============
+
+Installation of dependencies
 ----------------------------
-- Backup your baltrad_wms.cfg
-- Extract new version and replace baltrad_wms.cfg file there
-- Set config paths to "baltrad_wms.py" and "update_baltrad_wms.py"
-- Run clear_baltrad_wms_data.py
-- Run update_baltrad_wms.py to generate tiffs
+In principle, everything is platform independent here but the latest version is tested only with Ubuntu 12.04. If you are not using Ubuntu check README file for dependencies and install required software.
+
+In Ubuntu version 12.04 the installation of dependencies is easy. Just run:
+# sudo apt-get install python-mapscript python-h5py python-pyproj python-numpy python-gdal python-sqlalchemy python-sqlite
+(python-sqlite is not required but recommended to test DB storage, alternatively you can also post Postgres or Mysql as DB engine)
 
-The first installation
+Setup web server to provide service either using Python CGI or WSGI script. For testing purposes you can run 'python baltrad_wms.wsgi' from command line and then WMS service is available at http://localhost:8080/. This setup is recommended only for testing. Setup Apache or some other Web server software if you plan to deploy software. To test Stand-alone wsgi script you can try to access http://localhost:8080/?SERVICE=WMS&VERSION=1.1.1&REQUEST=GetCapabilities
+
+Apache can be installed in Ubuntu with command:
+sudo apt-get install apache2
+
+Configuration of files, DB and Web service locations 
+----------------------------------------------------
+Copy baltrad_wms.cfg.template to baltrad_wms.cfg
+# cp {baltrad_wms_dir}/baltrad_wms.cfg.template {baltrad_wms_dir}/baltrad_wms.cfg
+
+Open baltrad_wms.cfg for editing. 
+
+The locations sections is the most important one. Nothing works without it. Settings are explained below:
+* db_uri (required)
+    * DB locations. For sqlite this is something like sqlite:////home/user/baltrad/radar_datasets.db. For postgres this is something like postgres://postgres:model_postgres@localhost:5432/radardb. Sqlite can create database but with postgres it must be created. More information about DB engines supported by SqlAlchemy is here: http://docs.sqlalchemy.org/en/rel_0_9/core/engines.html.
+* baltrad_data_dir (optional; required when using BALTRAD HDF5 data)
+    * Path to directory where BALTRAD HDF5 files are located. Files must be in this directory - not in subdirectories
+* wms_data_dir (required)
+    * Path to directory where TIFF files are located. These are the files that are visualized. Files are stored here - not in DB. If this directory doesn't exist it will be created by DB populations scripts.
+*  mapfile (required)
+    * Absolute path to baltrad_wms.map file. This file is located in installation directory
+* online_resource (required)
+    * URL of WMS script for clients. If stand-alone wsgi script is used this is http://localhost:8080/. In the case of Apache this could be something like http://yourdomain.com/baltrad/baltrad_wms.py
+* tmpdir (optional; required when baltrad_tools.py script is used)
+    * Directory for temporary files. For Linux environment this can be set to /tmp
+
+Configuration of other settings and logging level
+--------------------------------------------------
+Next in the baltrad_wms.cfg file are settings and logging sections.
+
+The setting enable_contour_maps is recommended to set as false. Countour maps capability requires the very latest Mapserver 6.4.0 version.
+
+There is also fmi_api_key setting which is optional. It is required if you use FMI Open data but it is not required for users who use only BALTRAD data.
+
+In the logging section there is level setting. For testing it can be set to debug which is very verbose. For testing info is probably enough but if you are interested what is really happening set this to debug. In production this must be set to critical or error.
+
+Configuration of styles
+-----------------------
+The next four sections in template files are styles. Style sections is follow syntax:
+[Style_name]
+[number] = [name],[from_value],[to_value],[R],[G],[B]
+
+Styles have been adopted from FMI Opendata resources project (https://github.com/fmidev/opendata-resources) and you don't have to touch them if you are satisfied with the colours.
+
+Dataset sections
+----------------
+Finally you have to configure datasets. Dataset sections names are dataset_1, dataset_2 etc. This is very impontant. If you set them some other name the scripts will not find them.
+
+In template file you can find templates for BALTRAD data and FMI Open data. With BALTRAD data you must check the contents of your HDF5 files to get the parameters.
+
+Dataset parameters are:
+* name (required)
+    * Unique name for datasets. This is layer name in WMS requests.Avoid any special characters here (including white space).
+* title (required)
+    * Set dataset title. This can be same as hame
+* dataset_type (required)
+    * This is hdf or geotiff. For BALTRAD HDF5 files set this to hdf and for FMI Open data set this to geotiff
+* hdf_dataset (optional; required for BALTRAD data)
+    * Path of HDF dataset in HDF5 file. In ODIM_H5 file this is dataset1/data1, dataset1/data2 etc.
+* unit (required)
+    * Set unit name. Possible values are dBZ, mm/h and mm
+* style (required)
+    * This refers to style section name. The style section with this name must exist in config file
+* cleanup_time (required)
+    * Dataset cleanup time in hours. Set this to -1 for no clean-up. With constant data flow the file storage size grow rapidly, so this is hightly recommended to set this. For testing purposes value -1 can be used. The script reads time from system so make sure it is OK.
+
+Setup configurator.py
+---------------------
+Open configurator.py and modify variable path to point your baltrad_wms.cfg
+
+Setting up of database
 ----------------------
-1. Install dependencies, see sections below
+Run 
+# python db_setup.py
+
+You are first asked for confirmation. Answer 'y' here. 
+
+If you are not using sqlite you have to create database by hand. In Postgres this could be done using createdb command. 
+
+Setting up of BALTRAD datasets
+------------------------------
+NOTE: If you are using only BALTRAD data you can delete dataset sections 3-7 from baltrad_wms.cfg file
+
+Make sure you have HDF5 files in your baltrad_data_dir (see baltrad_wms.cfg). Then run 
+# python update_baltrad_wms.py
+
+The script will populate database with files and cleans up old files.
+
+In deployment setup this script can be run as a cron job with logging level 'critical'.
+
+Setting up of FMI Open datasets
+-------------------------------
+NOTE: If you are using only FMI open data you can delete dataset sections 1-2 from baltrad_wms.cfg file
+
+Make sure you have set your fmi_api_key in your config file.
 
-2. Configure files
+Then run:
+# python fmi_open.py
+
+The script will populate database with new datasets and cleans up old ones.
+
+In deploment setup this script can be run as a cronjob with logging level 'critical'.
+
+Setting up of a Web server
+--------------------------
+To serve images over HTTP or HTTPS you have to set up Web Server. Apache2 setup is explained here. You have two options how to serve files: Using Python CGI (traditional way) or Python WSGI (modern way). Apache configurations are explained here.
 
-Open file "baltrad_wms.cfg" for editing. There are five values that must be set.
-* baltrad_data_dir: Directory where Baltrad HDF5 files are stored. The files must be in this directory, not in subdirectories.
-* wms_data_dir: Directory where GeoTIFF files converted from HDF5 files are stored. This can be in a temporary directory that is not cleaned up too often.
-* mapfile: Location of Mapserver mapfile. Set this to directory where baltrad_wms is installed.
-* datasets: Location of datasets file that stores metadata (this is like database but text-based). Set this to directory where baltrad_wms installed. 
-* online_resource: URL of baltrad wms script
-* tmpdir: directory for temporary files, optional (used only by baltrad wms tools)
+In Ubuntu 12.04 Apache configuration is in the file /etc/apache2/httpd.conf.
 
-Optionally you can also set datasets:
-* dataset_1, dataset_2 etc. are the products in HDF5 to be visualized.
-  * name: unique dataset name. names must match with layer names in baltrad_wms.map file
-  * hdf_dataset: dataset path in HDF5 file
-  * unit: Unit of data
+To serve via Python CGI add these lines to your Apache2 config (NOTE: modify baltrad-wms directory path!):
+       ScriptAlias /baltrad/ /home/user/baltrad-wms/
+       <Directory "/home/user/baltrad-wms">
+               AllowOverride None
+               Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch
+               Order allow,deny
+               Allow from all
+       </Directory>
 
-Open files "baltrad_wms.py" and "update_baltrad_wms.py" and set the variable config_path located in the line number 3 in both files. This is the path to the file that was edited in the previous stage.
+Alternatively you can use WSGI script baltrad_wms.wsgi
 
-If you want to use WSGI script instead of CGI script then edit baltrad_wms.wsgi
+To run WSGI programs you have to install mod_wsgi:
+#  sudo apt-get install libapache2-mod-wsgi
 
-3. Test update_baltrad_wms.py
+Then you can add line 
+WSGIScriptAlias /baltrad_wsgi /home/user/baltrad-wms/baltrad_wms.wsgi
 
-Run "python update_baltrad_wms.py". If everything is successfully installed it shouldn't output anything. This script checks if there are new of deprecated files in HDF5 directory and converts them to GeoTIFFs and updates baltrad_dataset.dat file. The script is designed to be run from cron. So you can install this to crontab if you want datasets regulary updated.
+to your Apache config
 
-4. Test baltrad_wms.py or baltrad_wms.wsgi
+Restart Apache:
+# sudo service apache2 restart
 
-You can run WMS script using CGI or WSGI handler. WSGI script includes possibility to run without Web server. Simply run ./baltrad.wsgi to do that. This is not recommended for production environment but it is useful for testing purposes. 
+Now you should be able to access http://localhost/baltrad/baltrad_wms.py or http://localhost/baltrad_wsgi via Web browser.
 
-This is a CGI script. Copy this file to directory where it's allowed to run CGI scripts (By default in Apache2/Ubuntu it is /usr/lib/cgi-bin). Test if it works with Web browser (for example http://localhost/cgi-bin/baltrad_wms.py). If everything is OK, it should output something like "No query information to decode. QUERY_STRING is set, but empty"
+Other installation issues
+-------------------------
+For fresh start, run 
+# python clear_baltrad_wms_data.py
+
+Then go back to topic 'Setting up of database'. The script removes TIFF files and clears DB entries. This can be useful for example when migrating from one DB engine to another.
 
 Setting up of a demo
---------------------
-To test wms script with OpenLayers demo (or any other WMS client) you must have at least one dataset in your HDF5 directory and update_baltrad_wms.py executed after adding dataset(s). Running update_baltrad_wms.py might take for while if you have a large number of HDF5 files.
+====================
+To test wms script with OpenLayers demo (or any other WMS client) you must have either BALTRAD data or FMI Open data in your DB.
+
+Copy or symlink demo directory to the directory or make it available for Web server. For example with Apache config you can do it like this:
+Alias /baltrad_demo/ /home/user/baltrad-wms/demo/
+<Directory "/home/user/baltrad-wms/demo">
+    Order allow,deny
+    Allow from all
+</Directory>
+
+Now your demo is available at http.//localhost/baltrad_demo/demo.html
 
-Copy or symlink demo directory to the directory available for Web server. Edit demo.js and set variable wms_url (at the beginning of the code) to the URL of baltrad_wms.py. It should be same as "online_resource" value in baltrad_wms.cfg file. You can't run demo locally because the application uses AJAX call to fetch timestamps and it's allowed only from the same domain for security reasons. If you try the demo locally you can only see the newest dataset.
+Edit demo.js and set variable wms_url (at the beginning of the code) to the URL of baltrad_wms script. It should be same as "online_resource" value in baltrad_wms.cfg file. You can't run demo locally because the application uses AJAX call to fetch timestamps and it's allowed only from the same domain for security reasons. If you try the demo locally you can only see the newest dataset.
 
 You can also test WMS service with Google Earth (see http://earth.google.com/support/bin/static.py?page=guide.cs&guide=22373&topic=22376&answer=148100#wms) or any other WMS client. Unfortunately time dimension is not suppoted by common WMS clients and then only the newest datasets is shown.
 
+Deployment examples
+===================
+The scripts update_baltrad_wms.py and fmi_open.py are designed to run as cron job. Deployment examples for both data types are explained below.
 
-Installation of dependencies on Centos5
----------------------------------------
-Test platform:
-Centos 5.5 64-bin where Python 2.6.7 installed from sctrach.
-
-1. Install RPMForge repository
-(http://wiki.centos.org/AdditionalResources/Repositories/RPMForge)
-wget http://packages.sw.be/rpmforge-release/rpmforge-release-0.5.2-2.el5.rf.x86_64.rpm
-rpm --import http://apt.sw.be/RPM-GPG-KEY.dag.txt
-rpm -i rpmforge-release-0.5.2-2.el5.rf.*.rpm
-
-2. Install dependencies using yum
-
-These are needed for compiling GDAL and Mapserver:
-yum install proj-devel agg-devel geos-devel gd-devel giflib-devel libxml2-devel hdf5-devel
-
-And some essential packages for building (which you probably already
-have):
-yum install gcc gcc-c++
-
-And finally Web server
-yum install httpd
-
-2. Install GDAL (NOTE: Numpy must be installed before this. see section
-4)
-wget http://download.osgeo.org/gdal/gdal-1.9.2.tar.gz
-tar -zxvf gdal-1.9.2.tar.gz
-cd gdal-1.9.2
-./configure --with-libz=internal --with-curl=/usr/bin/curl-config
---with-png=internal --with-libtiff=internal --with-geotiff=internal
---with-jpeg=internal --with-gif=internal --with-geos=yes
---with-threads=yes --with-poppler=yes --with-python --with-xerces=yes
---with-expat=yes --without-libtool
-make
-make install
-
-3. Install Mapserver (you can try to install newer version too)
-wget http://download.osgeo.org/mapserver/mapserver-6.2.0.tar.gz
-tar -zxvf mapserver-6.2.0.tar.gz
-cd mapserver-6.2.0
-./configure --with-geos=/usr/bin/geos-config \
---with-gdal=/usr/local/bin/gdal-config \
---with-ogr=/usr/local/bin/gdal-config \
---with-agg --with-freetype \
---with-proj=/usr --with-wfs \
---with-gd=/usr
-make
-cd mapscript/python
-python setup.py install
-
-Note: make sure /usr/local/lib is at LD_LIBRARY_PATH
-Daniel's note: 
-"For some reason, I had to tweak the mapserver's 
-Makefile manually so that /usr/local/include,lib came before 
-/usr/include,lib..."
-
-4. Finally install other Python libs
-Here python-setuptools is used, which was installed like this:
-wget http://pypi.python.org/packages/2.6/s/setuptools/setuptools-0.6c11-py2.6.egg#md5=bfa92100bd772d5a213eedd356d64086
-sh setuptools-0.6c11-py2.6.egg
-
-and then...
-easy_install numpy
-easy_install h5py
-easy_install pyproj
-easy_install GDAL
-
-Installation of dependencies on Ubuntu (10.04-)
------------------------------------------------
-1. Install dependencies using packages from UbuntuGIS repository
-
-Add UbuntuGIS repository, see https://wiki.ubuntu.com/UbuntuGIS
+If you install scripts to crontab set logging level to error or critical!
 
-Then run:
-sudo apt-get install python-mapscript python-h5py python-pyproj python-numpy python-gdal
+BALTRAD deployment
+------------------
+The script update_baltrad_wms.py polls one directory where BALTRAD HDF5 files are located. So you have to setup somehow a constant data update and clean-up to this directory. In DB there is stored the initial file path of HDF5 file. If the file is already processed the script doesn't try to process the file again. 
 
-Apache can be installed in Ubuntu with command:
-sudo apt-get install apache2
+After you have constant data updater and cleaner you can add the cronjob. For example to run DB updater and cleaner every 5 minuts add this line to your crontab:
+#*/5 * * * * /home/user/baltrad-wms/update_baltrad_wms.py
+
+The script handles both updating and cleaning of your DB.
+
+FMI Open data deployment
+------------------------
+Unlike in the case of BALTRAD fmi_open.py script doesn't poll files but instead, reads the online WFS resource to check for new datasets. A list of datasets is in the beginning of the script and you can also add more of them. Only the most common ones are listed there. 
+
+To add FMI Open updater to your crontab just add the following line for example to run script every 5 minutes:
+#*/5 * * * * /home/user/baltrad-wms/fmi_open.py
+
+The script handles both updating and cleaning of your DB.
diff --git a/README b/README
index 6b19e39..42689fa 100644 (file)
--- a/README
+++ b/README
@@ -4,7 +4,9 @@ by Tuomas Peltonen, STUK <tuomas.peltonen@stuk.fi>
 
 Dependencies
 ------------
-* Mapserver with Python bindings (http://mapserver.org/)
+* Python version 2.6 or higher
+* SqlAlchemy (http://www.sqlalchemy.org/
+* Mapserver version 6 or later with Python bindings (http://mapserver.org/)
 * h5py (http://code.google.com/p/h5py/)
 * pyproj (http://code.google.com/p/pyproj/)
 * numpy (http://numpy.scipy.org/)
@@ -13,6 +15,15 @@ Dependencies
 
 History
 -------
+13.12.2013
+* Added support for FMI Open data
+* Migrated metadata storage from file to DB
+* Improved logging
+* Improved colour scales
+* Added support for countour plots
+* Documentation updated
+* Many minor improvements
+
 30.1.2013
 * Added WSGI script for baltrad_wms
 
@@ -31,7 +42,3 @@ History
 6.10.2011: 
 * First version 0.1 released for testing
 
-TODO list
----------
-Defects:
-* fix scale (colors, numbers etc.)
index 3c75599..70a0d1a 100644 (file)
@@ -5,7 +5,7 @@
   <description>Baltrad radar data exported to KML format</description>
   <Folder>
     <name>REPLACE_NAME</name>
-    <description>This is metadata and will be replaced with description someday.</description>
+    <description>Radar image generated by BALTRAD WMS</description>
     <!--
     <GroundOverlay>
       <name>Data</name>
diff --git a/baltrad_wms.cfg b/baltrad_wms.cfg
deleted file mode 100644 (file)
index ac35506..0000000
+++ /dev/null
@@ -1,18 +0,0 @@
-[locations]
-baltrad_data_dir = /home/user/baltrad/baltrad_data_sync/datadir
-wms_data_dir = /tmp/wms_data
-mapfile = /home/user/baltrad/baltrad_wms/baltrad_wms.map
-datasets = /home/user/baltrad/baltrad_wms/baltrad_datasets.dat
-online_resource = http://localhost/cgi-bin/baltrad_wms.py
-tmpdir = /tmp
-
-# dataset names must match with mapfile and they must be unique!
-[dataset_1]
-name = baltrad_dbz
-hdf_dataset = dataset1/data1
-unit = dBZ
-
-#[dataset_2]
-#name = baltrad_mmh
-#hdf_dataset = dataset1/data2
-#unit = mm/h
diff --git a/baltrad_wms.cfg.template b/baltrad_wms.cfg.template
new file mode 100644 (file)
index 0000000..5ea10a7
--- /dev/null
@@ -0,0 +1,201 @@
+[locations]
+db_uri = sqlite:////home/user/baltrad-wms/radar_datasets.db
+baltrad_data_dir = /home/user/baltrad_data
+wms_data_dir =/home/user/baltrad_wms_data
+mapfile = /home/user/baltrad-wms/baltrad_wms.map
+online_resource = http://localhost:8080/
+tmpdir = /tmp
+
+[settings]
+# true | false
+enable_countour_maps = false
+fmi_api_key = insert-your-fmiapikey-here
+
+[logging]
+# critical, error, warn, info, or debug
+level = debug
+
+[Radar_dbzh_style]
+# SYNTAX: number = name,from,to,R,G,B
+1 = dBZ,201,299,204,204,204
+2 = very heavy,180,201,250,81,165
+3 = very heavy,168,180,131,10,70
+4 = very heavy,156,168,206,2,2
+5 = very heavy,144,156,232,86,22
+6 = very heavy,132,144,235,149,26
+7 = very heavy,124,132,223,196,10
+8 = heavy,112,124,241,243,90
+9 = heavy,100,112,64,152,87
+10 = moderate,88,100,88,199,151
+11 = moderate,80,88,108,235,243
+
+[Radar_mmh_style]
+1 = ,63,999,204,204,204
+2 = > 25 mm/h,25,63,131,10,70
+3 = 10-25 mm/h,10,25,206,2,2
+4 = 4-10 mm/h,4,10,232,86,22
+5 = 2.2-4 mm/h,2.16,4.0,235,149,26
+6 = 0.9-2.2 mm/h,0.86,2.16,223,196,10
+7 = 0.3-0.9 mm/h,0.34,0.86,241,243,90
+8 = 0.1-0.3 mm/h,0.14,0.34,64,152,87
+9 = < 0.1 mm/h,0.07,0.14,108,235,243
+
+[FMI_Radar_mmh_style]
+1 = ,6313,65535,204,204,204
+2 = > 25 mm/h,2513,6313,131,10,70
+3 = 10-25 mm/h,1000,2513,206,2,2
+4 = 4-10 mm/h,398,1000,232,86,22
+5 = 2.2-4 mm/h,216,398,235,149,26
+6 = 0.9-2.2 mm/h,86,216,223,196,10
+7 = 0.3-0.9 mm/h,34,86,241,243,90
+8 = 0.1-0.3 mm/h,14,34,64,152,87
+9 = < 0.1 mm/h,7,14,108,235,243
+
+[FMI_Radar_accumulated_style]
+1 = mm,65530,99999999,204,204,204
+2 = 200,20000,65530,138,138,139
+3 = 190,19000,20000,189,189,190
+4 = 180,18000,19000,219,219,219
+5 = 170,17000,18000,249,249,250
+6 = 160,16000,17000,222,223,249
+7 = 150,15000,16000,202,203,249
+8 = 140,14000,15000,181,183,249
+9 = 130,13000,14000,151,153,250
+10 = 120,12000,13000,103,107,250
+11 = 110,11000,12000,67,71,250
+12 = 100,10000,11000,3,9,251
+13 = 95,9500,10000,141,7,10
+14 = 90,9000,9500,166,7,10
+15 = 85,8500,9000,193,8,10
+16 = 80,8000,8500,219,12,14
+17 = 75,7500,8000,249,8,7
+18 = 70,7000,7500,228,6,23
+19 = 65,6500,7000,208,13,12
+20 = 60,6000,6500,224,9,99
+21 = 55,5500,6000,241,13,151
+22 = 50,5000,5500,245,22,247
+23 = 47.5,4750,5000,215,118,232
+24 = 45,4500,4750,190,141,214
+25 = 42.5,4250,4500,201,170,220
+26 = 40,4000,4250,178,148,207
+27 = 37.5,3750,4000,150,118,182
+28 = 35,3500,3750,122,87,156
+29 = 32.5,3250,3500,101,65,140
+30 = 30,3000,3250,80,43,119
+31 = 27.5,2750,3000,62,23,102
+32 = 25,2500,2750,43,3,85
+33 = 24,2400,2500,240,186,231
+34 = 23,2300,2400,211,155,201
+35 = 22,2200,2300,194,131,182
+36 = 21,2100,2200,175,106,163
+37 = 20,2000,2100,151,72,137
+38 = 19,1900,2000,133,47,188
+39 = 18,1800,1900,118,25,101
+40 = 17,1700,1800,94,3,81
+41 = 16,1600,1700,70,3,69
+42 = 15,1500,1600,54,2,61
+43 = 14.5,1450,1500,252,210,207
+44 = 14,1400,1450,253,185,180
+45 = 13.5,1350,1400,250,160,154
+46 = 13,1300,1350,236,139,130
+47 = 12.5,1250,1300,218,121,113
+48 = 12,1200,1250,197,100,92
+49 = 11.5,1150,1200,179,83,74
+50 = 11,1100,1150,155,59,50
+51 = 10.5,1050,1100,131,35,26
+52 = 10,1000,1050,107,11,1
+53 = 9.5,950,1000,203,64,2
+54 = 9,900,950,212,88,2
+55 = 8.5,850,900,222,106,1
+56 = 8,800,850,225,129,3
+57 = 7.5,750,800,229,155,7
+58 = 7,700,750,232,179,10
+59 = 6.5,650,700,235,203,13
+60 = 6,600,650,238,225,15
+61 = 5.5,550,600,237,247,59
+62 = 5,500,550,206,245,50
+63 = 4.5,450,500,14,93,135
+64 = 4,400,450,14,93,135
+65 = 3.5,350,400,35,118,163
+66 = 3,300,350,59,145,193
+67 = 2.5,250,300,93,168,213
+68 = 2,200,250,147,198,228
+69 = 1.5,150,200,181,217,237
+70 = 1,100,150,212,234,246
+71 = 0.9,90,100,161,211,148
+72 = 0.8,80,90,131,183,137
+73 = 0.7,70,80,88,149,112
+74 = 0.6,60,70,45,114,87
+75 = 0.5,50,60,37,4,4
+76 = 0.4,40,50,181,198,228
+77 = 0.3,30,40,131,145,201 
+78 = 0.2,20,30,92,103,182
+79 = 0.1,10,20,49,57,159
+80 = ,0,9.5,204,204,204
+
+# dataset names must be unique!
+# Dataset section names are dataset_1, dataset_2, etc. (no limitation)
+[dataset_1]
+# ascii characters, max 50 characters, no spaces!
+name = baltrad_dbz 
+title = BALTRAD Radar (dBZH)
+# options: hdf | geotiff
+dataset_type = hdf 
+# dataset location in hdf file, not used in geotiff datasets
+hdf_dataset = dataset1/data1 
+# units in legend
+unit = dBZ 
+# refer to some style section in this file
+style = Radar_dbzh_style
+# clean-up time in full hours (set -1 for no cleanup)
+# this will also prevent adding datasets older than this!
+cleanup_time = 120
+
+[dataset_2]
+name = baltrad_mmh
+title = BALTRAD Radar (mm/h)
+dataset_type = hdf
+hdf_dataset = dataset1/data2
+unit = mm/h
+style = Radar_mmh_style
+cleanup_time = 120
+
+[dataset_3]
+name = fmi_open_composite_dbz 
+title = FMI Open Radar (dBZH)
+dataset_type = geotiff
+unit = dBZ 
+style = Radar_dbzh_style
+cleanup_time = 48
+
+[dataset_4]
+name = fmi_open_composite_rr
+title = FMI Open Radar (mm/h)
+dataset_type = geotiff
+unit = mm/h
+style = FMI_Radar_mmh_style
+cleanup_time = 48
+
+[dataset_5]
+name = fmi_open_composite_rr1h
+title = FMI Open Radar accumulated rain 1 h
+dataset_type = geotiff
+unit = mm
+style = FMI_Radar_accumulated_style
+cleanup_time = 72
+
+[dataset_6]
+name = fmi_open_composite_rr12h
+title = FMI Open Radar accumulated rain 12 h
+dataset_type = geotiff
+unit = mm
+style = FMI_Radar_accumulated_style
+cleanup_time = 72
+
+[dataset_7]
+name = fmi_open_composite_rr24h
+title = FMI Open Radar accumulated rain 24 h
+dataset_type = geotiff
+unit = mm
+style = FMI_Radar_accumulated_style
+cleanup_time = 120
index 19e2ccb..d6eafa5 100644 (file)
@@ -1,9 +1,10 @@
 MAP
   NAME "baltrad_wms"
-  #EXTENT -2200000 -712631 3072800 3840000
   EXTENT -180 -90 180 90
+  UNITS dd
   IMAGETYPE PNG
-  #IMAGECOLOR 105 153 189
+  #DEBUG 5
+  #CONFIG "MS_ERRORFILE" "/tmp/ms_error.txt"
   IMAGECOLOR 255 255 255
   MAXSIZE 4096
   STATUS ON
@@ -20,6 +21,7 @@ MAP
   END
 
   # "pseudo" gif, really it is PNG
+  # for google earth only
   OUTPUTFORMAT
     NAME "gif"
     MIMETYPE "image/png"
@@ -33,7 +35,7 @@ MAP
   PROJECTION
     "init=epsg:4326"
   END
+
   WEB
     METADATA
       "ows_enable_request" "*"
@@ -45,171 +47,4 @@ MAP
       "wfs_maxfeatures" "1"
     END
   END
-
-  LAYER
-    NAME "baltrad_dbz"
-    TYPE RASTER
-    STATUS ON
-    CLASSITEM "[pixel]"
-    DUMP true
-    PROJECTION
-      "init=epsg:4326"
-    END
-    METADATA
-      "wms_title" "Radar (dBZ)"
-      "wms_timeitem" "TIFFTAG_DATETIME"
-    END
-    OPACITY 70
-    CLASS
-      NAME "dBZ"
-      EXPRESSION ([pixel] > 200)
-      STYLE
-        COLOR 211 211 211
-      END
-    END
-    CLASS
-      NAME ">120"
-      EXPRESSION ([pixel]>120 AND [pixel] <=200)
-      STYLE
-        COLOR 215 48 39
-      END
-    END
-    CLASS
-      NAME "105-120"
-      EXPRESSION ([pixel]>105 AND [pixel] <=120)
-      STYLE
-        COLOR 244 109 67
-      END
-    END
-    CLASS
-      NAME "90-105"
-      EXPRESSION ([pixel]>90 AND [pixel] <=105)
-      STYLE
-        COLOR 253 174 97
-      END
-    END
-    CLASS
-      NAME "75-90"
-      EXPRESSION ([pixel]>75 AND [pixel] <=90)
-      STYLE
-        COLOR 254 224 144
-      END
-    END
-    CLASS
-      NAME "60-75"
-      EXPRESSION ([pixel]>60 AND [pixel] <=75)
-      STYLE
-        COLOR 255 255 191
-      END
-    END
-    CLASS
-      NAME "45-60"
-      EXPRESSION ([pixel]>45 AND [pixel] <=60)
-      STYLE
-        COLOR 224 243 248
-      END
-    END
-
-    CLASS
-      NAME "30-45"
-      EXPRESSION ([pixel]>30 AND [pixel] <=45)
-      STYLE
-        COLOR 171 217 233
-      END
-    END
-    CLASS
-      NAME "15-30"
-      EXPRESSION ([pixel]>15 AND [pixel] <=30)
-      STYLE
-        COLOR 116 173 209
-      END
-    END
-    CLASS
-      NAME "0-15"
-      EXPRESSION ([pixel]>0 AND [pixel] <=15)
-      STYLE
-        COLOR 69 117 180
-      END
-    END
-  END
-  
-  LAYER
-    NAME "baltrad_mmh"
-    GROUP "baltrad"
-    TYPE RASTER
-    STATUS ON
-    CLASSITEM "[pixel]"
-    DUMP true
-    PROJECTION
-      "init=epsg:4326"
-    END
-    METADATA
-      "wms_title" "Radar (mm/h)"
-      "wms_timeitem" "TIFFTAG_DATETIME"
-    END
-    OPACITY 70
-    CLASS
-      NAME "mm/h"
-      EXPRESSION ([pixel] > 9e9)
-      STYLE
-        COLOR 255 255 255
-      END
-    END
-    CLASS
-      NAME ">7"
-      EXPRESSION ([pixel]>7 and [pixel] <= 9e9 )
-      STYLE
-        COLOR 255 0 0
-      END
-    END
-    CLASS
-      NAME "5-7"
-      EXPRESSION ([pixel]>5 AND [pixel] <=7)
-      STYLE
-        COLOR 255 51 102
-      END
-    END
-    CLASS
-      NAME "2-5"
-      EXPRESSION ([pixel]>2 AND [pixel] <=5)
-      STYLE
-        COLOR 255 153 51
-      END
-    END
-    CLASS
-      NAME "1-2"
-      EXPRESSION ([pixel]>1 AND [pixel] <=2)
-      STYLE
-        COLOR 255 204 0
-      END
-    END
-    CLASS
-      NAME "0.5-1"
-      EXPRESSION ([pixel]>0.5 AND [pixel] <=1)
-      STYLE
-        COLOR 255 255 51
-      END
-    END
-    CLASS
-      NAME "0.2-0.5"
-      EXPRESSION ([pixel]>0.2 AND [pixel] <=0.5)
-      STYLE
-        COLOR 153 255 102
-      END
-    END
-    CLASS
-      NAME "0.1-0.2"
-      EXPRESSION ([pixel]>0.1 AND [pixel] <=0.2)
-      STYLE
-        COLOR 0 204 153
-      END
-    END
-    CLASS
-      NAME "-0.1"
-      EXPRESSION ([pixel]>0.05 AND [pixel] <=0.1)
-      STYLE
-        COLOR 0 153 204
-      END
-    END
-  END
 END
index 1b2a956..a445257 100755 (executable)
 #!/usr/bin/env python
 
-config_path = "/home/user/baltrad/baltrad_wms/baltrad_wms.cfg"
-
-#
-# do not edit anything below
-#
-
-import mapscript
+# read config
 import ConfigParser
+from configurator import *
+settings = read_config()
 
-config = ConfigParser.ConfigParser()
-config.read( config_path )
+from db_setup import *
+import mapscript
 
-def read_config(tools=False):
-    # read config
-    datasets = ConfigParser.ConfigParser()
-    datasets.read( config.get("locations","datasets") )
-    mapfile_path = config.get("locations","mapfile")
-    sections = datasets.sections()
-    sections.sort()
-    if tools:
-        tmpdir = config.get("locations","tmpdir")
-        online_resource = config.get("locations","online_resource")
-        return sections, datasets, tmpdir, online_resource
-    else:
-        return sections, datasets, mapfile_path
+def get_query_layer(layer_name):
+    if "_contour" in layer_name:
+        layer_name = layer_name.replace("_contour","")
+    return layer_name
 
-def wms_request(mapfile_path,req,sections,datasets):
-    map_object = mapscript.mapObj( mapfile_path )
+def wms_request(req,settings):
+    map_object = mapscript.mapObj( settings["mapfile_path"] )
     request_type = req.getValueByName("REQUEST")
+    if request_type==None:
+        raise Exception ( "WMS parameter request is missing!" )
     time_value = req.getValueByName("TIME")
-    # only one layer allowed
+    opacity = req.getValueByName("OPACITY")
+    layers = {}
+    contour = settings["enable_contour_maps"]
     if req.getValueByName("LAYERS")!=None:
-        layer = map_object.getLayerByName(req.getValueByName("LAYERS"))
+        layers_list = req.getValueByName("LAYERS").split(",")
+    elif req.getValueByName("LAYER")!=None: #legend
+        layers_list = [req.getValueByName("LAYER")]
     else:
-        layer = map_object.getLayerByName(config.get("dataset_1","name"))
-        layer2 = map_object.getLayerByName(config.get("dataset_2","name"))
-
+        layers_list = None
+    # create layers
+    config_dataset_names,config_sections = get_sections_from_config()
+    for dataset_name in config_sections:
+        new_layer_name = config.get(dataset_name, "name")
+        new_layer_title = config.get(dataset_name, "title")
+        # do not write styles for layers that are not queried
+        if layers_list:
+            if (not new_layer_name in layers_list and\
+                not new_layer_name+"_contour" in layers_list):
+                continue
+        if contour:
+            layer_names = (new_layer_name,new_layer_name+"_contour")
+        else:
+            layer_names = (new_layer_name,)
+        for l_name in layer_names:
+            processing = []
+            layers[l_name] = mapscript.layerObj(map_object)
+            layers[l_name].name = l_name
+            if "_contour" in l_name:
+                layers[l_name].type = mapscript.MS_LAYER_LINE
+                layers[l_name].connectiontype = mapscript.MS_CONTOUR
+                new_layer_title += " contour"
+                layers[l_name].addProcessing( "CONTOUR_INTERVAL=0" )
+                layers[l_name].addProcessing( "CONTOUR_ITEM=pixel" )
+                layers[l_name].setGeomTransform( "smoothsia([shape], 5)" )
+            else:
+                layers[l_name].type = mapscript.MS_LAYER_RASTER
+                layers[l_name].classitem = "[pixel]"
+            layers[l_name].status = mapscript.MS_ON
+            if str(opacity).isdigit():
+                layers[l_name].opacity = int(opacity)
+            else:
+                layers[l_name].opacity = 70 # default opacity
+            layers[l_name].metadata.set("wms_title", new_layer_title)
+            layers[l_name].metadata.set("wms_timeitem", "TIFFTAG_DATETIME")
+            layers[l_name].template = "featureinfo.html"
+            # set style class
+            class_name_config = config.get(dataset_name, "style")
+            for class_values in config.items ( class_name_config ):
+                item = class_values[1].split (",")
+                c =  mapscript.classObj( layers[l_name] )
+                style = mapscript.styleObj(c)
+                c.setExpression( "([pixel] > %s AND [pixel] <= %s)" % (item[1],item[2]) )
+                if "_contour" in l_name:
+                    processing.append(item[1])
+                    style.width = 2     
+                    style.color.setRGB( 0,0,0 )
+                    processing.append(item[1])
+                else:
+                    c.name = class_values[0]
+                    c.title = item[0]
+                    colors = map(int,item[3:6])
+                    style.color.setRGB( *colors )
+            if "_contour" in l_name:
+                processing.reverse()
+                layers[l_name].type = mapscript.MS_LAYER_LINE
+                layers[l_name].addProcessing( "CONTOUR_LEVELS=%s" % ",".join(processing) )
     if "capabilities" in request_type.lower():
         # set online resource
-        map_object.web.metadata.set("wms_onlineresource", config.get("locations","online_resource") )
-        projdef = datasets.get(sections[-1],"projdef")
+        map_object.web.metadata.set("wms_onlineresource", \
+                config.get("locations","online_resource") )
         # write timestamps
-        sections.reverse() # newest first
-        layer.metadata.set("wms_timeextent", ",".join(sections))
-        layer.metadata.set("wms_timedefault", sections[0])
-        if layer2:
-            layer2.metadata.set("wms_timeextent", ",".join(sections))
-            layer2.metadata.set("wms_timedefault", sections[0])
-            layer2.setProjection( projdef )
-            layer2.data = datasets.get(sections[-1], layer2.name)
-        # just a dummy dataset
-        tiff_path = datasets.get(sections[-1], layer.name) # last one
+        for layer_name in layers.keys():
+            if contour:
+                layer_types = ("","_contour")
+            else:
+                layer_types = ("",)
+            for layer_type in layer_types:
+                radar_datasets = session.query(RadarDataset)\
+                        .filter(RadarDataset.name==layer_name)\
+                        .order_by(RadarDataset.timestamp.desc()).all()
+                radar_timestamps = []
+                for r in radar_datasets:
+                    radar_timestamps.append(r.timestamp.strftime("%Y-%m-%dT%H:%M:00Z"))
+                if len(radar_timestamps)==0:
+                    continue
+                layers[layer_name+layer_type].metadata.set("wms_timeextent", ",".join(radar_timestamps))
+                layers[layer_name+layer_type].metadata.set("wms_timedefault", radar_timestamps[0])
+                # setup projection definition
+                projdef = radar_datasets[0].projdef
+                if not "epsg" in projdef:
+                    projdef = "epsg:3785" # quite near default settings, affects only bounding boxes
+                layers[layer_name+layer_type].setProjection( projdef )
+                layers[layer_name+layer_type].data = radar_datasets[0].geotiff_path
+                bbox = radar_datasets[0].bbox_original
+                bbox = map(float, bbox.split(","))
+                layers[layer_name+layer_type].setExtent( *bbox )
     elif time_value not in (None,"-1",""):
-        tiff_path = datasets.get(time_value,layer.name)
-        projdef = datasets.get(time_value,"projdef")
+        # dataset is a combination of timetamp and layer name
+        time_object = datetime.strptime(time_value,"%Y-%m-%dT%H:%M:00Z")
+        for layer_name in layers_list:
+            radar_dataset = session.query(RadarDataset)\
+                    .filter(RadarDataset.name==get_query_layer(layer_name))\
+                    .filter(RadarDataset.timestamp==time_object).one()
+            layers[layer_name].data = radar_dataset.geotiff_path
+            layers[layer_name].setProjection( radar_dataset.projdef )
+            # lon/lat bbox
+            bbox =  map(float,radar_dataset.bbox_original.split(",") )
+            layers[layer_name].setExtent( *bbox )
     else:
-        tiff_path = datasets.get(sections[-1], layer.name) # last one
-        projdef = datasets.get(sections[-1],"projdef")
-    layer.data = tiff_path
-    layer.setProjection( projdef )
-    opacity = req.getValueByName("OPACITY")
-    if opacity:
-        if opacity.isdigit():
-            layer.opacity = int(opacity)
-    # getfeatureinfo request
-    if request_type=="GetFeatureInfo":
-        layer.template = "featureinfo.html"
+        for layer_name in layers_list:
+            # get newest result if timestamp is missing
+            radar_dataset = session.query(RadarDataset)\
+                    .filter(RadarDataset.name==get_query_layer(layer_name))\
+                    .order_by(RadarDataset.timestamp.desc()).all()
+            layers[layer_name].data = radar_dataset[0].geotiff_path
+            layers[layer_name].setProjection( radar_dataset[0].projdef )
+            bbox =  map(float,radar_dataset[0].bbox_original.split(",") )
+            layers[layer_name].setExtent( *bbox )
+    session.close()
     return map_object
 
 if __name__ == '__main__': # CGI
-    sections, datasets, mapfile_path = read_config()
+    settings = read_config()
     req = mapscript.OWSRequest()
     req.loadParams()
-    #text = ""
-    #for i in range(req.NumParams):
-    #    text += " %s=%s " % (req.getName(i),req.getValue(i))
-    #raise NameError (text)
-    map_object = wms_request(mapfile_path,req,sections,datasets)
+    map_object = wms_request(req,settings)
     # dispatch
     map_object.OWSDispatch( req )
index 1e3e3ab..6a172cd 100755 (executable)
@@ -1,9 +1,4 @@
 #!/usr/bin/env python
-
-#
-# do not edit anything below
-#
-
 import mapscript
 from cgi import parse_qs, escape
 
@@ -17,11 +12,11 @@ def application(environ,start_response):
     # read config
     req = mapscript.OWSRequest()
     req.type = mapscript.MS_GET_REQUEST
-    sections, datasets, mapfile_path = read_config()
+    settings = read_config()
     parameters = parse_qs(environ.get('QUERY_STRING', ''))
     for key in parameters.keys():
         req.setParameter(key,parameters[key][0])
-    map_object = wms_request(mapfile_path,req,sections,datasets)
+    map_object = wms_request( req, settings )
     # output result
     mapscript.msIO_installStdoutToBuffer()
     map_success = map_object.OWSDispatch( req ) # output should be 0
index 816a5d0..6a9f8d8 100755 (executable)
@@ -1,10 +1,13 @@
 #!/usr/bin/env python
-from baltrad_wms import read_config
-try: # TODO: fix
-    from h5togeotiff import h5togeotiff
-    from update_baltrad_wms import read_datasets
-except ImportError:
-    pass
+# read config
+import ConfigParser
+from configurator import read_config,config
+from baltrad_wms import get_query_layer
+settings = read_config(tools=True)
+online_resource = settings["online_resource"]
+tmpdir = settings["tmpdir"]
+
+from db_setup import *
 
 import cgi
 import os
@@ -13,14 +16,22 @@ from urllib import urlopen
 from xml.etree import ElementTree
 import zipfile
 import tempfile
+from pyproj import Proj, transform
 
 kmz_image_width = 600
 kml_namespace = "http://www.opengis.net/kml/2.2"
 
+#            else:
+#                bbox_lonlat = None
+
 def download_geotiff():
     timestamp = pars["TIME"].value
+    time_object = datetime.strptime(timestamp,"%Y-%m-%dT%H:%M:00Z")
     layer_name = pars["LAYER"].value
-    tiff_path = datasets.get(timestamp,layer_name)
+    radar_dataset = session.query(RadarDataset)\
+            .filter(RadarDataset.name==get_query_layer(layer_name))\
+            .filter(RadarDataset.timestamp==time_object).one()
+    tiff_path = radar_dataset.geotiff_path
     filename = os.path.basename(tiff_path)
     content = open(tiff_path).read()
     return content, filename
@@ -28,19 +39,39 @@ def download_geotiff():
 def time_series(req):
     start_time = pars["START_TIME"].value
     end_time =  pars["END_TIME"].value
+    # read time values as objects
+    start = datetime.strptime(start_time,"%Y-%m-%dT%H:%M:00Z")
+    end = datetime.strptime(end_time,"%Y-%m-%dT%H:%M:00Z")
     layer_name = pars["LAYER"].value
-    for i in range (len(sections)):
-        if sections[i]==start_time: # Start found
-            start_index = i
-        if sections[i]==end_time: # end found
-            end_index = i
-            break
-    timestamps = sections[start_index:end_index+1]
+    radar_datasets = session.query(RadarDataset)\
+            .filter(RadarDataset.name==get_query_layer(layer_name))\
+            .filter(RadarDataset.timestamp>=start)\
+            .filter(RadarDataset.timestamp<=end)
+    timestamps = []
+    bboxes = []
+    for r in radar_datasets.all():
+        timestamps.append( r.timestamp.strftime("%Y-%m-%dT%H:%M:00Z") )
+        if "epsg" in r.projdef.lower():
+            radar_proj = Proj(init=r.projdef)
+        else:
+            radar_proj = Proj(str(r.projdef))
+        lonlat_proj = Proj(init="epsg:4326")
+        b = r.bbox_original.split(",")
+        lonmin, latmin = transform(radar_proj,
+                                   lonlat_proj,
+                                   float(b[0]),
+                                   float(b[1]))
+        lonmax, latmax = transform(radar_proj,
+                                   lonlat_proj,
+                                   float(b[2]),
+                                   float(b[3]))
+        # add some extra bounds due to different projection
+        bbox_lonlat = [lonmin-1, latmin-1, lonmax+1, latmax+1]
+        bbox_lonlat = map(str,bbox_lonlat)
+        bbox_lonlat = ",".join( bbox_lonlat )
+        bboxes.append( bbox_lonlat )
     if req=="kmz":
         # read bboxes from config
-        bboxes = []
-        for t in timestamps:
-            bboxes.append(datasets.get(t,"bbox"))
         # calculate image dimensions
         bbox_0 = map( float, bboxes[0].split(","))
         kmz_image_height = int ( kmz_image_width * (bbox_0[3]-bbox_0[1]) / (bbox_0[2]-bbox_0[0]) ) 
@@ -117,23 +148,9 @@ def time_series(req):
         content = kmz_output.getvalue()
         filename = "BALTRAD_DATA_from_%s_to_%s.kmz" % (timestamps[0],timestamps[-1])
         return content, filename
-    elif req=="accumulated_rain":
-        files = []
-        for t in timestamps:
-            files.append(datasets.get(t,layer_name))
-        # generate tiff
-        for d in read_datasets:
-            if d["name"]==layer_name:
-                hdf_dataset = d["hdf_dataset"]
-        # TODO: implement
-        tiff_path = "/tmp/testi.tif"
-        geotiff = h5togeotiff( files, tiff_path, hdf_dataset )
-        # create temporary mapfile
-        # request string?
-        content = ",".join(files)
-    return content, "debug,txt"
+    else:
+        return content, "debug,txt"
 
-sections, datasets, tmpdir, online_resource = read_config(True)
 pars = cgi.FieldStorage()
 
 action = pars["ACTION"].value
diff --git a/cleaner.py b/cleaner.py
new file mode 100644 (file)
index 0000000..a2f1b30
--- /dev/null
@@ -0,0 +1,37 @@
+from datetime import datetime
+from datetime import timedelta
+import os
+from db_setup import *
+from configurator import set_logger
+# set logger
+logger = set_logger( "cleaner" )
+
+def read_expiration_time(expiration_time_hours):
+    "return expiration time utc timestamp"
+    if expiration_time_hours==-1:
+        expiration_time = None
+    else:
+        now = datetime.utcnow()
+        expiration_time = now - timedelta(hours=expiration_time_hours)
+    return expiration_time
+
+def clean_up(dataset_name,expiration_time_in_hours):
+    "clean up old datasets"
+    logger.debug( "Start clean-up procedure for dataset %s" % dataset_name )
+    # read cleanup expiration time from config
+    logger.debug ("Clean up results older than %i hours" % expiration_time_in_hours )
+    expiration_time = read_expiration_time(expiration_time_in_hours)
+    radar_datasets = session.query(RadarDataset)\
+            .filter(RadarDataset.name==dataset_name)\
+            .filter(RadarDataset.timestamp<expiration_time)
+    for item in radar_datasets.all():
+        # delete file if it exists
+        try:
+            os.remove(item.geotiff_path)
+        except OSError:
+            pass
+    logger.info ( "Removed %i results from dataset %s " % \
+            (radar_datasets.count(), dataset_name ) )
+    radar_datasets.delete()
+    session.commit()
+    logger.debug( "Clean-up procedure for dataset %s finished" % dataset_name )
index 107f8b8..2c1441a 100755 (executable)
@@ -1,21 +1,24 @@
 #!/usr/bin/env python
-from update_baltrad_wms import config_path, read_datasets
+from configurator import read_config,config
 import ConfigParser
 import os
+from db_setup import drop
 
 # read config
-config = ConfigParser.ConfigParser()
-config.read( config_path )
-datasets_file = config.get("locations","datasets")
+settings = read_config()
+datasets = []
+for section in config.sections():
+    if "dataset" in section:
+        datasets.append(config.get(section,"name"))
 
-
-for d in read_datasets():
-    wms_datadir = config.get("locations","wms_data_dir") + "/" + d["name"]
-    for filename in os.listdir(wms_datadir):
-        os.remove(os.path.join(wms_datadir,filename))
-    os.rmdir(wms_datadir)
+for d in datasets:
+    wms_datadir = config.get("locations","wms_data_dir") + "/" + d
+    try:
+        for filename in os.listdir(wms_datadir):
+            os.remove(os.path.join(wms_datadir,filename))
+        os.rmdir(wms_datadir)
+    except OSError:
+        pass
 print "Deleted wms files."
-
-textfile = file(datasets_file,'wt')
-textfile.close()
-print "Cleared file %s." % (datasets_file)
+drop()
+print "Cleared DB"
diff --git a/configurator.py b/configurator.py
new file mode 100644 (file)
index 0000000..29bf94c
--- /dev/null
@@ -0,0 +1,43 @@
+#!/usr/bin/env python
+# MODIFY THIS
+config_path = "/home/user/baltrad-wms/baltrad_wms.cfg"
+#
+
+
+import ConfigParser
+config = ConfigParser.ConfigParser()
+config.read( config_path )
+
+import logging
+import sys
+
+def read_config(tools=False):
+    # read config
+    settings = {}
+    settings["mapfile_path"] = config.get("locations","mapfile")
+    settings["db_uri"] = config.get("locations","db_uri")
+    settings["enable_contour_maps"] = eval(config.get("settings",\
+            "enable_countour_maps").capitalize())
+    if tools:
+        settings["tmpdir"] = config.get("locations","tmpdir")
+        settings["online_resource"] = config.get("locations","online_resource")
+    return settings 
+
+def get_sections_from_config():
+    config_dataset_names = []
+    config_sections = []
+    for section in config.sections():
+        if "dataset" in section:
+            config_dataset_names.append(config.get(section,"name"))
+            config_sections.append(section)
+    return config_dataset_names,config_sections
+
+def set_logger(name):
+    logger = logging.getLogger(name)
+    logger.setLevel ( eval("logging." + config.get("logging","level").upper() ) )
+    ch = logging.StreamHandler(sys.stdout)
+    ch.setLevel ( eval("logging." + config.get("logging","level").upper() ) )
+    frmt =  logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s")
+    ch.setFormatter ( frmt )
+    logger.addHandler ( ch )
+    return logger
diff --git a/db_setup.py b/db_setup.py
new file mode 100644 (file)
index 0000000..e88f2c2
--- /dev/null
@@ -0,0 +1,62 @@
+#!/usr/bin/env python
+# read config
+from configurator import read_config
+settings = read_config()
+
+from sqlalchemy import *
+from sqlalchemy.orm import *
+from sqlalchemy.ext.declarative import declarative_base
+from datetime import datetime
+
+engine = create_engine( settings["db_uri"], echo=False)
+Session = sessionmaker(bind=engine)
+session = Session()
+
+metadata = MetaData(engine)
+Base = declarative_base(metadata=metadata)
+
+class RadarDataset(Base):
+    __tablename__ = "radar_dataset"
+    id = Column(Integer, primary_key=True)
+    name = Column(String(50), nullable=False)
+    title = Column(String(300), nullable=False)
+    timestamp = Column(DateTime,index=True)
+    geotiff_path = Column(String(500),nullable=False)
+    hdf_path = Column(String(500))
+    projdef = Column(String(150))
+    bbox_original = Column(String(150))
+    unit = Column(String(100),nullable=False)
+    dataset_type = Column(String(10))
+    style = Column(String(50))
+
+def drop():
+    "use with care"
+    try:
+        metadata.drop_all()
+    except:
+        pass
+
+def create():
+    metadata.create_all()
+
+def insert_stations_to_db(values=None):
+    "stations db must be dropped"
+    if not values:
+        values = get_values()
+    records = []
+    for station_id in values.keys():
+        point = "POINT(%s %s)" % (values[station_id]["lon"],values[station_id]["lat"])
+        geom = WKTSpatialElement(point,4326)
+        records.append(Station(id=int(station_id),geom=geom,name=values[station_id]["name"]))
+    session.add_all(records)
+    session.commit()
+
+if __name__ == '__main__':
+    # fresh start
+    #answer = raw_input("Erase all? (y/[n]) ")
+    #if answer=="y":
+    #    drop()
+    answer = raw_input("Create new database? (y/[n]) ")
+    if answer=="y":
+        create()
+
index cb0e870..11861f6 100644 (file)
         <p id="shortdesc">
             Choose product:
             <select id="layer" onchange="javascript:update_layer_params()" >
-              <option value="baltrad_dbz">Rain (dBZ)</option>
-              <option value="baltrad_mmh">Rain (mm/h)</option>
+              <option value="baltrad_dbz">BALTRAD Reflectivity (dBZ)</option>
+              <option value="baltrad_mmh">BALTRAD Rain (mm/h)</option>
+              <!--
+              <option value="baltrad_dbz,baltrad_dbz_contour">Rain+contour (dBZ)</option>
+              <option value="baltrad_mmh,baltrad_dbz_contour">Rain+contour (mm/h)</option>
+              -->
+              <option value="fmi_open_composite_dbz">FMI Open composite (dBZ)</option>
+              <!-- <option value="fmi_open_composite_dbz,fmi_open_composite_dbz_contour">FMI Open composite + countour (dBZ)</option> -->
+              <option value="fmi_open_composite_rr">FMI Open composite (mm/h)</option>
+              <!-- <option value="fmi_open_composite_rr,fmi_open_composite_rr_contour">FMI Open composite + contour (mm/h)</option> -->
+              <option value="fmi_open_composite_rr1h">FMI Open composite (accumulated 1 h)</option>
+              <!-- <option value="fmi_open_composite_rr1h,fmi_open_composite_rr1h_contour">FMI Open composite + contour (accumulated 1 h)</option> -->
+              <option value="fmi_open_composite_rr12h">FMI Open composite (accumulated 12 h)</option>
+              <!-- <option value="fmi_open_composite_rr12h,fmi_open_composite_rr12h_contour">FMI Open composite + contour (accumulated 12 h)</option> -->
+              <option value="fmi_open_composite_rr24h">FMI Open composite (accumulated 24 h)</option>
+              <!-- <option value="fmi_open_composite_rr24h,fmi_open_composite_rr24h_contour">FMI Open composite + contour (accumulated 24 h)</option> -->
             </select>
             Timestamp: 
             <select id="timestamps" onchange="javascript:update_layer_params();">
-               <option id="">default</select>
+               <option id="" value="-1">default</select>
             </select>
-            <a href="#" onclick="javascript:go('down')"><img src="img/south-mini.png"></a> 
-            <a href="#" onclick="javascript:go('up')"><img src="img/north-mini.png"></a> 
+            <a href="#" onclick="javascript:go('down')">[&lt;--]</a> 
+            <a href="#" onclick="javascript:go('up')">[--&gt;]</a> 
         </p>
         <div>
           <div id="map" class="map"></div>
index 76a4a2f..5bed633 100644 (file)
@@ -1,10 +1,11 @@
-var wms_url = "http://localhost/cgi-bin/baltrad_wms.py";
-var wms_tools_url = "../cgi-bin/baltrad_wms_tools.py";
+var wms_url = "/baltrad/baltrad_wms.py";
+// var wms_url = "/baltrad_wsgi";
+var wms_tools_url = "/baltrad/baltrad_wms_tools.py";
 
 /* do not edit anything below this */
 var map;
 var layer;
-var layer_name = "baltrad_dbz";
+var layer_name;
 var first_update = true;
 var time_value; // current time
 
@@ -34,11 +35,13 @@ function init() {
         4
     );
 
+    layer_name = document.getElementsByTagName("select")[0].value;
+
     layer = new OpenLayers.Layer.WMS(
         "Radar",
         wms_url,
         {layers: layer_name, transparent: 'true', format: 'image/png', time: "-1"},
-        {isBaseLayer: false,singleTile: true} );
+        {isBaseLayer: false,singleTile: true, buffer: 0} );
     map.addLayer(layer);
 
     map.events.register('click', map, findLayerClick);
@@ -50,29 +53,28 @@ function init() {
 function update_meta () {
     OpenLayers.Request.GET({
         url: wms_url,
+        async: false,
         params: {
             SERVICE: "WMS",
             VERSION: "1.1.1",
             REQUEST: "GetCapabilities"
         },
         success: function(request) {
-            var doc = request.responseXML;
+            var doc = StringToXML(request.responseText);
             var layers = doc.getElementsByTagName("Layer")[0].getElementsByTagName("Layer");
             for (i=0;i<layers.length;i++) {
-                if (layers[i].getElementsByTagName("Name")[0].childNodes[0].nodeValue==layer_name) {
-                    if (first_update) {
-                        var time_values = getDataOfImmediateChild(layers[i], "Extent").split(",");
-                        var select = document.getElementsByTagName("select")[1];
-                        select.options.length = 0;
-                        for (k=0;k<time_values.length;k++)
-                            select.options.add(new Option(time_values[k],time_values[k]));
-                        first_update = false;
-                        var start_select = document.getElementsByTagName("select")[2];
-                        var end_select = document.getElementsByTagName("select")[3];
-                        start_select.innerHTML = select.innerHTML;
-                        start_select.selectedIndex = 5;
-                        end_select.innerHTML = select.innerHTML;
-                    }
+                if (layers[i].getElementsByTagName("Name")[0].childNodes[0].nodeValue.split(",")[0]==layer_name) {
+                    var time_values = getDataOfImmediateChild(layers[i], "Extent").split(",");
+                    var select = document.getElementsByTagName("select")[1];
+                    select.options.length = 0;
+                    for (k=0;k<time_values.length;k++)
+                        select.options.add(new Option(time_values[k],time_values[k]));
+                    first_update = false;
+                    var start_select = document.getElementsByTagName("select")[2];
+                    var end_select = document.getElementsByTagName("select")[3];
+                    start_select.innerHTML = select.innerHTML;
+                    start_select.selectedIndex = 5;
+                    end_select.innerHTML = select.innerHTML;
                     var legend_url = layers[i].getElementsByTagName("LegendURL")[0].getElementsByTagName("OnlineResource")[0].getAttributeNS('http://www.w3.org/1999/xlink', 'href');
                     document.getElementById("map_legend").src = legend_url;
                     time_value = document.getElementsByTagName("select")[1].value;
@@ -90,8 +92,8 @@ function update_layer_params() {
     var new_layer = document.getElementsByTagName("select")[0].value;
     time_value = document.getElementsByTagName("select")[1].value;
     if (new_layer!=layer_name) {
-        update_meta();
         layer_name = new_layer;
+        update_meta();
     }
     layer.mergeNewParams({time: time_value,layers:layer_name});
 }
@@ -133,7 +135,6 @@ function findLayerClick(event) {
 
 
 function setHTML(response) {
-    console.log(response)
     if (response.status==500)
         var text = ""
     else
@@ -167,6 +168,18 @@ function getDataOfImmediateChild(parentTag, subTagName)
      return val;
 }
 
+function StringToXML (text) {
+    if (window.ActiveXObject) {
+        var doc = new ActiveXObject('Microsoft.XMLDOM');
+        doc.async = 'false';
+        doc.loadXML(text);
+    } else {
+        var parser = new DOMParser();
+        var doc = parser.parseFromString(text,'text/xml');
+    }
+    return doc;
+}
+
 function export_to_geotiff () {
     window.location = wms_tools_url + "?ACTION=download_geotiff&TIME=" + time_value + "&LAYER=" + layer_name;
 }
diff --git a/fmi_open.py b/fmi_open.py
new file mode 100755 (executable)
index 0000000..d6cfdec
--- /dev/null
@@ -0,0 +1,136 @@
+#!/usr/bin/env python
+# script fetches FMI Open data and imports it to DB
+from db_setup import *
+from cleaner import *
+import ConfigParser
+from configurator import *
+tiff_dir_base = config.get("locations","wms_data_dir")
+api_key = config.get( "settings","fmi_api_key")
+
+# set logger
+logger = set_logger( "fmi_open" )
+
+from datetime import datetime,timedelta
+import os
+import urllib2
+from xml.etree import ElementTree
+gml_namespace = "http://www.opengis.net/gml/3.2"
+
+# sections in config file must match with layer names
+wfs_layers = {
+    'fmi_open_composite_dbz': 
+    'http://data.fmi.fi/fmi-apikey/{key}/wfs?request=GetFeature&storedquery_id=fmi::radar::composite::dbz',
+    'fmi_open_composite_rr1h':
+    'http://data.fmi.fi/fmi-apikey/{key}/wfs?request=GetFeature&storedquery_id=fmi::radar::composite::rr1h',
+    'fmi_open_composite_rr12h':
+    'http://data.fmi.fi/fmi-apikey/{key}/wfs?request=GetFeature&storedquery_id=fmi::radar::composite::rr12h',
+    'fmi_open_composite_rr24h':
+    'http://data.fmi.fi/fmi-apikey/{key}/wfs?request=GetFeature&storedquery_id=fmi::radar::composite::rr24h',
+    'fmi_open_composite_rr':
+    'http://data.fmi.fi/fmi-apikey/{key}/wfs?request=GetFeature&storedquery_id=fmi::radar::composite::rr'
+}
+
+
+config_dataset_names,config_sections = get_sections_from_config()
+
+def update():
+    add_datasets = []
+    return_datasets = []
+    logger.debug( "Start updating of DB" )
+    for layer in wfs_layers:
+        if not layer in config_dataset_names:
+            continue
+        else:
+            logger.debug( "Found layer %s from config" % layer )
+            section = config_sections[config_dataset_names.index(layer)]
+        # check that tiff directory exists
+        tiff_dir = "%s/%s" % (tiff_dir_base,layer)
+        if not os.path.exists(tiff_dir):
+            logger.debug( "GeoTIFF directory does not exist. Create it." )
+            os.makedirs(tiff_dir)
+        # get WFS to get WMS urls
+        try:
+            wfs_url = wfs_layers[layer].replace("{key}",api_key) 
+            response = urllib2.urlopen( wfs_url )
+            logger.debug( "Data from url %s fetched" % wfs_url )
+        # ignore network related problems
+        except:
+            logger.error( "Network error occurred while fetching url %s" % wfs_url )
+            continue
+        logger.debug( "Parse WFS response for layer %s" % layer )
+        wfs_response = ElementTree.fromstring(response.read())
+        file_references = wfs_response.findall('.//{%s}fileReference' % gml_namespace)
+        for ref in file_references:
+            url = ref.text
+            query = urllib2.urlparse.urlparse(url).query
+            query = query.split("&")
+            for q in query:
+                if q[0:4].lower()=="time":
+                    time_value = q.split("=")[-1]
+                elif q[0:3].lower()=="srs":
+                    projdef = q.split("=")[-1]
+                elif q[0:4].lower()=="bbox":
+                    bbox = q.split("=")[-1]
+            timestamp = datetime.strptime(time_value,\
+                    "%Y-%m-%dT%H:%M:%SZ")
+            if ( timestamp<read_expiration_time(int( config.get(section,"cleanup_time") )) ):
+                # do not store old files
+                logger.debug ( "Skip expired dataset %s:%s" % (layer,str(timestamp)))
+                continue
+            # search if dataset already exists
+            radar_datasets = session.query(RadarDataset)\
+                    .filter(RadarDataset.timestamp==timestamp)\
+                    .filter(RadarDataset.name==layer)
+            # skip file fetching if it already exists
+            if radar_datasets.count()>0:
+                logger.debug ( "Dataset %s:%s already in database" % (layer,str(timestamp) ) )
+                continue
+            output_filename = tiff_dir + "/" + time_value.replace(":","")\
+                    .replace("-","") + ".tif"
+            # save file to disk
+            try:
+                response = urllib2.urlopen( url )
+                logger.debug( "Fetched data from url %s" % url )
+            # ignore network related problems
+            except:
+                logger.error( "Network error or invalid api-key occurred while fetching url %s" % url )
+                continue
+            f = open(output_filename,"wb")
+            f.write( response.read() )
+            f.close()
+            # import dataset to db
+            logger.info( "Add new dataset: %s:%s to DB." % (layer,str(timestamp)) )
+            add_datasets.append(
+                RadarDataset(
+                    name = layer,
+                    timestamp = timestamp,
+                    geotiff_path = output_filename,
+                    projdef = projdef,
+                    bbox_original = bbox,
+                    dataset_type = config.get(section,"dataset_type"),
+                    unit = config.get(section,"unit"),
+                    style = config.get(section,"style"),
+                    title = config.get(section,"title")
+                )
+            )
+            return_datasets.append({"name": layer, 
+                                    "timestamp": timestamp })
+    # finally add new datasets to DB
+    if (len(add_datasets)>0):
+        session.add_all(add_datasets)
+        session.commit()
+    logger.info ( "Added %i results." % len(add_datasets) )
+    logger.debug( "Updating of DB finished" )
+    session.close()
+    return return_datasets
+
+if __name__ == '__main__':
+    update()
+    # delete old
+    for layer in wfs_layers:
+        if not layer in config_dataset_names:
+            continue
+        else:
+            section = config_sections[config_dataset_names.index(layer)]
+            clean_up(config.get(section,"name"),
+                     int( config.get(section,"cleanup_time") ))
index 2ec87cc..cbc4f09 100644 (file)
@@ -34,7 +34,14 @@ except ImportError: # Windows
     from gdalconst import GDT_Int16
     import gdalnumeric as gdal_array
 
-def h5togeotiff(hdf_files,geotiff_target,dataset_name ="dataset1/data1",data_type="float"):
+class H5ConversionSkip(Exception):
+    "Conversion skipper exception handler"
+    def __init__(self, message):
+        self.message = message
+    def __str__(self):
+        return self.message
+
+def h5togeotiff(hdf_files,geotiff_target,dataset_name ="dataset1/data1",data_type="float",expiration_time=None):
     """
     Converts BALTRAD hdf5 file to Mapserver compliant GeoTiff file. 
     Reprojection of data is included.
@@ -44,6 +51,8 @@ def h5togeotiff(hdf_files,geotiff_target,dataset_name ="dataset1/data1",data_typ
     * geotiff_target: Target GeoTIFF file path
     * target_projection: EPSG code string or set to None for no projection
     * dataset_name: change this if other information is wanted 
+    * data_type: data type for target file: float or int
+    * expiration_time: if defined (datetype.datetype) skip conversion if necessary
     """
     if not isinstance(hdf_files, (list, tuple)):
         hdf_files = [hdf_files]
@@ -53,6 +62,15 @@ def h5togeotiff(hdf_files,geotiff_target,dataset_name ="dataset1/data1",data_typ
         f = h5py.File(hdf5_source,'r') # read only
         where = f["where"] # coordinate variables
         what = f["what"] # data
+
+        # read time from h5 file
+        date_string = what.attrs["date"][0:8]
+        time_string = what.attrs["time"][0:4] # ignore seconds
+        starttime = datetime.strptime(date_string+"T"+time_string, "%Y%m%dT%H%M")
+        if expiration_time:
+            if starttime<expiration_time:
+                raise H5ConversionSkip("Conversion of expired dataset (%s) skipped" % str(starttime))
+
         dataset = f[dataset_name.split("/")[0]] 
         data_1 = dataset[dataset_name.split("/")[1]]
         data = data_1["data"]
@@ -90,10 +108,6 @@ def h5togeotiff(hdf_files,geotiff_target,dataset_name ="dataset1/data1",data_typ
         y_axis = numpy.arange( ymax,ymin,(ymin-ymax)/y_size )  # reversed
         #x_help_axis = numpy.arange( xmax,xmin,(xmin-xmax)/y_size ) # reverse this also
 
-        # read time from h5 file
-        date_string = what.attrs["date"][0:8]
-        time_string = what.attrs["time"][0:4] # ignore seconds
-        starttime = datetime.strptime(date_string+"T"+time_string, "%Y%m%dT%H%M")
         missing_value = data_what.attrs["nodata"]
 
         if data_type=="int":
@@ -139,10 +153,11 @@ def h5togeotiff(hdf_files,geotiff_target,dataset_name ="dataset1/data1",data_typ
     del out
 
     f.close()
-
     return {"timestamp":starttime.strftime("%Y-%m-%dT%H:%MZ"),
             "projection": proj_text,
-            "bbox": "%f,%f,%f,%f" % (lon_min,lat_min,lon_max,lat_max)}
+            "bbox_lonlat": "%f,%f,%f,%f" % (lon_min,lat_min,lon_max,lat_max),
+            "bbox_original": "%f,%f,%f,%f" % (xmin,ymin,xmax,ymax)
+            }
 
 if __name__ == '__main__':
     # command line use
index e52e568..ca13a5b 100755 (executable)
 #!/usr/bin/env python
-
-config_path = "/home/tvp/baltrad/baltrad_wms/baltrad_wms.cfg"
-
-#
-# do not edit anything below
-#
-
 import os
 import ConfigParser
-from datetime import datetime
+from datetime import datetime,timedelta
 
-from h5togeotiff import h5togeotiff
+from h5togeotiff import h5togeotiff,H5ConversionSkip
+from db_setup import *
+from cleaner import *
 
 # read config
-config = ConfigParser.ConfigParser()
-config.read( config_path )
-datasets = ConfigParser.ConfigParser()
-datasets.read( config.get("locations","datasets") )
-
+from configurator import *
 h5_dir = config.get("locations","baltrad_data_dir")
 tiff_dir_base = config.get("locations","wms_data_dir")
 
-def read_datasets():
-    hdf_datasets = []
-    for i in (1,2):
-        try:
-            hdf_datasets.append({"hdf_dataset" : config.get("dataset_%i" % i,"hdf_dataset"),
-                                 "name":  config.get("dataset_%i" % i,"name"),
-                                 "unit": config.get("dataset_%i" % i,"unit")})
-        except ConfigParser.NoSectionError:
-            pass
-    return hdf_datasets
+# set logger
+logger = set_logger( "update_baltrad_wms" )
+
+def read_hdf_datasets():
+    config_datasets = []
+    datasets = []
+    logger.debug( "Read HDF5 datasets from DB" )
+    for section in config.sections():
+        if "dataset" in section:
+            config_datasets.append(section)
+    for dset in config_datasets:
+        if config.get(dset,"dataset_type")!="hdf":
+            continue
+        dataset = {}
+        for key in ("name","unit","dataset_type","hdf_dataset","style","title", "cleanup_time"):
+            dataset[key] = config.get(dset,key).strip()
+        datasets.append(dataset)
+    logger.debug ( "Found %i HDF5 datasets from DB" % len(datasets) )
+    return datasets
 
 def update():
-    for d in read_datasets():
+    # Only valid for BALTRAD HDF type datasets!
+    logger.debug( "Start updating of DB" )
+    return_datasets = []
+    for d in read_hdf_datasets():
+        # create dataset geotiff directory if it doesn't exist
         tiff_dir = "%s/%s" % (tiff_dir_base,d["name"])
         if not os.path.exists(tiff_dir):
+            logger.debug( "GeoTIFF directory does not exist. Create it." )
             os.makedirs(tiff_dir)
         h5_files = os.listdir( h5_dir )
         tiff_files = os.listdir( tiff_dir )
         # search for new datasets
         for h5_file in h5_files:
+            if d["dataset_type"]!="hdf":
+                logger.warn( "%s is not HDF5 file. Skip it" % h5_file )
+                continue
+            # read cleanup time
+            exp_time = read_expiration_time(int(d["cleanup_time"]))
+            # search if hdf5 dataset already exists
+            radar_datasets = session.query(RadarDataset)\
+                    .filter(RadarDataset.hdf_path==h5_file)\
+                    .filter(RadarDataset.name==d["name"])
+            # continue with next file if result is already in DB
+            if radar_datasets.count()>0:
+                logger.debug ( "File %s already in database" % h5_file )
+                continue
             basename = os.path.splitext( h5_file )[0]
             tiff_path = os.path.join( tiff_dir, basename+".tif")
             if not os.path.isfile( tiff_path ):
+                if d["unit"]=="dBZ":
+                    data_type = "int"
+                else:
+                    data_type = "float"
                 try:
-                    if d["unit"]=="dBZ":
-                        data_type = "int"
-                    else:
-                        data_type = "float"
-                    geotiff = h5togeotiff( os.path.join( h5_dir, h5_file ), tiff_path, d["hdf_dataset"], data_type)
-                except KeyError:
+                    try:
+                        logger.debug ( "Convert file %s from HDF5 to GeoTIFF" % h5_file )
+                        geotiff = h5togeotiff( os.path.join( h5_dir, h5_file ), \
+                                tiff_path, d["hdf_dataset"], data_type,exp_time)
+                    except H5ConversionSkip as skipped:
+                        logger.info ( skipped.message )
+                        continue
+                except IOError:
+                    # ignore broken files 
+                    logger.error ( "Broken file detected: %s" % tiff_path )
                     continue
-                dataset_name = geotiff["timestamp"]
-                try: # if already exists, add only new parameter
-                    datasets.set(dataset_name,d["name"],tiff_path)
-                except ConfigParser.NoSectionError: # create new
-                    datasets.add_section(dataset_name)
-                    datasets.set(dataset_name,"projdef",geotiff["projection"])
-                    datasets.set(dataset_name,d["name"],tiff_path)
-                    datasets.set(dataset_name,"bbox",geotiff["bbox"])
-                datasets.write(  open(config.get("locations","datasets"),"w") )
-
-        # deprecate old datasets
-        for tiff_file in tiff_files:
-            basename = os.path.splitext( tiff_file )[0]
-            h5_path = os.path.join( h5_dir, basename+".h5" )
-            if not os.path.isfile( h5_path ):
-                tiff_path = os.path.join( tiff_dir, tiff_file )
-                os.remove( tiff_path )
-                for timestamp in datasets.sections():
-                    if datasets.get(timestamp,d["name"])==tiff_path:
-                        datasets.remove_section(timestamp)
-                        break
-                datasets.write( open(config.get("locations","datasets"),"w") )
+                timestamp = datetime.strptime(geotiff["timestamp"],\
+                        "%Y-%m-%dT%H:%MZ")
+                logger.info( "Add new dataset: %s:%s to DB." % (d["name"],str(timestamp)) )
+                add_datasets = [(
+                    RadarDataset(
+                        name = d["name"],
+                        timestamp = timestamp,
+                        geotiff_path = tiff_path,
+                        hdf_path = os.path.join( h5_dir, h5_file ),
+                        projdef = geotiff["projection"],
+                        bbox_original = geotiff["bbox_original"],
+                        dataset_type = d["dataset_type"],
+                        unit = d["unit"],
+                        style = d["style"],
+                        title = d["title"]
+                    )
+                )]
+                return_datasets.append({"name": d["name"], 
+                                        "timestamp": timestamp })
+                session.add_all(add_datasets)
+                session.commit()
+            else:
+                logger.debug ( "File %s already in database. Skip it" % h5_file )
+    logger.info ( "Added %i results." % len(return_datasets) )
+    logger.debug( "Updating of DB finished" )
+    session.close()
+    return return_datasets
 
 if __name__ == '__main__':
-    update()
-
+    dsets = update()
+    for d in read_hdf_datasets():
+        clean_up(d["name"],int(d["cleanup_time"]))