Our cloud infrastructure provides users with innovative methods to efficiently process and distribute data in a matter of seconds.
Thanks to the EO Finder tool, users can order and process the data automatically using API or manually with GUI.
Depending on users experience and needs 4 complementary processing modes available to all registered users may be used
Processing on the Virtual Machine
Users run a local processing chain on a dedicated VM instance (or a VM cluster using a tool like Kubernetes or Docker) interacting with the archive via local access interfaces (S3, NFS) or OGC interfaces (WCS/WMS/WMTS).
Processing in the shared code development environment
Jupyter - users run their code interactively accessing the archive using preinstalled libraries (EOlearn, GDAL) or directly.
PGaaS - Product Generation as a Service - users run tasks in a serverless environment, triggering the next processing steps via an API operating on the archive using local access interfaces (S3, NFS) or OGC interfaces (WCS/WMS/WMTS). Our processing is container-based, so we are able to process up to several hundred thousand products for the customer in a very short time. Thanks to attractive price models the user saves his time and we process.
Processing of data using external applications
Users of EO data processing software, eg. GIS, can access earth observation data using OGC interfaces (WCS/WMS/WMTS) and use data processing capabilities of the SentineHub to do data processing on the fly.
Currently, a number of processors are available to automatically generate Analysis-Ready Data from Sentinel-1 (terrain-corrected backscatter, interferometric coherence) and Sentinel-2 (surface reflectance with sen2cor). Additional processors can be installed, delivered by the user in the form of a docker image. The use of the dockerized application allows for server-less data processing with its own workflow rules.
Sample automated processing can be found on CREODIAS S2scenes.
Try our services for free.
Receive up to 150 euro to test your solution.