Project P9: Introducing big-data and crowdsourcing to seismic hazard assessment

Timescale: Oct. 2021 – Sept. 2024

Supervisors:

Prof. Fabrice Cotton, GFZ Potsdam & University of Potsdam

Dr. Matthias Ohrnberger, University of Potsdam

Dr. Niels Landwehr, University of Potsdam

Objectives and Methods

Classical seismic hazard assessments and ground-shaking models are based on spatially sparse databases of seismograms collected by academic networks. Once analysed and interpreted, these data define the rate of occurrence of seismic events and the associated seismic ground-shaking. Most of these seismic hazard assessments are evaluated at the country level.

The urban population however continues to grow and there is a need to develop region and site-specific prediction of ground-shaking and seismic hazard evaluations. Such regionalisation and site-specific assessment are possible only if we are able to increase the density of ground-shaking records in urban areas and extract from them local amplifications factors and ground-shaking properties.

Indeed, new technologies give the opportunity to increase the density of academic seismological regional sensors but new “crowdsourcing” datasets are also emerging and show an exponential growth. Low cost environmental monitoring systems, smartphone apps for global earthquake eyewitnesses (e.g. Bossu et al., International Journal of Disaster Risk Reduction, 2018) and fibre optics cables (e.g. Jousset et al., 2018, nature communication) provide new ways to evaluate the intensity of seismic ground-shakings. These datasets are highly heterogeneous but they provide a unique opportunity to increase the density of data in urban centers and develop hazard assessment at the regional and even city level.

The goal of the PhD will be first to develop new data-analysis algorithms that deal with the unstructured nature and heterogenous reliability of these new datasets. High quality data based on “classical” high quality academic networks (e.g. GFZ seismological networks) will be first used to test and evaluate the potential of these new datasets (“data filtering”). In a second step (“data fusion”), the data will be harmonized and integrated. We will finally develop ground-shaking models (“data analysis”) which will benefit from the density and repeatability of these new and dense datasets to predict regional and local variations of ground-shaking using machine methods. These developments will benefit from the experience of the projects PI and their colleagues with non-ergodic ground-motion modeling, signal processing and machine learning methods.

Expected results

  1. A better understanding of the potential of crowsourcing datasets for ground-shaling modelling
  2. Innovative data filtering, heterogeneous data fusion and machine learning methods (which could be applied to other research topics covered by the NatRisk graduate schools)
  3. Regional and site-specific ground-shaking and hazard models

Dedicated Regional Cluster(s) and data availability

This project will benefit from data-mining methods and the training developed by the Helmholtz Einstein International Research School in Data Science (HEIBRiDS) and GeoX (The Research Network for Geosciences in Berlin and Potsdam. We will collaborate with the European Mediterranean Seismological Centre (EMSC), one of the top global earthquake information centers, which has been empirically developing a multichannel rapid information system comprising websites, a Twitter quakebot, and a smartphone app for global earthquake eyewitnesses. We will also have access to the data generated by fiber optics cables (GFZ project of Dr. Philippe Jousset) and the low cost sensors actually deployed by the Spinoff company (“Quakesaver”) which has been recently created in Potsdam by Dr. Danijel Schorlemmer. 

Links to former and current PhD-projects

This project will benefit from the PhD work of Sebastian Specht (NatRisk Change PhD project I3) and Henning Lilienkamp (HEIBRIDS graduate schools) who have developed new methods of ground-shaking analysis adapted to dense seismological networks. The recent PhD work of Reza Dokht Dolatabadi Esfahani (NatRisk Change PhD project I7) is developing innovative data reduction size strategies (retaining a comparable level of scientific information) which could also be used in this project.

Responsibilities: The PhD-project “Introducing big-data and crowdsourcing to seismic hazard assessment” is based at the research teams “General Geophysics” of the University of Potsdam and “Seismic Hazard and Risk Dynamics” of the GFZ German Research Centre for Geosciences. The recruited person will be in charge of testing, analysing and exploring the new datasets, e.g. using methods (e.g. Machine learning) which have been developed by the team recently (R and Python codes, Jupyter notebooks), but also suggest/develop innovative processing strategies and new ideas to use these data to improve ground-motion and hazard models.

Requirements: We are seeking applications from highly motivated candidates with excellent Master’s degree in mathematics, geosciences, physics or a related discipline. Programming skills are mandatory. We expect a solid background in seismology, statistics, signal processing, and interest in the quantitative assessment of hazards and the application/development of new machine learning methods. The PhD-project will be carried out in an interdisciplinary research team. Fluency in the English language (speaking and writing) is mandatory.