Open positions

 

Multiple researcher positions open at Risks-X, SUSTech

The Institute of Risk Analysis, Prediction and Management (Risks-X), Southern University of Science and Technology (SUSTech) has multiple postdoc and researcher opening positions at all ranks, including Postdoc, Research Associate, Research Assistant Professor, Research Associate Professor, Research Professor. We are looking for outstanding candidates with demonstrated research achievements in the following areas (but are not restricted to): Quantitative finance, Social-economic data analysis, Natural hazards, Earthquake modeling. We are looking for researchers for the list of projects attached to this advertisement, but candidates from relevant directions are welcome as well.

Risks-X is established by the Southern University of Science and Technology (Shenzhen, China) in closed collaboration with the Swiss Federal Institute of Technology Zurich (ETH Zurich, Switzerland). The Institute is committed to building a revolutionary dynamic risk management platform, with real-time and dynamically monitoring of extreme risks in various system applications, simulations of future scenarios, analysis and prediction of risk trends, in order to help society develop strong resilience and improve social responsibility.

Our Team


We are mavericks, we like to go off the beaten academic track, not shying away from attacking big problems, but we work closely with industry experts to solve the real-world challenges of human beings and society. The position offers the opportunity to work in a dynamic, multidisciplinary, and multicultural environment with teams researching diverse topics such as risks in finance, earthquakes, real estate, geo hazards, energy, epidemics, catastrophic risks, social networks, cyber security.

The institute is jointly led by Prof. Didier Sornette, member of the Swiss Academy of Engineering Sciences, professor on the Chair of Entrepreneurial Risks at ETH Zurich, (also chair professor at SUSTech) and Prof. Xiaofei Chen, member of the Chinese Academy of Sciences, professor and head of the Department of Earth and Space Sciences at SUSTech.

The City and The University

Established in 2012, the Southern University of Science and Technology (SUSTech) is a public institution funded by the municipality of Shenzhen, a special economic zone city in China. Shenzhen is a major city located in Southern China, situated immediately north to Hong Kong Special Administrative Region. As one of China's major gateways to the world, Shenzhen is the country's fastest-growing city in the past two decades. The city is the high-tech and manufacturing hub of southern China, home to the world's third-busiest container port, and the fourth-busiest airport on the Chinese mainland. As a picturesque coastal city, Shenzhen is also a popular tourist destination and was named one of the world's 31 must-see tourist destinations in 2010 by The New York Times. Shenzhen ranks at the 66th place on the 2017 Global City Competitiveness List, released by the National Academy of Economic Strategy, the Chinese Academy of Social Sciences and United Nations Habitat. By the end of 2016, there were around 20 million residents in Shenzhen.

SUSTech is a pioneer in higher education reform in China. The mission of the University is to become a globally recognized research university which emphasizes academic excellence and promotes innovation, creativity and entrepreneurship. Set on five hundred acres of wooded landscape in the picturesque Nanshan (South Mountain) area, the campus offers an ideal environment for learning and research.

Requirements

  • A PhD degree and demonstrated research capabilities.
  • Background in science and engineering fields such as statistics, physics, quantitative finance, mathematics, computer science, geophysics are highly preferred.
  • Strong desire and enthusiasm to work with a dynamic, growing and hard-working group.
  • "yes and" attitude and “doer” instead of a "no but” or “maybe” attitude.
  • Strong English communication skills.

Salary and conditions

  • Competitive salaries and fringe benefits including pension, medical and other allowances, which are among the best in China. Salary and rank will commensurate with qualifications and experience.
  • Opportunity for research exchange visit at ETH Zurich for a period of 3-12 months

The Shenzhen City has the Peacock scheme to support talent scholars, which provide 1.6-3 million RMB after-tax cash subsidy. The university will help qualified candidates to get the subsidy upon the approval of the municipal government.

Application

Please send your application to Prof. Ke WU (), with the following documents:
1) CV in English (please send Chinese version as well if available)
2) Cover letter which should explain your motivation and main achievements
3) Representative papers or other supportive documents if available

Project list


1. The Chinese financial crisis observatory

In 2008, in reaction to the widely spread belief that the great crisis was “bad luck” and could not have been predicted, at the chair of Entrepreneurial Risks of Prof. D. Sornette at ETH Zurich, we launched the Financial Crisis Observatory (FCO) with the goal of testing two nested hypotheses: (H1) financial bubbles can be diagnosed in advance before their bursts confirm their existence; (H2) given that a bubble is diagnosed in real time, its end can be probabilistically forecasted with skills significantly better than chance. These two hypotheses are built on a theory of financial bubbles that emphasizes the general mechanism of positive feedbacks, which leads in particular to specific signatures in the form of log-periodic power law singular (LPPLS) price behaviors. Since then, our group has published a large number of ex-ante real-time forecasts, with a quite remarkable track record. This suggests the possibility to develop operational early warning signals and actions that could mitigate bubbles and crashes.
We are now building a Chinese version of the financial crisis observatory to cover the major assets in China based on our existing systems and methodologies. The goal of this project is to develop a full picture of the financial bubble states in different Chinese assets, with input information from both structured data (price, fundamental data, etc.) and unstructured data (social networks and financial news). The Chinese FCO will also publish monthly reports in Chinese with in-depth analysis of Chinese and Global markets, based on more detailed data analysis, market news and macro policies. Currently we also have the following sub-projects under this framework:

  • Chinese stock market manipulation analysis and detection
  • Chinese equity analyst research reports analysis and evaluation
  • Bubble and crashes analysis and prediction based on Log Periodic Power Law Singularity (LPPLS)
  • Develop a convivial web-interface presenting graphical information on the bubble signals, with interactive feedbacks


2. Market impact and performance of arbitrageurs of financial bubbles in an agent-based model

Building on the previous project on the diagnostic and prediction of financial bubbles, what can be the market impacts of such predictions? Could there be severe unintended consequences deriving from a general use of such diagnostic methods? The present project has precisely to aim to address these outstanding questions of the market impact of potential superior knowledge on the development of bubbles.
The starting point is a versatile agent-based model (or computational economic model) with two or more assets, populated with different types of investors, fundamentalist and chartists. Fundamentalists form expectations on the return and risk of the risky assets and maximize their expected utility with constant relative risk aversion with respect to their allocation on the risky asset versus the risk-free asset. Chartists are subjected to social imitation and follow momentum trading. Allowing for random time-varying herding propensity, the model has been able to reproduce several well-known stylized facts of financial markets such as a fat-tail distribution of returns and volatility clustering. Moreover, transient faster-than-exponential bubble growth with approximate log-periodic behavior are observed and can be rationalized theoretically. The model accounts well for the behavior of traders and for the price dynamics that developed during the dotcom bubble in 1995–2000. Momentum strategies are shown to be transiently profitable, supporting these strategies as enhancing herding behavior.
The goal of this project is extend considerably the agent-based model to study the bubble dynamics, market impact and long-term growth of a financial market when either profit-seeking investors (“dragon-riders”), or a stability-oriented policy makers (“dragon-slayer”), or economic growth-focused authorities (“dragon-groomers”) or a mixture of them, employ the LPPLS methodology (and other bubble detection methods) for diagnosing bubbles in real time. Agent-based simulations will extend the existing model of super-exponential financial bubbles with two assets (risky and risk-free), in which fundamentalist and chartist traders (noise traders) co-exist.
One goal is to explore extensively the set of possible strategies of the dragon-riders, dragon-slayers and dragon-groomers, building on our expertise on agent-based models and our strong contact with the hedge-fund industry in particular, in order to develop realistic synthetic markets. We propose to endogenise the dynamics of the coupling strength between noise-traders by including feedback loops, in which the imitative mood of agents evolves as a function of the market dynamics and the interpretation of the agents on its generating mechanisms. We will also use machine learning and genetic algorithms to let evolve sophisticated specifications of LPPLS traders’ strategies. The goal of this part of the research is to quantify the stabilizing versus destabilizing impact of dragon-riders, exploring this question in the large universe of strategies while taking into account economic constraints that govern the evolution of the ecology of investors. We propose to build a large number of summary statistics of the resulting market dynamics to provide a full quantification of the different regimes.
Considering the dragon-slayers (supposed to embody a policy maker aiming at stabilizing financial markets), we want to explore what are the best strategies in the sense of their short-term goal of suppressing or minimizing bubbles and crashes as well as long-term consequences in terms of the market performance. Our ABM set-up can incorporate the feedbacks of policy decisions on monetary policy via the interest rates, as an example. From a different perspective, policy makers may want to project a positive investment environment to encourage the rebound of an anemic economy. The model will need to be extended to allow for credit creation, so that investors can borrow and leverage. We will incorporate a channel to feedback on the performance of the fundamental value reflecting the expected impact on innovation and real economic growth, which would be the target of the policy makers.
The agent-based model is also well adapted to explore the existence of trade-off between economic growth versus financial stability objectives. In this way, we want to provide an original approach to inform decision makers about the pros and cons of their large-scale interventions. Both dragon-riders and dragon-slayers (or dragon-groomers) coexist in reality and we aim at vigorously investigate the behaviors resulting from putting all these players together in our artificial world. In particular, we want to study economic performance, financial stability and the evolution of the ecology of investors that emerge from the selection process of the market dynamics. As the dynamics is inherently nonlinear with complex feedback loops, one should be prepared for unexpected behaviors.
We also need to investigate the interplay between the heterogeneity of time scales of different types of investors with the co-existence of many different strategy variants. One goal is to provide a robust answer to the question of the impact of an ecology of LPPLS traders and more generally of arbitrageurs of financial bubbles and market exuberance, in the presence of competing value investors and noise traders in stock markets exhibiting recurrent bubbles and crashes. Moreover, we want to study the dependence of the long-term macroeconomic growth rate on bubbles, by creating a feedback channel between financial markets and technological progress, in which above-average technological progress increases the dividend growth rate of the risky asset in the long run. This, in turn, will cause the stock’s value as seen by fundamentalists to rise, implying a higher stock price. We want to study the policy implications of regulating bubbles and crashes, which may involve a tradeoff between short-term stability and long-term growth.

3. Angel Investment Fund of Fund Risk Assessment and Decision Analysis (in collaboration with Shenzhen Angel Fund of Fund)
The PE & VC investment industry is prosperous in Shenzhen, with about 300 leading companies having gone public in Shenzhen in the past 30 years. To build Shenzhen as a Global venture capital center, and to support innovation and start-ups from an early stage, Shenzhen Government decided to setup an angel stage FoF – Shenzhen Angel FoF – in 2018, with an initial AUM of 5 billion Yuan. In order to better promote the venture capital market in Shenzhen, to better understand the dynamics and risks in this angel investment, we are teaming up with Shenzhen Angel FoF to initiate this project with the following objectives:

  1. Risk assessment model and decision making model for angel stage investment and risk management.
  2. On the basis of the existing government performance evaluation standards for angel FoF, study the performance evaluation and risk control models of the sub-funds,
  3. Study the effectiveness of government guidance funds, risk management mechanisms and evaluation methods,
  4. Develop Shenzhen Angel Investment Risk Index, Angel Fund Risk Management Professional Ability Ranking, and so on. through the Angel Investor Alliance established by Shenzhen Angel FoF.


4. Earthquake Cascading Effects quantified by the Earthquake Hazard Adjacency Matrix using Graph Theory & Machine Learning for improved Societal Resilience

As humans increase their impact on the planet, the risk associated with natural hazards can be amplified by emerging chains-of-events. This is especially true for large to great earthquakes, which are particularly prone to trigger other natural events, critical infrastructure failures, and further socio-economic disruption, with domino effects potentially leading to social unrest, economic slowdown, but even outbreaks and financial crises. Consequences of those super-catastrophes are usually unexpected. Cascading effects have so far been analyzed on a case-to-case basis or at a generic, semi-quantitative level, and never at a systematic, global level. Available data is uneven and scattered, often limited to secondary consequences or to specific critical infrastructures. The complex nature of interacting and interconnected relationships between different events thus needs to be integrated into a holistic framework. In this project, for the first time, we will take the ambitious approach of exploring the space of possible interactions, based on a systematic survey of all the available empirical evidence. We will develop a comprehensive, global database of past earthquake interdependencies based on an adjacency matrix (the Earthquake Hazard Adjacency Matrix or EQ-HAM) and analyze the main drivers of earthquake cascading disasters. We will first use data mining to extract all available information and encode it in the EQ-HAM. We will then investigate the statistical trends of earthquake risk amplification by using Graph Theory and Machine Learning, by defining indices that reflect the severity and depth of interconnections as a function of various environmental and macroeconomic parameters. Finally, we will provide decision-makers with a predictive tool for direct applications of the EQ-HAM in China. The aim of the proposed project is to help better understand, assess, and predict earthquake cascading effects from the local to the global scale and answer questions, such as: Above which magnitude does an earthquake trigger catastrophic cascading effects? How is earthquake risk amplification evolving over time? What are the spatial variations around the world and in China? What are the key drivers to earthquake cascading effects? The plan is consistent with the Sendai Framework for Disaster Risk Reduction 2015-2030 towards improved societal resilience. This project will indeed allow better mitigating earthquake risk amplification by focusing efforts on the critical event characteristics and environmental conditions which will be demonstrated to promote those catastrophic chains-of-events.

5. Global Earthquake Forecast System for China

The project consists in the extension and application to China of a Global Earthquake Forecast System started at ETH Zurich a few years ago under the direction of Prof. D Sornette, based on the wide range of data provided by different sensors on satellites and on the ground. In addition to a variety of international sources, the project aims at developing strong collaborations with various groups in China and in particular with the CEA and its Institute of Earthquake Forecasting. In a nutshell, the logic of the Earthquake Forecast System is based on
(i) the multi-phenomena nature of earthquake precursors,
(ii) a unifying theory in terms of stress activation of mobile electric charges,
(iii) multi-observations, multi-dimensional continuous monitoring,
(iv) multi-criteria multi-dimensional analyses and synthesis of precursors into a decision function providing earthquake alarms and likelihoods of target events,
(v) a decision making process towards operational activation and use by authorities, industry and citizens.
Forecasting earthquakes implies that there are time-varying processes, which depend on the changing conditions deep in the Earth’s crust prior to major seismic activity. These processes may be linearly or non-linearly correlated. In seismology, the research has traditionally been centered on mechanical variables, including precursory ground deformation (revealing the build-up of stress deep below) and on prior seismic events (past earthquakes may be related to or even trigger future earthquakes). Since the results have been less than convincing, there is a general consensus in the geoscience community that earthquake forecasting on time scales comparable to meteorological forecasts are still quite far in the future, if ever attainable.
The Global Earthquake Forecast System is based on innumerable reports of other types of precursory phenomena ranging from emission of electromagnetic waves from ultralow frequency (ULF) to visible (VIS) and near-infrared (NIR) light, electric field and magnetic field anomalies of various kinds (see below), all the way to unusual animal behavior, which has been reported again and again. Space and ground anomalies preceding and/or contemporaneous to earthquakes include:


Satellite Component
1. Thermal Infrared (TIR) anomalies
2. Total Electron Content (TEC) anomalies
3. Ionospheric tomography
4. Ionospheric electric field turbulences
5. Atmospheric Gravity Waves (AGW)
6. CO release from the ground
7. Ozone formation at ground level
8. VLF detection of air ionization
9. Mesospheric lightning
10. Lineaments in the VIS-NIR

Ground Station Component
1. Magnetic field variations
2. ULF emission from within the Earth crust
3. Tree potentials and ground potentials
4. Soil conductivity changes
5. Groundwater chemistry changes
6. Trace gas release from the ground
7. Radon emanation from the ground
8. Air ionization at the ground surface
9. Sub-ionospheric VLF/ELF propagation
10. Nightglow


These precursory signals are intermittent and seem not to occur systematically before every major earthquake. Researchers have not been able to explain and exploit them satisfactorily, but never together. Likewise, reports on pre-earthquake signals in the above list are not widely accepted by the geoscience community at large because no one could explain their origins. In addition the diversity of the signals makes them look disparate and unrelated, hampering any progress. Based on decades of research investment, there is now a unifying theory for a solid-state mechanism that is capable of providing explanations for the multitude of reported pre-earthquake phenomena. Analyzing satellite and ground station data, recorded before past large earthquakes, has provided clear evidence that precursory signals tend to become measurable days, sometimes weeks before the disasters. Since we have a serious scientific hypothesis of how these signals are generated, we now have a strong rational for a concerted initiative to continually monitor the Earth’s surface, both from satellites and from ground stations, with the goal of covering all relevant possible diagnostics. The appearance of different telltale signs will be consolidated in data centers, where data processing, analyses and synthesis will be carried out.
A crucial novelty is to use the multi-phenomena, multi-dimensional and multi-scale inputs to obtain robust decision outputs of earthquake alarms and the likelihood functions of target earthquakes, using rigorous statistical and machine learning techniques designed to tackle sparse intermittent multi-dimensional data. A strong emphasis on continuous statistical testing of the relevance and confidence of the precursors will be developed to assess and continue to improve the performance of the forecasts.
The Global Earthquake Forecast System is a revolutionary initiative, which is envisioned to transform the field of earthquake science by building a coherent edifice of signals for reliable earthquake forecasts. It will also be a cornerstone for the development of time-dependent preparatory measures of sensitive infrastructures and for the population, potentially saving billions of dollars and thousands of lives each year worldwide.

6. Leveraging Space Geodetic Data to enhance seismic hazard forecasting, warning and rapid response

Seismic hazards (e.g., damaging earthquakes, tsunamis) are extreme manifestations of the ongoing processes that shape and govern our dynamic Earth. Space geodetic data (GNSS: Global Navigation Satellite System, InSAR: Interferometric Synthetic Aperture Radar) can provide unique information for characterizing and monitoring inter-seismic strain accumulation, transient aseismic deformation, and evolving earthquake rupture, in turn supporting improved seismic hazard forecasting, warning and rapid response. Risks-X has long leveraged geodetic data for a range of influential studies (e.g., Prof. Kejie Chen’s real-time GNSS based tsunami early warning system at Jet Propulsion Laboratory), and we continue to develop innovative observation and analysis methods that push the boundaries of the field of geodesy as applied to seismic hazards research. Given the ongoing, rapid improvement in availability, variety, and precision of geodetic measurements, we are considering ways to fully utilize this observational resource for seismic hazard reduction is timely and essential. The project will focus on but not limited to:
1. Earthquake Rupture Forecasting based on long-term geodetic observations
Earthquake rupture forecast is the principal input for probabilistic seismic hazard assessment, which characterize possible earthquake sources in terms of their magnitudes, recurrence rates, and location of the causative fault ruptures. Traditionally the dates and fault displacements of past earthquakes, determined from geologic data, have provided the fault slip rate information for earthquake rupture forecasts. The geodetic data time-series reflects contemporary deformation rates, provide slip rate information on additional faults that lack geologic rate estimates, help quantify broadly distributed strain, and, in the future, may allow time-dependent forecasts to account for variation in slip rates throughout the earthquake cycle. In addition, geodetic data are unique in their sensitivity to creep rates and spatially variable fault interface coupling, including the down-dip limit of subduction zone locking, which are important factors in forecasting the potential size of future earthquakes and resulting ground motions, tsunami runups.
2. Earthquake/Tsunami Early Warning based on real-time GNSS observations
Earthquake/tsunami early warning involves predicting shaking intensity (or tsunami inundation) at user locations, which depends upon accurate, real-time determination of earthquake source characteristics including location, magnitude, fault orientation, moment release, and slip distribution. Real-time GNSS instruments directly measure co-seismic displacement and, unlike seismic data, can provide magnitude estimates that do not saturate for very large earthquakes. For earthquakes with magnitudes exceeding ~M7.5, finite fault modeling algorithms that use GNSS data might improve upon alerts by allowing more detailed characterization of the distance between the earthquake rupture and user locations which, along with the estimated magnitude, influences the accuracy of predicted shaking intensity and tsunami inundation.
3. Aftershock Prediction Based on Continuous GNSS Aseismic Deformation Observation
Forecasting aftershocks immediately after the main shock helps decision-makers evaluate risks and take appropriate actions to mitigate the effects of possible cascading seismic activity. Traditionally, operational aftershock forecasts are based on a statistical evaluation of the seismicity rate in an ongoing aftershock sequence without considering information regarding the long-term probability of earthquakes. During aftershock sequences, geodetic data record aseismic deformation with moment release that typically exceeds the cumulative seismic moment. In addition, spontaneous and triggered transient fault slip has been inferred using geodetic data in a variety of settings. While it is not yet known whether periods of transient aseismic deformation consistently correlate with changes in earthquake likelihood, the spatially dense, broad geographic coverage of continuous GNSS networks and the high sensitivity of borehole strainmeters yield data that could be systematically monitored for anomalous behavior, which could illuminate the relation between transient deformation and seismicity and potentially improve aftershock forecasting tools.
To achieve the goal of using geodetic observations for seismic hazard forecasting, warning and response, we need to overcome substantial technological challenges in data collection and handling. Data need to be freely and rapidly available in formats accessible for users from different fields and with various levels of experience. In particular we require:
1. Low latency tools and methods for delivering data from large, spatially distributed sensor networks;
2. Low latency tools and methods for automatic identification, discrimination, and verification of critical signals that indicate either increased risk or event occurrence;
3. Sensor placement in remote but critical regions, including the seafloor and parts of the cryosphere;
4. Full integration of data streams from many different kinds of sensors, including ground-based GNSS, seismometers, accelerators and space based InSAR, gravity, and optical imagery.
 

7. Earthquake Hazard and Risk Assessment Based on Seismicity Analysis and Strong Ground Motion Prediction

Keywords: Seismicity; Active fault; Probabilistic seismic hazard analysis (PSHA); Strong ground motion prediction; Earthquake hazard map
Earthquakes can cause landslides, tsunami, and bring about one fifth of the annual losses due to natural disasters, with an average death toll of over 25,000 people per year. To mitigate earthquake losses, it is necessary to evaluate the earthquake hazards and risks, as they could help decision makers in developing risk reduction measures that can include emergency response plans, the enforcement of building design codes, and development of insurance pools. The current project address this need by integrating the understanding of earthquake sources, active faulting, and ground shaking. This information is translated into a form that can be used to reduce the risk from earthquakes and to improve public safety. The project will focus on but not limited to:
(a) Seismic source characterization (Faults and Seismicity)
Source characterization describes the rate at which earthquakes of a given magnitude, and dimensions (length and width) occur at a given location. For each seismic source, the source characterization develops a suite of credible and relevant earthquake scenarios (magnitude, dimension, and location) and computes the rate at which each earthquake scenario occurs. The process involves: (1) geometrical models for active faults; (2) uncertainties in earthquake size, location and time of occurrence. The first step in the source characterization is to develop a model of the geometry of the sources, using the information from earthquake catalogues (historical and instrumental), active geological faults, geodetic estimates of crustal deformation, seismotectonic features and paleoseismicity. The second step includes models that describe the distribution of earthquake magnitudes, the distribution of rupture dimensions for each earthquake magnitude, the distribution of locations of the earthquakes for each rupture dimension, and the “Maximum Magnitude” for a given fault.
Related research fields: Paleoseismology; Active fault; Seismicity
(b) Strong Ground Motion Prediction (Empirical and Theoretical Modeling)
Strong ground motion is the strong earthquake shaking that occurs close to a causative fault. Predicting strong ground motion from future earthquakes is among the most important research topics in earthquake engineering and hazard assessment. The surface strong motion can be affected by the source effects related to the rupture process and the release of energy, the path effects related to the propagation of energy inside Earth, the influence of the shallow layers geotechnical characteristics: the so-called site-effects. The site effects are considered in risk mitigation through the evaluation of the seismic soil response. The process in this part involves: (1) empirical/numerical modeling of ground motion using virtual earthquake; (2) estimating the non-linear site response.
Related research fields: Seismic rupture process; Seismic modeling; Site effect
Requirements: The whole project aims to develop new methodologies and techniques for earthquake hazard and risk assessment and obtain high resolution PSHA at several topical urban agglomerations, such as Guangdong-Hong Kong-Macao Greater Bay Area. Applicant should be familiar with one or more research topics mentioned above.
 

 

JavaScript has been disabled in your browser