preloader
  • Home
  • Seminar Series: Computation & Data

“Computation & Data” is an interdisciplinary seminar series at the HSU.

Seminar Series: Computation & Data

Seminar Series: Computation & Data
hpc.bw
  • LOCATION

    Hybrid

  • DATE

    26 Jun, 2024

  • TIME

    16:00 CEST

  • MEMBER

    Public

About Event

The goal of the interdisciplinary seminar series Computation & Data at HSU is to bring together researchers and foster exchange on the development of algorithms, methods and software. The seminar series is typically scheduled for the last Wednesday every month, 16:00-18:00, with 1-2 presentations per hybrid session (digital and at HSU).

Feel free to subscribe the seminar newsletter by sending an e-mail to info-hpc-bw@hsu-hh.de with the subject line „Subscription Seminar Computation & Data”.

 

Tobias Bohne (UniBw M): Predicting and Analyzing Violent Conflict using Machine Learning Approaches

Crisis early warning systems are essential tools for conflict prevention and mitigation. Following the principle of “early warning, early action”, armed conflict forecasts can assist policymakers in anticipating future conflict risks and dynamics and assist them in the decision-making process. Germany’s National Security Strategy explicitly emphasizes increasing the use of science-based approaches in crisis early warning, which entails enhancing existing and developing new quantitative methods rooted in peace and conflict studies. While classical statistical methods were commonly employed in the field in the past, machine learning methods ranging from tree-based approaches to deep learning are now dominating the conflict forecast landscape. But how do quantitative conflict prediction models work? What kind of conflict is predicted? What data is fed into the systems, and what algorithms are used? What challenges are we facing when predicting violence, and how can we communicate and explain the models’ output to policymakers? This talk will provide an overview of the current state of conflict forecasting research, highlighting recent developments and exploring future research directions. In addition to providing a broad overview, this talk will delve into one of our subnational conflict fatalities modeling research projects, which focuses on uncertainty estimates and the spatial dimension of conflict. Instead of generating point estimates of armed conflict fatalities at the subnational level, we are estimating probability distributions of the number of conflict fatalities using distributional tree-based models to explicitly model the uncertainty of the forecasts. To account for the zero-inflated nature of (subnational) conflict data, we employ a hurdle model approach by combining the outputs of a binary classifier on conflict occurrences trained on all data with the estimates of our distributional regressors fitted to non-zero observations. As conflict dynamics are assumed to be shaped by unobserved local contexts, we also estimate hurdle models for contiguous regional clusters and combine them with our global models. Preliminary results indicate that global models consistently outperform naive benchmarks, while local models, though slightly less effective, still outperform benchmarks in most cases. This suggests the potential for enhancing local models by integrating more local-specific data

 

Nils Margenberg (HSU): Optimal Dirichlet boundary control by Fourier neural operators applied to nonlinear optics

We present an approach for solving optimal control problems in nonlinear optics using deep learning. To compute high-resolution approximations of the solution to the nonlinear wave model, we propose higher-order space-time finite element methods in combination with collocation techniques. This achieves higher regularity in time of the discrete solution. The simulation data is used to train a solution operator, which is represented by Fourier Neural Operators and can be used as the forward solver in the optimal control problem. The algorithm is tested on high-performance computing platforms to show that it is efficient and scalable. It is used to simulate generation of terahertz radiation in periodically poled lithium niobate. The neural network is used to optimize the optical input pulse and maximize the efficiency of nonlinear processes.

We use the periodic layering of the crystal to design the neural networks. The recursive application of the network onto itself yields an approximation to the full problem. Our results indicate that the proposed method can achieve a speedup by a factor of more than 360 compared to classical methods. A comparison of our results to experimental data shows the potential to revolutionize the way we approach optimization problems in nonlinear optics. Our approach still needs numerical simulations which dominate the overall computational cost. We also show how to improve their performance.

Event Speakers

speaker

Tobias Bohne

Research Assistant
speaker

Nils Margenberg

Research Assistant

More Events