Helmholtz AI consultants @ KIT

Helmholtz AI consultants @ Karlsruhe Institute of Technology

Energy-focused AI consultants

Helmholtz energy researchers are searching for solutions to meet the energy consumption needs of present and future generations. The Helmholtz AI consultant team @ KIT represents the research field 'Energy' and supports these groups in achieving their goals by providing knowledge on state-of-the-art artificial intelligence (AI) methods. The application areas for AI in energy research are as diverse as the field itself, ranging from energy system load forecasting over the discovery of new materials for storage technologies (e.g. batteries) to automated control of industrial system.

In order to develop suitable methods and systems for these applications, the consultant team, led by Markus Götz, harnesses its internal knowledge about modern AI approach with an emphasis on image analysis, time series, graph problems as well as uncertainty quantification, model search and large-scale parallel processing. With these tools, the team is able to tackle various AI challenges including regression, classification, segmentation and interpolation. The Helmholtz AI local unit for energy @ KIT acknowledges the need for open and reproducible research and is therefore committed to open source code development and access to data.

Questions or ideas? consultant-helmholtz.ainoSp@m@kit.edu

https://github.com/Helmholtz-AI-Energy/

 

Team members

Oops, an error occurred! Code: 20240908054253bb03088f

Selected ongoing voucher projects

Societal impacts of AI powered Microgrids

  • Challenge: The identification of vulnerabilities in critical infrastructures is essential to taking steps towards prevention of their exploitation. With the digitization and localisation of electrical grid infrastructure in the form of microgrids, and the rapidly growing use of AI in their decision logic (e.g. load scheduling), these expose new possibilities for targeting city-scale attacks on central power supplies. This project investigates the direction of micro-grid AI and aims to identify the likely vulnerabilities this introduces that need to be addressed.
  • Approach: We collaboratively explore the structural vulnerability of an AI-powered microgrid by regarding it as a sociotechnical system. For this purpose we combine sociological theory about reliability and safety in complex technical systems with the knowledge about technical components and design inside a microgrid as well as the expertise on AI functionality and vulnerabilties.
  • Collaborator: KIT-ITAS

Achilles - forecast-based mini-grid management System for rural areas

  • Challenge: Mini-Grids are a key solution to provide energy to sites situated in rural areas of the world. These systems use solar power, coupled with energy storage systems, or fossil fuels to generate the required energy locally and to distribute it to connected consumers. In order to be economically viable, these systems must have a high share of renewable energy, as refuelling and maintenance of the diesel generator is costly due to the location of these networks and the associated transport costs. In current mini-grid systems, the decision when to start the diesel generator is based on system parameters such as the state of charge (SOC) of the connected batteries. When a generator is running, it is normally kept at its optimum point, sometimes producing more energy than is currently consumed by the grid. This surplus energy is then used to charge the energy storage devices in the system, leading to high state of charges at the beginning of the day, if the battery runs flat during the night. The energy generated by the photovoltaic system during the day can therefore no longer be stored. Although this system behaviour guarantees that there are no failures in the grid, it also significantly reduces self-consumption of solar energy and increases maintenance and diesel costs.
  • Approach: The project aims at solving the above-mentioned problems through the transition to a forecast-based network operation management system. Within the project, artificial intelligence approaches are used to predict loads, solar power and maintenance intervals in order to optimise grid operation in a cost-effective way. A priority is put on solutions for predictive maintenance. For this purpose, long-term forecasts of load and solar power are important, e.g. to determine charging cycles of the energy storage device or to estimate the operating hours of the diesel generator. This allows, among other things, system failures and regular intervals to be predicted and bundled.
  • Collaborator: DLR-EST

PerSeuS (Part II)

  • Challenge: In the field of photovoltaics, perovskites have emerged as a promising candidate for future commercial solar cells. However, the key challenge of upscaling state-of-the-art fabrication routines must be addressed to pave the way towards large-scale industrial production. Large-area devices can be fabricated by utilizing scalable printing and coating techniques such as blade coating. At the Lichttechnisches Institut (LTI) at the Karlsruhe Institute of Technology (KIT), an automatic layer deposition setup has been installed, where the coating process is recorded with a camera using four different edgepass filters. This setup acts as a versatile characterization tool for detection of spatial irregularities in the perovskite films and can be applied as insitu monitoring tool on large-area samples during the whole perovskite formation process. The aim of this project is to quantify the power conversion efficiency (PCE) of the applied layer based on the acquired temporally resolved images in order to characterize the resulting photovoltaic cells created in a fabrication process.
  • Collaborator: KIT-LTI

Wahn - thermal bridge detection in district-scale roof data taken by drones

  • Challenge: Retrofitting concepts at the city district level are becoming increasingly important in research and practice. In Germany for example, energy improvement district concepts are a common approach to structurally increase the energy quality of a whole city district. To identify effective measures for increased energy quality of a district, an initial analysis of the thermal quality of existing buildings is necessary. With the help of drones/UAV it is possible to collect thermal images of buildings from many angles with relatively little effort and costs. Thermal bridges in particular can be easily identified in such images. A thermal bridge can occur in building components that conduct heat better and thus transport heat more quickly to the outside than the adjacent components. Thermal bridges lead to high energy losses and the collection of moisture, which in the long term attacks the building fabric or leads to mold and decay.
  • Approach: In line with this voucher, we would like to study how thermal images taken by drones can be used for a simple and automated analysis of the thermal quality of buildings on district scale. We are developing an algorithm to automatically detect thermal bridges of building roofs on thermal drone images using a neural network approach. For this we employ existing solutions from the field of semantic segmentation, classifying not just thermal bridges but also the roof component class they belong to (e.g. defects in roofing, skylights, dormers, etc.). This segmentation is performed using information from both the RGB and thermal drone images, as well as height information determined from a 3D point cloud constructed from the entire set of 2D drone images.
  • Collaborator: KIT-IIP, University of Southern California (USC)

Yhi - determining cleanliness of parabolic trough fields

  • Challenge: An established source of renewable energy with the capability to deliver dispatchable electricity is concentrated solar power (CSP). The power plants are directly exposed to harsh environmental conditions. One consequence of this is that the mirrors soil over time and need to be cleaned. Since cleaning is costly and has a high water consumption, while at the same time degrading the mirrors, this maintenance work should be reduced to a minimum. This minimized cleaning strategy can only be implemented if the cleanliness level of each mirror is known. Currently, cleanliness measurements are taken with handheld devices. Their fractional cleanliness value is accurate, but due to high time investment delivers only point measurements of single mirrors.
  • Approach: The aim of this voucher is to develop an advanced method that uses additional environmental and operational data as well as the ground-truth labels from the cleanliness measurements in order to estimate the cleanliness level of all collectors in the field.
  • Collaborator: DLR-SF

Selected completed voucher projects

Svarog - segmentation of ground-mounted photovoltaics systems

  • Challenge: Existing geodata on ground-mounted photovoltaics (PV) systems are insufficiently mapped. They mostly include the operational area of the PV open space plant instead of the pure module area. Therefore, the area taken up by the panels open spaces tends to be overestimated. The aim of this project is to enhance the mapping accuracy of the existing geodata of PV.
  • Approach: We will utilize existing AI methods for image segmentation to recognize the open space PV systems. In particular, we will map all areas with a diameter of more than 25m. Furthermore, the existing aerial based dataset will be enriched by further data captured by satellites. This also include adding hyperspectral information that goes beyond the existing RGB data.
  • Collaborator: UFZ-Bioenergie

HyDE - hyperspectral denoising

  • Challenge: Remote sensing imaging systems are crucial tools for applications such as earth observation, lithological mapping, and change detection. Among them, optical imaging systems (particularly hyperspectral imaging) have received significant attention. However, imaging systems induce different noise types and artifacts into the observed image. For instance, hyperspectral push-broom imaging adds a pattern noise called striping, which significantly affects subsequent processing steps. The aim of this voucher is the development of an image denoising toolbox enabling the recovery of the true unknown image from the degraded observed image.
  • Approach: We will develop an open-source, Python-based denoising toolbox for hyperspectral images denoising. The algorithms will be able to utilize single-node parallel computational ressource, such as multi-core CPUs and GPUs. The algorithms will be mainly based on wavelet-based analysis approaches.
  • Collaborator: HIF-Exploration

BaumBauen (Part II)

  • Challenge: Particle decay reconstruction is an essential tool used in high energy physics (HEP). The ability to correctly identify the decay process that took place allows researchers to make precision measurments of the physics governing particle interactions. The Belle II experiment, an electron-positron particle collider based in Tsukuba, Japan, relies on efficient reconstruction of a large number of different decay processes to carry out measurements. The current generic reconstruction tool used, the Full Event Interpretation (FEI), was developed by the local Institut für Experimentelle Teilchenphysik (ETP) at the Karlsruhe Institute of Technology (KIT). The FEI has several design limitations, namely that it is not end-to-end trainable and that the decay processes it can reconstruct are hand-coded.
  • Approach: Work is currently underway to develop a modern, deep learning approach to both overcome these limitations by developing and implementing a method of learnable tree reconstruction. The project is a collaboration between several institutes, namely: the Institut für Experimentelle Teilchenphysik (Karlsruhe Institute of Technology, DE), the High Energy and Detector Physics Group (University of Bonn, DE), and the Institut Pluridisciplinaire Hubert Curien (University of Strasbourg/Centre national de la recherche scientifique, FRA).
  • Collaborator: KIT-ETP, the Belle II collaboration, University of Strasbourg, University of Bonn

BaumBauen (Part I)

  • Challenge: Particle decay reconstruction is an essential tool used in high energy physics (HEP). The ability to correctly identify the decay process that took place allows researchers to make precision measurments of the physics governing particle interactions. The Belle II experiment, an electron-positron particle collider based in Tsukuba, Japan, relies on efficient reconstruction of a large number of different decay processes to carry out measurements. The current generic reconstruction tool used, the Full Event Interpretation (FEI), was developed by the local Institut für Experimentelle Teilchenphysik (ETP) at the Karlsruhe Institute of Technology (KIT). The FEI has several design limitations, namely that it is not end-to-end trainable and that the decay processes it can reconstruct are hand-coded.
  • Approach: Work is currently underway to develop a modern, deep learning approach to both overcome these limitations by developing and implementing a method of learnable tree reconstruction. The project is a collaboration between several institutes, namely: the Institut für Experimentelle Teilchenphysik (Karlsruhe Institute of Technology, DE), the High Energy and Detector Physics Group (University of Bonn, DE), and the Institut Pluridisciplinaire Hubert Curien (University of Strasbourg/Centre national de la recherche scientifique, FRA).
  • Collaborator: KIT-ETP, the Belle II collaboration, University of Strasbourg, University of Bonn

Evolutionary optimization of neural architectures in remote sensing

  • Challenge: The aim of this project is to perform a neural architecture search (NAS) and hyperparameter optimization for remote sensing classification problem on the BigEarthNet dataset. For this, we investigate the utilized optimizer, the number of training epochs, activation functions, filter sizes and counts, the network architecture (especially pre- and postactivation and batch normalization), different loss functions as well as learn rates and their scheduling.
  • Approach: For the neural architecture search (NAS) we make use of high-performance computing implementations of asynchronous genetic optimization in the MPI-parallelized propulate package. In particular, we are restricting us to an internal search of the residual network space in order to achieve comparibility with prior manual studies.
  • Collaborator: FZJ-JSC

KryoSense

  • Challenge: We investigate thermal flow meter, targeted for applications in low-energy environments and kryogenics, that has the ability of self-calibration. This means that the physically measured flow value is corrected by an estimate of the errors of the measurement devices.
  • Approach: In particular, the sensor's characteristic map is reconstructed on the basis of a Gaussian Process Regression (GPR). This Bayesian approach allows to not only get a good estimate for the correction but also a mean of quantifying the uncertainty of our estimate. We conduct this voucher in conjunction with our colleagues from the Institute of Technical Physics (ITEP) at KIT.
  • Collaborator: KIT-TTK

PerSeuS (Part I)

  • Challenge: In the field of photovoltaics, perovskites have emerged as a promising candidate for future commercial solar cells. However, the key challenge of upscaling state-of-the-art fabrication routines must be addressed to pave the way towards large-scale industrial production. Large-area devices can be fabricated by utilizing scalable printing and coating techniques such as blade coating. At the Lichttechnisches Institut (LTI) at the Karlsruhe Institute of Technology (KIT), an automatic layer deposition setup has been installed, where the coating process is recorded with a camera using four different edgepass filters. This setup acts as a versatile characterization tool for detection of spatial irregularities in the perovskite films and can be applied as insitu monitoring tool on large-area samples during the whole perovskite formation process. The aim of this project is to quantify the power conversion efficiency (PCE) of the applied layer based on the acquired temporally resolved images in order to characterize the resulting photovoltaic cells created in a fabrication process.
  • Collaborator: KIT-LTI

P-/S-wave Tagging

  • Challenge: Accurate detection of earthquake signals generated within the Earth is a fundamental and challenging task in seismology. With respect to geothermal power plants this also means the detection of time spans for safe operation and ensured electricity supply. Traditionally, the optimal method of identifying seismic phases involves a trained analyst manually inspecting seismograms and determining individual phase (p-/s-waves) arrival times. For modern large-scale datasets, traditional manual picking methods are rendered unfeasible because of the required investment of time and resources.
  • Approach: An automatized sequence tagging process based on convolutional neural networks, developed in prior works, can overcome this limitation. Its prediction accuracy rivals that of humans, but would benefit from further improvement. As part of this project, we perform macro and micro neural architectures search (NAS) to identify even better suited neural networks for the p-/s-wave tagging tasks. Our colleagues from the geophysical institute at KIT and the University of Liverpool started this voucher and we are working closely together with them in this research project.
  • Collaborator: KIT-GPI, University of Liverpool

Ongoing projects

ASSAS: Artificial intelligence for the Simulation of Severe Accident

After the Fukushima-Daiichi nuclear accidents, great effort has been internationally put to better manage severe accidents (SA) and how to mitigate their consequences. In line with this, the ASSAS project aims at developing a proof-of-concept SA simulator for nuclear power plants with a focus on simplified generic Western-type pressurized light water reactors (PWR). The most significant scientific challenge of the project concern the necessary improvement of severe accident codes’ robustness and rapidity. The solution approach is twofold: On the one hand, the benefits of precise solutions are studied by utilizing efficient programming techniques, like parallelization for example. On the other hand, the simulator will be augmented by AI-based surrogate models that are approximative, but can do rapid predictions. In both cases the models' uncertainties are closely studied.

ProFiLe: Better prediction of protein structure and function with AI (DLR + KIT)

Life is orchestrated via an interplay of many biomolecules. Any understanding of bio-molecular function relies on detailed knowledge of their three-dimensional structure, whose determination is experimentally often very challenging. An orthogonal theoretical approach isthe set of structure prediction techniques. ProFile aims to predict structures by using untapped information in the exponentially growing genomic databases via deep learning methods on high-performance computers. By this data-driven approach we want to (i) accurately infer pairs of residues in spatial contact within biomolecules to (ii) guide structure prediction by (iii) novel tensor algebraic methods in a (iv) tailored open source software. ProFiLe will employ transformer networks and their attention mechanism may learn the complex processes leading to protein folding. Yet, the multitude of interactions and quantity of amino acid sequences require neural networks of considerable size, rendering the training computationally intensive. The software HeAT (Helmholtz Analytics Toolkit) provides a high-performance framework for distributed machine learning. With HeAT, the desired network architectures can be memory-efficiently implemented and parallelized. Handling the attention matrices in the protein structure predicting neural network, requires new tensor algebraic techniques. In line with the ProFiLe project, these will be designed and developed for HeAT enabling algorithmically efficient processing. In contrast to the recently presented ground-breaking AlphaFold program by Google DeepMind our results will be openly available to the scientific community. The proposed approach promises even better computational properties on current high-performance computing systems.

AI-powered microgrids: vulnerabilities to terrorism (KIT)

The identification of vulnerabilities in critical infrastructures is essential in order to prevent their exploitation by terrorist actors. Digitization and localization of electrical grids give rise to new technical applications, promising a more efficient and resilient energy management. One such technology is microgrids, and the use of artificial intelligence (AI) in their decision logic is rapidly growing. New technological innovations and assemblages tend to bring about unintended consequences and add to the uncertainty in managing complex technical systems. Increased complexity may imply new vulnerabilities, especially in view of the interconnected nature of critical infrastructures and their latent potential for cascade effects. The technical design and socially embedded operation of AI-powered microgrids may render them vulnerable to malicious attacks.
Consequently, the focus of this core-funded project, which aims to foster networking across different units of KIT, is a three-step vulnerability assessment: It first evaluates what advantages are expected from microgrids with regard to energy supply and its resilience. To get a grasp on the technical design and necessary social organization around its operation, the project will develop a description of microgrids as sociotechnical entities. The project subsequently reviews the state of AI in microgrids, which developmental trajectories can be identified, and what are the potential vulnerabilities that AI introduces into the system? Finally, the project aims to answer the questions of how social actors maintain a resilient state of operation for AI-powered microgrids and which actors would likely be affected by feasible terrorist attacks on AI-powered microgrids. In the end, the project shall propound a technically and sociologically sound model for assessing the vulnerability of AI-powered microgrids. The model is intended to be potentially applicable to other sociotechnical systems in critical infrastructures as well.

Software

HyDE

Hyperspectral Denoising is a Python toolbox provides denoising algorithms for hyperspectral image data. In particular, we provide: 

  • A wide variety of hyperspectral denoising algorithms
  • GPU acceleration for all algorithms
  • An inuitive pythonic API design
  • PyTorch compatibility

https://github.com/Helmholtz-AI-Energy/HyDe

HeAT

HeAT is a flexible and seamless open-source software for high performance data analytics and machine learning. It provides highly optimized algorithms and data structures for tensor computations using CPUs, GPUs and distributed cluster systems on top of MPI. The goal of HeAT is to fill the gap between data analytics and machine learning libraries with a strong focus on on single-node performance on the one hand, and traditional high-performance computing (HPC) on the other. HeAT's generic Python-first programming interface integrates seamlessly with the existing data science ecosystem and makes it as effortless as using numpy to write scalable scientific and data science applications.

https://github.com/helmholtz-analytics/heat/

DASO

Distributed Asynchronous and Selective Optimization (DASO) enables you to leverage multi-GPU compute node architectures, e.g. high-performance cluster systems, to accelerate neural network training while maintaining accuracy. For this, DASO uses a hierarchical and asynchronous communication scheme comprised of node-local and global networks while adjusting the global synchronization rate during the learning process. The result: a reduction in training time of up to 34% on classical and state-of-the-art networks, as compared to current optimized data parallel training methods.

https://arxiv.org/abs/2104.05588

Featured publications

  • Funk, Y., Götz, M. and Anzt, H., 2022. Prediction of Optimal Solvers for Sparse Linear Systems Using Deep Learning. In Proceedings of the 2022 SIAM Conference on Parallel Processing for Scientific Computing (pp. 14-24). Society for Industrial and Applied Mathematics. DOI: https://doi.org/10.1137/1.9781611977141.2
  • Coquelin, D., Debus, C., Götz, M., von der Lehr, F., Kahn, J., Siggel, M. and Streit, A., 2022. Accelerating neural network training with distributed asynchronous and selective optimization (DASO). Journal of Big Data, 9(1), pp.1-18. DOI: https://doi.org/10.1186/s40537-021-00556-1
  • Weiel, M., Götz, M., Klein, A., Coquelin, D., Floca, R. and Schug, A., 2021. Dynamic particle swarm optimization of biomolecular simulation parameters with flexible objective functions. Nature Machine Intelligence, 3(8), pp.727-734. DOI: https://doi.org/10.1038/s42256-021-00366-3
  • Markus Götz, Charlotte Debus, Daniel Coquelin, Kai Krajsek, Claudia Comito, Philipp Knechtges, Björn Hagemeier, Michael Tarnawa, Simon Hanselmann, Martin Siggel, Achim Basermann, and Achim Streit, "HeAT - a Distributed and GPU-accelerated Tensor Framework for Data Analytics.", In Proceedings of the 2020 IEEE International Conference on Big Data (Big Data). IEEE, 2020. DOI: 10.1109/BigData50022.2020.9378050