Skip to ContentSkip to Navigation
About us Faculty of Science and Engineering Data Science & Systems Complexity (DSSC) Research

Projects

Adaptive Models & Big Data

Complex Systems & Engineering

Advanced Instrumentation & Big Data

Abbreviations:

  • Bernoulli Institute for Mathematics, Computer Science and Artificial Intelligence (BI)
  • Engineering and Technology Institute Groningen (ENTEG)
  • Kapteyn Astronomical Institute (Kapteyn)
  • Zernike Institute for Advanced Materials (ZIAM)
  • Netherlands Institute for Space Research (SRON)
  • Netherlands Institute for Radio Astronomy (ASTRON).


Adaptive Models & Big Data

Advanced models & Big data
Advanced models & Big data


Project: Automatic identification of structures in biomedical mega-images

Nanotomy is an innovation in Electron microscopy (EM) at the Giepmans Lab, UMCG, to allow the study tissues, cells, organelles and macromolecules in a Google Earth-like fashion. Each dataset consists of a large gigapixel high-resolution image (of size more than 15GB) that provides information at the nanometer-range. A new microscope to solve for the current rate-limiting data acquisition step is to be installed at the UMCG in 2020Q3. Since the new microscope will acquire data 100x faster, the next main challenge is the interpretation of data. This can be done partially by labeling or staining, but automatic tools for the detection of organelles and subcellular structures are essential when dealing with the big data at hand.

We aim to apply pattern recognition and computer vision techniques to handle automatic identification of structures in biomedical mega-images. The focus will be on developing adaptive algorithms for best accuracy and to run them efficiently to deal with large and noisy data.

Keywords: Nanotomy, High-resolution, Automatic tools, Pattern Recognition, Computer Vision.

Fields of expertise: Pattern Recognition, Computer Vision, Mathematics, Artificial Intelligence.

Supervisory team:


PhD student: Anusha Aswad


Project: A Visual Analytics Approach of Big Data

This project focuses on a Visual Analytics approach of big data, that is, combining automated data analysis techniques with interactive visual interfaces to pose, refine, and confirm hypotheses about complex phenomena represented by such data.

We will focus on high-dimensional time-dependent data, that is, large sets of observations having each many measurement values that represent the evolution of a phenomenon over time. Challenges are here the large numbers of observations (millions or more); dimensions (hundreds or more); and time steps (thousands or more). Finding efficient and effective ways to discover and display complex patterns in this very high dimensional data space is the key challenge in modern (visual) data exploration.

To solve this, we will develop methods for data-size and data-dimensionality reduction; automatic and user-assisted discovery of meaningful patterns hidden in the data; and intuitive visual depiction of such patterns,  their evolution, and their inter-relationships. For this, we will develop new techniques for pattern mining, dimensionality reduction, scalable information visualization, uncertainty visualization, relational visualization, and interactive data querying. We will achieve scalability for big data by using CPU and GPU parallelization, and multiscale data-representation and visualization techniques.

We will apply our interactive visual analytics pipeline to two use cases: 1) large simulations or observation catalogs in an astronomical pilot project to address questions on galaxy evolution; 2) prediction of neurodegenerative diseases from multi-centre clinical brain data.

This PhD student follows up on a recently completed project on e-Visualisation of Big Data funded by the Dutch e-Science Centre NLeSC.

Keywords: Visual analytics, high-dimensional data visualization, interactive pattern discovery, scalable information visualization, astronomical data, medical data.

Fields of expertise involved: Multidimensional data analysis, Data and pattern mining, Interactive visual analytics and information visualization, Disease prediction from medical data, Galaxy evolution.

Supervisory team:

PhD student: Youngjoo Kim (Korea)

Institutes involved:
Bernoulli institute Kapteyn

Potential partners: Philips

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 754315.


Project: Clinical Big Data for multifactorial diseases: from molecular profiles to precision medicine


Multifactorial diseases with complex traits such as various types of cancer and chronic obstructive pulmonary disease (COPD) are among the leading causes of death in the Western society and form primary challenges of current health system. Poor understanding of the complex molecular mechanisms of such diseases and currently available diagnostic options are often inadequate to find efficient treatment for a large proportion of patients. Personalized treatment of patients or identification of subgroups of patients where treatment is efficient using precision medicine approaches are pivotal to improve health and patient care.

The goal of the project is to develop a machine learning method for large multi-omics and clinical data sets to address different clinical questions in projects studying complex diseases such as cancer and COPD.

Researchers : Prof. Dr. Peter Horvatovich (Groningen Research Institute of Pharmacy - GRIP); Dr. Marco Grzegorczyk (JBI), Victor Guryev (UMCG, ERIBA), Dr. Corry-Anke Brandsma (UMCG), Prof. dr. Kathrin Thedieck (UMCG, Medical Faculty of Oldenburg University), Prof. dr. Rainer Bischoff (GRIP), Dr. Bart Verheij (AI), Dr. György B. Halmos (UMCG), Prof. dr. Wim Timens (UMCG), Prof. dr. Dirkje Postma (UMCG), Maarten van de Berge (UMCG), Prof. dr. Gerald Koppelman (UMCG), Prof. dr. Eelko Hak (GRIP).

PhD student:
Victor Arturo Bernal


Project: Computational design of soft robots

Miniaturized untethered soft robots find extensive use in biomedical and microfluidic applications due to their inherent compliance and bio-compatibility. Soft robotic micro-swimmers are fabricated using state-of-the-art additive manufacturing techniques and exhibit precise position control and untethered manoeuvring. Although experiments have been quite successful, there has been very little progress in the computational modeling of these soft robots and their intricate swimming dynamics due to the coupled non-linear interactions between the external magnetic field, the soft elastica and the surrounding viscous incompressible fluid medium. Standard Arbitrary Lagrangian-Eulerian (ALE) methods fail when there are large structural displacements involving mesh distortion and degeneration. Extended ALE methods are developed to address this issue wherein, variational mesh optimization and data-driven algorithms are used. The primary objective is to develop a robust computational framework using new ALE methods with non-degenerative meshes and efficient algorithms to overcome extensive mesh distortion, resulting in higher accuracy and predictability for modeling the coupled non-linear dynamics of magnetically-actuated soft robots swimming in a viscous incompressible fluid.

Keywords: Soft robots, Swimming dynamics, ALE methods, Variational mesh optimization, Data-driven algorithms

Fields of Expertise: Non-linear dynamics, Finite element modeling, Fluid-solid interaction, Numerical techniques, ALE methods

Supervisory team:


PhD student: Ratnadeep Pramanik


Project: Low-complexity, parallel and distributed algorithms to detect and classify objects in large infrared, hyper-spectral and 3D sensor images

This project (JBI, Kapteyn) will develop low-complexity, parallel and distributed algorithms that robustly detect and classify objects in large infrared, hyper-spectral and 3D sensor images. The methodology will be based on computer vision and machine learning techniques, such as trainable morphological image processing filters which can be efficiently implemented using sequential and parallel implementations based on the max-tree or related alpha-tree data structure. We intend to explore two ways of combining these morphological scale-space data structures with machine learning, i.e. 1) using machine learning to classify nodes in the trees based on so-called vector-attribute filtering, and, 2) feeding key-point features detected by analysis of these scale spaces to deep convolutional networks, after automatic rescaling and rotation to a standard scale and orientation. This would allow scale-invariant analysis of huge images using deep learning, without excessive compute power requirements. The methods will be applied to several use cases: 1) the detection of buildings at many different types of resolution from remote-sensing data, as needed in the Global Human Settlement Layer project which aims to improve understanding of urbanisation, to assist urban planning, and 2) to support disaster relief, 3) for detection and analysis of objects in large astronomical surveys. This subproject will profit from existing collaborations of JBI with the Joint Research Centre (JRC) in Ispra, Italy. Potentially, a fourth use case on large electron micrographs obtained from the UMCG could be included.

Keywords: Scale spaces, connected filters, machine learning, remote sensing, astronomical surveys

Supervisory team:


PhD student: Jiwoo You (Korea)

Institutes involved:
Bernoulli insitute
Kapteyn

Potential partners:
  • ASTRON
  • SRON

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 754315.


Project: Machine Learning for the Detection of "Filament-like" Astronomical Structures

We propose studying filamentary stellar streams present in the Milky Way with the aim of uncovering evolutionary properties of our galaxy through its past interactions with galaxy satellites. Such interactions result in the formation of the streams which we wish to study. The outcomes of this project can also serve as a tool for galaxy mass measurements, dark matter studies, and exploration of the cosmic web. We plan to extend and enhance the tools developed by the SUNDIAL network which have shown promising results for similar applications. These tools include: strategies from Evolutionary Computation particularly a biologically inspired ant colony algorithm for the detection of dense structures within point clouds, Structure Aware Filtering for reduction and pre-processing of the data while preserving topological structure, and N-body simulations for studying the temporal evolution of the structures at hand. We demonstrate the strategies’ performance by comparing computational predictions to synthetic and observational datasets provided by the newest satellite data sources. Our contribution to the previous work will improve on the developed tools taking advantage of newly released datasets, and will provide quantitative measures of topological features of stellar streams extracted from simulations and observational data.

Keywords: Machine Learning, Structure Detection, Stellar Streams, Milky Way.

Fields of Expertise: Data Science, Machine Learning, Astronomy.

Supervisory team:


PhD student: Petra Awad


Project: SMART-AGENTS (Swarm collaborative Multi-Agent cybeR physical sysTems with shAred sensinG modalitiEs, 5G commuNication and micro-elecTromechanical Sensor arrays)

The demand for mobile agents in industrial environments to perform various tasks is growing tremendously in recent years. However, changing environments, security considerations and robustness against failure are major persistent challenges autonomous agents have to face when operating alongside other mobile agents. Currently, such problems remain largely unsolved. Collaborative multi-platform Cyber-Physical-Systems (CPSs) in which different agents flexibly contribute with their relative equipment and capabilities forming a symbiotic network solving multiple objectives simultaneously are highly desirable. Our proposed SMART-AGENTS platform will enable flexibility and modularity providing multi-objective solutions, demonstrated in two industrial domains: logistics (cycle-counting in warehouses) and agriculture (pest and disease identification in greenhouses).

Keywords: Machine Learning, Artificial Intelligence, Cyberphysical Systems, Control Systems, Cooperative Control, MEMS Sensors, biomimetic sensors, Smart Industry.

Fields of expertise: Computer Science, Machine Learning, Artificial Intelligence, Control Theory, Dynamical Systems, micro-electromechanical systems (MEMS) sensors.

Supervisory team:


PhD student: Matteo Marcantoni


Project: The Value of Data

Data is a driver of value creation. For instance, the value of many tech companies is based on data-guided marketing algorithms. Businesses optimize the gains of their processes using data, e.g. in fraud detection. For a good understanding of the value of data, three theoretical domains are relevant: probability theory, expected utility theory and logic. Probability theory provides the foundations for descriptive statistics, expected utility theory models subjective values, and logic represents complex qualitative relations. In the project, argumentation-based formal methods and algorithms will be developed connecting probability theory, expected utility theory and logic. The project will extend techniques developed for evidential reasoning about the facts to practical reasoning about actions.

Researchers: Dr. Bart Verheij (BI), Prof. dr. Rineke Verbrugge (BI)


Project: Topological Analysis Methods for Big Data

This project is concerned with topological analysis methods for big data, and the insights this provides into the emergence of complexity in dynamical systems. The PhD student who will select this project will develop and use topological data analysis methods for the analysis of big data originating in cosmological and astronomical studies. It will be based on recent developments that have shown that state-of-the-art topological data analysis (TDA) methods, such as persistent homology, uncover new understanding and insights into the multiscale topological description of the Megaparsec weblike cosmic matter distribution. Betti numbers and topological persistence turn out to represent powerful means of describing the rich connectivity structure of the cosmic web and of its multiscale arrangement of matter and galaxies. This has shown that topological data analysis methods provide provide new means of understanding the shape of data and uncovering hidden patterns and relations. The PhD student will focus on using TDA to detect structure in data coming from large scale cosmological simulations and large cosmological redshift surveys, and use these to look into the homological properties of the observed Cosmic Web and study how these properties change in time. Intimately related to the singularity structure of the mass distribution represented in the data, the project should also enable the study of the connection with the dynamical evolution of the probed system. By means of sophisticated new reconstruction methods producing maps of the evolving mass distribution in the local Universe, the statistical and topological data analysis of real observational data may thus lead to a unique means of studying the phase-space dynamics and singularity structure of the cosmic web. This will yield unique insights in the dynamics of the cosmic web and culminate in a dynamical characterization of the observed distribution of galaxies in upcoming large galaxy surveys such as Euclid and SKA. The potential for additional applications is large, such as the issue of astronomical object classification in data coming from astronomical instruments. In this context, the PhD student will interact with other COFUND PhD students who need advanced methods for understanding the shape of their data.

Keywords: Topological data analysis, Cosmology, Astronomy, Big Data

Fields of expertise involved: Algebraic Topology, Computational Geometry, Cosmology, Astronomy

Supervisory team:


PhD student: Georg Wilding (Austria)

Institutes involved:
Bernoulli institute
Kapteyn

Partners: INRIA

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 754315.


Project: Uncovering the information processing underlying the interactions between brain areas

While considerable progress has been made on understanding how the brain works, most of this is focused on the functions of individual brain areas in isolation. The next frontier is to understand how these brain areas work together in the service of cognition. The main aim of the project is therefore to understand how brain areas communicate in the service of information processing.

Researchers : Dr. Marieke van Vugt (BI); Prof. dr. ir. Ming Cao (ENTEG); Prof. dr. Niels Taatgen (BI); Dr. Jelmer Borst (BI)

PhD student:
Oscar Portoles Marin, MSc


Complex Systems & Engineering

Complex Systems & Engineering
Complex Systems & Engineering



Project: Coevolutionary Dynamic Networks

In the real-world, a wide variety of coupled dynamic systems arise, ranging from physical to economic and social ones. The aforementioned structures are closely correlated as the actions made by one might alter the performance of another. For this reason, network based models are often employed in modern dynamic systems research for such phenomena description. Thus, one interprets a network as a set of dynamical systems represented as nodes and connected through links, which describe the way the nodes interact with each other, providing the network with particular topological properties and evolution rules.

Essentially, previous research has been conducted under two specific assumptions, either networks with static links and evolving nodes, or networks with static nodes and evolving links. However, very few existing studies consider networks with simultaneous evolving nodes and links, despite the fact that in reality both elements are constantly transforming. Therefore, a coevolutionary network determines a structure on which the performance of its nodes and their connections is co-dependent and co-evolving.

This project will focus on the research of coevolutionary dynamic networks, firstly studying those evolving with the same time-scale, but emphasizing later on multiple time-scale coevolutionary networks, predominantly considering opinion and epidemiological dynamics.

Keywords: Complex systems, Dynamic networks, Coevolution, Multiple time-scales, Epidemics.

Fields of expertise involved: Complex systems, Coevolutionary Networks, Multiple time-scale dynamics, Game Theory, Epidemiological dynamics.

Supervisory team:

PhD student: Luis Venegas Pineda

Project: Complex Dynamical networks: From Data to Connectivity Structure

Dynamical networks are pervasive in today’s world, ranging from social and economic networks to biological systems and man-made infrastructures. The connection structure plays a crucial role in determining the overall behavior of these networks. For instance, the topology of social networks affects the spread of information and disease, and the topology of the power grid affects the robustness and stability of power transmission. Knowing the connection structure is fundamental in order to predict how these networks might evolve and to anticipate/counteract critical transitions. In this research project, we aim at developing methods and algorithms to infer the connectivity structure of complex dynamical networks from sparsely collected data.

Researchers : Dr. Pietro Tesi (ENTEG); Prof. Dr. Kanat Camlibel (BI).

PhD student:
Henk van Waarde, BSc


Project: Cyber security with applications to both smart energy systems and smart industry

Owing to advances in computing and communication technologies, recent years have witnessed a growing interest towards Cyber-physical Systems (CPSs), i.e., systems where physical processes are monitored/controlled via embedded computers and networks. The concept of CPSs is extremely appealing for smart energy systems and smart industry, but it raises many theoretical and practical challenges. In particular, CPSs have triggered the attention towards networked control in the presence of cyber-attacks. In fact, unlike general-purpose computing systems where attacks limit their impact to the cyber realm, attacks to CPSs can impact the physical world as well. The ESRs who select this Project line will develop novel monitoring and control systems that are resilient against Bias Injection (BI) and Denial-of-Service (DoS) attacks. Two specific case studies will be considered: (i) BI attacks on distributed consensus networks for environmental monitoring, and (ii) DoS attacks on distributed control algorithms for optimal frequency regulation in smart grids.

Keywords: Cyberphysical Systems, Networked Control Systems,  Hybrid Systems, Power Networks, Smart Industries.

Fields of expertise: Control Theory, Dynamical Systems, Cybersecurity, Computer Science, Network Science.

Supervisory team:

PhD student: Alessandro Luppi (Italy)

Institutes involved:
ENTEG

Potential partners: MathSys

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 754315.


Project: Distributed control methods under communication constraints with applications to sensor networks in a smart industry setting

In modern industry, there is an increasing interest in control systems employing multiple sensors and actuators, possibly geographically distributed. Communication is an important component of these networked control systems. Understanding the interactions between control and communication components is especially important to develop systems that possess scalability features. Existing solutions for distributed control have little to no scalability features in terms of both convergence time and accuracy caused by limited bandwidth and quantization issues. The ESRs who select this Project line will develop novel distributed control methods, with emphasis on event-based and self-triggered communication protocols. A specific case study will be considered in terms of distributed “average” and “max-min” consensus networks, which are prototypical networks for distributed sensing and actuation.

Keywords: Sensor Networks, Nonlinear consensus, Hybrid Systems, Self-triggered Algorithms, Coordination Control.

Fields of expertise: Control Theory, Dynamical Systems, Signal Processing, Computer Science, Network Science.

Supervisory team:

PhD student: Monica Rotulo (Italy)

Institutes involved:
ENTEG
Bernoulli Institute

Potential partners: MathSys

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 754315.


Project: Home robotics

Robots must be able to handle unforeseen circumstances. Neither knowledge representation nor machine learning approaches allow for the sufficiently robust handling of unforeseen circumstances. As a result, new hybrid technology must be developed that combines knowledge technology for the manual representation of behavior-guiding scenarios for new and exceptional circumstances with data technology to evaluate and adapt these scenarios. In the robot architecture developed in the project, a hypothesis testing cycle will be modeled using argumentation-based techniques designed for the combination of logic-based scenario representations and probability-based data analysis. The architecture will be tested in the international annual RoboCup@Home competition.

Keywords: Artificial intelligence, argumentation, robotics, scenario modeling, human-machine interaction.

Fields of expertise involved: Artificial intelligence, argumentation theory, robotics, human-machine interaction.

Supervisory team:

PhD student: Hamed Ayoobi (Iran)

Institutes involved:
Bernoulli I nstitute
ENTEG

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 754315.


Project: Interactive Adaptive Multi-agent Imitation Learning

Due to the increase in the number of modern engineering systems comprising of a large number of interacting components, dealing with complex real-world multi-agent problems is quite important. Multi-agent reinforcement learning and imitation learning are among the most powerful approaches in solving such problems. However, each of these approaches suffers from undeniable drawbacks. For example, imitation learning-based approaches need a large number of near-optimal demonstrations to perform appropriately.

In this project, we propose a novel multi-agent imitation learning algorithm by combining the underlying idea of imitation learning, generative deep learning, and control theory techniques. Our goal is to deal with safety-critical complex multi-agent problems more effectively than the present approaches. Furthermore, the developed solution strategy will have better sample guarantees and will be able to tackle sparse-reward environments.

Keywords: Multiagent networks, Imitation learning, Deep generative networks, Control theory

Fields of expertise involved: Artificial Intelligence, Complex Systems, Control systems

Supervisory team:


PhD student: Alireza Zolanvari


Project: Tailor-made model reduction methods for integrated energy systems

The rapid increase of renewable/sustainable energy production leads to daunting challenges in the design, analysis, and control of the modern energy system. Namely, the large number and decentralized nature of renewable energy sources calls for a change of the current paradigm in which energy production is largely centralized in a small number of power plants.

Instead, future energy systems will constitute a network of a large number of heterogeneous  dynamical systems, corresponding to a variety of agents such as large power plants, wind farms, solar collectors, industrial and household consumers.

One major challenge in the design of these highly integrated energy systems is that the complexity of the resulting mathematical models is immense, such that the analysis, simulation, and controller design become intractable. Therefore, satisfactory methods for the approximation of such complex models by lower-order, less complex models are needed. In order for these low-complexity models to be reliably employed as substitute of the original model, they should not only accurately preserve the behavior of the original model, also inherit/preserve/reflect structural properties of the original model, in particular the underlying physical network structure. Consequently, this project will investigate network-structure preserving model reduction methods as well as a priori error bounds for such methods.

Keywords: Networks of dynamical systems,  structure preserving model reduction, smart energy systems, model reduction for networks, graph simplification

Fields of expertise involved: Systems and control engineering, Feedback control theory, Mathematical modelling, Model reduction, Graph theory.

Supervisory team:

PhD student: Azka Burohman (Indonesia)

Institutes involved:
Bernoulli Institute
ENTEG

Partner: Ocean Grazer

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 754315.


Advanced Instrumentation & Big Data

Advanced Instrumentation & Big Data
Advanced Instrumentation & Big Data


Project: Improved modeling of astronomical radio sources with compressive sensing techniques

The 21cm line emitted by neutral hydrogen is one of the most promising probes of the intergalactic medium, and it is particularly important to understand the thermal and ionization history during the Epoch of Reionization. Observing the distribution and evolution in redshift of neutral hydrogen is of fundamental importance to understand the formation and evolution of first structures. The LOw-Frequency ARray (LOFAR) has one of the largest program in the world to achieve this through radio images. However, its large data sets (approximately 5 petabytes) require optimization of imaging algorithms. Furthermore, the Galactic and extragalactic foreground emission is up to 5 orders of magnitudes brighter than the 21cm emission and it has to be subtracted with an accurate sky model. This project will focus on the improvement of imaging algorithms for LOFAR data with compressive sensing techniques, which are ideal to reconstruct the signal from sparse data (such as those collected by radio interferometers) and to generate a physical sky model. These techniques have already been successfully applied in other fields, such as medical imaging and facial recognition, which will benefit from this project. Improving imaging techniques is fundamental to obtain high-fidelity results from the infant Universe.

Keywords: Cosmology, Early Universe, Radio Interferometry, Compressive Sensing Techniques, Sky Modeling.

Fields of expertise involved: Epoch of Reionization, Radio Astronomy, Imaging Algorithms, Signal Processing

Supervisory team:

PhD student: Emilio Ceccotti


Project: Methods for automated and robust analysis of astronomical data

This project (Kapteyn, JBI and ASTRON) will focus on methods for automated and robust analysis of astronomical data taken with the Low Frequency Array (LOFAR). The Early Stage Researcher will develop sophisticated calibration methods that connect astronomy and complex data systems, as part of a project on Advanced Instrumentation and Big Data. They will use novel computing science methods to analyse data from the LOFAR radio interferometer, built and operated by ASTRON. LOFAR is the world’s largest connected radio telescope that is optimised to operate a low radio frequencies at arc-second angular resolution. The student will develop new techniques for the automated and robust analysis of the data taken with the International LOFAR Telescope, which includes signals from the stations both within the Netherlands and throughout Europe. This data stream will be used to investigate the low energy radiation from supermassive black holes at the highest possible angular resolution to investigate particle acceleration and energetics in the Universe’s largest natural particle accelerators. This will be done by combining the resolving power of LOFAR and the added magnification provided by gravitational lensing to study black hole physics at redshifts where the active phase from such objects is expected to peak, and have the largest influence on the build-up of the stellar host galaxy. The computing science will involve developing automated methods for identifying such objects from the LOFAR all-sky surveys data. The novel analysis tools developed in this project line will be made available to the community.

Key Words: Galaxy formation; Gravitational lensing; Black Holes

Fields of expertise involved: Radio Astronomy; Interferometry; Advanced calibration; Machine Learning

Supervisory team:

PhD student: Samira Rezaei (Iran)

Institutes involved:
Bernoulli Institute
Kapteyn

Potential partners: Netherlands Institute for Radio Astronomy (ASTRON)

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 754315.


Project: Modeling and control of hysteretic deformable mirrors for high-contrast imaging systems

In this project, we will develop model-based nonlinear control algorithm for the control of a novel hysteretic deformable mirror (HDM). For enabling wavefront control in a high-contrast coronagraph instrument for future space-telescopes, with the ultimate goal of finding and characterizing Earth-like exoplanets, we have developed a novel deformable mirror (DM) concept based on new hysteretic piezoelectric material and new distributed polarization method. It allows for a high-density, low-power and scalable DM system which is crucial for space application. The demonstrator is currently being built and tested by a multidisciplinary team from ENTEG (Engineering and Technology institute Groningen), ZIAM (Zernike Institute for Advanced Material), SRON (Dutch Space Research Agency) and KAI (Kapteyn Astronomical Institute).

As an instrumental element of this novel DM system, the smart control system will be developed by the PhD student/Early Stage Researcher (PhD student) within DSSC. In particular, the PhD student will (i). model and characterize the realized HDM demonstrator, including, hysteresis and dynamical behavior; (ii). perform systems identification/identify parameters based on the real-time dynamical data; (iii). develop a distributed control method for achieving desired shape of the DM; and (iv). develop a combination of data-driven and model-based distributed control algorithm for continuous and iterative improvement of the control systems.

Keywords: Control algorithm; Distributed nonlinear control method; Advanced instrumentation; Mechatronic systems; Systems identification; Model-based control algorithm; Data-driven control algorithm.

Field of expertise: Systems and Control; Applied Mathematics; Applied Physics; Astronomy; Mechanical Engineering; Electrical & Electronics Engineering; Mechatronics; Advanced Instrumentation; Opto-mechatronics; Precision Engineering.

Supervisory team:

PhD student: Marco Augusto Vasquez Beltran (Mexico)

Institutes involved:
ENTEG
Kapteyn
ZIAM
Netherlands Institute for Space Research

P artner: SRON

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 754315.


Project: Scalable algorithms to process massive datasets from radio astronomy

One of the holy grails in cosmology is the discovery of neutral hydrogen in the infant Universe (its first billion years), which can tell astronomers and cosmologists how the first stars, galaxies and black holes formed, how dark matter evolved and whether gravity indeed follows Einstein’s general relativity. The low-frequency part of the Square Kilometer Array (SKA) –  the world’s largest radio telescope currently under construction in Australia and South Africa –  can not only discover the emission of this hydrogen, but it can even make images of its distribution and how it evolves in time (redshift). The data volume of the SKA, however, is huge (1000 Tb/day) and will require novel scalable algorithms in order to process the total amount of data accumulated in such a project (106 Tb or 1 Exabyte).

The PhD student project will focus on how subtle effects in the processing (i.e. error-correction and imaging) of these data can be uncovered using machine learning techniques, such as neural networks and pattern recognition. Such an approach is becoming necessary in order to quickly explore the very high-dimensional data-set, beyond any human ability, and discover effects and relations in the data that will enable finding these extremely faint signals of neutral hydrogen from the infant Universe.

Keywords: Cosmology, Radio Telescopes, Machine Learning, Infant Universe.

Fields of expertise involved: Cosmic Dawn & Reionization, Radio astronomy, Machine Learning, Signal Processing.

Supervisory team:

PhD student: Hyoyin Gan (Korea)

Institutes involved:
Kapteyn
Bernouilli Institute

P artner: ASTRON

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 754315.


Project: Searching for extremely rare objects in the Universe

Detecting rare objects of high scientific importance such as strong gravitational lenses in large astronomical datasets will increase dramatically within ten years. Detecting rare objects in large datasets resembling other more common objects can lead to very high rate of false positives. Hence, there is a key challenge to suppress the false positive rate while not suppressing the true positive rate.

To overcome this problem, this research project focuses on developing novel algorithms to create realistic synthetic image data of rare astronomical objects of interest and also developing deep learning based image classifiers such as Convolutional Neural Networks (CNNs) to be used in combination with other techniques such as Stacked Denoising Autoencoders (SDAe), Generative Adversarial Networks (GANs), colour-scale filter augmentation etc., for efficiently detecting rare objects such as strong gravitational lensing in large datasets such as ESO-VST KiDS (Kilo - Degree Survey). This algorithm will also be scalable to future missions such as ESA’s Euclid satellite mission.

Keywords: Gravitational Lensing, GAN, CNN.

Fields of Expertise: Data Science, Astronomy, Deep Learning.

Supervisory team:

PhD student: Bharath Nagam


Project: The Low Surface Brightness Universe: Multidimensional Faint-Object Detection

Astronomy is experiencing rapid growth in data size and complexity. Future surveys, e.g., LSST, Euclid, KIDS, DESI and SKA will increase the number of available objects and deliver wide-deep images of the sky, including galaxies that are too faint to be seen today. It is not possible to manually inspect all images produced by these surveys, making computer science, machine learning algorithms, and advanced image analysis of vital importance. Such deep-imaging surveys are ideal to apply machine learning models, as they will be wider and deeper than any survey conducted before. In this new data-rich era, astronomy and computer science can benefit greatly from each other. Their synergy will lead to tools that will allow us to use the information of the Low Surface Brightness Universe hidden in the new surveys and will allow us to start uncovering a completely new parameter space.

The principal goal of this project is improvements to MTObjects , which is a Max-Tree based method for extraction of faint extended sources, to handle multi-band optical data, and extend the tool to multidimensional datasets, to work efficiently on 3D optical data.

Keywords: Machine learning, Astronomical imaging, Object detection


Fields of expertise involved: Machine learning, Astronomy, Image processing

Supervisory team:

PhD student: Mohammed Faezi


Project: [VF]ast Data

In many fields of science, data sets are not just big, the data rate is also huge, and the sizes of individual data items or cliques of data that need to processed as a whole become very large indeed. Coping with big data sets is quite a challenge on its own, but the problem is compounded if the individual data items are too large to process on a single node, or if the sensor data rate is so high you cannot possibly store all of it. Our proposal aims to develop fast, multi-scale algorithms for graph-based data processing suitable for efficient distributed-memory parallel processing.

Applications include the vast image data cubes from the Square Kilometer Array (SKA) radio telescope and the fast data streams from the PANDA (Anti-Proton Annihilation) Collaboration.

Researchers : Dr. Michael Wilkinson (BI); Prof. Dr. Alexander Lazovik (BI); Dr. Johan Messchendorp (KVI); Prof. Dr. Leon Koopmans (Kapteyn); Prof. Dr. Scott Trager (Kapteyn).

PhD student:
Simon Gazagnes, MSc

Partner: IlionX

Last modified:19 January 2022 2.51 p.m.