About
My research aims to uncover and design mathematical modelling methodologies for policy decisions with respect to the energy transition in the Netherlands. In this pursued, I aim carefully trying to find it’s places of applicability without pushing the ideology of an expansive, invasive, and messianic ‘computational universalism’ [1]). As every decision is a medium of power and of stratification, developing a framework that truly respects epistemic pluralism has to start at the root.
As the root of my work, I considered cyberinfrastructures which set “the invisible rules that determine the spaces of our everyday life”, as the social geographer Keller Easterling writes [2]). This is also emphasised by the EU Crossover Project on ‘ICT solutions for governance and policy modelling’ which was part of the Seventh Framework Programme. You can read more about my contributions to internet governance here.
Embedded within these cyberinfrastructures are digital platforms on which my everyday research practices depend in many ways (such as finding and gathering data to evaluate and validate mathematical models, identifying relevant literature, publishing results, etc). Platforms can be identified as a key driver of a significant shift in our epistemic infrastructure that is currently underway (Celia Lury 2021, [3]). Here, my past work on digital platforms for political participation investigates the reshaping of urban issues.
My current work focuses on accounting for tacit knowledges beyond data-driven logics I am working on new methodologies:
- Starting with the creation of conceptual models in the form of causal loop diagrams (CLD) that store qualitative expert knowledge which has been gathered through group model building (GMB).
- Turning the conceptual model, represented through the CLD, into a computational model, represented though a system dynamics models (SDM).
- The created SDM can then be further refined by replacing links with more complex computational methods, such as agent-based models (ABM). These will be developed in the context of the energy transition in the Netherlands through the Future of Energy network and various governmental and industrial partners. As part of this work I am participant in the Open Energy Modelling Initiative, which promotes transparency and openness in energy system modelling.
Previously, I was a PostDoc at the Institute for Biodiversity and Ecosystem Dynamics and the Institute for Advanced Study where I develop modelling methodologies for analysing complex systems and help to set up the POLDER Center for policy decision-support.
I obtained my PhD under the supervision of Baojiu Li and Richard Massey at the Institute for Computational Cosmology and the Institute for Data Science with the support of the STFC/CDT scholarship.
My PhD research covered a wide range of different topics ranging from gravitational strong lensing, halo-galaxy connection, second-order CMB perturbations, and modified gravity. The underlying motivation is to understand the enigmatic late time accelerated expansion of the universe which we infer from multiple independent observations. To test new theories and potential observational probes I designed, developed, run and analysed cosmological simulations which consisted of over 1 billion particles and span 13 billion years of cosmic evolution. These simulations were completed on 10,000+ core supercomputers.
As the WHO declared the COVID-19 outbreak a pandemic, I joined the “Rapid Assistance in Modelling the Pandemic” (RAMP) Royal Society initiative and worked with two Durham based initiatives to model its spread. One of the outcomes was the open-source framework JUNE. During this effort my attention was ever more drawn to grasp the social, ethical, political, and philosphical implications of my work on how they are reflected in the current state of technology regulation.
In that pursuit I organized and moderated a series of conversation at the Durham Research Methods Centre on machine learning and how to foster interdisciplinary research between the humanities, social- and natural-science (blog posts on these conversation can be found in my essay section). Furthermore am I working with the In-SIGHT project (formerly the Datactive project) to investigate how norms, such as human rights, get inscribed, resisted, and subverted in the Internet infrastructure through its transnational governance.
The programming languages listed below are ordered according to the level of my experience in them:
- Python:
- with speed ups via Numba or C and C++ through ctypes and cython
- parallelisation through threading and multiprocessing
- machine learning libraries such as Sklearn, PyTorch and TensorFlow
- unit testing
- …
- Julia
- FORTRAN
- Bash
- Matlab
- Haskell
- C/C++
- Scala
For HPC queuing I am most familiar with SLURM and LSF batch systems.
For database management systems I am experienced in MySQL and GraphQL.
For software development I rely on (the probably incomplete list of tools):
- distributed version control through git
- test coverage through, e.g., Travis CL and Codecov
- unifying code style through, e.g., flak8
For documentation of code, reading lists, notes, etc.:
- Zotero: for notes, publications, books, etc
- Sphinx: for Python Software
- Pandoc: to be able to convert between different document file types
- Latex, Word, Google doc, Markdown: writing in natural language
References
[1] Golumbia, David (2009) The Cultural Logic of Computation Harvard University Press
[2] Easterling, Keller (2014) Extrastatecraft: The Power of Infrastructure Space Verso Publishing
[3] Lury, Celia (2021) Problem Spaces: How and Why Methodology Matters Polity Press