My primary work these days are memory management and applications of programming languages to problems in cybersecurity. I am particularly interested in memory safety, concurrency safety (which includes data-race freedom), verified compilation, programming models for concurrency, and low-latency garbage collection. Type systems play a key role in several of these research endeavours.
I collaborate with:
Funding: Swedish Research Council
Duration: 2025–2028+1
The goal of this research is to strengthen data-race freedom and memory safety in software development.We focus on untyped programming languages such as Python and JavaScript, which are widely used across a wide range of domains.
The increased use of these languages forces their evolution to embrace concurrency while staying safe. This is challenging for the implementation of these languages and adds new classes of possible bugs to programs written in these languages.Our approach is to make a technique from the world of statically typed languages available in the untyped world: ownership. Static ownership annotations have successfully been used in the past to avoid data races and ensure memory safety, using types to propagate ownership information and rule out violating programs at compile-time. However, this entirely static approach excludes the class of untyped programming languages for which there is no compile-time: Worse: these programs often use reflective mechanisms to construct themselves incrementally during execution. Our main challenges are: dealing with such dynamism without losing neither a strong notion of ownership nor destroying the advantage of this class of programming languages; tracking and enforcing ownership efficiently at run-time; provide alternatives to the static means for supporting programming with ownership so that programmers can take advantage of our results. We evaluate our work through integration into the Python programming language.
Funding: Swedish Research Council
Duration: 2024–2027+1
Memory safety is an important property of a programming language or runtime environment that prevents programs from accessing or modifying memory in unintended ways, such as unauthorised or invalid pointer access, use-after-free errors, double-free errors, and buffer overflows.
Despite most popular languages being memory safe, memory safety issues still account for most of the known vulnerabilities. The use of memory unsafe languages is primarily driven by performance needs, particularly at the low levels of the software stack. The tension between safety and performance is critical as bugs at the lower levels can be exploited to attack software above it. The implementation of runtime techniques for memory safety, such as garbage collection, complicates reasoning about and controlling performance. Recent advances in compile-time memory management allow programming languages to deliver memory safety without unpredictable runtime behaviour. However, two obstacles to widespread adoption remain: expressivity and usability. As a result, developers resort to using unsafe code to circumvent these limitations and achieve the desired functionality.The goal of this research is to advance the state of the art in memory safety, and in particular, to improve the security of programs by eliminating the need for unsafe code. The proposed approach has the potential to benefit a wide range of applications in cybersecurity, where memory-related vulnerabilities are a significant concern.
Funding: Swedish Research Council
Duration: 2020–2024+1
Managed languages, such as Java, JavaScript and Haskell, enable better software by abstracting the hardware and removing entire classes of bugs from software development, for example through the use of automatic memory management.
Relieving programmers from dealing with low-level non-functional considerations is key to enabling a wider class of programmers, for example the recent surge of data scientists programming in the managed languages Python and R. The price of added abstraction is additional overhead and inability to optimise which hurts performance in managed languages. This greatly inflates hardware requirements which prevents the use of managed languages in performance-sensitive or resource-constrained domains. Dropping to a low-level language entirely or in part leads to issues with portability, safety and security, software quality and inflated development costs. The purpose of this project is to allow the wider use of managed languages by removing the inefficiencies without lowering the abstraction. By combining new developments in hardware on software-controlled metadata and custom micro-accelerators with new memory management techniques, both for reclaiming unused memory and improving the placement of data in memory, we will deliver not only improved managed languages, but new insights in how to balance the characteristics of hardware (efficient-but-inflexible) and software (less-efficient-but-malleable) in execution environments for managed languages.
Funding: Oracle unrestricted gifts
Duration: 2019–2023 (project still going)
The JVM ReCo project is a collaborative effort between Uppsala University’s Programming Language Lab (UPLANG) and Oracle, focusing on enhancing the Java Virtual Machine (JVM). This project offers master thesis opportunities for computer science and IT civil engineering students in the greater Stockholm area. The collaboration aims to bridge the gap between academia and industry by providing students with hands-on experience at Oracle’s Stockholm offices, where the Java VM is developed. Under the supervision of Professor Tobias Wrigstad, students work on various aspects of JVM performance, memory management, and garbage collection.
The project has produced several significant publications, including studies on energy consumption of garbage collectors, improvements in program locality, and modern garbage collection techniques like ZGC. These contributions not only advance the academic understanding of JVM internals but also have practical implications for the development of more efficient and effective garbage collection algorithms. The collaboration has been supported by the Swedish Foundation for Strategic Research, the Swedish Research Council, and corporate donations from Oracle, highlighting the project’s importance and impact in the field of programming languages and runtime systems1
Funding: Swedish Research Council
Duration: 2015–2018+1
Parallel computers are pervasive, and future computers of all scales will be provide their power through parallelism.Thus, all applications will need to be parallel to exploit the available computing resources.
Funding: Swedish Research Council
Duration: 2013–2016+1
Object-oriented programming builds on aliasing and mutable state. It is well-known that these combined cause problems for programmers, tool developers and formalists. Creating and managing aliases and resulting object structures are among the most frequent operations in object-oriented programs.
Funding: European Union, 7th Frame Programme
Duration: 2013–2016
Manufacturers shifted from multicore to manycore chips, featuring up to a million processors. However, software struggled to utilize this power without complex concurrency details. Traditional multithreading in object-oriented languages proved inefficient for manycore chips, threatening the object-oriented model and existing programming knowledge.
The project aimed to enable scalable application development for manycore chips while retaining the object-oriented paradigm. It proposed a new approach to integrating parallelism and concurrency into programming languages, emphasizing concurrent computation by default. This breakthrough promised to significantly impact software development for future manycore chips.
Funding: Swedish Foundation for Strategic Research
Duration: 2019–2022
Managed languages like Java and JavaScript execute in a virtual machine (VM), which abstracts the underlying architecture and operating system. VMs typically handle things like allocation, security, linking and loading, and garbage collection (GC). GC is a key activity in modern programming languages that automates one of the most tedious and error-prone programming activities, namely resource management of the computer’s memory, with benefits ranging from productivity to security.
The goal of this project is to explore new techniques for garbage collection and memory management. Recently, the applicant has developed two new techniques for improving cache performance in GC-based languages, with promising results demonstrated on standard benchmarks in the academic setting.
Funding: Swedish Foundation for Strategic Research
Duration: 2013–2014
An increasing number of companies use a family of programming languages called “scripting languages” for ease and speed of development. Scripting languages are often an enabler during the start-up phase, but as software grows and matures, hurt performance, power-efficiency and quality of service.
The goal of this research is to develop tools and techniques for facilitating the transition of “scripts into programs”, which requires furthering the current understanding of how scripting languages are used in practise.