Powered by OpenAIRE graph

NWO-institutenorganisatie, CWI - Centrum Wiskunde & Informatica, Algorithms and Complexity (A&C)

NWO-institutenorganisatie, CWI - Centrum Wiskunde & Informatica, Algorithms and Complexity (A&C)

8 Projects, page 1 of 2
  • Funder: Netherlands Organisation for Scientific Research (NWO) Project Code: VI.Veni.222.331

    Short random circuits (tiny quantum computer programs built by dice roll), are an essential testing tool for near-term quantum computers. They also model interesting physics such as quantum chaos. However their properties are not very well understood mathematically. In this project I aim to rigorously understand such circuits, with an eye to applying what I learn to building new quantum computers and understanding physics. I will do this by combining time tested tools from mathematical physics with techniques from the field of quantum information theory.

    more_vert
  • Funder: Netherlands Organisation for Scientific Research (NWO) Project Code: 647.003.001

    There is a need for more energy system integration, while the autonomy of individual systems is necessary to cope with the exploding complexity of multiple buildings and their interaction with the electricity grid. The use of Big Data in combination with deep learning techniques offers new opportunities to better predict energy consumption and decentralized production of renewable energy (for example, based on local weather data taking into account local phenomena such as urban heat islands). This combined with multi-agent systems with a cooperative approach provides decentralized control and monitoring autonomy to further reduce the complexity of energy system integration.

    more_vert
  • Funder: Netherlands Organisation for Scientific Research (NWO) Project Code: 639.073.904

    We aim to develop a general theory for statistical inference under misspecification, to be used when all models under consideration are wrong (=misspecified), yet some are useful. In practice, wrong-yet-useful models are employed all the time: we often pretend that nonlinear variables are linear; or that dependent variables are independent; or that measurement error or noise is normally distributed, even though it isnt - and so on. Besides such misspecification of the data generating machinery, we also target sampling plan misspecification, which arises if our assumptions about how the data are gathered or sampled are incorrect. A key novel insight is that these two types of misspecification, while usually viewed as intrinsically different, can be given a unified treatment by employing safe distributions. These are probability distributions accompanied by a specification of what aspects of a domain they can predict well. Using safe distributions, we plan to develop statistical methods that work near optimal if the model under consideration is entirely correct, almost correct (as in nonparametric settings) and entirely incorrect, without knowing in advance which of these situations pertains. In practice, such methods would unify and generalize both Bayesian and worst-case approaches to statistical learning, and in many cases considerably outperform them both, allowing us to do more with less data. Such a unification is a holy grail in the fields of statistical learning, with applications in classification and regression problems such as automated object or character recognition, time series prediction and so on. But the same concept of safe distributions also leads to improved, robustified versions of null hypothesis testing, the standard statistical method for inference in, for example, the medical sciences and experimental psychology; and, relatedly, to new insights on the use of statistics in court cases, shedding light on controversial issues such as does it sometimes make sense to ignore part of the data?

    more_vert
  • Funder: Netherlands Organisation for Scientific Research (NWO) Project Code: NGF.1623.23.005

    Rolling the quantum dice: better quantum computers through randomness Quantum computers are an exciting future technology, promising advances in material science, medicine and many other places. However building them is difficult, because quantum systems are fragile and complex. In this project we will harness the power of randomness —the rolling of dice— combined with advanced mathematical techniques, to design more efficient ways to learn about the noise in quantum computers, so we can have better quantum computers, faster.

    more_vert
  • Funder: Netherlands Organisation for Scientific Research (NWO) Project Code: 617.001.651

    Bayesian inference can be inconsistent when models are wrong yet useful. We recently showed this for an important practical setting, exhibiting dismal behavior of Bayesian model averaging (with standard priors) in standard linear regression under seemingly mild misspecification. The example extends to Bayesian ridge regression and the Bayesian Lasso. We can remedy the problem by equipping Bayesian inference with a learning rate that regulates relative importance of data and prior. With a small enough learning rate, Bayes continues to perform well under misspecification, but the problem is how to learn the learning rate. We recently proposed the Safe Bayesian algorithm, which provably learns the optimal rate for bounded loss functions and has excellent practical performance. The optimal target learning rate turns out to be determined by the stochastic mixability condition, a measure of easiness of the learning problem at hand. This proposal aims to further develop mixability and establish convergence results for SafeBayes with rates determined by the underlying mixability level for unbounded loss functions and fat-tailed distributions, as well as developing fast implementations of SafeBayes and related algorithms. Stochastic mixability generalizes (a) mixability, exp-concavity and strong convexity, which determine learning speed in sequential prediction under nonstochastic, adversarial settings, and (b) the Tsybakov and Bernstein conditions in statistical learning, which determine optimal convergence rates in e.g. classification problems. Unlike these conditions stochastic mixability is one-sided, making it weaker for unbounded losses. Perhaps surprisingly, stochastic mixability is also the right tool to study Bayesian inference under misspecification with variable learning rate.

    more_vert
  • chevron_left
  • 1
  • 2
  • chevron_right

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

Content report
No reports available
Funder report
No option selected
arrow_drop_down

Do you wish to download a CSV file? Note that this process may take a while.

There was an error in csv downloading. Please try again later.