Research activities (August 2018)General note: descriptions of research have a tendency to age quickly. For an uptodate view it is better to look at my Google scholar page or my papers on arXiv.Earlier projects which were not active by the cutoff date in August 2018 are also better found from there. 

(Classical) nonequilibrium statistical physicsClassical nonequilibrium statistical physics has seen a major transformation in the last two decades with the discovery of exact results known as fluctuation relations.The fundamental equality which underlines these relations is that the logratio of path probabilities in a forward and timereversed process is the entropy production (entropy increase in the environment). By Clausius equality entropy production is in turn proportional to the heat exchanged between a system and a bath. I have worked on optimizing the entropy production functional (papers from around 2012) and on the overdamped limit of the same functional (slightly later), and more recently on strongcoupling issues where the system dynamics is not described as a Markov process. Main publications in this area:

Open quantum systemsMotivated by fluctuation relations I have worked on computing the energy change of a bath of quantum harmonic oscillators interacting with a general quantum system. These calculations can be done by extending the FeynmanVernon method invented to compute the change of the density matrix of an open quantum system using path integrals. The energy exchange with the bath is then described by new functionals, similar to but not identical to the FeynmanVernon kernels.This research was started together with Ralf Eichhorn together with whom I computed the expected energy change in the environment. As was found later, structurally similar results hold also for the full generating function of the energy change. A conceptual point was clarified in the paper On Work and Heat in TimeDependent Strong Coupling (this paper, listed above, is however mostly about the classical situation). As a spinoff I also wrote a paper on errors generated in a quantum computational device interacting with a bosonic bath. Applications to heat in environments interacting with a system of qubits are (at the time of writing) work in progress. Main publications in this area:

Various issues motivated by dynamics of disordered systemsSpin glass theory concerns the GibbsBoltzmann (equilibrium) measure, but there are also interesting issues around "dynamics of spin glasses". Such processes have applications (maybe the most interesting applications) outside physics.For instance, a local search to optimize some combination of constraints can be considered a dynamics on spins representing the discrete variables. For simulated annealing the analogy is quite direct, indeed this famous and very basic algorithm was directly motivated from Physics. For other interesting algorithms, such as the ones investigated in the first listed paper below, the analogy should however be physical processes driven out of equilibrium. The papers I like best in this direction were attempts to generalize the cavity method (or Belief Propagation) to dynamics. The hence apply (when they apply) to processes on locally treelike graphs. This work was done with my postdoc Hamed Mahmoudi (in Helsinki), my PhD student Gino Del Ferraro (at KTH), and most recently in collaboration with Roberto Mulet and others from University of Havana (Cuba).

Direct coupling analysis (DCA)The last ten years has seen a breakthrough in our ability to infer interactions from biological sequence data. Direct Coupling Analysis (DCA) which has allowed, for the first time, to reliably infer spatially proximate residues in protein structures from a sufficiently large set of sequences of homologous proteins sharing the same structure. Combining this contact prediction with computational methods developed over many decades, ab initio protein structure prediction has finally become a reality for many proteins for which no structure is known (but can be tested and assessed afterwards).DCA in a nutshell means two things. The first step is to take n samples, each of them of L symbols, and learn a generative model of the Ising or Potts type. For the protein problem L is the length of the protein, and n is the number of homologous (similar) proteins. In practice such a table often has to be padded to represent subsequences that only appear in some subset of the samples. The second step is to represent the data by a small subset of the inferred parameters. In the protein problem these retained parameters are leveraged to predict which pairs of residues are spatially close. For me, the work on DCA (which we first called "inverse Ising problems") can be seen as a spinoff from the work on spin glasses, which gradually took on a life of its own, and then led to other projects in computational inference and data science (not referenced here) My group contributed the ``pseudolikelihood maximization'' (PLM) approach to DCA. Both the method and the software (plmDCA) have become standards. I was also the first to use DCA on to identify epistatic contributions to fitness from wholegenome data (in bacteria, collaboration with J Corander, now at University of Oslo). Main publications in this area
