Skip to content


An opportunity for Fellows to meet and socialise whilst discovering and sharing science. Hosted by Professor John Pethica and Dame Jean Thomas.

Research Weekends at Chicheley Hall are part of the Fellows' Social Programme at the Society. These relaxed weekends allow Fellows and their guests to socialise, whilst also joining informal discussions and talks on a range of scientific topics.

Join hosts Professor John Pethica and Dame Jean Thomas for this weekend on 9 - 10 September 2017. Each day will feature a series of relaxed talks, lively discussion and opportunities to enjoy the surroundings of Chicheley Hall.

To cover some of the accommodation and catering costs we ask for a contribution of £85 from Fellows and Foreign Members. Guests of Fellows are very welcome, and we ask for a contribution from guests of £110.

To register for this Fellow's Research Weekend please complete the following survey. For further information please contact the Scientific Programmes team on



Memory, simple concept but multiple systems in the brain


Memory is a critical process of the mind – enabling us to travel in time and defining our own individuality.  It is often described as the “glue” that holds families and friends together.

The “folk psychology” sense of how it works is that we store information initially in short-term memory and then some of this information gets selectively transferred to long-term memory; remembering is then a matter of reactivating these long-term traces – having a memory of some past event.  There is a grain of truth in this informal sense of the process, but it is wrong.

First of all, there are multiple forms of memory.  Of these, short-term working memory which we use to keep of track of information during conversation or to do mental arithmetic is a relatively newly evolved form of memory which operates in parallel with long-term memory rather than as an input to it.  Second, it now seems as though all attended information enters long-term memory from the outset but most of it is lost quite quickly – in a matter of hours.  This period of time does, however, create a window of opportunity for some information to be selectively stabilized or “consolidated” to use the current neuroscience jargon. Memory consolidation itself is further subdivided into two inter-dependent processes, one operating primarily at the cellular level and likely involving the synthesis of “plasticity proteins” that stabilize the synaptic connections between neurons, while the other is more of a network process that links disparate brain regions and enables information to be added to existing networks of knowledge.  On top of these complexities, there is the further distinction between “declarative” memory (which the philosopher Gilbert Ryle referred to as “knowing that”) and “procedural” memory (“knowing how”), mediated by very different neural networks in the brain and with quite distinct functions in cognition and behavior.

At the more neurobiological level, there is a fascinating mix of anatomy (where in the brain?), physiology (what patterns of cellular activity trigger memory?), and molecular biology (what genes are transcribed? what proteins are involved? what signal transduction mechanisms?) that, at different levels of understanding, need to be brought into the picture. Only then will we have a mature understanding of the biology of memory linking the overarching psychological concept to the molecules of the mind.  



Professor Richard Morris CBE FRS

University of Edinburgh, UK

The shape of data


A year ago the Alan Turing Institute opened for business. Researchers from a variety of different disciplines and different career stages filled the newly equipped rooms on the second floor of the British Library. It has been an exciting journey in which the Turing has defined its space in the ever growing data science landscape.

In this presentation a glimpse of what is happening at the Turing will be given and it will explain how topology, a branch of pure mathematics, can give new insights in data.


The Large Hadron Collider Project: Interwoven complexity of a scientific project for fundamental discoveries


The Large Hadron Collider (LHC) at CERN is designed to probe Nature moments after the Big Bang to tackle the questions about the origin, evolution and composition of our universe. The discovery in July 2012 of the Higgs boson, by the ATLAS and CMS experiments at the Large Hadron Collider (LHC), though completing the particle content of the standard model (SM) of particle physics, a theory that describes our visible universe in exquisite detail, leaves many questions unanswered.

The LHC Project, the accelerator and the experiments, comprises quite complex instruments using a wide range of engineering aspects, many pushed to the limits. This talk will try to tease out these varied aspects of the Project, from civil, mechanical, cryogenics, electronic/electrical, and software engineering, computing and data handling, instrumentation, to data analysis, that make this project one of the most complex scientific endeavours. All had to work in an efficient unison to enable the discovery of the Higgs boson.


Halving premature death


Death in old age is inevitable, but death before old age is not. Except where HIV or political disturbances predominated, mortality rates have been decreasing for decades, helped by sanitation, health care, and social changes. If disease control keeps progressing, then within the next few decades—except where disasters or new epidemics supervene—under-50 mortality and under-70 mortality should fall to less than half of the 2010 global risks of, respectively, 15% and 36%.

Non-communicable diseases, such as cancer, stroke, heart disease, and emphysema, cause about 1/4 of all deaths before age 50, plus four-fifths of those at ages 50 to 69, but although the global population is rising, under-70 NCD death rates are falling by about 15% per decade. The most important external factor is smoking, which in 2010 was causing about a quarter of all cancer deaths in the European Union and 30% of all cancer deaths in the US. Options for reducing this and other major cancer causes will be reviewed.


Making sense of tissue complexity by single cell transcriptomics


The Human Cell Atlas (HCA) ( is an international initiative to create comprehensive reference maps of all human cells as a basis for understanding health and disease. Given the accessibility of skin, its well-defined spatial organisation, and the close clinical-scientific collaborations that already exist amongst skin researchers, this tissue represents an ideal project for the HCA. I will discuss some of the logistical and computational challenges of creating a skin atlas, as well as the potential benefits for discovery science.


Be careful what you ask the genie for. Practical approaches to distilling meaning


Causality, correlation, generalization, interpretation, stationarity, phase change and the curse of dimensionality can lead you astray in complex systems. These issues can pose practical problems in areas such as Big data, machine learning and complex system analysis. Such questions affect disciplines  as diverse as genomics to financial stability and policy decisions from legal to healthcare and regulation.