Recent research from Mike Lockwood and team has evaluated the current decline in solar activity (which began in 1985) by comparing with past behaviour of the Sun deduced from cosmogenic isotopes found in ice sheets and tree trunks. The current decline appears faster than any equivalent fall in the past 1000 years, giving an estimated probability of about 20% on a return to “grand solar minimum” conditions within 40 years.
The last such grand minimum was called the Maunder minimum, a period between about 1650 and 1700 AD. During these decades, hardly any sunspots were seen on the Sun: cosmogenic isotope production peaked because the lower solar activity level led to a weaker magnetic shield in interplanetary space, allowing more galactic cosmic ray particles to reach Earth’s atmosphere where they generate these isotopes.
So what do we think the Sun is like in these grand minima? All the indications are that the total solar irradiance, which powers the troposphere and our weather and climate, would be smaller by about 0.1%, which is a relatively small change in radiative forcing and, indeed, climate models predict this will have only a very minor effect on global mean temperatures. The ultraviolet light from the Sun would also be lower and by a larger factor (a few percent) which should have detectable effects on the distribution of winds and temperatures in the stratosphere. Lastly there would be considerable changes in the near-Earth space environment. Recent work predicts a slower, less dense, solar wind carrying a weaker magnetic field. These may well not be fully benign changes as far as “space weather” is concerned, as higher fluxes of damaging cosmic ray particles and solar mass ejection events may have the potential to do more damage. One sure effect would be a cooler, less dense uppermost atmosphere (the thermosphere) where satellites orbit and this would mean, for example, a slower rate of loss of space debris and junk by orbital decay.
One effect that interests us a lot is a consequence of the drop in solar ultraviolet. This is a matter of ongoing scientific debate, but there are reasons (including whole atmosphere model simulations) to suspect that the reduced stratospheric heating may allow more jet stream blocking events in winter in the eastern Atlantic, leading to colder European winters (but warmer ones elsewhere, for example, Greenland). Indeed the term “little ice age” is often used interchangeably with the term “Maunder minimum” and some misleading and spurious arguments appear in the press and on the internet based on this choice of words. There is evidence for a prolonged period of somewhat lower global mean temperatures beginning in around 1400 to 1500 (estimates vary) and ending sometime between 1700 and 1800. This appears to have been a global phenomenon and has been termed the ‘Little Ice Age’: but note that it began a long time before the start of the Maunder minimum, and continued long after it ended. Our research strongly suggests that the higher fraction of cold winters in Europe during the Maunder minimum was a regional rather than global phenomenon.
Calling this period a ‘mini Ice Age’ is misleading, implying as it does unremitting cold throughout the Maunder minimum. This is far from the case, even for Europe let alone the whole world. For example, consider the Central England Temperature (CET) record, which extends back to 1659. If we take the winter averages (December, January and February) we find the coldest winter on record was 1683/4 – right in the middle of the Maunder minimum. However just two years later, and still right in the middle of the Maunder minimum, we have the sixth mildest winter in the whole 353-year CET record! What’s more, there’s no evidence that summers in the Maunder minimum were any colder than usual. To refer to this period as an ‘ice age’ is therefore simply incorrect.