Why has Obama failed to knock ‘47’ bells out of Romney?

Dr Jonathan Bell from the Department of History wonders why the race to the White House is still too close to call…

‘There are 47 percent of the people who will vote for the president no matter what,’ commented Mitt Romney in a video clip recorded at a fundraiser in September.

This was one of the most controversial moments in the current US presidential campaign so far and, despite the fact that Romney recently disavowed the statement and impressed in the first tv debate, should have done irreparable damage to the challenger’s campaign.

With Romney’s further comments that ‘forty-seven percent of Americans pay no income tax,’ and that his role ‘is not to worry about those people. I’ll never convince them they should take personal responsibility and care for their lives,’ it seems staggering that with under two weeks left until the election, the race to the White House is still too close to call.

So what’s Obama done wrong?

Romney’s attempt in the video to lay out in sharp relief the fundamental gulf between his political worldview and that of President Obama, reveals much about the highly charged debate raging in the United States over the future direction of the country. The fact that in making such a statement Romney did not deliver a self knockout blow demonstrates how powerful notions of individualism and freedom from state control remain despite the economic cataclysm of 2008.

The 2012 election is a referendum on how far government can be trusted to nurture a fairer society, a debate that has been raging for at least a hundred years but which has now reached fever pitch.

In response to a similarly devastating economic depression in the 1930s President Roosevelt set up the modern American social welfare system with the Social Security Act, and won a second term by a landslide despite failing to end the Great Depression. That welfare system expanded gradually through the 1950s and 1960s, notably with the creation of Medicare and Medicaid, and Republican Ronald Reagan was unable to curtail government programmes as much as he wanted.

Yet America’s relationship with Big Government remained uneasy and tentative, and notions of the ‘undeserving poor’ were powerful weapons in the Republican arsenal when they took control of Congress in the 1990s. The economic crisis of recent years has only heightened conservative attacks on government, as they see attempts to spend stimulus money or make the state an arbiter of health care provision as reckless extravagant schemes that risk wrecking the US economy on the shoals of fiscal ruin.

But if FDR was able to win a second term despite opponents’ cries of socialism and in spite of the lingering misery of economic crisis, why is Obama having such difficulty painting Romney as uncaring and aloof, happy to cut taxes on the rich while millions of Americans suffer unemployment and job insecurity?

The problem lies in the very success of the New Deal mission over the last eighty years. Most Americans benefit from government aid: through a bewildering array of tax breaks, old age pensions, health care for the elderly and for a majority of mothers and children, farm subsidies, to name a few examples.

But the majority don’t consider themselves part of Romney’s 47 percent. They don’t see their own tax breaks or health care as a handout, but are susceptible to appeals that Democrats want to extend the social safety net to the underserving.

Obama has failed to get across the message that Romney was wrong not because the 47% figure was too high, but because it was too low: corporate America also depends on government, the middle class depend on government, and the poor in many ways get the least from government.

The persistence of the myth, a myth Obama himself cannot quite dare to explode,  that all but undeserving shirkers gain economic security purely from their own efforts, is making it easier to attack Obama than it should be and could lead him passing on the Oval Office keys to Governor Romney.

Dr Jonathan Bell is Head of the Department of History and is interested in the political history of the United States since the Great Depression, in particular the relationship between political ideas and social change. He works mainly on the politics of the post-1945 era, looking at ways in which the theory and practice of liberalism and conservatism changed in the three decades since 1929.

The good in ‘bad science’

Dr David Stack is a Reader in History who believes strongly in the need to promote interdisciplinary understanding and public engagement with history. His research interests include the inter-relationship of ideas (especially ‘scientific’ and medical ideas) and politics in the history of Britain and beyond, and Victorian autobiographies and their authors.

David will be presenting at the ‘Cultivating common ground: biology and the humanities’, an AHRC-funded workshop that will be held at the University of Reading, 18 July 2012. For more information follow the link: http://reading.ac.uk/cultivating-common-ground/

For many years I was reluctant to admit to strangers that I was a historian. This made getting a haircut a frequent source of torment: “Not working today?” the barber would ask. “No,” I’d reply in the forlorn hope of closing down the conversation. “What do you do then?” [Pause] “Erm …”. At this point, if feeling particularly adventurous, I’d ‘borrow’ an occupation from a friend or relation. The tangled web that ensued, however, was never, as Walter Scott warned, worth the trouble. (I still have nightmares about the time I found myself in a shop basement staring at an electricity meter and intoning, with as much authority as I could muster: “Yes, that’s working perfectly.”)

An honest answer, however, often brought its own difficulties, especially if someone asked what the book I was working on was about. For a few years the simplest answer to this question was ‘phrenology’. Most people understood the term – “Bumps on the head, isn’t it?” – but few could fathom why anyone would spend their time researching it. “So, is there any truth in it?” I was asked on more than one occasion (and not just by barbers, who might be expected to have had a better idea than most).

Implicit in the question, I’ve always felt, was a dual assumption: that the world (and the past) is neatly divided into the ‘true’ and ‘untrue’, and that the latter is useless and thus not worth bothering with. If one holds to this view, that ‘truth’ and ‘utility’ are the only guides to what is worth studying, historians really should be embarrassed about avowing their vocation, and not only to barbers but more especially to university administrators and government ministers (in the unlikely event that they deign to ask). For one of the key tasks of historians, and historians of science more than most, is precisely the study of what is ‘wrong’ and ‘useless’. Just after the Second World War the medical historian Walter Pagel (1898-1983), published a short manifesto for this activity. He called it ‘the vindication of rubbish’.

Pagel’s approach was the antithesis of what we might call popular histories of science. In these accounts, often written by scientists themselves, knowledge is marching ever onwards in a straight line, with each generation (to mix our metaphors) standing on the shoulders of earlier giants, in their progress towards a more truthful understanding. Oddities, such as phrenology, take their place in such accounts, but only as party-pieces to be laughed at and thus confirm our own superior understanding. Pagel, by contrast, coined his phrase while exploring the relationship between alchemy and naturalism in the Renaissance. Rather than dismiss such ‘non-progressive’ elements in the history of science, Pagel argued that an understanding of alchemy and magic was essential to understanding Renaissance medicine and chemistry.

The justification for studying phrenology was slightly different. What interested me (and earlier historians) was the manner in which phrenology’s model of the brain – especially the notion of cognitive localization – mapped onto broader social and economic changes in Victorian society, and how this related to its success and popularity. In part, what we argued was that phrenology is a good example of how the success of a science is determined less by its inherent (or ahistorical) ‘truth’, and more by its explanatory power in a definite set of social relations. Phrenology’s theory of the brain, and thus human nature, that is, was popular because it was compatible with the new industrial capitalist economy and the values of free market economics. A study of phrenology, therefore, both aids our understanding of Victorian Britain and provides a historical perspective through which to view contemporary claims about the modularity of the mind, in neuroethics and brain imaging, which have been labelled the ‘new phrenology’.

Underlying my work, and that of other historians of science, are two assumptions that, I suspect, practising scientists will find uncomfortable. The first is that the distinction between ‘science’ and ‘pseudo-science’, associated with the philosopher Karl Popper, is unhelpful and invalid. Alchemy, phrenology, magic, and a host of other oddities all deserve to be taken seriously in the history of science. Second, that the ‘realist’ assumption that we should unquestioningly accept today’s science as ‘objective’ and ‘true’ is unsustainable in the face of historical evidence. There are so many cases of theories that were empirically successful in their own day but are now believed false – one historian (Larry Laudan) listed 30 in a range of different disciplines and eras – that there seem good grounds for assuming the historical contingency of any scientific ‘truth’. The atomic theory of matter, after all, could go the same way as phlogiston (the non-existent chemical thought to be released during combustion, prior to the discovery of oxygen) theory.

These, I concede, are difficult and contentious topics that cut to the heart of both the self-image of science and the work of humanities scholars. They are not, perhaps, suitable topics for the barber’s chair but they will, I hope, form part of our discussions at the Cultivating Common Ground workshop. And if biologists prove no more receptive to my ideas than barbers, I am well practised in steering the conversation around to where I’m planning to spend my holidays.

http://www.reading.ac.uk/history/hist-home.aspx

Roman fragments and digital modelling shed light on urban spectacle

Dr Matthew Nicholls from the Department of Classics at the University of Reading is interested in the political and social history of the Romans and the way that the built environments of Rome and cities around the empire expressed their values and priorities.

Matthew has an interest in computer modelling as a way of exploring ancient structures and bringing them to life and has developed an ambitious recreation of the city of Rome in the age of Constantine.

The first Roman emperor Augustus (ruled 31BC – AD14) began the imperial tradition of offering the people of Rome the ‘bread and circuses’ – cheap food and entertainment on a lavish scale – that came to characterise life in the city.

Augustus, as one ancient commentator tells us, “surpassed all his predecessors in the number, variety and splendour of his games”, often using new or restored buildings within the city as a setting and boasting (not without justification) that he had “found Rome a city of brick, and left it a city of marble”, a tangible element of his vision for cultural, political and moral renewal of the Roman state.

Theatre of Marcellus digital model

After more than a decade in power Augustus used this renewed city as the setting for a particularly splendid set of games, which combined high literary culture and deliberately archaising religious ceremonial with popular gestures, including the provision of plays and chariot races. These games were intended to celebrate a renewed epoch of Roman prosperity and divine favour, a saeculum, and are therefore known in English (despite their religious function) as the Secular Games of 17BC.

We know about these games from a variety of sources – ancient authors mention them, the special hymn that the poet Horace wrote at the emperor’s behest survives, and several of the buildings or areas used for the events are known to archaeologists. We also have an inscription, found in fragments,  that had been reset into a medieval wall, which lays out the daily programme for the games as devised by the priestly board in charge with details of the venue, date, and time for each event.

Combining the text of this inscription with what we know about Augustus’ building programme in Rome lets us see how he used his regenerated city as a backdrop for these games, reinforcing the message of renewal and prosperity with a splendid vista of new and restored buildings. In particular, the daily programme of supplementary entertainments, put on as crowd-pleasers for a week after the main religious sacrifices had been concluded, led crowds of spectators in their tens of thousands from open ground on the edge of town past a series of buildings – temples, porticoes, theatres, squares, and parkland – that had received special attention in the previous decade, showing off a thoroughly regenerated quarter of the city.

The modern Olympic Games, with their simultaneous functions as national showcases, catalysts for urban regeneration, and cultural spectaculars, offer a good comparison.

Moreover, the detailed timetable set out in the inscription allows us to investigate how the Games’ planners took particular advantage of the summer sun. The Games were held in early June, with events starting (as was common in the ancient world) shortly after sunrise. By the time that spectators arrived at the morning’s final venue, Augustus’ own new Theatre of Marcellus, it would have been around a quarter past eight in the morning on a modern clock. The Theatre of Marcellus is, uniquely for theatre buildings in Rome, orientated so the stage points north east; the other venues for the Games point west. With help from a UROP undergraduate research assistant, Ed Howkins, I used the digital model of ancient Rome that I have been building as a research and teaching tool to investigate the lighting conditions in the Theatre of Marcellus at the relevant date and time.

It turns out that the stage would have been perfectly illuminated for exactly the period in which the plays were held there, with the sun shining on the actors rather than into the faces of the spectators. It seems that Augustus’ theatre architects had created a venue intended especially for morning performances, and that his games planners responded to this by ensuring that each day’s entertainment culminated in the splendid new theatre at just the point when it was most perfectly illuminated – a subtle reminder that the reign of Augustus and the continued prosperity of Rome were, as the prayers for the Games implied, cosmically ordained and favoured by heaven.

http://www.reading.ac.uk/classics/