‘Public engagement’ and ‘research communities’ – these are the new buzzwords from the Arts and Humanities Research Council, one of the largest funding bodies for historical research in the UK. Their message is that the gulf between the ivory tower of academic research in higher education institutions and the enthusiastic, public communities interested in historical research must be reduced. It’s an idea that has been at the forefront of university scholarship within the humanities for some time now, and it’s unsurprising considering it’s the public who fund historical investigation. In providing the opportunity to conduct research, it would seem that the same public would like us to deliver it into their hands. This in itself is not an unreasonable request, though it is one that has lead us to the general assumption that the only good history is ‘usable history’.
Posts tagged ‘collaborative history’
If there was one thing that the Making Big Data Human conference made clear, it was that ‘Big Data’, and indeed digital methodologies in general, provide some very exciting opportunities to advance historical research. From the ambitious and wide-ranging National Archives’ Traces Through Time project, which looks to create a generic method to look at historical individuals across enormous datasets, through to the more specific but equally exciting Casebooks Project, the conference participants were treated to a feast of ideas about how historical methods are adapting to the changing nature of data in a digital age.
But what exactly is ‘big data’, and what did the Doing History in Public team have in mind when we decided to explore how we could make it ‘human’? The basic definition of ‘big data’ is ‘extremely large data sets that may be analysed computationally’. For historians this might, as Jane Winters demonstrated in her keynote lecture, be a case of using the archived web as an historical source, or of exploring parliamentary proceedings from three different countries over a period of more than 200 years.