This week’s readings introduced the
concepts of data friction and infrastructural globalisation which was coined by
Paul Edwards (2010). He uses the study of weather and climate in order to
contextualise the two complex theories which complement the study of
publishing.
Data friction is defined as the effort
required in managing pre-existent data into new and feasible sources of
information. In the readings, Edwards (2010) uses previously recorded
atmospheric information and contextualises the information according to the
time and place. Through this he can attempt to publish a history of global
weather patterns. The old data is then repurposed into potentially being useful
to climatologists and bringing a light to a different aspect of knowledge.
There is “friction” in consolidating both
forms of data together. Interestingly this theory observes epistemological (theory
of knowledge) aspects of publishing rather than the ontological (the way things
are). In order evolve our pre-inherited knowledge we seek to combine and
readapt them in order to enter new paradigms of thinking. This alludes to the
idea that the creation of information is constructed by rearranging the
information that we currently possess (Burr 2003), to oversimplify
constructionism.
Other than historical climate data, this
can be applied to publishing. The data that we as journalists collect from
interviews, research, and even filming is edited and rearranged in order to
create new information to be given out to the public. Single sound bites, gives
us information in that one particular aspect, but when it is rearranged amongst
several others, one can create a news story or even remixed song. I understand
that data friction is created when two things don’t make sense to each other at
first sight, but if done correctly you open yourself to new knowledge.
![]() |
| News is applies both data friction and infrastructural globalisation |
Furthermore, Edwards (2010) discusses this
idea of infrastructural globalisation. It is defined as a mechanism to obtain
data globally and in turn helps to develop global methods of thinking Edwards
(2010). When the term “global thinking” is stated, I immediately thought of the
influence of the internet in bringing about “shared and synthesised” knowledge,
which could help make sense of certain aspects of research (Castree 2010). With the internet, our ability to connect with
an array of information and being exposed to different perspectives and
cultures reshapes our current linear way of knowing. With networked information
we develop new knowledge at a global scale.
News, for example, used to focus on stories
based on their proximity, but today’s celebrity, political and sports news have
become more focused on international stories. Because of the internet (John
Oliver’s Last Week Tonight for
instance), I now have the ability to expand my knowledge about the condition of
student debt in America. This has become one of the consequences of
globalisation, the expansion and the restructuring of local to global
knowledge. With this medium we can therefore come to an agreement on what
should be universally known. However, a problem could be that, Western
countries still hold the monopoly in affecting information. Are we really
expanding our knowledge or are we just adhering to cultural imperialism?
I believe that the emergence of diverse
news companies overcoming data friction is truly globalising knowledge.
References
Burr,
V. (2003). Social constructionism. London: Routledge.
Castree,
N. (2010). How We Make Knowledge About Climate Change » American
Scientist. [online] Americanscientist.org. Available at:
http://www.americanscientist.org/bookshelf/pub/how-we-make-knowledge-about-climate-change
[Accessed 20 Oct. 2014].
Edwards,
P. (2010). ‘Introduction’ in A Vast Machine: Computer Models, Climate
Data, and the Politics of Global Warming Cambridge, MA: MIT Press: xiii-xvii

0 comments:
Post a Comment