Difference between revisions of "X = Y"

From Mondothèque

 
(62 intermediate revisions by 4 users not shown)
Line 1: Line 1:
 +
{{NOTOC}}
 +
<span class="name">[[author::Dick Reckard]]</span>
  
 +
{{Revision}}
  
==0. innovation of the same==
+
==0. Innovation of the same==
  
<div class="book"><onlyinclude>
+
<div class="book">{{RT|morphing_sameness}}<section begin=morphing_sameness />
The PR imagery produced by and around the Mundaneum (disambiguation: the Museum in Mons) is often hinting, by a series of  'samenesses', at a fundamental continuity between Otlet's endeavor and Internet/Google's products. A good example is the image below, where the Tiroirs of Mundaneum (disambiguation: Otlet's Utopia) morph into the servers of one of Google's datacenters.</onlyinclude></div>
+
The PR imagery produced by and around the Mundaneum (disambiguation: the institution in Mons) often suggests, through a series of  'samenesses', an essential continuity between Otlet's endeavour and Internet-related products and services, in particular Google's. <section end=morphing_sameness />A good example is a scene from the video "From industrial heartland to the Internet age", published by The Mundaneum, 2014 , where the drawers of Mundaneum (disambiguation: Otlet's Utopia) morph into the servers of one of Google's data centres.</div>
[...]
 
  
[[category:publication]]
+
This approach is not limited to images: a recurring discourse that shapes some of the exhibitions taking place in the Mundaneum maintains that the dream of the Belgian utopian has been kept alive in the development of internetworked communications, and currently finds its spitiual successor in the products and services of Google.
[[author::Dick Reckard]]
+
Even though there are many connections and similarities between the two endeavours, one has to acknowledge that Otlet was an internationalist, a socialist, an utopian, that his projects were not profit oriented, and most importantly, that he was living in the temporal and cultural context of modernism at the beginning of the 20<sup>th</sup> century.
 +
The constructed identities and continuities that detach Otlet and the Mundaneum from a specific historical frame, ignore the different scientific, social and political milieus involved. It means that these narratives exclude the discording or disturbing elements that are inevitable when considering such a complex figure in its entirety.
  
 +
This is not surprising, seeing the parties that are involved in the discourse: these types of instrumental identities and differences suit the rhetorical tone of Silicon Valley.
 +
Newly launched IT products for example, are often described as groundbreaking, innovative and ''different from anything seen before''. In other situations, those products could be advertised ''exactly the same'', as something else that already exists<ref>A good account of such phenomenon is described by David Golumbia. http://www.uncomputing.org/?p=221</ref>. While novelty and difference surprise and amaze, sameness reassures and comforts. For example, Google Glass was marketed as revolutionary and innovative, but when it was attacked for its blatant privacy issues, some defended it as just a camera and a phone joined together. The sameness-difference duo fulfils a clear function: on the one hand, it suggests that technological advancements might alter the way we live dramatically, and we should be ready to give up our old-fashioned ideas about life and culture for the sake of innovation. On the other hand, it proposes we should not be worried about change, and that society has always evolved through disruptions, undoubtedly for the better. For each questionable groundbreaking new invention, there is a previous one with the same ideal, potentially with just as many critics... Great minds think alike, after all.
 +
This sort of a-historical attitude pervades techno-capitalist milieus, creating a cartoonesque view of the past, punctuated by great men and great inventions, a sort of technological variant of Carlyle's [https://en.wikipedia.org/wiki/Great_Man_theory ''Great Man Theory'']. In this view, the Internet becomes the invention of a few father/genius figures, rather than the result of a long and complex interaction of diverging efforts and interests of academics, entrepreneurs and national governments. This instrumental reading of the past is largely consistent with the theoretical ground on which the ''Californian Ideology''<ref>As described in the classic text looking at the ideological ground of Silicon Valley culture. http://www.hrc.wmin.ac.uk/theory-californianideology-main.html</ref> is based, in which the conception of history is pervaded by various strains of technological determinism (from Marshall McLuhan to Alvin Toffler<ref>For an account of Toffler's determinism, see http://www.ukm.my/ijit/IJIT%20Vol%201%202012/7wan%20fariza.pdf .</ref>) and capitalist individualism (in generic neoliberal terms, up to the fervent objectivism of Ayn Rand).
  
This stance is not limited to images: the main discourse that shapes the exhibitions that happen in Mundaneum basically maintains that Internet / Google (leaving aside for now the issue of the interchangeability of Google and Internet ) is keeping alive the utopia of Paul Otlet, who unfortunately at his time didn't have access to the adequate technology to realize it.
+
The appropriation of Paul Otlet's figure as Google's grandfather is a historical simplification, and the samenesses in this tale are not without fundament. Many concepts and ideals of documentation theories have reappeared in cybernetics and information theory, and are therefore present in the narrative of many IT corporations, as in Mountain View's case. With the intention of restoring a historical complexity, it might be more interesting to play the ''exactly the same'' game ourselves, rather than try to dispel the advertised continuum of the ''Google on paper''. Choosing to focus on other types of analogies in the story, we can maybe contribute a narrative that is more respectful to the complexity of the past, and more telling about the problems of the present.
  
Even though there are obvious links and similarities between the two projects, one cannot put aside as details the fact that Otlet was an internationalist, a socialist, an utopian whose project was not profit oriented, and most importantly, that he was living in the temporal and cultural context of modernism in the beginning of the century.
+
What followings are three such ''comparisons'', which focus on three aspects of continuity between the documentation theories and archival experiments Otlet was involved in, and the cybernetic theories and practices that Google's capitalist enterprise is an exponent of.
 +
The First one takes a look at the conditions of workers in information infrastructures, who are fundamental for these systems to work but often forgotten or displaced.
 +
Next, an account of the elements of distribution and control that appear both in the idea of a ''Reseau Mundaneum'', and in the contemporary functioning of data centres, and the resulting interaction with other types of infrastructures.
 +
Finally, there is a brief analysis of the two approaches to the 'organization of world's knowledge', which examines their regimes of truth and the issues that come with them.
 +
Hopefully these three short pieces can provide some additional ingredients for adulterating the sterile recipe of the Google-Otlet sameness.
  
Such construction of identities and continuities are in fact detaching something from a specific historical frame, ignoring the different scientifical, social and political milieus that Otlet's Mundaneum were part of.
+
==a. Do androids dream of mechanical turks?==
This means that such continuities exclude discording or disturbing elements that are inevitable when one considers such a complex figure in its entirety.
 
  
 +
{{RT|laboratorium}}In a drawing titled ''Laboratorium Mundaneum'', Paul Otlet depicted his project as a massive factory, processing books and other documents into end products, rolled out by a UDC locomotive. <section begin=laboratorium />In fact, just like a factory, Mundaneum was dependent on the bureaucratic and logistic modes of organization of labour developed for industrial production. Looking at it and at other written and drawn sketches, one might ask: who made up the workforce of these factories? <section end=laboratorium />
  
This is not surprising, seen the focus on the information technology: this type of a-historical perspectives are something really common in Silicon Valley culture.
+
In his ''Traité de Documentation'', Otlet describes extensively the thinking machines and tasks of intellectual work into which the ''Fordist chain'' of documentation is broken down. In the subsection dedicated to the people who would undertake the work though, the only role described at length is the ''Bibliotécaire''. In a long chapter that explains what education the librarian should follow, which characteristics are required, and so on, he briefly mentions the existence of “Bibliotecaire-adjoints, rédacteurs, copistes, gens de service”<ref>Otlet, Paul. Traité de documentation: le livre sur le livre, théorie et pratique. Editiones Mundaneum, 1934: 393-394.</ref>. There seems to be no further description nor depiction of the staff that would write, distribute and search the millions of index cards in order to keep the archive running, an impossible task for the Bibliotécaire alone.
  
For example, everybody is familiar with the tendency of describe new IT products as groundbreaking, innovative and 'different from anything seen before'. There is in other situations, instead, the complementary habit to maintain that something is 'exactly the same' as something else that already existed (Golumbia). While difference is used to surprise and wonder, sameness is there instead to reassure and comfort. For example Google Glass was marketed as revolutionary and innovative, but when it was attacked for its blatant privacy issues, it was described as just a camera and a phone joined together.  
+
{{RT|mondocrew}}A photograph from around 1930, taken in the ''Palais Mondial'', where we see Paul Otlet together with the rest of the équipe, gives us a better answer. In this beautiful group picture, we notice that the workforce that kept the archival machine running was made up of women, but we do not know much about them.<section begin=mondocrew /> As in telephone switching systems or early software development<ref>http://gender.stanford.edu/news/2011/researcher-reveals-how-%E2%80%9Ccomputer-geeks%E2%80%9D-replaced-%E2%80%9Ccomputergirls%E2%80%9D</ref>, gender stereotypes and discrimination led to the appointment of female workers for repetitive tasks that required specific knowledge and precision. <section end=mondocrew />
  
This sort of a-historical attitude pervades the discourse of many techno-capitalists, drawing a cartoonesque view of the past, punctuated by great men and great inventions. The Internet becomes then the invention of a few father/genius figures, rather than a long complex interaction of diverging efforts and interests of academics, entrepreneurs, companies, universities.
+
{{RT|scanplant}}<section begin=scanplant />According to the ideal image described in "Traité", all the tasks of collecting, translating, distributing, should be completely automatic, seemingly without the necessity of human intervention. However, the Mundaneum hired dozens of women to perform these tasks. This human-run version of the system was not considered worth mentioning, as if it was a temporary in-between phase that should be overcome as soon as possible, something that was staining the project with its vulgarity.<section end=scanplant />
  
To sum up a simplified narrative: on one hand, technological advancements are gonna completely redesign the way we live, so we have to be ready to give up our old fashioned ideas of life and culture when the new comes. On the other hand, that's something not to worry about, after all that's how society has always evolved, and undoubtedly for the better.  
+
Notwithstanding the incredible advancement of information technologies and the automation of innumerable tasks in collectiong, processing and distributing information, we can observe the same pattern today. All automatic repetitive tasks that ''technology'' should be able to do for us are still, one way or another, relying on human labour. And unlike the industrial worker who obtained recognition through political movements and struggles, the role of many cognitive workers is still hidden or under-represented.
 +
Computational linguistics, neural networks, optical character recognition, all amazing machinic operations are still based on humans performing huge amounts of repetitive intellectual tasks from which software can ''learn'', or which software can't do with the same ''efficiency''. Automation didn't really free us from labour, it just shifted the where, when and who of labour.<ref>This process has been named “heteromation”, for a more thorough analysis see: Ekbia, Hamid, and Bonnie Nardi. “Heteromation and Its (dis)contents: The Invisible Division of Labor between Humans and Machines.” First Monday 19, no. 6 (May 23, 2014). http://firstmonday.org/ojs/index.php/fm/article/view/5331</ref>. Mechanical turks, content verifiers, annotators of all kinds... The software we use requires a multitude of tasks which are invisible to us, but are still accomplished by humans. Who are they? When possible, work is outsourced to foreign English-speaking countries with lower wages, like India. In the western world it follows the usual pattern: female, lower income, ethnic minorities.
 +
{{RT|visible_hands}}An interesting case of heteromated labour are the so-called Scanops<ref>The name scanops was first introduce by artist Andrew Norman Wilson when he found out about this category of workers during his artistic residency at Google in Mountain View. See http://www.andrewnormanwilson.com/WorkersGoogleplex.html .</ref>, a set of Google workers who have a different type of badge and are isolated in a section of the Mountain View complex secluded from the rest of the workers through strict access permissions and fixed time schedules. Their work consists of scanning the pages of printed books for the Google Books database, a task that is still more convenient to do by hand (especially in the case of rare or fragile books). The workers are mostly women and ethnic minorities, and there is no mention of them on the Google Books website or elsewhere; in fact the whole scanning process is kept secret. Even though the secrecy that surrounds this type of labour can be justified by the need to protect trade secrets, it again conceals the human element in machine work. This is even more obvious when compared to other types of human workers in the project, such as designers and programmers, who are celebrated for their creativity and ingenuity.
 +
<section begin=visible_hands />However, here and there, evidence of the workforce shows up in the result of their labour. Photos of Google Books employee's hands sometimes mistakenly end up in the digital version of the book online<ref>As collected by Krissy Wilson on her http://theartofgooglebooks.tumblr.com .</ref>.<section end=visible_hands />
 +
Whether the tendency to hide the human presence is due to the unfulfilled wish for total automation, to avoid the bad publicity of low wages and precarious work, or to keep an aura of mystery around machines, remains unclear, both in the case of Google Books and the ''Palais Mondial''. {{RT|raw_work}}<section begin=raw_work />Still, it is reassuring to know that the products hold traces of the work, that even with the progressive removal of human signs in automated processes, the workers' presence never disappears completely. This presence is proof of the materiality of information production, and becomes a sign of the economies and paradigms of efficiency and profitability that are involved.<section end=raw_work />
  
For each groundbreaking new invention that is questioned, there is always a previous invention that was aiming for the same ideal, with just as many detractors... Great minds think alike, after all.
+
==b. The (data) centre and the periphery==
  
This instrumental use of history is consistent with much of the theoretical ground on which the Californian Ideology stands, whose conception of history is pervaded by various strains of technological determinism ( from Marshall McLuhan to Alvin Toffler ) and capitalist individualism ( either in generic neoliberal terms, or the fervent objectivism a la Ayn Rand ).
+
In 2013, while Prime Minister Di Rupo was celebrating the beginning of the second phase of constructing the Saint Ghislain data centre, a few hundred kilometres away a very similar situation started to unroll. In the municipality of Eemsmond, in the Dutch province of Groningen, the local Groningen Sea Ports and NOM development were rumoured to have plans with [[LA MÉGA-ENTREPRISE|another]] code named company, ''Saturn'', to build a data centre in the small port of Eemshaven.
 +
{{RT|harm_loves_google}}<section begin=harm_loves_google />A few months later, when it was revealed that Google was behind ''Saturn'', Harm Post, director of Groningen Sea Ports, commented: "Ten years ago Eemshaven became the laughing stock of ports and industrial development in the Netherlands, a planning failure of the previous century. And now Google is building a very large data centre here, which is 'pure advertisement' for Eemshaven and the data port."<ref>http://www.rtvnoord.nl/nieuws/139016/Keerpunt-in-de-geschiedenis-van-de-Eemshaven .</ref> Further details on tax cuts were not disclosed and once finished, the data centre will provide at most 150 jobs in the region.<section end=harm_loves_google />
  
 +
Yet another territory fortunately chosen by Google, just like Mons, but what are the selection criteria? For one thing, data centres need to interact with existing infrastructures and flows of various type. Technically speaking, there are three prerequisites: being near a substantial source of electrical power (the finished installation will consume twice as much as the whole city of Groningen); being near a source of clean water, for the massive cooling demands; being near Internet infrastructure that can assure adequate connectivity. There is also a whole set of non-technical elements, that we can sum up as the social, economical and political ''climate'', which proved favourable both in Mons and Eemshaven.
  
With the intention to return Otlet's figure to some of its historical complexity, a
+
The push behind constructing new sites in new locations, rather expanding existing ones, is partly due to the rapid growth of the importance of ''Software as a service'', so-called cloud computing, which is the rental of computational power from a central provider.
Instead of trying to dispel the advertised continuity of the 'Google of paper',  
+
With the rise of the ''SaaS'' paradigm the geographical and topological placement of data centres becomes of strategic importance to achieve lower latencies and more stable service. For this reason, Google has in the last 10 years been pursuing a policy of end-to-end connection between its facilities and user interfaces. This includes buying leftover fibre networks<ref>http://www.cnet.com/news/google-wants-dark-fiber/ .</ref>, entering the business of underwater sea cables<ref>http://spectrum.ieee.org/tech-talk/telecom/internet/google-new-brazil-us-internet-cable .</ref> and building new data centres, including the ones in Mons and Eemshaven.
  
It can be interesting, instead, to play "exactly the same" game ourselves, but choosing to focus on other types of samenesses in the story, which make the story more complicated and more telling about the present.
+
The spread of data centres around the world, along the main network cables across continents, represents a new phase in the diagram of the Internet. This should not be confused with the idea of decentralization that was a cornerstone value in the early stages of interconnected networks.<ref>See Baran, Paul. “On Distributed Communications.” Product Page, 1964. http://www.rand.org/pubs/research_memoranda/RM3420.html .</ref> During the rapid development of the Internet and the Web, the new tenets of immediacy, unlimited storage and exponential growth led to the centralization of content in increasingly large server farms. Paradoxically, it is now the growing centralization of all kind of operations in specific buildings, that is fostering their distribution.
 +
The tension between centralization and distribution and the dependence on neighbouring infrastructures as the electrical grid is not an exclusive feature of contemporary data storage and networking models. Again, similarities emerge from the history of the Mundaneum, illustrating how these issues relate closely to the logistic organization of production first implemented during the industrial revolution, and theorized within modernism.  
  
Here following are three such 'samilarities', which look at three aspects of continuity between the documentation theories and the archival experiments Otlet was involved in, and the cybernetic theories and practices that Google is an exponent of, in its capitalist enterprise.
+
Centralization was seen by Otlet as the most efficient way to organize content, especially in view of international exchange<ref>Pierce, Thomas. ''Mettre des pierres autour des idées''. Paul Otlet, de Cité Mondiale en de modernistische stedenbouw in de jaren 1930. PhD dissertation, KULeuven, 2007: 34.</ref> which already caused problems related to space back then: the Mundaneum archive counted 16 million entries at its peak, occupying around 150 rooms. The cumbersome footprint, and the growing difficulty to find stable locations for it, concurred to the conviction that the project should be included in the plans of new modernist cities. In the beginning of the 1930s, when the Mundaneum started to lose the support of the Belgian government, Otlet thought of a new site for it as part of a proposed ''Cité Mondiale'', which he tried in [[Smart cities, cities of knowledge|different locations with different approaches]].  
First is a look at the relation between human and machine that is structured by the two assemblages, in particular the role of cognition workers, always necessary for information structures to work, and which often tends to be forgotten.
 
  
Then an account the elements of distribution and control that are in the idea of a 'reseau mundaneum', and the implicit interaction with other types of existing infrastructures, and that echo the functioning of datacenters.
+
Between various attempts, he participated in the competition for the development of the Left Bank in Antwerp. The most famous modernist urbanists of the time were invited to plan the development from scratch. At the time, the left bank was completely vacant. Otlet lobbied for the insertion of a Mundaneum in the plans, stressing how it would create hundreds of jobs for the region. He also flattered the Flemish pride by insisting on how people from Antwerp were more hard working than the ones from Brussels, and how they would finally obtain their deserved recognition, when their city would be elevated to ''World City'' status.<ref>Ibid: 94-95.</ref> He partly succeeded in his propaganda; aside from his own proposal, developed in collaboration with Le Corbusier, many other participants included Otlet's Mundaneum as a key facility in their plans. {{RT|all_systems}}<section begin=all_systems />In these proposals, Otlet's archival infrastructure was shown in interaction with the existing city flows such as industrial docks, factories, the railway and the newly constructed stock market.<ref>Ibid: 113-117.</ref>The modernist utopia of a planned living environment implied that methods similar to those employed for managing the flows of coal and electricity could be used for the organization of culture and knowledge.<section end=all_systems />
  
Finally there is the common idea of an underlying formula or algorithm that has the promise of reading the world.
+
The ''Traité de Documentation'', published in 1934, includes an extended reflection on a ''Universal Network of Documentation'', that would coordinate the transfer of knowledge between different documentation centres such as libraries or the Mundaneum<ref>Otlet, Paul. Traité de documentation: le livre sur le livre, théorie et pratique. Editiones Mundaneum, 1934.</ref>. In fact the existing Mundaneum would simply be the first node of a wide network bound to expand to the rest of the world, the ''Reseau Mundaneum''. The nodes of this network are explicitly described in relation to "post, railways and the press, those three essential organs of modern life which function unremittingly in order to unite men, cities and nations."<ref>Otlet, Paul. Les Communications MUNDANEUM, Documentatio Universalis, doc nr. 8438</ref> In the same period, in letter exchanges with Patrick Geddes and Otto Neurath, commenting on the potential of heliographies as a way to distribute knowledge, the three imagine the ''White Link'', a network to distribute copies throughout a series of Mundaneum nodes<ref>Van Acker, Wouter. “Internationalist Utopias of Visual Education: The Graphic and Scenographic Transformation of the Universal Encyclopaedia in the Work of Paul Otlet, Patrick Geddes, and Otto Neurath.” Perspectives on Science 19, no. 1 (January 19, 2011): 68-69.</ref>. As a result, the same piece of information would be serially produced and logistically distributed, described as a sort of ''moving Mundaneum'' idea, facilitated by the railway system<ref>Ibid: 66.</ref>. No wonder that future Mundaneums were foreseen to be built next to a train station.
  
==a. hidden workers==
+
In Otlet's plans for a ''Reseau Mundaneum'' we can already detect some of the key transformations that reappear in today's data centre scenario. First of all, a drive for centralization, with the accumulation of materials that led to the monumental plans of World Cities. In parallel, the push for international exchange, resulting in a vision of a distribution network. Thirdly, the placement of the hypothetic network nodes along strategic intersections of industrial and logistic infrastructure.
  
In a picture from xxxx you can see Paul Otlet together with the rest of the equipe of the Mundaneum. A very nice group picture, which also tells us something that is not that present in the written archive: Who else worked at the Palais Mondial, except Otlet and Lafontaine?
+
While the plan for Antwerp was in the end rejected in favour of more traditional housing development, 80 years later the legacy of the relation between existing infrastructural flows and logistics of documentation storage is highlighted by the data ports plan in Eemshaven. Since private companies are the privileged actors in these types of projects, the circulation of information increasingly respond to the same tenets that regulate the trade of coal or electricity. The very different welcome that traditional politics reserve for Google data centres is a symptom of a new dimension of power in which information infrastructure plays a vital role. The celebrations and tax cuts that politicians lavish on these projects cannot be explained with 150 jobs or ''economic incentives'' for a depressed region alone. They also indicate how party politics is increasingly confined to the periphery of other forms of power and therefore struggle to assure themselves a strategic positioning.
http://www.mondotheque.be/wiki/images/e/ec/Mundaneumteam.png
 
These mostly unidentified women were the workforce that kept the archival machine running. The huge amounts of cards needed to be written, distributed, located in the tiroirs continuously, and obviously it was not possible for Otlet himself to make all the work. Similarly to phone operators or the early computer operators, female workers were often hired for repetitive works that required specific knowledge and precision. As even the brightest women would have a hard time to be able to design systems let alone directing them (and still do in most of the world).
 
  
 +
==c. 025.45UDC; 161.225.22; 004.659GOO:004.021PAG.==
  
In a drawing called Laboratorium Mundaneum, the project is depicted as a huge factory, processing books and documentation into end products rolled out via a UDC locomotive. In fact, like a factory, Mundaneum was highly dependent on infrastructures and the bureaucratic and logistic modes of organizing labour deriving from mass-production.
+
The Universal Decimal Classification<ref>The Decimal part in the name means that any records can be further subdivided by tenths, virtually infinitely, according to an evolving scheme of depth and specialization. For example, 1 is “Philosophy”, 16 is “Logic”, 161 is “Fundamentals of Logic”, 161.2 is “Statements”, 161.22 is “Type of Statements”, 161.225 is “Real and ideal judgements”, 161.225.2 is “Ideal Judgements” and 161.225.22 is “Statements on equality, similarity and dissimilarity”.</ref> system, developed by Paul Otlet and Henri Lafontaine on the basis of the Dewey Decimal Classification system is still considered one of their most important realizations as well as a corner-stone in Otlet's overall vision. Its adoption, revision and use until today demonstrate a thoughtful and successful approach to the classification of knowledge.
Here a pattern of similarity emerges, still very present nowadays: all magic repetitive machinic tasks that 'technique' can do for us are based on human labour. In fact, differently from the industrial production where the worker has been usually given its place in history as a figure, mostly thanks to the workers movements and struggles, the so-called cognitive worker is mostly hidden or under-represented.
 
Hundred years after Mundaneum's unidentified women, automation didn't really free us from labour. Computational linguistics, neural networks, optical character recognition, all the most amazing machinic performances that are  are actually based on huge amounts of humans performing repetitive cognitive operations that the software will 'learn from'.
 
  
 +
The UDC differs from Dewey and other bibliographic systems as it has the potential to exceed the function of ordering alone. The complex notation system could classify phrases and thoughts in the same way as it would classify a book, going well beyond the sole function of classification, becoming a real language. One could in fact express whole sentences and statements in UDC format<ref>“The UDC and FID: A Historical Perspective.” The Library Quarterly 37, no. 3 (July 1, 1967): 268-270.</ref>. The fundamental idea behind it <ref>TEMP: described in french by the word ''depouillement'', </ref>was that books and documentation could be broken down into their constitutive sentences and boiled down to a set of universal concepts, regulated by the decimal system. This would allow to express objective truths in a numerical language, fostering international exchange beyond translation, making science's work easier by regulating knowledge with numbers. We have to understand the idea in the time it was originally conceived, a time shaped by positivism and the belief in the unhindered potential of science to obtain objective universal knowledge. Today, especially when we take into account the arbitrariness of the decimal structure, it sounds doubtful, if not preposterous.
  
Mechanical turks, content verifiers, annotators of all kinds... Which, unless it is possible to outsource to foreign english speaking countries with lower wages, like India, in the western world follow this pattern: female, lower income, ethnical minorities.
+
However, the linguistic-numeric element of UDC which enables to express fundamental meanings through numbers, plays a key role in the oeuvre of Paul Otlet. In his work we learn that numerical knowledge would be the first step towards a science of combining basic sentences to produce new meaning in a systematic way. When we look at ''Monde'', Otlet's second publication from 1935, the continuous reference to multiple algebraic formulas that describe how the world is composed suggests that we could at one point “solve” these equations and modify the world accordingly.<ref>Otlet, Paul. Monde, essai d’universalisme: connaissance du monde, sentiment du monde, action organisée et plan du monde. Editiones Mundaneum, 1935: XXI-XXII.</ref>
Like all other informatic giants, Google has its own share of invisible workers, one interesting case of which are the Scanops, as Artist Andrew Norman Wilson called the workforce he discovered during his artistic residency at Google office in Mountain View. http://www.andrewnormanwilson.com/WorkersGoogleplex.html
+
Complementary to the ''Traité de Documentation'', which described the systematic classification of knowledge, ''Monde'' set the basis for the transformation of this knowledge into new meaning.
One set of the Google workers, owning a different type of badge, are isolated in one part of the complex and secluded from the rest of the workers, by having access to just one facility and by their time schedules being made so. The task of these workers consists in scanning the pages of printed books to be added in the Google Books database, a work that is still more convenient to do by hand in some cases (delicate or rare books, for example).In prevalence female, in prevalence ethnic minorities these workers are secluded from the actual glorified workers of the company, the programmers, revealing a will to hide what the machine can't do 'yet'.
 
The traces of this workforce are still traceable through the result of its work, though, as seen in the following project by Andrew Wilson. http://www.andrewnormanwilson.com/ScanOps.html
 
  
 +
Otlet wasn't the first to envision an ''algebra of thought''. It has been a recurring ''topos'' in modern philosophy, under the influence of scientific positivism and in concurrence with the development of mathematics and physics. Even though one could trace it back to Ramon Llull and even earlier forms of combinatorics, the first to consistently undertake this scientific and philosophical challenge was Gottfried Leibniz. The German philosopher and mathematician, a precursor of the field of symbolic logic, which developed later in the 20th century, researched a method that reduced statements to minimum terms of meaning. He investigated a language which “... will be the greatest instrument of reason,” for “when there are disputes among persons, we can simply say: Let us calculate, without further ado, and see who is right”.<ref>Leibniz, Gottfried Wilhelm, The Art of Discovery 1685, Wiener: 51.</ref>
 +
His inquiry was divided in two phases. The first one, analytic, the ''characteristica universalis'', was a universal conceptual language to express meanings, of which we only know that it worked with prime numbers. The second one, synthetic, the ''calculus ratiocinator'', was the algebra that would allow operations between meanings, of which there is even less evidence.
 +
The idea of calculus was clearly related to the infinitesimal calculus, a fundamental development that Leibniz conceived in the field of mathematics, and which Newton concurrently developed and popularized. Even though not much remains of Leibniz's work on his ''algebra of thought'', it was continued by mathematicians and logicians in the 20th century. Most famously, and curiously enough around the same time Otlet published ''Traité'' and ''Monde'', logician Kurt Godel used the same idea of a translation into prime numbers to demonstrate his incompleteness theorem.<ref>https://en.wikipedia.org/wiki/G%C3%B6del_numbering</ref> The fact that the ''characteristica universalis'' only made sense in the fields of logics and mathematics is due to the fundamental problem presented by a mathematical approach to truth beyond logical truth. While this problem was not yet evident at the time, it would emerge in the duality of language and categorization, as it did later with Otlet's UDC.
  
It is somehow reassuring that, when looking with the right eyes, the final product keeps in some ways the traces of the workforce involved, that even with the progressing removal of human signs from a technical infrastructure, this role can never be hidden completely.
+
The relation between organizational and linguistic aspects of knowledge is also one of the open issues at the core of web search, which is, at first sight, less interested in objective truths. At the beginning of the Web, around the mid '90s, two main approaches to online search for information emerged: the web directory and web crawling. Some of the first search engines like Lycos or Yahoo!, started with a combination of the two. The web directory consisted of the human classification of websites into categories, done by an “editor”; crawling in the automatic accumulation of material by following links with different rudimentary techniques to assess the content of a website. With the exponential growth of web content on the Internet, web directories were soon dropped in favour of the more efficient automatic crawling, which in turn generated so many results that quality has become of key importance. Quality in the sense of the assessment of the webpage content in relation to keywords as well as the sorting of results according to their relevance.
In the very same way, there are many traces of human work in the mundaneum archive... From the inside pictures of the Palais Mondial ( differently from the Google Books working place, of which no image exist yet ), or from the elements one can find in the cards themselves.
 
http://www.mondotheque.be/wiki/index.php/File:Archives_MundaneumDSC04638.jpg fill in the gap. the archive is full of traces of unnamed workers.
 
[...]
 
  
+
Google's hegemony in the field has mainly been obtained by translating the relevance of a webpage into a numeric quantity according to a formula, the infamous PageRank algorithm. This value is calculated depending on the relational importance of the webpage where the word is placed, based on how many other websites link to that page. The classification part is long gone, and linguistic meaning is also structured along automated functions. What is left is reading the network formation in numerical form, capturing human opinions represented by hyperlinks, i.e. which word links to which webpage, and which webpage is generally more important.
 +
In the same way that UDC systematized documents via a notation format, the systematization of relational importance in numerical format brings functionality and efficiency.
 +
In this case rather than linguistic the translation is value-based, quantifying network attention independently from meaning. The interaction with the other infamous Google algorithm, Adsense, adds an economic value to the PageRank position.
 +
The influence and profit deriving from how high a search result is placed, means that the relevance of a word-website relation in Google search results translates to an actual relevance in reality.
  
==b. centralization - distribution - infrastructure==
+
Even though both Otlet and Google say they are tackling the task of ''organizing knowledge'', we could posit that from an epistemological point of view the approaches that underlie their respective projects, are opposite. UDC is an example of an analytic approach, which acquires new knowledge by breaking down existing knowledge into its components, based on objective truths. Its propositions could be exemplified with the sentences “Logic is a subdivision of Philosophy” or “PageRank is an algorithm, part of the Google search engine”. PageRank, on the contrary, is a purely synthetic one, which starts from the form of the network, in principle devoid of intrinsic meaning or truth, and creates a model of the network's relational truths. Its propositions could be exemplified with “Wikipedia is of the utmost relevance” or “The University of District Columbia is the most relevant meaning of the word 'UDC'”.
In 2013, in the days in which prime minister Di Rupo was celebrating the beginning of the second phase of construction of the datacenter, a few hundred kilometers away something very similar to what happened few years before in Mons was unrolling. In the small town of Eemshaven, a port in the province of Groningen, the Netherlands, Groningen Sea Ports and NOM development were in secret deals with another[crystal comp] temporary named firm, "Saturn", to deploy a datacenter in the infrastructural wonder of Eemshaven.
 
When many months later, the party was revealed to be Google, Harm Post, director of Groningen Sea Ports, commented:
 
"Just ten years ago Eemshaven was the standing joke of the ports, a case to look down upon of industrial development in the Netherlands, the planning failure of the last century. And now Google is building a very large data center here, which is "pure advertisement" for Eemshaven and the data port." Again, further details on taxes were not disclosed, and the final number of jobs is considered to be around 150.
 
  
 +
We (and Google) can read the model of reality created by the PageRank algorithm (and all the other algorithms that were added during the years<ref>A fascinating list of all the algorithmic components of Google search is at https://moz.com/google-algorithm-change .</ref>) in two different ways. It can be considered a device that 'just works' and does not pretend to be true but can give results which are useful in reality, a view we can call ''pragmatic'', or instead, we can see this model as a growing and improving construction that aims to coincide with reality, a view we can call ''utopian''. It's no coincidence that these two views fit the two stereotypical faces of Google, the idealistic Silicon Valley visionary one, and the cynical corporate capitalist one.
  
Another region had the fantastic luck, to be chosen by Google, just like Mons, but why there, actually? Well, datacenters necessarily need to interact with existing infrastructures and flows of various type. Mainly, there are three necessities: being near a massive source of electrical power (better if green, in long-term thinking); being near a source of clean water, for the massive coooling needs; and thirdly being near internet infrastructure that can assure proper connectivity. There is then a whole set of other non technical elements, that we can describe as the social, economical and political "climate", that proved favorable in Mons and Eemshaven.
+
From our perspective, it is of relative importance which of the two sides we believe in. The key issue remains that such a structure has become so influential that it produces its own effects on reality, that its algorithmic truths are more and more considered as objective truths. While the utility and importance of a search engine like Google are out of the question, it is necessary to be alert about such concentrations of power. Especially if they are only controlled by a corporation, which, beyond mottoes and utopias, has by definition the single duty of to make profits and obey its stakeholders.
This fundamental importance of infrastructures can be traced back to the logistic organization of life first implemented with the industrial revolution, and theorized with modernism. Again, some suggestions of something quite 'exactly the same' come from the history of Paul Otlet.
 
  
 
+
[[category:publication]]
After the closedown of Mundaneum in 1934, Otlet got involved in the competition for the development of the Left Bank in Antwerp, attempting to build a new Mundaneum as part of a 'Cite Mondiale'. The most famous modernist urbanists of the time were invited plan the development from zero the left side of the river, at the time completely unbuilt. Otlet lobbied for the insertion of a Mundaneum in the projects, stressing how it would create hundreds of jobs for the region. He also flattered the flemish pride by stressing how Antwerp inhabitants, often more hard working than the Bruxelloises, would finally obtain their deserved recognition, hightening their city to a World City status. He partly succeeded in his propaganda, seen the fact that apart from his own proposal developed in collaboration with Le Corbusier, many other participants included Otlet's Mundaneum as a key facility in their plans. In these proposals for new development, Otlet's archival infrastructure was shown in interaction with the existing flows already running thorugh the city, like industrial docks, factories, the railway and the newly constructed stockmarket. The modernist utopia of the planned living environment already implied the organization of culture and knowledge with the same approach that was used for coal or electricity.
 
 
 
 
 
While the plan for Antwerp was in the end rejected in favour of more traditional housing development, 80 years later, the same relation between existing infrastructural flows and the logistics of documentation storage is highlighted by the grand realization in Eemshaven.  The continous construction of new datacenters is a result from the ideological propaganda that equals computation with immediacy, unlimited storage and exponential growth. But it also finds its roots in the logistic organization proper of modernism. The key ideas of centralization, distribution and control were present in Otlet already.  The centralization, which was seen as the most efficient way to organize content, also in view of international connection, already generated problems of space back then. The Mundaneum database at its peak reached 16 million entries, occupying around 150 rooms.. This was a big incentive towards the insertion of the archive in newly planned city structures that would better facilitate centralization and distribution of content. In his exchanges with Patrick Geddes[Van Acker], they imagined 'the White Link', a network to distribute heliographies in copies throughout a network of Mundaneums. Thanks to this, the same piece of information would be produced and logistically distributed. It is clear then the technical importance for the Mundaneums to be next to a railway station, as well as to the radio station, when thinking about the sketches for the Mondotheque.
 
 
 
 
 
This key inter-relation between centralization, distribution and infrastructures is very similar with nowadays datacenters.
 
Google has been in the last 10 years pursuing a policy of end-to-end connection between its datacenters and its end-user interfaces.
 
From 2005 it started buying thousands and thousands miles of unused fiber networks in the United States, assuring autonomous links between its datacenters, and becoming one of the main proprietor of physical networks in the world. More recently, it also entered the business of underwater sea cables, financing 3 different cables between US and Japan, and between us and the South East Asia.
 
This control over infrastructure, combined with the strategical distribution of its computing power in new datacenters, or, as the engineering department calls them, 'warehouse-sized computers', is fundamental for Mountain View's future plans: the cloud services that Google started to deploy, planning to threaten Amazon monopoly in the field. Own cabling and geographic placement of servers makes a lot of difference in the cloud computing field, allowing to achieve lower latencies and more stable service.
 
 
 
 
 
 
==c==
 
-UDC and Google Algorithm 'sameness'
 

Latest revision as of 15:29, 2 August 2016

Dick Reckard

Last Revision: 2·08·2016

0. Innovation of the same

Screenshot from 2014-10-31 16-21-52.png
The PR imagery produced by and around the Mundaneum (disambiguation: the institution in Mons) often suggests, through a series of 'samenesses', an essential continuity between Otlet's endeavour and Internet-related products and services, in particular Google's. A good example is a scene from the video "From industrial heartland to the Internet age", published by The Mundaneum, 2014 , where the drawers of Mundaneum (disambiguation: Otlet's Utopia) morph into the servers of one of Google's data centres.

This approach is not limited to images: a recurring discourse that shapes some of the exhibitions taking place in the Mundaneum maintains that the dream of the Belgian utopian has been kept alive in the development of internetworked communications, and currently finds its spitiual successor in the products and services of Google. Even though there are many connections and similarities between the two endeavours, one has to acknowledge that Otlet was an internationalist, a socialist, an utopian, that his projects were not profit oriented, and most importantly, that he was living in the temporal and cultural context of modernism at the beginning of the 20th century. The constructed identities and continuities that detach Otlet and the Mundaneum from a specific historical frame, ignore the different scientific, social and political milieus involved. It means that these narratives exclude the discording or disturbing elements that are inevitable when considering such a complex figure in its entirety.

This is not surprising, seeing the parties that are involved in the discourse: these types of instrumental identities and differences suit the rhetorical tone of Silicon Valley. Newly launched IT products for example, are often described as groundbreaking, innovative and different from anything seen before. In other situations, those products could be advertised exactly the same, as something else that already exists[1]. While novelty and difference surprise and amaze, sameness reassures and comforts. For example, Google Glass was marketed as revolutionary and innovative, but when it was attacked for its blatant privacy issues, some defended it as just a camera and a phone joined together. The sameness-difference duo fulfils a clear function: on the one hand, it suggests that technological advancements might alter the way we live dramatically, and we should be ready to give up our old-fashioned ideas about life and culture for the sake of innovation. On the other hand, it proposes we should not be worried about change, and that society has always evolved through disruptions, undoubtedly for the better. For each questionable groundbreaking new invention, there is a previous one with the same ideal, potentially with just as many critics... Great minds think alike, after all. This sort of a-historical attitude pervades techno-capitalist milieus, creating a cartoonesque view of the past, punctuated by great men and great inventions, a sort of technological variant of Carlyle's Great Man Theory. In this view, the Internet becomes the invention of a few father/genius figures, rather than the result of a long and complex interaction of diverging efforts and interests of academics, entrepreneurs and national governments. This instrumental reading of the past is largely consistent with the theoretical ground on which the Californian Ideology[2] is based, in which the conception of history is pervaded by various strains of technological determinism (from Marshall McLuhan to Alvin Toffler[3]) and capitalist individualism (in generic neoliberal terms, up to the fervent objectivism of Ayn Rand).

The appropriation of Paul Otlet's figure as Google's grandfather is a historical simplification, and the samenesses in this tale are not without fundament. Many concepts and ideals of documentation theories have reappeared in cybernetics and information theory, and are therefore present in the narrative of many IT corporations, as in Mountain View's case. With the intention of restoring a historical complexity, it might be more interesting to play the exactly the same game ourselves, rather than try to dispel the advertised continuum of the Google on paper. Choosing to focus on other types of analogies in the story, we can maybe contribute a narrative that is more respectful to the complexity of the past, and more telling about the problems of the present.

What followings are three such comparisons, which focus on three aspects of continuity between the documentation theories and archival experiments Otlet was involved in, and the cybernetic theories and practices that Google's capitalist enterprise is an exponent of. The First one takes a look at the conditions of workers in information infrastructures, who are fundamental for these systems to work but often forgotten or displaced. Next, an account of the elements of distribution and control that appear both in the idea of a Reseau Mundaneum, and in the contemporary functioning of data centres, and the resulting interaction with other types of infrastructures. Finally, there is a brief analysis of the two approaches to the 'organization of world's knowledge', which examines their regimes of truth and the issues that come with them. Hopefully these three short pieces can provide some additional ingredients for adulterating the sterile recipe of the Google-Otlet sameness.

a. Do androids dream of mechanical turks?

Laboratorium mundaneum.jpg
In a drawing titled Laboratorium Mundaneum, Paul Otlet depicted his project as a massive factory, processing books and other documents into end products, rolled out by a UDC locomotive. In fact, just like a factory, Mundaneum was dependent on the bureaucratic and logistic modes of organization of labour developed for industrial production. Looking at it and at other written and drawn sketches, one might ask: who made up the workforce of these factories?

In his Traité de Documentation, Otlet describes extensively the thinking machines and tasks of intellectual work into which the Fordist chain of documentation is broken down. In the subsection dedicated to the people who would undertake the work though, the only role described at length is the Bibliotécaire. In a long chapter that explains what education the librarian should follow, which characteristics are required, and so on, he briefly mentions the existence of “Bibliotecaire-adjoints, rédacteurs, copistes, gens de service”[4]. There seems to be no further description nor depiction of the staff that would write, distribute and search the millions of index cards in order to keep the archive running, an impossible task for the Bibliotécaire alone.

A photograph from around 1930, taken in the Palais Mondial, where we see Paul Otlet together with the rest of the équipe, gives us a better answer. In this beautiful group picture, we notice that the workforce that kept the archival machine running was made up of women, but we do not know much about them. As in telephone switching systems or early software development[5], gender stereotypes and discrimination led to the appointment of female workers for repetitive tasks that required specific knowledge and precision.
RC : Il faut déjà au minimum avoir un inventaire. Il faut que les pièces soient numérotées, sinon c’est un peu difficile de retracer tout le travail. Parfois, ça passe par une petite phase de restauration parce qu’on a des documents poussiéreux et quand on scanne ça se voit. Parfois, on doit faire des mises à plat, pour les journaux par exemple, parce qu’ils sont pliés dans les boîtes. Ça prend déjà un petit moment avant de pouvoir les numériser. Ensuite, on va scanner le document, ça c’est la partie la plus facile. On le met sur le scanner, on appuie sur un bouton, presque.

SVP: Wij scannen op een totaal andere manier. Bij Google gaat het om massa-productie. Wij kiezen zelf voor kleinere projecten. We hebben een vaste ploeg, twee mensen die voltijds scannen en beelden verwerken, maar daarmee begin je niet aan een project van 250.000 boeken. We doen wel een scan-on-demand of selecteren volledige collecties. Toen we al onze 2.750.000 fiches enkele jaren geleden door een externe firma lieten scannen had ik medelijden met de meisjes die de hele dag de invoerscanner bedienden. Hopeloos saai.

According to the ideal image described in "Traité", all the tasks of collecting, translating, distributing, should be completely automatic, seemingly without the necessity of human intervention. However, the Mundaneum hired dozens of women to perform these tasks. This human-run version of the system was not considered worth mentioning, as if it was a temporary in-between phase that should be overcome as soon as possible, something that was staining the project with its vulgarity.

Notwithstanding the incredible advancement of information technologies and the automation of innumerable tasks in collectiong, processing and distributing information, we can observe the same pattern today. All automatic repetitive tasks that technology should be able to do for us are still, one way or another, relying on human labour. And unlike the industrial worker who obtained recognition through political movements and struggles, the role of many cognitive workers is still hidden or under-represented. Computational linguistics, neural networks, optical character recognition, all amazing machinic operations are still based on humans performing huge amounts of repetitive intellectual tasks from which software can learn, or which software can't do with the same efficiency. Automation didn't really free us from labour, it just shifted the where, when and who of labour.[6]. Mechanical turks, content verifiers, annotators of all kinds... The software we use requires a multitude of tasks which are invisible to us, but are still accomplished by humans. Who are they? When possible, work is outsourced to foreign English-speaking countries with lower wages, like India. In the western world it follows the usual pattern: female, lower income, ethnic minorities.

An interesting case of heteromated labour are the so-called Scanops[7], a set of Google workers who have a different type of badge and are isolated in a section of the Mountain View complex secluded from the rest of the workers through strict access permissions and fixed time schedules. Their work consists of scanning the pages of printed books for the Google Books database, a task that is still more convenient to do by hand (especially in the case of rare or fragile books). The workers are mostly women and ethnic minorities, and there is no mention of them on the Google Books website or elsewhere; in fact the whole scanning process is kept secret. Even though the secrecy that surrounds this type of labour can be justified by the need to protect trade secrets, it again conceals the human element in machine work. This is even more obvious when compared to other types of human workers in the project, such as designers and programmers, who are celebrated for their creativity and ingenuity.

However, here and there, evidence of the workforce shows up in the result of their labour. Photos of Google Books employee's hands sometimes mistakenly end up in the digital version of the book online[8].

Whether the tendency to hide the human presence is due to the unfulfilled wish for total automation, to avoid the bad publicity of low wages and precarious work, or to keep an aura of mystery around machines, remains unclear, both in the case of Google Books and the Palais Mondial.
The computer scientists' view of textual content as "unstructured", be it in a webpage or the OCR scanned pages of a book, reflect a negligence to the processes and labor of writing, editing, design, layout, typesetting, and eventually publishing, collecting and cataloging [9].

Still, it is reassuring to know that the products hold traces of the work, that even with the progressive removal of human signs in automated processes, the workers' presence never disappears completely. This presence is proof of the materiality of information production, and becomes a sign of the economies and paradigms of efficiency and profitability that are involved.

b. The (data) centre and the periphery

In 2013, while Prime Minister Di Rupo was celebrating the beginning of the second phase of constructing the Saint Ghislain data centre, a few hundred kilometres away a very similar situation started to unroll. In the municipality of Eemsmond, in the Dutch province of Groningen, the local Groningen Sea Ports and NOM development were rumoured to have plans with another code named company, Saturn, to build a data centre in the small port of Eemshaven.

A few months later, when it was revealed that Google was behind Saturn, Harm Post, director of Groningen Sea Ports, commented: "Ten years ago Eemshaven became the laughing stock of ports and industrial development in the Netherlands, a planning failure of the previous century. And now Google is building a very large data centre here, which is 'pure advertisement' for Eemshaven and the data port."[10] Further details on tax cuts were not disclosed and once finished, the data centre will provide at most 150 jobs in the region.

Yet another territory fortunately chosen by Google, just like Mons, but what are the selection criteria? For one thing, data centres need to interact with existing infrastructures and flows of various type. Technically speaking, there are three prerequisites: being near a substantial source of electrical power (the finished installation will consume twice as much as the whole city of Groningen); being near a source of clean water, for the massive cooling demands; being near Internet infrastructure that can assure adequate connectivity. There is also a whole set of non-technical elements, that we can sum up as the social, economical and political climate, which proved favourable both in Mons and Eemshaven.

The push behind constructing new sites in new locations, rather expanding existing ones, is partly due to the rapid growth of the importance of Software as a service, so-called cloud computing, which is the rental of computational power from a central provider. With the rise of the SaaS paradigm the geographical and topological placement of data centres becomes of strategic importance to achieve lower latencies and more stable service. For this reason, Google has in the last 10 years been pursuing a policy of end-to-end connection between its facilities and user interfaces. This includes buying leftover fibre networks[11], entering the business of underwater sea cables[12] and building new data centres, including the ones in Mons and Eemshaven.

The spread of data centres around the world, along the main network cables across continents, represents a new phase in the diagram of the Internet. This should not be confused with the idea of decentralization that was a cornerstone value in the early stages of interconnected networks.[13] During the rapid development of the Internet and the Web, the new tenets of immediacy, unlimited storage and exponential growth led to the centralization of content in increasingly large server farms. Paradoxically, it is now the growing centralization of all kind of operations in specific buildings, that is fostering their distribution. The tension between centralization and distribution and the dependence on neighbouring infrastructures as the electrical grid is not an exclusive feature of contemporary data storage and networking models. Again, similarities emerge from the history of the Mundaneum, illustrating how these issues relate closely to the logistic organization of production first implemented during the industrial revolution, and theorized within modernism.

Centralization was seen by Otlet as the most efficient way to organize content, especially in view of international exchange[14] which already caused problems related to space back then: the Mundaneum archive counted 16 million entries at its peak, occupying around 150 rooms. The cumbersome footprint, and the growing difficulty to find stable locations for it, concurred to the conviction that the project should be included in the plans of new modernist cities. In the beginning of the 1930s, when the Mundaneum started to lose the support of the Belgian government, Otlet thought of a new site for it as part of a proposed Cité Mondiale, which he tried in different locations with different approaches.

Between various attempts, he participated in the competition for the development of the Left Bank in Antwerp. The most famous modernist urbanists of the time were invited to plan the development from scratch. At the time, the left bank was completely vacant. Otlet lobbied for the insertion of a Mundaneum in the plans, stressing how it would create hundreds of jobs for the region. He also flattered the Flemish pride by insisting on how people from Antwerp were more hard working than the ones from Brussels, and how they would finally obtain their deserved recognition, when their city would be elevated to World City status.[15] He partly succeeded in his propaganda; aside from his own proposal, developed in collaboration with Le Corbusier, many other participants included Otlet's Mundaneum as a key facility in their plans.
In a sense, data centers are similar to the capitalist factory system; but instead of a linear process of input of raw materials to output of material goods for mass consumption, they input mass data in order to facilitate and expand the endless cycle of commodification – an Ouroboros-like machine. As the factory system enables the production of more goods at a lower cost through automation and control of labor to maximize profit, data centers have been developed to process large quantities of bits and bytes as fast as possible and at as low a cost as possible through automation and centralization.

In these proposals, Otlet's archival infrastructure was shown in interaction with the existing city flows such as industrial docks, factories, the railway and the newly constructed stock market.[16]The modernist utopia of a planned living environment implied that methods similar to those employed for managing the flows of coal and electricity could be used for the organization of culture and knowledge.

The Traité de Documentation, published in 1934, includes an extended reflection on a Universal Network of Documentation, that would coordinate the transfer of knowledge between different documentation centres such as libraries or the Mundaneum[17]. In fact the existing Mundaneum would simply be the first node of a wide network bound to expand to the rest of the world, the Reseau Mundaneum. The nodes of this network are explicitly described in relation to "post, railways and the press, those three essential organs of modern life which function unremittingly in order to unite men, cities and nations."[18] In the same period, in letter exchanges with Patrick Geddes and Otto Neurath, commenting on the potential of heliographies as a way to distribute knowledge, the three imagine the White Link, a network to distribute copies throughout a series of Mundaneum nodes[19]. As a result, the same piece of information would be serially produced and logistically distributed, described as a sort of moving Mundaneum idea, facilitated by the railway system[20]. No wonder that future Mundaneums were foreseen to be built next to a train station.

In Otlet's plans for a Reseau Mundaneum we can already detect some of the key transformations that reappear in today's data centre scenario. First of all, a drive for centralization, with the accumulation of materials that led to the monumental plans of World Cities. In parallel, the push for international exchange, resulting in a vision of a distribution network. Thirdly, the placement of the hypothetic network nodes along strategic intersections of industrial and logistic infrastructure.

While the plan for Antwerp was in the end rejected in favour of more traditional housing development, 80 years later the legacy of the relation between existing infrastructural flows and logistics of documentation storage is highlighted by the data ports plan in Eemshaven. Since private companies are the privileged actors in these types of projects, the circulation of information increasingly respond to the same tenets that regulate the trade of coal or electricity. The very different welcome that traditional politics reserve for Google data centres is a symptom of a new dimension of power in which information infrastructure plays a vital role. The celebrations and tax cuts that politicians lavish on these projects cannot be explained with 150 jobs or economic incentives for a depressed region alone. They also indicate how party politics is increasingly confined to the periphery of other forms of power and therefore struggle to assure themselves a strategic positioning.

c. 025.45UDC; 161.225.22; 004.659GOO:004.021PAG.

The Universal Decimal Classification[21] system, developed by Paul Otlet and Henri Lafontaine on the basis of the Dewey Decimal Classification system is still considered one of their most important realizations as well as a corner-stone in Otlet's overall vision. Its adoption, revision and use until today demonstrate a thoughtful and successful approach to the classification of knowledge.

The UDC differs from Dewey and other bibliographic systems as it has the potential to exceed the function of ordering alone. The complex notation system could classify phrases and thoughts in the same way as it would classify a book, going well beyond the sole function of classification, becoming a real language. One could in fact express whole sentences and statements in UDC format[22]. The fundamental idea behind it [23]was that books and documentation could be broken down into their constitutive sentences and boiled down to a set of universal concepts, regulated by the decimal system. This would allow to express objective truths in a numerical language, fostering international exchange beyond translation, making science's work easier by regulating knowledge with numbers. We have to understand the idea in the time it was originally conceived, a time shaped by positivism and the belief in the unhindered potential of science to obtain objective universal knowledge. Today, especially when we take into account the arbitrariness of the decimal structure, it sounds doubtful, if not preposterous.

However, the linguistic-numeric element of UDC which enables to express fundamental meanings through numbers, plays a key role in the oeuvre of Paul Otlet. In his work we learn that numerical knowledge would be the first step towards a science of combining basic sentences to produce new meaning in a systematic way. When we look at Monde, Otlet's second publication from 1935, the continuous reference to multiple algebraic formulas that describe how the world is composed suggests that we could at one point “solve” these equations and modify the world accordingly.[24] Complementary to the Traité de Documentation, which described the systematic classification of knowledge, Monde set the basis for the transformation of this knowledge into new meaning.

Otlet wasn't the first to envision an algebra of thought. It has been a recurring topos in modern philosophy, under the influence of scientific positivism and in concurrence with the development of mathematics and physics. Even though one could trace it back to Ramon Llull and even earlier forms of combinatorics, the first to consistently undertake this scientific and philosophical challenge was Gottfried Leibniz. The German philosopher and mathematician, a precursor of the field of symbolic logic, which developed later in the 20th century, researched a method that reduced statements to minimum terms of meaning. He investigated a language which “... will be the greatest instrument of reason,” for “when there are disputes among persons, we can simply say: Let us calculate, without further ado, and see who is right”.[25] His inquiry was divided in two phases. The first one, analytic, the characteristica universalis, was a universal conceptual language to express meanings, of which we only know that it worked with prime numbers. The second one, synthetic, the calculus ratiocinator, was the algebra that would allow operations between meanings, of which there is even less evidence. The idea of calculus was clearly related to the infinitesimal calculus, a fundamental development that Leibniz conceived in the field of mathematics, and which Newton concurrently developed and popularized. Even though not much remains of Leibniz's work on his algebra of thought, it was continued by mathematicians and logicians in the 20th century. Most famously, and curiously enough around the same time Otlet published Traité and Monde, logician Kurt Godel used the same idea of a translation into prime numbers to demonstrate his incompleteness theorem.[26] The fact that the characteristica universalis only made sense in the fields of logics and mathematics is due to the fundamental problem presented by a mathematical approach to truth beyond logical truth. While this problem was not yet evident at the time, it would emerge in the duality of language and categorization, as it did later with Otlet's UDC.

The relation between organizational and linguistic aspects of knowledge is also one of the open issues at the core of web search, which is, at first sight, less interested in objective truths. At the beginning of the Web, around the mid '90s, two main approaches to online search for information emerged: the web directory and web crawling. Some of the first search engines like Lycos or Yahoo!, started with a combination of the two. The web directory consisted of the human classification of websites into categories, done by an “editor”; crawling in the automatic accumulation of material by following links with different rudimentary techniques to assess the content of a website. With the exponential growth of web content on the Internet, web directories were soon dropped in favour of the more efficient automatic crawling, which in turn generated so many results that quality has become of key importance. Quality in the sense of the assessment of the webpage content in relation to keywords as well as the sorting of results according to their relevance.

Google's hegemony in the field has mainly been obtained by translating the relevance of a webpage into a numeric quantity according to a formula, the infamous PageRank algorithm. This value is calculated depending on the relational importance of the webpage where the word is placed, based on how many other websites link to that page. The classification part is long gone, and linguistic meaning is also structured along automated functions. What is left is reading the network formation in numerical form, capturing human opinions represented by hyperlinks, i.e. which word links to which webpage, and which webpage is generally more important. In the same way that UDC systematized documents via a notation format, the systematization of relational importance in numerical format brings functionality and efficiency. In this case rather than linguistic the translation is value-based, quantifying network attention independently from meaning. The interaction with the other infamous Google algorithm, Adsense, adds an economic value to the PageRank position. The influence and profit deriving from how high a search result is placed, means that the relevance of a word-website relation in Google search results translates to an actual relevance in reality.

Even though both Otlet and Google say they are tackling the task of organizing knowledge, we could posit that from an epistemological point of view the approaches that underlie their respective projects, are opposite. UDC is an example of an analytic approach, which acquires new knowledge by breaking down existing knowledge into its components, based on objective truths. Its propositions could be exemplified with the sentences “Logic is a subdivision of Philosophy” or “PageRank is an algorithm, part of the Google search engine”. PageRank, on the contrary, is a purely synthetic one, which starts from the form of the network, in principle devoid of intrinsic meaning or truth, and creates a model of the network's relational truths. Its propositions could be exemplified with “Wikipedia is of the utmost relevance” or “The University of District Columbia is the most relevant meaning of the word 'UDC'”.

We (and Google) can read the model of reality created by the PageRank algorithm (and all the other algorithms that were added during the years[27]) in two different ways. It can be considered a device that 'just works' and does not pretend to be true but can give results which are useful in reality, a view we can call pragmatic, or instead, we can see this model as a growing and improving construction that aims to coincide with reality, a view we can call utopian. It's no coincidence that these two views fit the two stereotypical faces of Google, the idealistic Silicon Valley visionary one, and the cynical corporate capitalist one.

From our perspective, it is of relative importance which of the two sides we believe in. The key issue remains that such a structure has become so influential that it produces its own effects on reality, that its algorithmic truths are more and more considered as objective truths. While the utility and importance of a search engine like Google are out of the question, it is necessary to be alert about such concentrations of power. Especially if they are only controlled by a corporation, which, beyond mottoes and utopias, has by definition the single duty of to make profits and obey its stakeholders.
  1. A good account of such phenomenon is described by David Golumbia. http://www.uncomputing.org/?p=221
  2. As described in the classic text looking at the ideological ground of Silicon Valley culture. http://www.hrc.wmin.ac.uk/theory-californianideology-main.html
  3. For an account of Toffler's determinism, see http://www.ukm.my/ijit/IJIT%20Vol%201%202012/7wan%20fariza.pdf .
  4. Otlet, Paul. Traité de documentation: le livre sur le livre, théorie et pratique. Editiones Mundaneum, 1934: 393-394.
  5. http://gender.stanford.edu/news/2011/researcher-reveals-how-%E2%80%9Ccomputer-geeks%E2%80%9D-replaced-%E2%80%9Ccomputergirls%E2%80%9D
  6. This process has been named “heteromation”, for a more thorough analysis see: Ekbia, Hamid, and Bonnie Nardi. “Heteromation and Its (dis)contents: The Invisible Division of Labor between Humans and Machines.” First Monday 19, no. 6 (May 23, 2014). http://firstmonday.org/ojs/index.php/fm/article/view/5331
  7. The name scanops was first introduce by artist Andrew Norman Wilson when he found out about this category of workers during his artistic residency at Google in Mountain View. See http://www.andrewnormanwilson.com/WorkersGoogleplex.html .
  8. As collected by Krissy Wilson on her http://theartofgooglebooks.tumblr.com .
  9. http://informationobservatory.info/2015/10/27/google-books-fair-use-or-anti-democratic-preemption/#more-279
  10. http://www.rtvnoord.nl/nieuws/139016/Keerpunt-in-de-geschiedenis-van-de-Eemshaven .
  11. http://www.cnet.com/news/google-wants-dark-fiber/ .
  12. http://spectrum.ieee.org/tech-talk/telecom/internet/google-new-brazil-us-internet-cable .
  13. See Baran, Paul. “On Distributed Communications.” Product Page, 1964. http://www.rand.org/pubs/research_memoranda/RM3420.html .
  14. Pierce, Thomas. Mettre des pierres autour des idées. Paul Otlet, de Cité Mondiale en de modernistische stedenbouw in de jaren 1930. PhD dissertation, KULeuven, 2007: 34.
  15. Ibid: 94-95.
  16. Ibid: 113-117.
  17. Otlet, Paul. Traité de documentation: le livre sur le livre, théorie et pratique. Editiones Mundaneum, 1934.
  18. Otlet, Paul. Les Communications MUNDANEUM, Documentatio Universalis, doc nr. 8438
  19. Van Acker, Wouter. “Internationalist Utopias of Visual Education: The Graphic and Scenographic Transformation of the Universal Encyclopaedia in the Work of Paul Otlet, Patrick Geddes, and Otto Neurath.” Perspectives on Science 19, no. 1 (January 19, 2011): 68-69.
  20. Ibid: 66.
  21. The Decimal part in the name means that any records can be further subdivided by tenths, virtually infinitely, according to an evolving scheme of depth and specialization. For example, 1 is “Philosophy”, 16 is “Logic”, 161 is “Fundamentals of Logic”, 161.2 is “Statements”, 161.22 is “Type of Statements”, 161.225 is “Real and ideal judgements”, 161.225.2 is “Ideal Judgements” and 161.225.22 is “Statements on equality, similarity and dissimilarity”.
  22. “The UDC and FID: A Historical Perspective.” The Library Quarterly 37, no. 3 (July 1, 1967): 268-270.
  23. TEMP: described in french by the word depouillement,
  24. Otlet, Paul. Monde, essai d’universalisme: connaissance du monde, sentiment du monde, action organisée et plan du monde. Editiones Mundaneum, 1935: XXI-XXII.
  25. Leibniz, Gottfried Wilhelm, The Art of Discovery 1685, Wiener: 51.
  26. https://en.wikipedia.org/wiki/G%C3%B6del_numbering
  27. A fascinating list of all the algorithmic components of Google search is at https://moz.com/google-algorithm-change .