Difference between revisions of "X = Y"

From Mondothèque

Line 1: Line 1:
  
  
==0. innovation of the same==
+
==0. Innovation of the same==
  
 
<div class="book"><onlyinclude>
 
<div class="book"><onlyinclude>
Line 15: Line 15:
 
[[File:Screenshot from 2014-10-31 16-21-52.png|center|From industrial heartland to the Internet age (screen-capture). Video published by The Mundaneum, 2014]]
 
[[File:Screenshot from 2014-10-31 16-21-52.png|center|From industrial heartland to the Internet age (screen-capture). Video published by The Mundaneum, 2014]]
  
 +
This stance is not limited to images: a recurring discourse that shapes some of the exhibitions taking place in Mundaneum maintains that the dream of the belgian utopian has been kept alive in the development of internetworked comunications, and currently finds its spititual successor in the products and services of Google.
 +
Even though there are many connections and similarities between the two endeavors, one cannot ignore as a negligible detail the fact that Otlet was an internationalist, a socialist, an utopian, that his projects were not profit oriented, and most importantly, that he was living in the temporal and cultural context of modernism in the beginning of the century.
 +
The constructed identities and continuities are detaching Otlet and the Mundaneum from a specific historical frame, ignoring the different scientific, social and political milieus involved. This means that such narratives exclude discording or disturbing elements that are inevitable when one would consider such a complex figure in its entirety.
  
This stance is not limited to images: the main discourse that shapes the exhibitions that happen in Mundaneum basically maintains that Internet / Google (leaving aside for now the issue of the interchangeability of Google and Internet ) is keeping alive the utopia of Paul Otlet, who unfortunately at his time didn't have access to the adequate technology to realize it.
+
This is not surprising, seen the parties involved in the discourse: this type of instrumental identities and differences fit quite well in the rethorical tone of the Silicon Valley.
Even though there are obvious links and similarities between the two projects, one cannot put aside as details the fact that Otlet was an internationalist, a socialist, an utopian whose project was not profit oriented, and most importantly, that he was living in the temporal and cultural context of modernism in the beginning of the century.
+
For example, it is common for newly launched IT products to be described as groundbreaking, innovative and 'different from anything seen before'. In other situations, instead, there is the complementary habit to stress that a product is 'exactly the same' as something else that already existed<ref>A good account of such phenomenon is described by David Golumbia. http://www.uncomputing.org/?p=221</ref>. While novelty and difference has the function to surprise and wonder, sameness is there instead to reassure and comfort. For example Google Glass was marketed as revolutionary and innovative, but when it was attacked for its blatant privacy issues, some defended it as just a camera and a phone joined together. The sameness-difference couple fulfills a clear function: on one hand, it suggests that technological advancements might alter dramatically the way we live, and we have to be ready to give up our old fashioned ideas about life and culture when innovation comes. On the other hand, it suggests we should not be worried about these changes, and that society has always evolved through such disruptions, undoubtedly for the better. For each groundbreaking new invention that is questioned, there is a previous invention that was aiming for the same ideal, potentially with just as many detractors... Great minds think alike, after all.
Such construction of discursive identities and continuities are in fact detaching something from a specific historical frame, ignoring the different scientifical, social and political milieus that Otlet's Mundaneum were part of. This means that such continuities exclude discording or disturbing elements that are inevitable when one considers such a complex figure in its entirety.
+
This sort of a-historical attitude pervades the techno-capitalist milieus, drawing a cartoonesque view of the past, punctuated by great men and great inventions, a sort of technological variant of Carlyle's [https://en.wikipedia.org/wiki/Great_Man_theory ''Great Man Theory'']. In this view, the Internet becomes the invention of a few father/genius figures, rather than the result of a long and complex interaction of diverging efforts and interests of academics, entrepreneurs, national governments. This instrumental reading of the past is consistent with much of the theoretical ground on which the ''Californian Ideology''<ref>As described in the classic text looking at the ideological ground of Silicon Valley culture. http://www.hrc.wmin.ac.uk/theory-californianideology-main.html</ref> stands. In this ground, the conception of history is pervaded by various strains of technological determinism ( from Marshall McLuhan to Alvin Toffler<ref>For an account of Toffler's determinism, see http://www.ukm.my/ijit/IJIT%20Vol%201%202012/7wan%20fariza.pdf .</ref> ) and capitalist individualism ( in generic neoliberal terms, up to the fervent objectivism of Ayn Rand ).
  
This is not surprising, seen the milieus involved in the operation: this type of a-historical perspectives are something quite common in Silicon Valley culture.
+
The appropriation of Paul Otlet's figure as Google's grandfather is such kind of historical simplification, and the samenesses that this tale is made of are not without fundament. Many concepts and ideals of documentation theories have reappeared in cybernetics and information theory, and therefore are present as well in the narrative of many IT corporations, as in Mountain View's case. With the intention to re-establish an historical dimension to the matter, it might be more interesting to play "exactly the same" game ourselves, rather than trying to dispel the advertised continuum of the 'Google of paper'. Choosing to focus on other types of analogies in the story, we can maybe contribute a narrative that is more respectful to the complexity of the past, and more telling about the problems of the present.
For example, there is the tendency of describe new IT products as groundbreaking, innovative and 'different from anything seen before'. There is in other situations, instead, the complementary habit to maintain that something is 'exactly the same' as something else that already existed1. While difference is used to surprise and wonder, sameness is there instead to reassure and comfort. For example Google Glass was marketed as revolutionary and innovative, but when it was attacked for its blatant privacy issues, it was described as just a camera and a phone joined together.
 
This sort of a-historical attitude pervades the discourse of many techno-capitalists, drawing a cartoonesque view of the past, punctuated by great men and great inventions, sort of a technological variant of Carlyle's “Great Man Theory”2. The Internet becomes the invention of a few father/genius figures, rather than a long complex interaction of diverging efforts and interests of academics, entrepreneurs, companies, universities.
 
The narrative fulfills a clear function: on one hand, it suggests that technological advancements might alter dramatically the way we live, and we have to be ready to give up our old fashioned ideas about life and culture when innovation comes. On the other hand, it suggests we should not be worried about these same changes, after all that's how society has always evolved, and undoubtedly for the better.
 
For each groundbreaking new invention that is questioned, there is always a previous invention that was aiming for the same ideal, with just as many detractors... Great minds think alike, after all.
 
This instrumental use of history is consistent with much of the theoretical ground on which the Californian Ideology3 stands, whose conception of history is pervaded by various strains of technological determinism ( from Marshall McLuhan to Alvin Toffler4 ) and capitalist individualism ( either in generic neoliberal terms, or the fervent objectivism a la Ayn Rand ).
 
  
The appropriation of Otlet's figure as Google's grandfather is such kind of historical simplification, and the samenesses that this tale is made of are not without fundament. Many concepts and ideals of documentation theories have reappeared in cybernetics and information theory, and therefore are present as well in the narrative of corporations, as in Mountain View's case.
+
Following are three such 'comparisons', which focus on three aspects of continuity between the documentation theories and archival experiments Otlet was involved in, and the cybernetic theories and practices that Google's capitalist enterprise is an exponent of.
So with the intention to return Otlet's figure to some of its historical complexity, instead of trying to dispel the advertised continuity of the 'Google of paper', it might be more interesting to play this "exactly the same" game ourselves. Choosing to focus on other types of samenesses in the story, we can maybe contribute a narrative that is more respectful to the complexity of the past and more telling about the problems of the present.
+
First is a look at the conditions of workers in information infrastructures, fundamental for these systems to work but often forgotten or displaced.
 
+
Then an account of the elements of distribution and control that appear both in the idea of a 'Reseau Mundaneum', and in the contemporary functioning of data centers, and the resulting interaction of these with other types of infrastructures.
Here following are three such 'samilarities', which look at three aspects of continuity between the documentation theories and the archival experiments Otlet was involved in, and the cybernetic theories and practices that Google is an exponent of, in its capitalist enterprise.
 
First is a look at the specific sitaution of workers in information infrastructures, fundamental for these systems to work but often forgotten or displaced.
 
Then an account of the elements of distribution and control that appear in the idea of a 'Reseau Mundaneum', as well as in the contemporary functioning of data centers, and the interaction of these with other types of infrastructures.
 
 
Finally there is a brief analysis of the two approaches to the 'organization of world's knowledge', examining their regimes of truth and the issues that come with them.
 
Finally there is a brief analysis of the two approaches to the 'organization of world's knowledge', examining their regimes of truth and the issues that come with them.
Hopefully these three short pieces can provide some additional ingredients to adulterate the sterilized recipe of the Google – Otlet sameness.
+
Hopefully these three short pieces can provide some additional ingredients to adulterate the sterile recipe of the Google – Otlet sameness.
 
 
  
 
==a. Do androids dream of mechanical turks?==
 
==a. Do androids dream of mechanical turks?==
  
In a drawing called “Laboratorium Mundaneum”, Paul Otlet depicted his project as a massive factory, processing books and other documents into end products, rolled out by a UDC locomotive. In fact, just like a factory, Mundaneum was dependent on the bureaucratic and logistic modes of organization of labour developed for mass-production. Looking at this and other written and drawn sketches one can wonder: who is making up the workforce of such factories?  
+
In a drawing titled “Laboratorium Mundaneum”, Paul Otlet depicted his project as a massive factory, processing books and other documents into end products, rolled out by a UDC locomotive. In fact, just like a factory, Mundaneum was dependent on the bureaucratic and logistic modes of organization of labour developed for industrial production. Looking at this and other written and drawn sketches one can wonder: who was making up the workforce of such factories?  
  
In his Traite de Documentation Otlet describes extensively the thinking machines and the tasks of intellectual work in which the 'Fordist chain' of documentation is broken down into.
+
In his Traité de Documentation, Otlet describes extensively the thinking machines and the tasks of intellectual work which the ''Fordist chain'' of documentation is broken down into. In the subsection dedicated to the personnel that would work at these systems, though, the only role described in length is the one of the ''Bibliotécaire''. Through the lengthy chapter that describes what formation such person should follow, what characteristics are necessary for the role, and so on, a brief mention is made about the existence of “Bibliotecaire-adjoints, rédacteurs, copistes, gens de service”<ref>Otlet, Paul. Traité de documentation: le livre sur le livre, théorie et pratique. Editiones Mundaneum, 1934: 393-394.</ref>. There seem to be no further description nor depiction of the personnel that would write, distribute and search for the millions of index cards to keep the archive running, an impossible task for the Bibliotécaire alone.
In the subsection dedicated to the personnel that would work at these system, though, the only role described in length is the one of the Bibliotécaire.
 
Through the lengthy chapter that describes what formation such person should follow, what characteristics are necessary for the role, and so on, a brief mention is made about the existence of “Bibliotecaire-adjoints, rédacteurs, copistes, gens de service”.
 
There seem to be no further description nor depiction of the personnel that would write, distribute and search for the millions of index cards to keep the archive running.
 
  
A good answer to this question comes instead from a photograph from around 1930, take in the Palais Mondial, where we can see Paul Otlet together with the rest of the equipe. In this nice group picture, we see that the workforce that kept the archival machine running was made up by mostly unidentified women.  
+
A good answer to this question comes instead from a photograph from around 1930, taken in the Palais Mondial, where we can see Paul Otlet together with the rest of the equipe. In this beautiful group picture, we see that the workforce that kept the archival machine running was made up by women, of whom we have barely any information. In the same way as for telephone switching or early software development<ref>http://gender.stanford.edu/news/2011/researcher-reveals-how-%E2%80%9Ccomputer-geeks%E2%80%9D-replaced-%E2%80%9Ccomputergirls%E2%80%9D</ref>, gender stereotypes and discrimination appointed female workers for repetitive tasks that required specific knowledge and precision.  
In the same way as for telephone switching or early software development, gender stereotypes and discrimination appointed female workers for repetitive tasks that required specific knowledge and precision.  
 
  
In the ideal image described in the Traité, all the tasks of collection, translation, distribution would be completely technical; seemingly with no human interaction necessary. In the meantime though, the Mundaneum hired tenths of women to do those tasks. The existing human run version of the system is not considered a reference, as if it was some temporary in-between step that would be overcome as soon as possible, something that would stain the project with its vulgarity.
+
In the ideal image described in the Traité, all the tasks of collection, translation, distribution would be completely technical; seemingly without the necessity of any human intervention. In the meantime though, the Mundaneum hired tenths of women to do those tasks. The existing human-run version of the system was not considered a reference, as if it was some temporary in-between step that would be overcome as soon as possible, something that was staining the project with its vulgarity.
 
 
Notwithstanding the incredible advancement of information technologies and the automation of innumerable tasks in the collection, processing and distribution of information, this same pattern is very present nowadays as well. All automatic repetitive tasks that 'technique' can do for us are still based on human labour in one way or another. And, differently from the industrial worker who obtained its recognition with political movements and struggles, the role of many cognitive workers is still hidden or under-represented.
 
Computational linguistics, neural networks, optical character recognition, all the most amazing machinic performances are still based on humans performing huge amounts of repetitive intellectual tasks that the software can 'learn from', or that the software can't do with the same efficiency.
 
Automation didn't really free us from labour, it just shifted where, when and whose labour has to happen1. Mechanical turks, content verifiers, annotators of all kinds... There is a multitude of tasks that has to happen for the software we use, that is invisible to us but it is accomplished by humans. Who are they? When possible, it is outsourced to foreign english speaking countries with lower wages, like India. In the western world instead it follows the pattern: female, lower income, ethnic minorities.
 
An interesting case of heteromated labour are the so-called Scanops2, a set of Google workers with a different type of badge, isolated in one part of Mountain View complex and secluded from the rest of the workers, by their strict access permissions and fixed time schedules. The task of these workers consists of scanning the pages of printed books to be added to the Google Books database, a work that is still mroe convenient to do by hand in some cases (rare or fragile books, for example). In prevalence female, in prevalence ethnic minorities, there is no mention of these workers in Google Books or elsewhere; in fact the whole process of scanning is kept completely secretive.
 
Even though the secrecy and lack of mention around this kind of work could be to protect eventual trade secrets, this continues the attitude of hiding the human part in the machine. This is even more obvious for the contrast with the celebration of other types of human workers, in the positions deemed creative and ingenious as designers and programmers.
 
Even though there is a tendency to hide the human labour that is necessary for certain technical automation to happen, some evidence of the workforce's existence remains in the result of its work. In the case of Google Books Employees, for example, it is possible to encounter the photos of their hands that mistakenly ended up in the digital version of the scanned book online.3
 
It is reassuring that, when looking with the right kind of eyes, the final product keeps in some ways the traces of the work, that even with the progressing removal of human signs from a technical infrastructure, this presence never disappears completely. The worker's presence is here the proof of the materiality of information production, and becomes a sign of the economies and paradigms of efficiency and profitability that are involved.
 
  
 +
Notwithstanding the incredible advancement of information technologies and the automation of innumerable tasks in the collection, processing and distribution of information, this same pattern is very present nowadays as well. All automatic repetitive tasks that 'technology' can do for us are still based on human labour in one way or another. And, differently from the industrial worker who obtained its recognition with political movements and struggles, the role of many cognitive workers is still hidden or under-represented.
 +
Computational linguistics, neural networks, optical character recognition, all the most amazing machinic performances are still based on humans performing huge amounts of repetitive intellectual tasks that the software can 'learn from', or that the software can't do with the same efficiency. Automation didn't really free us from labour, it just shifted where, when and whose labour has to happen, a process that has been named “heteromation”<ref>Ekbia, Hamid, and Bonnie Nardi. “Heteromation and Its (dis)contents: The Invisible Division of Labor between Humans and Machines.” First Monday 19, no. 6 (May 23, 2014). http://firstmonday.org/ojs/index.php/fm/article/view/5331.</ref>. Mechanical turks, content verifiers, annotators of all kinds... There is a multitude of tasks that has to happen for the software we use, that is invisible to us but is accomplished by humans. Who are they? When possible, work is outsourced to foreign english speaking countries with lower wages, like India. In the western world instead it follows the usual pattern: female, lower income, ethnic minorities.
 +
An interesting case of heteromated labour are the so-called Scanops<ref>The name scanops was first introduce by artist Andrew Norman Wilson when he found out about this catgory of workers during his artistic residency at Google in Mountain View. See http://www.andrewnormanwilson.com/WorkersGoogleplex.html .</ref>, a set of Google workers with a different type of badge, isolated in one section of Mountain View complex and secluded from the rest of the workers, by their strict access permissions and fixed time schedules. The task of these workers consists of scanning the pages of printed books to be added to the Google Books database, a work that is still more convenient to do by hand in some cases (rare or fragile books, for example). In prevalence female, in prevalence ethnic minorities, there is no mention of these workers in Google Books or elsewhere; in fact the whole process of scanning is kept completely secretive. Even though the secrecy around this kind of labour is usually justified by the need to protect trade secrets, it anyway continues the attitude of hiding the human part in the machine work. This is even more obvious for the contrast with the celebration of other types of human workers, in the positions deemed creative and ingenious, as designers and programmers.
 +
Even though there is a tendency to hide the human labour that is necessary for certain automation to take place, some evidence of the workforce's existence remains in the result of its labour. In the case of Google Books employees, for example, it is possible to encounter the photos of their hands that mistakenly ended up in the digital version of the scanned book online<ref>As collected by Krissy Wilson on her http://theartofgooglebooks.tumblr.com .</ref>.
 +
Whether the tendency to hide the human role is due to the unfulfilled wish for total automation, to avoid the bad publicity of low wages and precarious work, or to keep an aura of mistery around machines, is still unclear for Google Books as it was for the Palais Mondial. It is reassuring though to know that the products still keep in some ways the traces of the work, that even with the progressive removal of human signs in automated processes, the workers' presence never disappears completely. This presence remains the proof of the materiality of information production, and becomes a sign of the economies and paradigms of efficiency and profitability that are involved.
  
 
 
  
 
==b. centralization - distribution - infrastructure==
 
==b. centralization - distribution - infrastructure==
In 2013, in the days when prime minister Di Rupo was celebrating the beginning of the second phase of construction of the Saint Ghislain datacenter, a few hundred kilometers away a very similar situation was starting to unroll. In the municipality of Eemsmond, in the province of Groningen, the Netherlands, Groningen Sea Ports and NOM development were in secret deals with another[crystal comp] temporary named firm, "Saturn", to deploy a datacenter in the small port of Eemshaven, now an infrastructural wonder.
 
When some months later, the party was revealed to be Google, Harm Post, director of Groningen Sea Ports, commented: "Just ten years ago Eemshaven was the standing joke of the ports, a case to look down upon of industrial development in the Netherlands, the planning failure of the last century. And now Google is building a very large data center here, which is "pure advertisement" for Eemshaven and the data port." [calimerocomplex] Again, further details on taxes were not disclosed, and once finished, the datacenter will provide 150 jobs in the region.
 
  
Another territory had the luck to be chosen by Google, just like Mons, but what are the criteria behind such choices? For one, datacenters necessarily need to interact with existing infrastructures and flows of various type. Technically speaking, there are three prerequisites: being near a substantial source of electrical power (the datacenter will consume twice as much as the whole city of Groningen); being near a source of clean water, for the massive cooling demands; thirdly being near Internet infrastructure that can assure proper connectivity. There is then a whole set of other non-technical elements, that we can describe as the social, economical and political "climate", that proved favorable both in Mons and Eemshaven.
+
In 2013, while prime minister Di Rupo was celebrating the beginning of the second phase of construction of the Saint Ghislain datacenter, a few hundred kilometers away a very similar situation was starting to unroll. In the municipality of Eemsmond, in the dutch province of Groningen, the local Groningen Sea Ports and NOM development were in secret deals with another[crystal comp] temporary named firm, "Saturn", to deploy a datacenter in the small port of Eemshaven, now an infrastructural wonder.
 +
When some months later, the party was revealed to be Google, Harm Post, director of Groningen Sea Ports, commented: "Just ten years ago Eemshaven was the standing joke of the ports, a case to look down upon of industrial development in the Netherlands, the planning failure of the last century. And now Google is building a very large data center here, which is 'pure advertisement' for Eemshaven and the data port."<ref>http://www.rtvnoord.nl/nieuws/139016/Keerpunt-in-de-geschiedenis-van-de-Eemshaven .</ref> Again, further details on the tax-cuts in the deal were not disclosed and, once finished, the datacenter will provide 150 jobs in the region.
  
The push behind the construction of datacenters in new locations, rather than just enlarging the ones that already exist, is partly due to the rapid growth of importance of “Software as a service”, so called cloud computing which means the rental of computational power from a central provider.
+
Another territory had the luck to be chosen by Google, just like Mons, but what are the criteria behind such selection? For one, datacenters necessarily need to interact with existing infrastructures and flows of various type. Technically speaking, there are three prerequisites: being near a substantial source of electrical power (the datacenter will consume twice as much as the whole city of Groningen); being near a source of clean water, for the massive cooling demands; being near Internet infrastructure that can assure adequate connectivity. There is then a whole other set of non-technical elements, that we can sum up as the social, economical and political "climate", that proved favorable both in Mons and Eemshaven.
While until recent times it was not really relevant where a server was located, now that SaaS is offered as a product for companies, the placement becomes of strategic importance to achieve lower latencies and more stable service. For this reason, Google has been in the last 10 years pursuing a policy of end-to-end connection between its datacenters and the user interfaces. That meant buying leftover fiber networks[fonte], entering the business of underwater sea cables[fonte], in view of its competition with Amazon for the primate in cloud computing.
 
  
This distribution of datacenters around the globe, along the main network cables crossing the continents, represents a new phase in the diagram of the Internet. It should not be confused with the idea of decentralization that was a cornerstone value in the early stages of interconnected networks.[baran] During the rapid development of the Internet and the Web, the new tenets of immediacy, unlimited storage and exponential growth brought to the centralization of content in increasingly large datacenters. Paradoxically, it is now the growing centralization of all kind of operations in datacenters that is fostering their distribution in different places around the globe, along the main network cables.
+
The push behind the construction of datacenters in new locations, rather than the enlargement of the ones that already exist, is partly due to the rapid growth of importance of “Software as a service”, so-called cloud computing, which means the rental of computational power from a central provider.
+
With the rise of the SaaS paradigm the geographical and topological placement of the datacenter becomes of strategic importance to achieve lower latencies and more stable service. For this reason, Google has been in the last 10 years pursuing a policy of end-to-end connection between its datacenters and the user interfaces. That included buying leftover fiber networks<ref>http://www.cnet.com/news/google-wants-dark-fiber/ .</ref>, entering the business of underwater sea cables<ref>http://spectrum.ieee.org/tech-talk/telecom/internet/google-new-brazil-us-internet-cable .</ref> and building new datacenters, including the ones in Mons and Eemshaven.
The tension between centralization and distribution, and the dependence on neighbouring infrastructures is not an exclusive feature of contemporary data storage and computational models.
 
Again, suggestions of something quite similar emerge from the history of Mundaneum, that illustrate how these issues relate closely to the logistic organization of production first implemented during the industrial revolution, and theorized within modernism.  
 
  
Centralization, was seen by Otlet [link?] as the most efficient way to organize content, especially in view of international exchange. Which already generated space problems back then: the Mundaneum database counted 16 million entries at its peak, occupying around 150 rooms. This cumbersome footprint, and the growing difficulty to find stable locations for it, concurred to the conviction that the archive should be added to the plans of newly planned modernist cities. In the beginning of the 1930s, with Mundaneum starting to lose support from the Belgian government, Otlet tried to find a new site for it as part of a proposed 'Cite Mondiale', which he attempted in different locations [dennis] with different approaches.  
+
The spread of datacenters around the world, along the main network cables crossing the continents, represents a new phase in the diagram of the Internet. It should not be confused with the idea of decentralization that was a cornerstone value in the early stages of interconnected networks.<ref>See Baran, Paul. “On Distributed Communications.” Product Page, 1964. http://www.rand.org/pubs/research_memoranda/RM3420.html .</ref> During the rapid development of the Internet and the Web, the new tenets of immediacy, unlimited storage and exponential growth brought to the centralization of content in increasingly large server farms. Paradoxically, it is now the growing centralization of all kind of operations in datacenters that is fostering their distribution.
 +
The tension between centralization and distribution, and the dependence on neighbouring infrastructures as for example the electrical grid, is not an exclusive feature of contemporary data storage and networking models. Again, suggestions of something quite similar emerge from the history of the Mundaneum, and illustrate how these issues relate closely to the logistic organization of production first implemented during the industrial revolution, and theorized within modernism.  
  
Between the various attempts, he participated in the competition for the development of the Left Bank in Antwerp. The most famous modernist urbanists of the time were invited to plan the development from scratch of the left side of the river, at the time completely unbuilt. Otlet lobbied for the insertion of a Mundaneum in the projects, stressing how it would create hundreds of jobs for the region. He also flattered the flemish pride by stressing how Antwerp inhabitants, often more hard working than the Brussels', would finally obtain their deserved recognition, hightening their city to a “World City” status. He partly succeeded in his propaganda, seen the fact that apart from his own proposal, developed in collaboration with Le Corbusier, many other participants included Otlet's Mundaneum as a key facility in their plans. In these proposals for new development, Otlet's archival infrastructure was shown in interaction with the existing flows already running through the city, like industrial docks, factories, the railway and the newly constructed stockmarket. The modernist utopia of the planned living environment already implied the organization of culture and knowledge with the same approach that was used for coal or electricity.
+
Centralization was seen by Otlet as the most efficient way to organize content, especially in view of international exchange<ref>Pierce, Thomas. ‘Mettre des pierres autour des idées’. Paul Otlet, de Cité Mondiale en de
 +
modernistische stedenbouw in de jaren 1930. PhD dissertation, KULeuven, 2007: 34.</ref>. This already generated space problems back then: the Mundaneum archive counted 16 million entries at its peak, occupying around 150 rooms. The cumbersome footprint, and the growing difficulty to find stable locations for it, concurred to the conviction that the project should be included in the plans of new modernist cities. In the beginning of the 1930s, with Mundaneum starting to lose support from the Belgian government, Otlet tried to find a new site for it as part of a proposed 'Cite Mondiale', which he tried in different locations [dennis] with different approaches.  
  
In the Traité de Documentation, published in 1934, there is a long speculation on the “Universal Network of Documentation”, which would be rensponsible for the transfer knowledge between different documentation centres as libraries or the Mundaneum. The nodes of this network are explicitly described in relation to ‘post, railways and the press, those three essential organs of modern life which function unremittingly in order to unite men, cities and nations.’[Paul Otlet, Les Communica-tions MDN, EUM, Documentatio Universalis, doc nr. 8438] In the same period, in letter exchanges with Patrick Geddes and Otto von Neurath[Van Acker], commenting on the potential of heliographies as a way to distribute knowledge, the three imagine the 'White Link', a network to distribute copies throughout a network of Mundaneums. Thanks to this, the same piece of information would be serially produced and logistically distributed, described as a sort of “moving Mundaneum” idea, facilitated by the railway system. No wonder then, it was a main characteristic for the future Mundaneums to be built next to a train station.
+
Between the various attempts, he participated in the competition for the development of the Left Bank in Antwerp. The most famous modernist urbanists of the time were invited to plan the development from scratch of the left side of the river, at the time completely unbuilt. Otlet lobbied for the insertion of a Mundaneum in the projects, stressing how it would create hundreds of jobs for the region. He also flattered the flemish pride by stressing how Antwerp inhabitants, often more hard working than the Brussels', would finally obtain their deserved recognition, hightening their city to a “World City” status.<ref>Ibid: 94-95.</ref> He partly succeeded in his propaganda, seen the fact that apart from his own proposal, developed in collaboration with Le Corbusier, many other participants included Otlet's Mundaneum as a key facility in their plans. In these proposals for new development, Otlet's archival infrastructure was shown in interaction with the existing flows already running through the city, like industrial docks, factories, the railway and the newly constructed stockmarket.<ref>Ibid: 113-117.</ref>The modernist utopia of the planned living environment already implied the organization of culture and knowledge by methods similar to the ones used for coal or electricity.
  
So through Otlet's plans for a reseaux Mundaneum we can already see some of the key transformations that reappear with nowadays evolving datacenter scenario. A drive for centralization in the first place, with the accumulation of materials that brought to the monumental plans of World Cities. Parallelly to this, the push for international exchange meant as well a vision of a distribution network. Thirdly, the resulting placement of the hypothetic nodes along strategical intersections of industrial and logistic infrastructure.
+
In the Traité de Documentation, published in 1934, there is a long speculation on the “Universal Network of Documentation”, which would be rensponsible for the transfer of knowledge between different documentation centres as libraries or the Mundaneum<ref>Otlet, Paul. Traité de documentation: le livre sur le livre, théorie et pratique. Editiones Mundaneum, 1934.</ref>. In fact the existing Mundaneum would just be the first node of a wide network bound to expand to the rest of the world, the “Reseau Mundaneum”. The nodes of this network are explicitly described in relation to ‘post, railways and the press, those three essential organs of modern life which function unremittingly in order to unite men, cities and nations.’<ref>Otlet, Paul. Les Communications MUNDANEUM, Documentatio Universalis, doc nr. 8438</ref> In the same period, in letter exchanges with Patrick Geddes and Otto von Neurath, commenting on the potential of heliographies as a way to distribute knowledge, the three imagine the 'White Link', a network to distribute copies throughout a network of Mundaneums<ref>Van Acker, Wouter. “Internationalist Utopias of Visual Education: The Graphic and Scenographic Transformation of the Universal Encyclopaedia in the Work of Paul Otlet, Patrick Geddes, and Otto Neurath.” Perspectives on Science 19, no. 1 (January 19, 2011): 68-69.</ref>. Thanks to this, the same piece of information would be serially produced and logistically distributed, described as a sort of “moving Mundaneum” idea, facilitated by the railway system<ref>Ibid: 66.</ref>. No wonder then, it was a main characteristic for the future Mundaneums to be built next to a train station.
  
While the plan for Antwerp was in the end rejected in favour of more traditional housing development, 80 years later the legacy of the relation between existing infrastructural flows and the logistics of documentation storage is highlighted by the data-ports plan in Eemshaven. 
+
Through Otlet's plans for a “Reseau Mundaneum” we can already see some of the key transformations that reappear with nowadays evolving datacenter scenario. A drive for centralization in the first place, with the accumulation of materials that brought to the monumental plans of World Cities. Parallelly to this, the push for international exchange, which brought a vision of a distribution network. Thirdly, the resulting placement of the hypothetic nodes of such network along strategical intersections of industrial and logistic infrastructure.
The very different welcome that traditional politics reserve for Google datacenters is a symptom of a new dimension of power that an information infrastructure represents. The celebrations and tax cuts that politicians lavish in these projects are not exchanged for the 150 jobs or the 'economic incentives' alone. They also indicate how party politics live in awe of being periferic to other forms of power and wants to benefit from strategic positioning, as well. And this all concurs to the fact that infrastructures that facilitates the exchange of knowledge will more and more respond to the same tenets that are involved in the exchange of coal or electricity.
 
  
==c. 025.45UDC; 161.225.22; 004.659GOO:004.021PAG.==
+
While the plan for Antwerp was in the end rejected in favour of more traditional housing development, 80 years later the legacy of the relation between existing infrastructural flows and the logistics of documentation storage is highlighted by the data-ports plan in Eemshaven. Since private companies are the privileged actors in these type of projects, the circulation of information increasingly respond to the same tenets that regulate the trade of coal or electricity. The very different welcome that traditional politics reserve for Google datacenters is a symptom of a new dimension of power that information infrastructure plays a role into. The celebrations and tax cuts that politicians lavish for these projects cannot be explained with 150 jobs or the 'economic incentives' for a depressed region alone. They also indicate how party politics live in awe of being periferic to other forms of power and want to benefit from strategic positioning, as well.
  
  
The Universal Decimal Classification system, developed by Otlet and Lafontaine on the basis of the Dewey Decimal Classification system, is still considered one of the most important realizations of the two men, as well as a corner stone in Otlet's overall vision. Its adoption, revision and use until present day demonstrates a thoughtful and successful approach to the challenge of the classification of knowledge.
+
==c. 025.45UDC; 161.225.22; 004.659GOO:004.021PAG.==
  
The UDC though, differently from Dewey and other bibliographic systems, had the potential to exceed the function of ordering alone. The complex notation system could classify phrases and thoughts in the same way as it would classify a book, going well beyond the mere function of classification, becoming a real language. One could in fact express whole sentences and statements in UDC format.* The fundamental idea, described in french with the word “depouillement”, was that books and documentation could be broken in their constitutive sentences and boiled down to a set of universal concepts, regulated by the decimal system.
+
The Universal Decimal Classification<ref>The Decimal part in the name means that any records can be further subdivided by tenths, virtually infinitely, according to an evolving scheme of depth and specialization. For example, 1 is “Philosophy”, 16 is “Logic”, 161 is “Fundamentals of Logic”, 161.2 is “Statements”, 161.22 is “Type of Statements”, 161.225 is “Real and ideal judgements”, 161.225.2 is “Ideal Judgements” and 161.225.22 is “Statements on equality, similarity and dissimilarity”.</ref> system, developed by Paul Otlet and Henri Lafontaine on the basis of the Dewey Decimal Classification system, is still considered one of the most important realizations of the two men, as well as a corner stone in Otlet's overall vision. Its adoption, revision and use until present day demonstrates a thoughtful and successful approach to the challenge of the classification of knowledge.
This would allow to express objective truths in a numerical language, fostering international exchange beyond translation, making science's work easier by regulating knowledge with numbers.  
 
One has to set this idea into its time, shaped by positivism and the belief in the unhindered potential of science to obtain objective universal knowledge in all fields. Especially taking into account the arbitrariness of the decimal structure, this today sounds doubtful, if not preposterous.
 
  
This linguistico-numeric element of UDC, enabling to express fundamental meanings by numbers, plays a key role though, in the project of Paul Otlet. What one is brought to think by taking into account Otlet's overall path, is that numerical knowledge would be the first step towards a science of combination of these basic sentences to produce new meaning in a systematic way.
+
The UDC, differently from Dewey and other bibliographic systems, had the potential to exceed the function of ordering alone. The complex notation system could classify phrases and thoughts in the same way as it would classify a book, going well beyond the sole function of classification, becoming a real language. One could in fact express whole sentences and statements in UDC format<ref>“The UDC and FID: A Historical Perspective.” The Library Quarterly 37, no. 3 (July 1, 1967): 268-270.</ref>. The fundamental idea, described in french by the word “depouillement”, was that books and documentation could be broken in their constitutive sentences and boiled down to a set of universal concepts, regulated by the decimal system. This would allow to express objective truths in a numerical language, fostering international exchange beyond translation, making science's work easier by regulating knowledge with numbers. One has to set this idea into its time, shaped by positivism and the belief in the unhindered potential of science to obtain objective universal knowledge. Especially taking into account the arbitrariness of the decimal structure, this today sounds doubtful, if not preposterous.
When one looks at Le Monde, Otlet's work from 1935, the continous reference to multiple algebraic formulas* that describe how the world is composed, suggest that one could “solve” such equations, to modify the world accordingly.
 
As a complementary part to Le Traite de Documentation, which was describing the systematic classification of knowledge, Le Monde was setting the basis to the transformation of this knowledge into new meaning.
 
  
This linguistico-numeric element of UDC, enabling to express fundamental meanings by numbers, plays a key role though, in the project of Paul Otlet. What one is brought to think by taking into account Otlet's overall path, is that numerical knowledge would be the first step towards a science of combination of these basic sentences to produce new meaning in a systematic way.
+
This linguistico-numeric element of UDC, enabling to express fundamental meanings by numbers, plays a key role, though, in the oeuvre of Paul Otlet. What one is brought to think by taking into account his overall path, is that numerical knowledge would be the first step towards a science of combination of these basic sentences to produce new meaning in a systematic way. When one looks at Monde, Otlet's second publication from 1935, the continous reference to multiple algebraic formulas that describe how the world is composed, suggest that one could at some point “solve” such equations, and modify the world accordingly.<ref>Otlet, Paul. Monde, essai d’universalisme: connaissance du monde, sentiment du monde, action organisée et plan du monde. Editiones Mundaneum, 1935: XXI-XXII.</ref>
When one looks at Monde, Otlet's work from 1935, the continous reference to multiple algebraic formulas* that describe how the world is composed, suggest that one could “solve” such equations, to modify the world accordingly.[[image mod monde]]
+
As a complementary part to Le Traité de Documentation, which was describing the systematic classification of knowledge, Monde was setting the basis to the transformation of this knowledge into new meaning.
As a complementary part to Le Traité de Documentation, which was describing the systematic classification of knowledge, Monde was setting the basis to the transformation of this knowledge into new meaning.[[ref?]]
 
  
Otlet wasn't the first to envision an idea of an 'algebra of thought'. It has been a recurring topos of modern philosophy, under the influence of scientific positivism and in concurrence with the development of mathematics and physics. Even though one could trace it to Ramon Llull and even earlier forms of combinatorics, the first to consistently undertake this same scientific and philosophical challenge was Gottfriend Leibniz.
+
Otlet wasn't the first to envision an idea of an 'algebra of thought'. It has been a recurring topos of modern philosophy, under the influence of scientific positivism and in concurrence with the development of mathematics and physics. Even though one could trace it to Ramon Llull and even earlier forms of combinatorics, the first to consistently undertake this same scientific and philosophical challenge was Gottfriend Leibniz. The German philosopher and mathematician, a precursor of the field of symbolic logic, developed later in the 20th century, was researching a method by which statements could be reduced to minimum terms of meaning. He has been famously researching a language which “... will be the greatest instrument of reason,” for “when there are disputes among persons, we can simply say: Let us calculate, without further ado, and see who is right”.<ref>Leibniz, Gottfried Wilhelm, The Art of Discovery 1685, Wiener: 51.</ref>
The German philosopher and mathematician, a precursor of the field of symbolic logic, which was developed later in the 20th century, was researching a method by which statements could be reduced to minimum terms of meaning. He has been famously researching a language which “... will be the greatest instrument of reason,” for “when there are disputes among persons, we can simply say: Let us calculate, without further ado, and see who is right”.*
+
His inquiry was divided in two phases, too. The first one, analitic, the characteristica universalis, was a universal conceptual language to express meanings, of which we only know that it worked with prime numbers. The second one, synthetic, the calculus ratiocinator, was the algebra that would allow operations between the meanings, of which there is even less evidence.
His inquiry was also divided in two phases. The first one, analitic, the characteristica universalis, was a universal conceptual language to express meanings, of which is only known that it worked with prime numbers. The second one, synthetic, the calculus ratiocinator, was the algebra that would allow operations between the meanings, of which there is even less evidence.
+
The idea of calculus was clearly related to the infinitesimal calculus, fundamental development that Leibniz conceived in the field of mathematics, and Newton concurrently developed and popularized. Even though not much remains of Leibniz's work on this 'algebra of thought', this task was later on taken on by mathematicians and logicians in the 20th century. Most famously, and curiously enough in the same years as Otlet was publishing Traite and Monde, logician Kurt Godel used the same idea of a translation to prime numbers to demonstrate his incompleteness theorem.<ref>https://en.wikipedia.org/wiki/G%C3%B6del_numbering</ref> The fact that the characteristica universalis only made sense in the fields of logics and mathematics is due to the fundamental problem presented by a mathematical approach to truth beyond logical truth. While such problem was not yet evident at the time, it would emerge in the duality of language and categorization, as it did later with Otlet's UDC.
The idea of calculus was clearly related to the infinitesimal calculus, fundamental development that Leibniz conceived in the field of mathematics, and Newton concurrently developed and popularized.1Even though not much remains of Leibniz's work on this 'algebra of thought', this task was later on taken on by mathematicians and logicians in the 20th century. Most famously, and curiously enough in the same years as Otlet was publishing Traite and Monde, logician Kurt Godel used the same idea of a translation to prime numbers to demonstrate his incompleteness theorem.* The fact that the characteristica universalis only made sense in the fields of logics and mathematics is due to the fundamental problem presented by such a mathematical approach to truths that exceed the logical ones. While such a problem was not yet evident at the time, it still put strain on the incompatible duality of language and categorization.
 
  
The relation between the organizational and linguistic aspects of knowledge is also one of the open issues that are at the core of the field of web search, which at first sight is less occupied with objective truths. At the beginning of the Web, around mid-90s, two main approaches to online search for information emerged: the web directory and web-crawling.
+
The relation between the organizational and linguistic aspects of knowledge is also one of the open issues that are at the core of the field of web search, at first sight less interested in objective truths. At the beginning of the Web, around mid-90s, two main approaches to online search for information emerged: the web directory and web-crawling. Some of the first search engines like Lycos or Yahoo!, started with a combination of the two. The web directory consisted in the human classification of websites into categories, done by an “editor”; crawling in the automatic accumulation of material by following links, with different rudimentary techniques to assess the content of a website. With the exponential growth of web content on the Internet, web directories were soon dropped in favour of the more efficient automatic crawling, which in turn generated at this point so many results that quality became of key importance. Quality in the sense both of the assessment of the webpage content in relation to keywords, as well as the sorting of results according to their relevance.
Some of the first search engines like Lycos or Yahoo!, started with a combination of the two. The web directory consisted in the human classification of websites into categories, done by an “editor”; crawling in the automatic accumulation of material by following links, with different rudimentary techniques to assess the content of a website.*
 
With the exponential growth of web content on the Internet, web directories were soon dropped in favour of the more efficient automatic crawling, which in turn generated at this point so many results that their quality became of key importance. Quality in the sense both of the assessment of the webpage content in relation to keywords, as well as the sorting of results according to their relevance.
 
  
Google's hegemony in the field has mainly been obtained with the approach of translating the relevance of a webpage into a numeric quantity according to a formula, the infamous PageRank algorithm.* This value is calculated on the relational importance of the webpage where the word is placed, based on how much other websites links to that page. The classification part is long gone, and linguistic meaning is also structured along automated functions. What is left is reading the network formation in number form, capturing the human opinions diffused in hyperlinks, both about which word links to which webpage, and which webpage is in general more important
+
Google's hegemony in the field has mainly been obtained with the approach of translating the relevance of a webpage into a numeric quantity according to a formula, the infamous PageRank algorithm. This value is calculated on the relational importance of the webpage where the word is placed, based on how much other websites links to that page. The classification part is long gone, and linguistic meaning is also structured along automated functions. What is left is reading the network formation in number form, capturing the human opinions represented by hyperlinks, both about which word links to which webpage, and which webpage is in general more important.
In the same way as UDC systematized documents via a notation format, the systematization of relational importance in numerical format allows for functionality and efficiency.  
+
In the same way as UDC systematized documents via a notation format, the systematization of relational importance in numerical format brings functionality and efficiency.  
In this case though, rather than linguistic, the translation is value-based, quantifying network attention independently from meaning. The interaction with the other infamous Google algorithm, Adsense2*, makes so that an economic value is intertwined with the PageRank position.
+
In this case rather than linguistic the translation is value-based, quantifying network attention independently from meaning. The interaction with the other infamous Google algorithm, Adsense, makes so that an economic value is intertwined with the PageRank position.
The influence and profit deriving from how high is a search result placed, mean that the relevance of a word-website relation in Google search results translates to an actual relevance in reality.*
+
The influence and profit deriving from how high is a search result placed, mean that the relevance of a word-website relation in Google search results translates to an actual relevance in reality.
  
In fact, we could posit that even though both Otlet and Google say they are the task of “organizing knowledge”, the approaches that are the foundation of the respective projects are at the opposite corners from an epistemological point of view.--
+
We could posit that even though both Otlet and Google say they are the task of “organizing knowledge”, the approaches that are the foundation of the respective projects are at the opposite corners from an epistemological point of view. UDC is an example of an analitic approach, which aquires new knowledge by breaking down existing knowledge in its components, based on objective truths. Its propositions could be exemplified with the sentences “Logic is a subdivision of Philosophy”, or “PageRank is an algorithm, part of the Google search engine”. PageRank instead is a purely synthetic one, which starts from the sole form of the network, devoid in principle of intrinsic meaning or truth, and makes a model of the network's relational truths. Its propositions could be exemplified with “Wikipedia is of utmost relevance”, or “The University of District Columbia is the most relevant meaning of the word 'UDC'”.
UDC is an example of a completely analitic approach, which aquires new knowledge by breaking down existing knowledge in its components, based on objective truths. Its propositions could be exemplified with the sentences “Logic is a subdivision of Philosophy”, or “PageRank is an algorithm, part of the Google search engine”.
 
PageRank instead is an extremely synthetic one, which starts from the sole form of the network, devoid in principle of intrinsic meaning or truth, and makes instead a model of the network's relational truths. Its propositions could be exemplified with “Wikipedia is of utmost relevance”, or “The University of District Columbia is the most relevant meaning of the word 'UDC'”.
 
  
We (and Google) can read the model of reality that is created by the Pagerank algorithm (and all the other algorithms that were added during the years*)in two different ways. It can be considered a device that 'just works' and does not pretend to be true but can give results which are useful in reality, a view we can call “pragmatic”; or we can instead see this model as a growing and improving construction that aims in the end to coincide with reality, a view we can call “utopian”. It's not a coincidence that these two views fit neatly the two stereotypical faces of Google, on one side idealistic Sylicon Valley visionary, on the other cynically corporate capitalist.
+
We (and Google) can read the model of reality that is created by the Pagerank algorithm (and all the other algorithms that were added during the years<ref>A fascinating list of all the algorithmic components of Google search is at https://moz.com/google-algorithm-change .</ref>)in two different ways. It can be considered a device that 'just works' and does not pretend to be true but can give results which are useful in reality, a view we can call “pragmatic”', or we can instead see this model as a growing and improving construction that aims in the end to coincide with reality, a view we can call “utopian”. It's not a coincidence that these two views fit neatly the two stereotypical faces of Google, the idealistic Sylicon Valley visionary one, and the cynical corporate capitalist one.
  
It is of relative importance which of the two sides we believe in our perspective. The key issue remains that such a structure has become so influential that it produces now its own effects on reality, that its algorithmical relative truths are increasingly taken as objective truths due to the power structure they circulate in.
+
For our perspective, it is of relative importance which of the two sides we believe in. The key issue remains that such a structure has become so influential that it produces now its own effects on reality, that its algorithmical truths are more and more considered as objective truths. While the utility and importance of a search engine like Google are out of the question, it is necessary to be alert about such concentrations of power. Especially if they are controlled solely by a corporation, which, beyond mottos and utopias, has by definition the sole duty of making profits and obeying its stakeholders.
While the usefulness and influence of a search engine like Google are out of the question, it is necessary to be alert about such concentrations of power. Especially if they are controlled solely by a corporation, which, beyond mottos and utopias, has by definition the sole goal of making profits and obeying its stakeholders.
 

Revision as of 14:52, 2 February 2016


0. Innovation of the same

The PR imagery produced by and around the Mundaneum (disambiguation: the instituition in Mons) is often hinting, by a series of 'samenesses', at a fundamental continuity between Otlet's endeavor and Internet/Google's products. A good example is the image below, where the drawers of Mundaneum (disambiguation: Otlet's Utopia) morph into the servers of one of Google's datacenters.
Dick Reckard


INCOMPLETE DRAFT


From industrial heartland to the Internet age (screen-capture). Video published by The Mundaneum, 2014

This stance is not limited to images: a recurring discourse that shapes some of the exhibitions taking place in Mundaneum maintains that the dream of the belgian utopian has been kept alive in the development of internetworked comunications, and currently finds its spititual successor in the products and services of Google. Even though there are many connections and similarities between the two endeavors, one cannot ignore as a negligible detail the fact that Otlet was an internationalist, a socialist, an utopian, that his projects were not profit oriented, and most importantly, that he was living in the temporal and cultural context of modernism in the beginning of the century. The constructed identities and continuities are detaching Otlet and the Mundaneum from a specific historical frame, ignoring the different scientific, social and political milieus involved. This means that such narratives exclude discording or disturbing elements that are inevitable when one would consider such a complex figure in its entirety.

This is not surprising, seen the parties involved in the discourse: this type of instrumental identities and differences fit quite well in the rethorical tone of the Silicon Valley. For example, it is common for newly launched IT products to be described as groundbreaking, innovative and 'different from anything seen before'. In other situations, instead, there is the complementary habit to stress that a product is 'exactly the same' as something else that already existed[1]. While novelty and difference has the function to surprise and wonder, sameness is there instead to reassure and comfort. For example Google Glass was marketed as revolutionary and innovative, but when it was attacked for its blatant privacy issues, some defended it as just a camera and a phone joined together. The sameness-difference couple fulfills a clear function: on one hand, it suggests that technological advancements might alter dramatically the way we live, and we have to be ready to give up our old fashioned ideas about life and culture when innovation comes. On the other hand, it suggests we should not be worried about these changes, and that society has always evolved through such disruptions, undoubtedly for the better. For each groundbreaking new invention that is questioned, there is a previous invention that was aiming for the same ideal, potentially with just as many detractors... Great minds think alike, after all. This sort of a-historical attitude pervades the techno-capitalist milieus, drawing a cartoonesque view of the past, punctuated by great men and great inventions, a sort of technological variant of Carlyle's Great Man Theory. In this view, the Internet becomes the invention of a few father/genius figures, rather than the result of a long and complex interaction of diverging efforts and interests of academics, entrepreneurs, national governments. This instrumental reading of the past is consistent with much of the theoretical ground on which the Californian Ideology[2] stands. In this ground, the conception of history is pervaded by various strains of technological determinism ( from Marshall McLuhan to Alvin Toffler[3] ) and capitalist individualism ( in generic neoliberal terms, up to the fervent objectivism of Ayn Rand ).

The appropriation of Paul Otlet's figure as Google's grandfather is such kind of historical simplification, and the samenesses that this tale is made of are not without fundament. Many concepts and ideals of documentation theories have reappeared in cybernetics and information theory, and therefore are present as well in the narrative of many IT corporations, as in Mountain View's case. With the intention to re-establish an historical dimension to the matter, it might be more interesting to play "exactly the same" game ourselves, rather than trying to dispel the advertised continuum of the 'Google of paper'. Choosing to focus on other types of analogies in the story, we can maybe contribute a narrative that is more respectful to the complexity of the past, and more telling about the problems of the present.

Following are three such 'comparisons', which focus on three aspects of continuity between the documentation theories and archival experiments Otlet was involved in, and the cybernetic theories and practices that Google's capitalist enterprise is an exponent of. First is a look at the conditions of workers in information infrastructures, fundamental for these systems to work but often forgotten or displaced. Then an account of the elements of distribution and control that appear both in the idea of a 'Reseau Mundaneum', and in the contemporary functioning of data centers, and the resulting interaction of these with other types of infrastructures. Finally there is a brief analysis of the two approaches to the 'organization of world's knowledge', examining their regimes of truth and the issues that come with them. Hopefully these three short pieces can provide some additional ingredients to adulterate the sterile recipe of the Google – Otlet sameness.

a. Do androids dream of mechanical turks?

In a drawing titled “Laboratorium Mundaneum”, Paul Otlet depicted his project as a massive factory, processing books and other documents into end products, rolled out by a UDC locomotive. In fact, just like a factory, Mundaneum was dependent on the bureaucratic and logistic modes of organization of labour developed for industrial production. Looking at this and other written and drawn sketches one can wonder: who was making up the workforce of such factories?

In his Traité de Documentation, Otlet describes extensively the thinking machines and the tasks of intellectual work which the Fordist chain of documentation is broken down into. In the subsection dedicated to the personnel that would work at these systems, though, the only role described in length is the one of the Bibliotécaire. Through the lengthy chapter that describes what formation such person should follow, what characteristics are necessary for the role, and so on, a brief mention is made about the existence of “Bibliotecaire-adjoints, rédacteurs, copistes, gens de service”[4]. There seem to be no further description nor depiction of the personnel that would write, distribute and search for the millions of index cards to keep the archive running, an impossible task for the Bibliotécaire alone.

A good answer to this question comes instead from a photograph from around 1930, taken in the Palais Mondial, where we can see Paul Otlet together with the rest of the equipe. In this beautiful group picture, we see that the workforce that kept the archival machine running was made up by women, of whom we have barely any information. In the same way as for telephone switching or early software development[5], gender stereotypes and discrimination appointed female workers for repetitive tasks that required specific knowledge and precision.

In the ideal image described in the Traité, all the tasks of collection, translation, distribution would be completely technical; seemingly without the necessity of any human intervention. In the meantime though, the Mundaneum hired tenths of women to do those tasks. The existing human-run version of the system was not considered a reference, as if it was some temporary in-between step that would be overcome as soon as possible, something that was staining the project with its vulgarity.

Notwithstanding the incredible advancement of information technologies and the automation of innumerable tasks in the collection, processing and distribution of information, this same pattern is very present nowadays as well. All automatic repetitive tasks that 'technology' can do for us are still based on human labour in one way or another. And, differently from the industrial worker who obtained its recognition with political movements and struggles, the role of many cognitive workers is still hidden or under-represented. Computational linguistics, neural networks, optical character recognition, all the most amazing machinic performances are still based on humans performing huge amounts of repetitive intellectual tasks that the software can 'learn from', or that the software can't do with the same efficiency. Automation didn't really free us from labour, it just shifted where, when and whose labour has to happen, a process that has been named “heteromation”[6]. Mechanical turks, content verifiers, annotators of all kinds... There is a multitude of tasks that has to happen for the software we use, that is invisible to us but is accomplished by humans. Who are they? When possible, work is outsourced to foreign english speaking countries with lower wages, like India. In the western world instead it follows the usual pattern: female, lower income, ethnic minorities. An interesting case of heteromated labour are the so-called Scanops[7], a set of Google workers with a different type of badge, isolated in one section of Mountain View complex and secluded from the rest of the workers, by their strict access permissions and fixed time schedules. The task of these workers consists of scanning the pages of printed books to be added to the Google Books database, a work that is still more convenient to do by hand in some cases (rare or fragile books, for example). In prevalence female, in prevalence ethnic minorities, there is no mention of these workers in Google Books or elsewhere; in fact the whole process of scanning is kept completely secretive. Even though the secrecy around this kind of labour is usually justified by the need to protect trade secrets, it anyway continues the attitude of hiding the human part in the machine work. This is even more obvious for the contrast with the celebration of other types of human workers, in the positions deemed creative and ingenious, as designers and programmers. Even though there is a tendency to hide the human labour that is necessary for certain automation to take place, some evidence of the workforce's existence remains in the result of its labour. In the case of Google Books employees, for example, it is possible to encounter the photos of their hands that mistakenly ended up in the digital version of the scanned book online[8]. Whether the tendency to hide the human role is due to the unfulfilled wish for total automation, to avoid the bad publicity of low wages and precarious work, or to keep an aura of mistery around machines, is still unclear for Google Books as it was for the Palais Mondial. It is reassuring though to know that the products still keep in some ways the traces of the work, that even with the progressive removal of human signs in automated processes, the workers' presence never disappears completely. This presence remains the proof of the materiality of information production, and becomes a sign of the economies and paradigms of efficiency and profitability that are involved.

b. centralization - distribution - infrastructure

In 2013, while prime minister Di Rupo was celebrating the beginning of the second phase of construction of the Saint Ghislain datacenter, a few hundred kilometers away a very similar situation was starting to unroll. In the municipality of Eemsmond, in the dutch province of Groningen, the local Groningen Sea Ports and NOM development were in secret deals with another[crystal comp] temporary named firm, "Saturn", to deploy a datacenter in the small port of Eemshaven, now an infrastructural wonder. When some months later, the party was revealed to be Google, Harm Post, director of Groningen Sea Ports, commented: "Just ten years ago Eemshaven was the standing joke of the ports, a case to look down upon of industrial development in the Netherlands, the planning failure of the last century. And now Google is building a very large data center here, which is 'pure advertisement' for Eemshaven and the data port."[9] Again, further details on the tax-cuts in the deal were not disclosed and, once finished, the datacenter will provide 150 jobs in the region.

Another territory had the luck to be chosen by Google, just like Mons, but what are the criteria behind such selection? For one, datacenters necessarily need to interact with existing infrastructures and flows of various type. Technically speaking, there are three prerequisites: being near a substantial source of electrical power (the datacenter will consume twice as much as the whole city of Groningen); being near a source of clean water, for the massive cooling demands; being near Internet infrastructure that can assure adequate connectivity. There is then a whole other set of non-technical elements, that we can sum up as the social, economical and political "climate", that proved favorable both in Mons and Eemshaven.

The push behind the construction of datacenters in new locations, rather than the enlargement of the ones that already exist, is partly due to the rapid growth of importance of “Software as a service”, so-called cloud computing, which means the rental of computational power from a central provider. With the rise of the SaaS paradigm the geographical and topological placement of the datacenter becomes of strategic importance to achieve lower latencies and more stable service. For this reason, Google has been in the last 10 years pursuing a policy of end-to-end connection between its datacenters and the user interfaces. That included buying leftover fiber networks[10], entering the business of underwater sea cables[11] and building new datacenters, including the ones in Mons and Eemshaven.

The spread of datacenters around the world, along the main network cables crossing the continents, represents a new phase in the diagram of the Internet. It should not be confused with the idea of decentralization that was a cornerstone value in the early stages of interconnected networks.[12] During the rapid development of the Internet and the Web, the new tenets of immediacy, unlimited storage and exponential growth brought to the centralization of content in increasingly large server farms. Paradoxically, it is now the growing centralization of all kind of operations in datacenters that is fostering their distribution. The tension between centralization and distribution, and the dependence on neighbouring infrastructures as for example the electrical grid, is not an exclusive feature of contemporary data storage and networking models. Again, suggestions of something quite similar emerge from the history of the Mundaneum, and illustrate how these issues relate closely to the logistic organization of production first implemented during the industrial revolution, and theorized within modernism.

Centralization was seen by Otlet as the most efficient way to organize content, especially in view of international exchange[13]. This already generated space problems back then: the Mundaneum archive counted 16 million entries at its peak, occupying around 150 rooms. The cumbersome footprint, and the growing difficulty to find stable locations for it, concurred to the conviction that the project should be included in the plans of new modernist cities. In the beginning of the 1930s, with Mundaneum starting to lose support from the Belgian government, Otlet tried to find a new site for it as part of a proposed 'Cite Mondiale', which he tried in different locations [dennis] with different approaches.

Between the various attempts, he participated in the competition for the development of the Left Bank in Antwerp. The most famous modernist urbanists of the time were invited to plan the development from scratch of the left side of the river, at the time completely unbuilt. Otlet lobbied for the insertion of a Mundaneum in the projects, stressing how it would create hundreds of jobs for the region. He also flattered the flemish pride by stressing how Antwerp inhabitants, often more hard working than the Brussels', would finally obtain their deserved recognition, hightening their city to a “World City” status.[14] He partly succeeded in his propaganda, seen the fact that apart from his own proposal, developed in collaboration with Le Corbusier, many other participants included Otlet's Mundaneum as a key facility in their plans. In these proposals for new development, Otlet's archival infrastructure was shown in interaction with the existing flows already running through the city, like industrial docks, factories, the railway and the newly constructed stockmarket.[15]The modernist utopia of the planned living environment already implied the organization of culture and knowledge by methods similar to the ones used for coal or electricity.

In the Traité de Documentation, published in 1934, there is a long speculation on the “Universal Network of Documentation”, which would be rensponsible for the transfer of knowledge between different documentation centres as libraries or the Mundaneum[16]. In fact the existing Mundaneum would just be the first node of a wide network bound to expand to the rest of the world, the “Reseau Mundaneum”. The nodes of this network are explicitly described in relation to ‘post, railways and the press, those three essential organs of modern life which function unremittingly in order to unite men, cities and nations.’[17] In the same period, in letter exchanges with Patrick Geddes and Otto von Neurath, commenting on the potential of heliographies as a way to distribute knowledge, the three imagine the 'White Link', a network to distribute copies throughout a network of Mundaneums[18]. Thanks to this, the same piece of information would be serially produced and logistically distributed, described as a sort of “moving Mundaneum” idea, facilitated by the railway system[19]. No wonder then, it was a main characteristic for the future Mundaneums to be built next to a train station.

Through Otlet's plans for a “Reseau Mundaneum” we can already see some of the key transformations that reappear with nowadays evolving datacenter scenario. A drive for centralization in the first place, with the accumulation of materials that brought to the monumental plans of World Cities. Parallelly to this, the push for international exchange, which brought a vision of a distribution network. Thirdly, the resulting placement of the hypothetic nodes of such network along strategical intersections of industrial and logistic infrastructure.

While the plan for Antwerp was in the end rejected in favour of more traditional housing development, 80 years later the legacy of the relation between existing infrastructural flows and the logistics of documentation storage is highlighted by the data-ports plan in Eemshaven. Since private companies are the privileged actors in these type of projects, the circulation of information increasingly respond to the same tenets that regulate the trade of coal or electricity. The very different welcome that traditional politics reserve for Google datacenters is a symptom of a new dimension of power that information infrastructure plays a role into. The celebrations and tax cuts that politicians lavish for these projects cannot be explained with 150 jobs or the 'economic incentives' for a depressed region alone. They also indicate how party politics live in awe of being periferic to other forms of power and want to benefit from strategic positioning, as well.


c. 025.45UDC; 161.225.22; 004.659GOO:004.021PAG.

The Universal Decimal Classification[20] system, developed by Paul Otlet and Henri Lafontaine on the basis of the Dewey Decimal Classification system, is still considered one of the most important realizations of the two men, as well as a corner stone in Otlet's overall vision. Its adoption, revision and use until present day demonstrates a thoughtful and successful approach to the challenge of the classification of knowledge.

The UDC, differently from Dewey and other bibliographic systems, had the potential to exceed the function of ordering alone. The complex notation system could classify phrases and thoughts in the same way as it would classify a book, going well beyond the sole function of classification, becoming a real language. One could in fact express whole sentences and statements in UDC format[21]. The fundamental idea, described in french by the word “depouillement”, was that books and documentation could be broken in their constitutive sentences and boiled down to a set of universal concepts, regulated by the decimal system. This would allow to express objective truths in a numerical language, fostering international exchange beyond translation, making science's work easier by regulating knowledge with numbers. One has to set this idea into its time, shaped by positivism and the belief in the unhindered potential of science to obtain objective universal knowledge. Especially taking into account the arbitrariness of the decimal structure, this today sounds doubtful, if not preposterous.

This linguistico-numeric element of UDC, enabling to express fundamental meanings by numbers, plays a key role, though, in the oeuvre of Paul Otlet. What one is brought to think by taking into account his overall path, is that numerical knowledge would be the first step towards a science of combination of these basic sentences to produce new meaning in a systematic way. When one looks at Monde, Otlet's second publication from 1935, the continous reference to multiple algebraic formulas that describe how the world is composed, suggest that one could at some point “solve” such equations, and modify the world accordingly.[22] As a complementary part to Le Traité de Documentation, which was describing the systematic classification of knowledge, Monde was setting the basis to the transformation of this knowledge into new meaning.

Otlet wasn't the first to envision an idea of an 'algebra of thought'. It has been a recurring topos of modern philosophy, under the influence of scientific positivism and in concurrence with the development of mathematics and physics. Even though one could trace it to Ramon Llull and even earlier forms of combinatorics, the first to consistently undertake this same scientific and philosophical challenge was Gottfriend Leibniz. The German philosopher and mathematician, a precursor of the field of symbolic logic, developed later in the 20th century, was researching a method by which statements could be reduced to minimum terms of meaning. He has been famously researching a language which “... will be the greatest instrument of reason,” for “when there are disputes among persons, we can simply say: Let us calculate, without further ado, and see who is right”.[23] His inquiry was divided in two phases, too. The first one, analitic, the characteristica universalis, was a universal conceptual language to express meanings, of which we only know that it worked with prime numbers. The second one, synthetic, the calculus ratiocinator, was the algebra that would allow operations between the meanings, of which there is even less evidence. The idea of calculus was clearly related to the infinitesimal calculus, fundamental development that Leibniz conceived in the field of mathematics, and Newton concurrently developed and popularized. Even though not much remains of Leibniz's work on this 'algebra of thought', this task was later on taken on by mathematicians and logicians in the 20th century. Most famously, and curiously enough in the same years as Otlet was publishing Traite and Monde, logician Kurt Godel used the same idea of a translation to prime numbers to demonstrate his incompleteness theorem.[24] The fact that the characteristica universalis only made sense in the fields of logics and mathematics is due to the fundamental problem presented by a mathematical approach to truth beyond logical truth. While such problem was not yet evident at the time, it would emerge in the duality of language and categorization, as it did later with Otlet's UDC.

The relation between the organizational and linguistic aspects of knowledge is also one of the open issues that are at the core of the field of web search, at first sight less interested in objective truths. At the beginning of the Web, around mid-90s, two main approaches to online search for information emerged: the web directory and web-crawling. Some of the first search engines like Lycos or Yahoo!, started with a combination of the two. The web directory consisted in the human classification of websites into categories, done by an “editor”; crawling in the automatic accumulation of material by following links, with different rudimentary techniques to assess the content of a website. With the exponential growth of web content on the Internet, web directories were soon dropped in favour of the more efficient automatic crawling, which in turn generated at this point so many results that quality became of key importance. Quality in the sense both of the assessment of the webpage content in relation to keywords, as well as the sorting of results according to their relevance.

Google's hegemony in the field has mainly been obtained with the approach of translating the relevance of a webpage into a numeric quantity according to a formula, the infamous PageRank algorithm. This value is calculated on the relational importance of the webpage where the word is placed, based on how much other websites links to that page. The classification part is long gone, and linguistic meaning is also structured along automated functions. What is left is reading the network formation in number form, capturing the human opinions represented by hyperlinks, both about which word links to which webpage, and which webpage is in general more important. In the same way as UDC systematized documents via a notation format, the systematization of relational importance in numerical format brings functionality and efficiency. In this case rather than linguistic the translation is value-based, quantifying network attention independently from meaning. The interaction with the other infamous Google algorithm, Adsense, makes so that an economic value is intertwined with the PageRank position. The influence and profit deriving from how high is a search result placed, mean that the relevance of a word-website relation in Google search results translates to an actual relevance in reality.

We could posit that even though both Otlet and Google say they are the task of “organizing knowledge”, the approaches that are the foundation of the respective projects are at the opposite corners from an epistemological point of view. UDC is an example of an analitic approach, which aquires new knowledge by breaking down existing knowledge in its components, based on objective truths. Its propositions could be exemplified with the sentences “Logic is a subdivision of Philosophy”, or “PageRank is an algorithm, part of the Google search engine”. PageRank instead is a purely synthetic one, which starts from the sole form of the network, devoid in principle of intrinsic meaning or truth, and makes a model of the network's relational truths. Its propositions could be exemplified with “Wikipedia is of utmost relevance”, or “The University of District Columbia is the most relevant meaning of the word 'UDC'”.

We (and Google) can read the model of reality that is created by the Pagerank algorithm (and all the other algorithms that were added during the years[25])in two different ways. It can be considered a device that 'just works' and does not pretend to be true but can give results which are useful in reality, a view we can call “pragmatic”', or we can instead see this model as a growing and improving construction that aims in the end to coincide with reality, a view we can call “utopian”. It's not a coincidence that these two views fit neatly the two stereotypical faces of Google, the idealistic Sylicon Valley visionary one, and the cynical corporate capitalist one.

For our perspective, it is of relative importance which of the two sides we believe in. The key issue remains that such a structure has become so influential that it produces now its own effects on reality, that its algorithmical truths are more and more considered as objective truths. While the utility and importance of a search engine like Google are out of the question, it is necessary to be alert about such concentrations of power. Especially if they are controlled solely by a corporation, which, beyond mottos and utopias, has by definition the sole duty of making profits and obeying its stakeholders.
  1. A good account of such phenomenon is described by David Golumbia. http://www.uncomputing.org/?p=221
  2. As described in the classic text looking at the ideological ground of Silicon Valley culture. http://www.hrc.wmin.ac.uk/theory-californianideology-main.html
  3. For an account of Toffler's determinism, see http://www.ukm.my/ijit/IJIT%20Vol%201%202012/7wan%20fariza.pdf .
  4. Otlet, Paul. Traité de documentation: le livre sur le livre, théorie et pratique. Editiones Mundaneum, 1934: 393-394.
  5. http://gender.stanford.edu/news/2011/researcher-reveals-how-%E2%80%9Ccomputer-geeks%E2%80%9D-replaced-%E2%80%9Ccomputergirls%E2%80%9D
  6. Ekbia, Hamid, and Bonnie Nardi. “Heteromation and Its (dis)contents: The Invisible Division of Labor between Humans and Machines.” First Monday 19, no. 6 (May 23, 2014). http://firstmonday.org/ojs/index.php/fm/article/view/5331.
  7. The name scanops was first introduce by artist Andrew Norman Wilson when he found out about this catgory of workers during his artistic residency at Google in Mountain View. See http://www.andrewnormanwilson.com/WorkersGoogleplex.html .
  8. As collected by Krissy Wilson on her http://theartofgooglebooks.tumblr.com .
  9. http://www.rtvnoord.nl/nieuws/139016/Keerpunt-in-de-geschiedenis-van-de-Eemshaven .
  10. http://www.cnet.com/news/google-wants-dark-fiber/ .
  11. http://spectrum.ieee.org/tech-talk/telecom/internet/google-new-brazil-us-internet-cable .
  12. See Baran, Paul. “On Distributed Communications.” Product Page, 1964. http://www.rand.org/pubs/research_memoranda/RM3420.html .
  13. Pierce, Thomas. ‘Mettre des pierres autour des idées’. Paul Otlet, de Cité Mondiale en de modernistische stedenbouw in de jaren 1930. PhD dissertation, KULeuven, 2007: 34.
  14. Ibid: 94-95.
  15. Ibid: 113-117.
  16. Otlet, Paul. Traité de documentation: le livre sur le livre, théorie et pratique. Editiones Mundaneum, 1934.
  17. Otlet, Paul. Les Communications MUNDANEUM, Documentatio Universalis, doc nr. 8438
  18. Van Acker, Wouter. “Internationalist Utopias of Visual Education: The Graphic and Scenographic Transformation of the Universal Encyclopaedia in the Work of Paul Otlet, Patrick Geddes, and Otto Neurath.” Perspectives on Science 19, no. 1 (January 19, 2011): 68-69.
  19. Ibid: 66.
  20. The Decimal part in the name means that any records can be further subdivided by tenths, virtually infinitely, according to an evolving scheme of depth and specialization. For example, 1 is “Philosophy”, 16 is “Logic”, 161 is “Fundamentals of Logic”, 161.2 is “Statements”, 161.22 is “Type of Statements”, 161.225 is “Real and ideal judgements”, 161.225.2 is “Ideal Judgements” and 161.225.22 is “Statements on equality, similarity and dissimilarity”.
  21. “The UDC and FID: A Historical Perspective.” The Library Quarterly 37, no. 3 (July 1, 1967): 268-270.
  22. Otlet, Paul. Monde, essai d’universalisme: connaissance du monde, sentiment du monde, action organisée et plan du monde. Editiones Mundaneum, 1935: XXI-XXII.
  23. Leibniz, Gottfried Wilhelm, The Art of Discovery 1685, Wiener: 51.
  24. https://en.wikipedia.org/wiki/G%C3%B6del_numbering
  25. A fascinating list of all the algorithmic components of Google search is at https://moz.com/google-algorithm-change .