»The US-societies’ low income fringe, which grew significantly during the crisis years, includes people who rely on state-run food support. These benefits still bear the name ›food stamps‹, a term that refers to the post-World War II period although a private company actually runs a digital system called ›Electronic Benefits Transfer‹, in short EBT, that deals with the financial transactions. […] Last Saturday, around 9 a.m. [Oct. 12, 2013] in some northern federal states the electronic EBT-cards began to malfunction. The food chains‹ shopping peak time had not yet begun. One and a half hour later the EBT-payment system failed at the West coast and shortly after in the East as well down to Florida, at a time when many clients began their shopping. […] Within a few days the situation escalated. The computer problems appeared to be of serious nature and the EBT-system stayed offline. Occasionally reports of organized supermarket plundering appeared on the Internet – it got dicey in gunmens’ country« (Kurz 2013:37, transl. F.H.). This is how journalist Constanze Kurz describes in the newspaper Frankfurter Allgemeine Zeitung an infrastructural breakdown, providing an impressing example for infrastructures getting visible only during break-down.
Aim of the following text is to develop a notion, how the visibility of database systems – understood as basic infrastructure in Post-Fordist societies – can be raised. Infrastructure studies, a relatively new theoretical field, provides the theoretical framework, which is enriched with methods from media studies, media history and media art. First I’ll discuss infrastructure in general, and in how far database systems can be addressed as infrastructure. Further we look into the various dimensions of database infrastructure, such as time, space, membership, organizational structures and practices. This shall lead to an practical approach for making database systems more visible, through paying attention to recurring aspects of user interfaces that can help to identify underlying database infrastructures. My intention with this text is to develop a theoretical base for further practical, artistic explorations.
Geoffrey Bowker, Karen Baker, Florence Millerand and David Ribes introduce in their essay Towards Information Infrastructure Studies: Ways of Knowing in a Networked Environment (2010) a field of study, which aims to observe and interpret everyday infrastructures, especially the computerized, networked that they call cyber-infrastructures (Bowker et al. 2010:97). There they define infrastructure as »vast sets of collective equipment necessary to human activities«. This could be stone, concrete, pipes or wires, but it includes protocols, standards and knowledge as well. The latter actually enable the common usage of infrastructure by the diverse users and maintainers, that is developers, administrators, managers and end-users.
From this definition and from own experience we can deduct, that infrastructure is characterized by its distribution in space and time. At the same time »Infrastructure typically exists in the background, it is invisible, and it is frequently taken for granted« (ibid.). And it is exactly because of this embeddedness, that their relative invisibility makes sense. This, however, can lead to invisibility of the economies and social dynamics of infrastructure and affects the maintainers who experience low visibility and often enough receive low wages. Sociologists Susan L. Star and Karen Ruhleder furthermore observe a series of pairs in infrastructure: »It is both engine and barrier for change; both customizable and rigid; both inside and outside organizational practices. It is product and process« (Star/Ruhleder 1996:114).
Urbanism researchers Stephen Graham and Simon Marvin describe in their study Splintering Urbanism–Networked Infrastructures the spreading of infrastructures as follows: »Infrastructure networks dramatically, but highly unevenly, ›warp‹ and refashion spaces and times of all aspects of interaction – social, economic, cultural, physical, ecological. Infrastructure networks are thus involved in sustaining what we might call ›sociotechnical geometries of power‹ in very real – but often very complex – ways (see Massey, 1993). They tend to embody ›congealed social interests‹ (Bijker, 1993)« (Graham/Marvin 2001:17). Referring to Bijker they further develop the notion of »congealed« or »frozen« when discussing infrastructure as capital: »As capital that is literally ›sunk‹ and embedded within and between the fabrics of cities, they represent long-term accumulations of finance, technology, know-how, organizational and geopolitical power.« (ibid:12).
The metaphor of frozenness in conjunction with infrastructure is also used by Geoffrey Bowker and Susan Star. They speak of software as frozen organizational discourse and compare it to machines where processual and energetic parts of humans’ work get frozen in technological form. »Modern information technologies […] embed and inscribe work in ways that are important for policy-makers, but which are often difficult to see. Where they are used to make decisions, or to represent decision-making processes, such technologies also act to embed those decisions. That is, the arguments, decisions, uncertainties and processual nature of decision-making are hidden away inside a piece of technology or in a complex representation. Thus values, opinions and rhetoric are frozen into codes, electronic thresholds and computer applications. Extending Marx, then, we can say that in many ways, software is frozen organizational discourse« (Bowker/Star 1994:187).
We realize already, that infrastructure exceeds concrete and pipes, but it even gets more complex, because infrastructure also changes over time, as well as its users and usage. Infrastructure already bears the history of its emergence within: discussions between engineers, decisions about user interfaces (from the valve to computer interfaces), legal fights and power struggles are part of the standardization processes. When finished, a fixed standard creates stability for a certain period of time until economical, technological or social changes give reason for changing the standard again. The process of standardization furthermore implies economical advantages for individual actors whenever they own the patents of the standards’ key technologies.
The dimension of time further implies, that infrastructure is designed in a manner, which allows its functioning for really long periods of time. Streets, rail tracks, dewatering systems got developed for functioning during prolonged time spans. Partly this gets realized through the interchangeability of modular single parts – just recall the single elements of a railroad system. Especially long-term users expect its guaranteed functioning. They get used to it, a process that is augmented through the fact that infrastructural practices are often built on top of already existing infrastructures. Computer networks build on electricity networks (and in a second step also reciprocally) and so on. This correlates with the already discussed invisibility of infrastructure – as long as it works, it gets out of sight. Inversely infrastructure gains visibility upon its breakdown, be it a water main break or the outage of Twitter on June 21, 2012 caused by a »cascading bug in one of our infrastructure components« (Handelsblatt, 2012-6-21)
Furthermore organizational structures get established that allow a financial and juridical forward planning. We may look at the International Telecommunication Union/Radiocommunication Sector as an example: It coordinates the limited space in geostationary orbit between the diverse governmental and non-governmental actors.
In terms of databases, the Conference on Data System Languages (CODASYL), that was convened by the US defense department was of infrastructural importance. It established not only the programming language COBOL but also created the basics for a standardization of database technologies. A sub-group, the Data Base Task Group, delivered in 1968 its first report called COBOL extensions to handle Data Bases. Subsequent reports became known as CODASYL network model. The breakthrough however was reached through the relational model, developed by Edgar F. Codd at IBM in 1970. It differed significantly from the CODASYL proposal and it took until the 1980s until it became a quasi standard of today’s database technology.
CODASYL is also a relevant example for a government funded organization. Infrastructure studies are not solely concerned with the histories of infrastructural product and services development but also with the history of those organizations who create the context. It is them who define, categorize, organize, discuss and create standards and I assume it would be easy to fill a theatre stage with the ongoing drama of power struggle, lobbying, friendship and competition surrounding this process. »We can not do the history of software without doing the history of their surrounding organizations« writes the technology historian Geoffrey Bowker (Bowker 2010:102).
CODASYL members in 1969
- James P. Fry – The MITRE Corporation
- Mary K. Hawes – Information Systems Leasing Corp.
- William C. McGee – IBM International Business Machines
- Tax A. Metaxides – Bell Telephone Laboratories
- William Olle – RCA Radio Corporation of America
- Jonas Rabin – Western Electric
- Martin J. Rich – ESSO Mathematics and Systems, Inc.
- Richard F. Schubert – Goodrich Chemical Company
- Edgar H. Sibley – University of Michigan
- Aria E. Weinert – Naval Command System Support Activity
- Alfred H. Vorhaus – SDC System Development Corporation
- John W. Young – The National Cash Register Company
Ill: Members of the CODASYL in 1969 (CODASYL 1969a).
Membership is another facet of infrastructural organizations. »Strangers and outsiders encounter infrastructure as a target object to be learned about. New participants acquire a naturalized familiarity with its objects as they become members« (Star/Ruhleder 1996:116). To be naturalized in this context implies, that to the members a certain infrastructure appears as naturally functioning: »It always has been working like this«, the qualified user may say – while »newbies« still struggle with details and unanswered questions. So over time, one becomes a member, which would identify a functioning infrastructure; or if unsuccessful, potential members get pushed away. So over time, one eventually becomes a member, which would identify a functioning infrastructure; or if unsuccessful, potential members get pushed away.
Whether structure becomes infrastructure depends on developers and users. Susan L. Star and Karen Ruhleder have looked in their study Steps toward an Ecology of Infrastructure: Design and Access for Large Information Spaces at a community of biologists, who already were users of a scientific community system, or were about to enter the system. The Worm Community System enabled networked interactions between US-scientists and their work on genetic sequencing. While the software package, globally seen, was well developed and documented, the individual users on a local level met resistance when trying to actually implement the software. Local IT departments for instance were not willing to run the UNIX-based system, since they were used to Windows servers. The tension also originated from the software developers who opted globally for UNIX and the users who preferred Apple Macintosh computers over UNIX. »On one level it is a discussion about operating systems, on another it is representative of two world views and sets of values with respect to the relationship between technology and work, and the relationship between the tool and user« (Star/Ruhleder 1996:130) Star and Ruhleder describe the tension between a flexible, familiar usage that is adapted to local needs and the global necessity for standards and continuity. This tension grows with the global distribution of an infrastructure. They conclude: »An infrastructure occurs, when the tension between local and global is solved« (Star/Ruhleder 1996:117).
Another impressing example for this tension between local and global is the Library of the University Leipzig. Over the last 15 years its systems have changed from book handling by hand to computerized handling involving a complex set of database software systems. It comprises not only the libraries catalogues but also media that is in reach of the library through inter-library loans or more generally through electronic licenses from large distributors or aggregators, such as Ebsco, ExLibris or ProQuest (some of which are ordered almost in real time following users’ demands). It includes data, i.e. titles or availability, that is kept locally at the library, which consists of several branches spread over the city. It incorporates data that is collected and consolidated with associated libraries through the Südwestdeutscher Bibliotheksverbund (South-West German Library Network). It comprises data from the German National Library and other national libraries. Lately they were moving their search infrastructure funded by the European Union and the federal state of Saxony from a relational database approach, the OPAC, to a search engine like setting, employing and co-developing open source solutions like Solr and VuFind. Another emerging approach is the usage and providing of Open Data access through APIs, where the library not only receives data from other services but also publicly provides their own data, i.e. title meta-data. This listing may only roughly illustrate the tension that appears between local needs and global interaction of these interdependent systems but it is obvious that in this library, where two decades ago insular local electronic databases were installed, but today a technological landscape distributes data locally and globally, huge changes have taken place (Hunger 2014).
It can make a difference whether infrastructure owners are state or privately run and the form of ownership often differs among the various organizational structures that may be involved in infrastructure. This way it can happen in meta-structures or meta-organizations, that state actors interact with private actors with often differing scope. Also the decision, if infrastructure should be a state or communal task, is subject to change over time. Over the last two decades a phenomenon could be observed, where after initial privatization of a communal infrastructure, i.e. hospitals, water supply companies, or tram lines, since experiencing a row of unexpected consequences, the privatization is reversed and infrastructure bought back into public ownership by the municipality.
Today’s Internet emerged from a mixture of university and military structures. After state subsidized organizations, financed through tax payer money, powered a concentrated initial funding, private actors were able to take over the basic technology and create a market by fostering the technological innovation towards mass consumer products (see National Research Council 1999). In addition to the Internet, the Global Positioning System, communication satellite systems in general or the personal computer show this pattern of a first governmental then private ownership succession. Along with that we can observe mixed private-public infrastructure organization structures, as in the Global Positioning System. This system of 24 satellites is managed and maintained by the US military. This includes ground infrastructure, transceiver and receiver stations, a control room which for instance adjusts satellites’ orbits, and resources for rocket launches since the satellites have to be renewed every 15 to 20 years. This partly is achieved through NASA or partly through private businesses such as Ariane Corporation. The satellites’ construction is in the hands of the private defense industry. The end user is served through private providers of navigation systems and geo data services. Geo data again is produced and provided by partly private actors and partly public organizations, such as the land registry offices. Public and private databases play at many points of these infrastructures an integral role for data handling and processing (Hunger 2012).
The establishing of state-run infrastructures ultimately depends on tax income and the ability to incur debts for large scale projects. In that sense it is politicians who have the tasks to establish justification contexts for public infrastructure investment. Once the state-run establishment of an infrastructure succeeded, it can be observed, that it stays in public ownership at least for a 30 year period. Private companies often offer services and products that extend and diversify this infrastructure towards a mass market, they finance their actions through stock market and have to be able to present a direct gain (in the sense of the intention to make profit) to the creditors. Both forms of ownership appear complementing and dependent of each other.
Today databases sit at the heart of nearly every software application. While hardware was advancing the development of electronic computers during the early phase (1940’s–1970’s), today software plays an important part in innovation. The universal computer – in the hardware sense – was strongly pushed by John von Neumann and his cohorts from 1945 on and since has become the predominant model for electronic computers. This model describes a computer consisting of the following units: Processing Unit, Control Unit, Memory (for software and for data), Input and Output units. Goal of this configuration was, to make software independent of data (Ceruzzi 1998:21f). In regards to databases we can observe a similar movement: The current paradigm, the relational database, developed and promoted by Edgar F. Codd aimed to make the data independent of the underlying hardware structure (Codd 1970). My preliminary assumption is, that Codds relational model is the complementing part to von Neumanns idea of universality.
Concluding (with Star/Ruhleder 1996:116) infrastructure appears in multiple dimensions:
- being transparent/invisibility
- reach or scope (beyond a single event or place)
- learned through membership
- conventions derived from practice
- embodiment of standards
- based on existing infrastructure
- visibility upon breakdown
To which I’d like to add the dimensions
- public/private ownership
- universality of concept
Databases only emerge as infrastructure through their embeddedness in already existing infrastructures (computer networks, software environments). Databases as infrastructure are subject to a complex set of relations, which usually stays invisible.
Ghost Hunters: Strategies for creating visibility
It became already obvious how deeply the vocabulary that we use is shaped by infrastructures’ invisibility: infrastructure »exists in the background«, is »literally sunk«, or »transparent«. Already now we can assume that the issue of in/visibility addresses a political dimension as well. Database infrastructure resembles an invisible ghost or a ghost in a shell, hidden in a cover of chips and circuit boards.
The visualization of infrastructure during its break-down is pop-culturally known from many films and TV-series: New York gets flooded and frozen in The day after tomorrow (2004), a large part of infrastructure destroyed, while just the New York Public Library remains a survivors’ hideaway. In Falling Skies (2011) aliens nearly destroy humankind who leaves the central urban areas and rebuilds a new infrastructure advancing from the periphery. The science-fiction series Revolution (2012) depicts an archaic world following the global breakdown of electrical power networks.
The breakdown of infrastructure, the fears connected to this breakdown and the dependence on infrastructures in general is created in these phantasies in form of an imaginary, subconscious fear, a phantasm in the Lacanian sense. Similar phantasies play a role in the hacker movie genre, where a dysfunctional computer system completely paralyses the function of a broad infrastructure, as for instance in Wargames (1983) and Sneakers (1992). The computer game Watchdog (2014) may be the most obvious example: The games hero Aiden Pearce uses a mobile phone with an extra function that allows him to interact with infrastructure via a centralized computer system, so he can switch on and off traffic lights, manipulate surveillance cameras in public spaces, or close gates and bars at his will.
These pop cultural works celebrate catastrophic scenarios and manage to address subconscious fears of a civilization breakdown. They visualize our dependence on infrastructure, yet mostly implicitly, that is without mentioning explicitly the term »infrastructure«. The name of the genre is »disaster film«, not »infrastructure film«. However, if we aim to make infrastructure visible, it is Bowker et al. who propose to investigate infrastructure breakdowns as a means.
Another method is the use of infographics, which aims to reduce complex problems to clearly structured graphics – but in the case of infrastructure this often means, that designers tend to address the comprehensible aspects of infrastructure, that is »bricks and mortar« or users. They tend to deal with temporal or juridical aspects insufficiently.
Shannon Mattern, professor of Media Studies, proposes best practices beyond mapping and visualization through photography and video: »We should consider, if there are perhaps other modes of ›accounting for‹ infrastructural units, their couplings and chasms, and the ›relations that take shape through and around them‹ that don’t necessary translate infrastructures into visual – graph – form. Could we possibly play an infrastructure? Listen to it? And of course visit it?« (Mattern 2012). In her essay Infrastructural Tourism (2012) Mattern discusses several media art projects that employed performative strategies and enable infrastructure users to develop a basic »infrastructural literacy«. Yet she critically comments on these projects: »There seems to be an implicit idea in many situationist-style interventions that participants learn to look at their environment in new ways. But ways this ›knowledge‹ becomes expressed after the tour, derive, or whatever, remain vague. … The ›after‹ seems largely unexplored« (ibid.). She argues, that it would need further steps, to sustainably tie the participants’ experiences and proposes for example to create a topical research library that supports a deeper insight into structures and political decisions. Mattern locates further potential of an actual application of knowledge that was generated in the field of infrastructure studies through »introducing more critically aware design practices, or reversing protocols and regulations« (ibid.).
Another method of visualization was described as infrastructural inversion by Geoffrey Bowker in 1994. Lisa Parks, professor for film and media studies employed this method for her studies of satellite infrastructure. Satellites in orbit remain invisible to the immediate sight, a problem that shows similarity to the invisibility of databases. During the workshop Satellites/Borders/Footprints with Lisa Parks, which I organized at the Hartware MedienKunstVerein Dortmund in 2010, I learned this method of visualization of »invisible« infrastructure. While it is difficult, to observe »distant« parts of infrastructure such as command centers and their workers, the rocket industry, ground stations, or regulatory bodies, there is still an obvious part that can be immediately viewed with the naked eye: the satellite dish. Yet, Lisa Parks mentions, that any approach to infrastructure, in this case the satellite footprint – the strip on earth where a certain satellite is receivable – has to stay necessarily fragmentary. »Rather than assume the footprint could be described in its totality, footprint analysis engages only with portions of it in order to provide a sense of the complexity and impossibility of an entire picture. In other words, rather than set out to describe and document all parts of the system that make a footprint possible, the analysis focuses upon a selection of localized sites or issues as suggestive parts of a broader system that is imperceptible in its entirety« (Parks 2009).
I think the method of infrastructural inversion, which Lisa Parks practically developed within the context of anthropological field research referring to Bowker (1994) and Fiske (1996), can be fruitfully employed for databases as well. We can paraphrase Lisa Parks considerations directly with just the change of one word: Rather than assume the database infrastructure could be described in its totality, database analysis engages only with portions of it in order to provide a sense of the complexity and impossibility of an entire picture.
Summarizing, the following methods for fostering database-visibility can be employed:
- Visibility during breakdown (in fictional, documentary or journalist approaches)
- Performative, artistic strategies
- Infrastructural inversion/field studies
When trying to apply the infrastructural inversion method to database infrastructure, we would first identify those parts of database systems with lower visibility:
- Geographically dispersed data centers, which house database servers and databases software
- Organizatorial structures and those individuals, who program and distribute database software
- The structures and raw data of databases, which tend to be hidden from public perception through the limitation of access rights and through layers of software »protecting« the database
In a second step we can turn towards aspects of database infrastructure with a higher visibility. Visually they appear through the different modi: 1.) data input 2.) query modus 3.) result display. While data input and query modus often use form fields, where users can enter data, the result display uses list views of different complexity to present data.
To finally deduct the existence of databases from the perspective of visual output, we could look out for the following functions and visual signs:
- Form view for data input, search queries and updating data
- List view or presence of structured data
- Usage of abstract organizing principles (i.e. ID-numbers)
- Login (allows for gradual access levels and individualization)
- Change logs that record changes in data (i.e. in Wikipedia it records different article versions and the editors’ user names)
- External data access through a standardized interface, the API
These visuals in form of user interfaces generate higher visibility, and they also allow for a partial reverse-engineering of the infrastructure that is hidden in the background (based on the idea, that the user should not need to deal with infrastructures’ details).
The presented discussion is mostly theoretical to this point, but I hope that it may be inspirational for some and I look forward to further develop and adapt it in practice.
Bowker, Geoffrey C. et al. “Toward information infrastructure studies: Ways of knowing in a networked environment.” International handbook of internet research. Ed. Jeremy Hunsinger, Lisbeth Klastrup, and Matthew Allen. Dordrecht: Springer Netherlands, 2010. p 97–117.
Bowker, Geoffrey C. Science on the run: Information management and industrial geophysics at Schlumberger, 1920–1940. Cambridge, Mass.: MIT Press, 1994
Ceruzzi, Paul E. A history of modern computing. Cambridge, Mass.: MIT Press, 2003.
CODASYL Data Base Task Group. “A survey of generalized data base management systems.” New York, 1969a.
CODASYL Data Base Task Group. “Data base task group report to the CODASYL programming language committee.” New York: Association for Computing Machinery, 1969b.
Codd, Edgar F. “A relational model of data for large shared data banks.” Communications of the ACM 13.6, 1970. p 377–387.
Fry, James P., and Edgar H. Sibley. “Evolution of data-base management systems.” ACM Computing Surveys 8.1, 1976. p 7–42.
Graham, Steve, and Simon Marvin. Splintering urbanism: Networked infrastructures, technological mobilities and the urban condition. London: Routledge, 2001.
Haigh, Thomas, and Bachmann, Charles W. “Charles W. Bachman interview: September 25-26, 2004; Tucson, Arizona.” ACM Oral History Interviews. ACM, 2006. 2.
Hunger, Francis. “The Global Positioning System – emergence and early development.” 2012. Unpublished manuscript.
Hunger, Francis. “The University of Leipzig library database infrastructure. Interview with Leander Seige.” 2014. Unpublished manuscript.
Kurz, Constanze. “Bei Softwarefehlern droht die Hungersnot.” Frankfurter Allgemeine Zeitung 18 Oct. 2013. p 37.
Mattern, Shannon Christine. “Infrastructural tourism.” Words in space. 20 July 2012. Web. Last aceess 18 Nov. 2012. http://www.wordsinspace.net/wordpress/2012/07/20/infrastructural-tourism
McGee, W. C. “Data base technology.” IBM Journal of Research and Development 25.5, 1981. p 505–519.
National Research Council. Funding a revolution – government support for computing research. Washington DC: The National Academies Press, 1999.
Parks, Lisa. “Signals and oil: Satellite footprints and post-communist territories in Central Asia.” European Journal of Cultural Studies 12.2, 2009. p 137–156.
Parks, Lisa. “Spotting the satellite dish. Populist approaches to infrastructure.” Satellite/Border/Footprint. Ed. HMKV. Dortmund, 2010.
Star, Susan Leigh, and Karen Ruhleder. “Steps toward an ecology of infrastructure: design and access for large information spaces.” Information systems research 7.1, 1996. p 111–134.
 »Collective« here addresses the form of usage, not of ownership.
 Only a few highly qualified workers get better payment. In this context the meaning of »highly qualified« is limited to the qualifications as asked by the labor market. It is up to further research to look into the relation between maintenance work and the female reproductive work, which both share the low payment and low visibility aspects. These might be positioned against a higher public visibility of the new creation as such (that is the establishing process of infrastructure), which can be associated to the pairing of the male subject and public sphere.
 For databases these struggles can be observed exemplary during the discussions over the hierarchic model, the networked model and the relational model, within and around the CODASYL committee.
 Cf. primary sources: CODASYL 1969a, CODASYL 1969b, Codd 1970; secondary sources: Fry/Sibley 1976, McGee 1981, Haigh 2004.
 Note: A naturalized citizen also means the process of becoming a US citizen (through migration).
 In these cases it is possible to observe the ramifications of neoliberal ideas in relation to infrastructure. Although it is beyond the scope of this article to discuss it in closer detail, the whole discussion of public versus private ownership of life-supporting infrastructure can show, that infrastructure is subjected directly to political processes.
 See further the research by Lisa Parks (2009, 2010). In future I hope to be able to show a similar model in regards to databases.
 This pattern seems to have changed recently with the emergence of companies like Google. It would need further investigation to clarify this. At least for the product Google Earth it can be shown that the pre-cursor company Keyhole Inc., which was founded in 2001 and acquired by Google in 2004, got part of their funding through state resources: The CIA’s venture capital arm In-Q-Tel channeled tax payers money from the National Geospatial-Intelligence Agency into Keyhole Inc. (http://en.wikipedia.org/wiki/Keyhole,_Inc)
 I remain intentionally vague, since further research is needed. Also this argument necessarily ignores more recent technological developments in non-relational databases.
 Ghost in a Shell is a manga by Masamune Shirow, where the ghost addresses the human mind and the shell the cyborg-body, which surrounds the ghost.
 E.g. the performances Electrical Walks by Christina Kubisch.
 Bowker 1994, here cited after Parks 2009
 As Lisa Parks with the satellite dish, I’m concentrating on the visual appearance of database infrastructure and omit the many occasions where databases are involved but do not get immediately visible to the end user, e.g. in logistics, production of goods, services etc.
 Cf. Bucher 2013.