GIS, Virtualization, and Environmental Uncertainty – Part I

Changes in

international business since the past few decades have brought

greater internationalization and integration. The term globalization

captured these changes with considerable impact in increased cross-

border movements of goods, services, capital, technology, and people.

Based on global integration and local responsiveness dimensions, four

forms of organizations are used to manage international business:

global, international, multidomestic, and transnational


This post

focuses on Global Corporations (GCs). GCs prefer to market a

standardized product worldwide for economical reasons while moving

concentration of production, marketing, research and development

activities to a few favorable locations. The issues around the

expansion of business to a global level relate to the external

environment of the organization as well as its internal environment.

The management of external

environmental uncertainty is critical for the success of global

corporations. The major sources of uncertainty in the external

environment are the number of different forces that firms have to

manage, the degree to which the external environment is changing, the

resources available in the environment, and business continuity

management of Global Information Systems (GIS).

GIS drive the information

society and enables knowledge workers to connect and communicate in

ways that drastically change their work. Four main factors that

generally influence decisions each organization make in designing and

pursuing its GIS are: (a) interoperability, (b) total cost of

ownership, (c) security, and (d) transparency and public right to



earthquakes in Haiti and Chile, the violent European windstorm

Xynthia, or hurricane Katrina in New Orleans

(U.S.) are

reminders to business and IT managers that preparedness to protect

critical information systems and data against natural and man-made

disasters, swift response, and quick recovery are necessary tools to

assure business continuity.

Business continuity planning is

about having plans and procedures in place to recover key business

processes after a disaster. Participants in a recent business

continuity management survey perceived failure of computer hardware

or software and data loss as the highest risk to business disruption,

with 21% of the executives stating that natural disasters such as

storms, floods, and earthquakes were of particular concern. Disasters

are not just restricted to fire, flood, and other causes of property

damage; they can equally result from more mundane problems such as

labor strikes, hardware, or software malfunctions.


business impact analysis, redundancy, and offsite data centers are

various approaches to ensure business continuity. Virtualization (or

virtual machine technology) refers to a framework or methodology of

dividing the resources of a computer into multiple execution

environments, by applying one or more concepts or technologies such

as hardware and software partitioning, time-sharing, partial or

complete machine simulation, emulation, quality of service, and many


As companies

exploit the growing possibilities of international business,

technology leaders must build consensus for an organizational

structure that enables the expansion of information systems. The

purpose of the posts of this thread is to evaluate the expansion from

the perspective of a Chief Information Officer and discuss the issues

within the expansion scenario as they relate to environmental

uncertainty, business continuity, and virtualization considerations.

Global Corporations and Global Information


Information Systems (IS) organization refers to the

combination of technologies, processes, people, and promotion

mechanisms to improve the performance and effectiveness of the

organization. IS affects nearly all aspects of human endeavor and

also assists in the management and operations of various types of


Since the

1960s, managing and operating IS to improve organizational

performance and effectiveness has been a field of practice. Firstly

known as business data processing and later as management information systems, the field

is currently referred to as information technology (IT). Ongoing

innovations in IS and the growing worldwide competition add

difficulties and uncertainties to corporate environments. Global

information systems attract attention from both practitioners and

scholars as it is a critical enabler of competitive advantage for

international businesses.

Operational priorities of GCs requires innovative

capabilities and creates new requirements on the IS function of GCs.

Prior research categorized the requirements of GCs into four areas:

(a) decreasing the cost structure, (b) increasing innovation, (c)

leveraging information assets, and (d) becoming more



systems are fundamental to effective global operations because it

enables coalitions and provides a coordination mechanism for

geographically dispersed activities. Information systems are

disruptive phenomena for global corporations because of its

capacities of changing the competitive landscape and enabling new

organizational structures, products, processes, and ways of


The nature and function of GIS should concur with

the operational shifts of GCs which are highlighted above. The strategic use of global information systems

(GIS) depends on the ability of corporate managers to appreciate the

IT business value and use it as a competitive tool. GIS

organizations must provide resources to lead and support IT-enabled

business transformation initiatives by simplifying global operations,

automating the streamlined processes, and relocating some business

processes to lower cost locations. The increased focus on innovation

in the business, for example, required GIS organizations to increase

productivity, effectiveness of their research and development

capabilities. The focus on agility and innovation created new demands

on GIS organizations to provide rapid solutions to information

management frameworks essentials to ensure intelligent and informed

business decision making.


The idea of virtualization is

to partition a physical computer into several logical zones. Each of

these partitions can run a different operating system and function as

if it was a completely separate machine. Virtual machine technology,

or virtualization, refers to a framework or methodology of dividing

the resources of a computer into multiple execution environments, by

applying one or more concepts or technologies such as hardware and

software partitioning, time-sharing, partial or complete machine

simulation, emulation, quality of service, and many others.

The idea

behind virtualization is an extension of what is found in a modern

operating system (OS). A program running, for example, on a UNIX

machine has its own virtual address space. From the program’s

perspective, it has a large chunk (4GB on a 32-bit machine) of RAM to

use. The operating system is responsible for multiplexing with other

programs. This large and contiguous space does not exist in the real

machine. Some of the space will be scattered around real memory while

the rest of it might be stored on a hard disk.

Memory is not the only resource

virtualized with a modern OS. The CPU is usually allocated to

different processes using some form of pre-emption. When a process

has used its fair share of the CPU, it is interrupted and another is

allowed to take its place. From the process perspective, it has a CPU

of its own (or more than one, as in the case with the duo core or

quad cores).

Virtualization is not a new

technology. In the 1960s, IBM developed a handful of virtual machine

systems including the CP-40, CP-67, and VM/370. In all of these

instances, a virtual machine monitor (VMM) ran between the

application and hardware layers. Through the utilization of this VMM,

multiple virtual operating systems could be created, utilized, and

shut down without interfering with other virtual machines using the

same VMM. This research placed IBM at the forefront of the

virtualization race and is acknowledged along with the research

assistance from MIT, as the foundation of modern



machines are implemented in various forms: mainframe, open source,

paravirtualization, and custom approaches to virtual machines, which

were designed over the years. Complexity in chip technology and

approaches to solving the x86 limitations of virtualization have led

to three different variants of virtual machines: (a) software virtual

machines, (b) hardware virtual machines, and (c) virtual



virtual machines manage interactions between the host operating

system and guest operating system (Microsoft Virtual Server 2005). In

the case of hardware virtual machines, virtualization technology sits

directly on host hardware (bare metal) using hypervisors, modified

code, or APIs to facilitate faster transactions with hardware devices

(VMWare ESX). EMC’s VMWare technology is the market leader in x86

virtualization technology. The VMWare solution is more costly, but it

provides a robust management console and full-virtualization support

for an array of guest operating systems including Solaris, Linux,

Windows, and DOS. In this case of

virtual OS/containers, the host

operating system is partitioned into containers or zones (Solaris

Zones, BSD Jail).

There are

several vendors in the virtualization technology and each comes with

its own features which makes it adaptable for various scenarios. Some

virtualization technologies are (a) Microsoft Virtual Server or Hyper

V; and (b) EMC’s VMWARE suite (VMWARE workstation, VMWARE server,

VMWARE ESX, and Vsphere.

Whereas the VMWARE suite is adaptable to most operating

systems including Novel and UNIX, Microsoft virtual server is


The huge number of

centralized services and processing power in data centers in GCs

headquarters are the reasons for an adequate virtualization.

Virtualization reduces the number of servers, costs in maintenance

and server management, costs in power consumption and cooling costs.


continuity and disaster recovery planning is the other main reason

why GCs are virtualizing their services. Business continuity planning

is the elaboration of plans and procedures in place to recover key

business processes following a disaster. The plans and procedures for

a business continuity planning process encompass (a) business impact

analysis, (b) backup and restoration strategy, (c) redundancy, (d)

offsite data centers, and (e) virtualization.


GIS, and Management of Environmental


Environmental uncertainty is a central issue for the

deployment of global information systems. Uncertainty refers to

events the organization cannot forecast. The major sources of

uncertainty in the environment are usually the (a) complexity and the

number of different forces an organization has to manage, (b)

dynamism or the degree to which the environment is changing, and (c)

richness or the amount of resources available in the environment. The

accurate perception of uncertainty emanating from the environment is

critical to organizational performance, organizational structure,

firm strategy, and business continuity and disaster recovery


Natural disasters

can produce both horrifying and stunning tales of human tragedy and

triumph. But after the initial dust has settled, an after shock

experience materializes as businesses struggle to resume their

operations. The Gartner Group noted that 43% of such companies were

immediately put out of business by a major loss of computer records,

and another 51% permanently closed their doors within two years

leaving a mere 6% survival rate.

Information systems are

fundamental to effective global operations because it enables

coalitions and provides a coordination mechanism for geographically

dispersed activities. From the business continuity and disaster

recovery perspectives, the strategic use of GIS depends on

a proactive business continuity planning

of IT

executives. Business Continuity Management

(BCM) programs ensure that organizations adopt best practices through

industry certification standards such as the British standard BS

25999-2: 2007. This standard specifies requirements for establishing,

implementing, operating, monitoring, reviewing, exercising,

maintaining, and improving a documented BCM system within the context

of managing an organization’s overall business



server consolidation, storage, remote access, security, and green

initiatives are among the various challenges companies face with

expansion of IS at a global level. Organizations are primarily

deploying virtualization to improve server and system utilization

rates, increase server reliability and uptime, and enhance business

continuity. I believe that successful virtualization of GIS depends

on the approaches adopted and the ability of measuring the

performance of the virtualized environment.

The purpose of the next post

(Part II) will be to explore approaches of virtualization of GIS and

identify performance measurement indicators of virtualized global




Foucault and the Critique of Modernity

his post is not a

biographical work on Michel Foucault, but a quick sketch of his life

and the environment in which he was educated and these help to better

understand the philosophical (since his academic formation is

psychology and history) aspects of his works: (a) research and

analysis of philosophy’s traditional critical project in a new

historical manner, and (b) critical engagement with the thought of

traditional philosophers.

Michel Foucault (1926-1984) is one of the French

figures usually associated to the radical postmodern philosophies.

Despite his bourgeois origins, he sympathized early with vulnerable

groups such as artists, homosexuals, and prisoners. Like many young

thinkers of his generation, Foucault was largely influenced by the

(a) French tradition of history and philosophy of science represented

by Georges Canguilhem, (b) French avant-garde literature with the

writings of Georges Bataille and Maurice Blanchot, and (c)

philosophical milieu and its methods of writing history based on

archaeology and genealogy techniques.

The purpose of

this post is to explore and reflect upon Foucault’s critique of

modernity. First, an analysis of his historiographical approaches

(archaeology and genealogy) is provided. Second, Foucault’s

postmodern perspectives on the nature of modern power and his

argument that the modern subject is a construct of domination are

explained. Third, the political implications of Foucault’s

genealogical method are analyzed as well as his work on technologies

of the self. Finally, by taking the examples of institutions and

technologies, this post provides some indications of the conservative

aspects of Foucault’s work.

Postmodernism and the Critique of


Modernism was the cultural revolution of

the 20th century,

which is as important as Romanticism for the 19th century and the Enlightenment for

the 18th

century. The word modern has its origins in the early

medieval modernus, meaning that which is present, of our time, and by

extension, new, novel, or contemporary. From about 1900 to the 1960s,

modernism reigned as a succession of varied movements and styles that

reacted against historicism and recognized individual perception and

experience as the cornerstone of the creative


Postmodernism arrived in the mid-1960s and reached its

apex in the early 80s. Postmodernism is an intellectual current that

rejects the Enlightenment project of modernity. This involves, among

others, a radical critique (and often uncritical rejection) of

objectivity, the a priori subject as source of meaning, authenticity,

and authority, the importance of truth and abstract reason, the

teleological approach of history, the universalizing grand narratives

that aspire to completeness and the distinction between high and low

culture. For postmodernists, science is nothing more than a

narration, myth, or a social construction.

Analysis of Foucault’s Historiographical


Archaeology and genealogy are the two

approaches Foucault applied to his critique of historical reason. To

understand these historiographical models, one should trace the evolution of

philosophy from its beginnings with Socrates (and his project of

questioning knowledge) to Kant (for whom philosophy is the critique

of knowledge) through Descartes (rationalism), Locke (empiricism),

and Hume (induction). For instance, Hume thought that expectations

are built up based on recurrent experiences that the world in the

future will be similar to the past, without any knowledge of

anything. For Kant, reality is the sum of what can be experienced. He

added that the mind has a set of rules for how experience must be

constructed. Kant concluded that the rules must always apply to


Foucault rejected this prescriptive definition of

knowledge that establishes a set of conditions which, if met, would

equate knowledge with truth, making it certain and definitive. He

created a set of rules which can account for how men, at a specific

time and place and in particular domains do produce knowledge,

separate this knowledge from error, opinion, and beliefs. Foucault

did not only accept the scandal of existing knowledge (men at

different times and places have known differently), he made this

scandal the focal point of his analysis, seeking to identify (using

archaeology) the historical conditions of possibility of


The history of knowledge, he argued, can be written only

on the basis of what was contemporaneous with it, and certainly not

in terms of reciprocal influence but in terms of conditions and a

priori’s established in time. By using Nietzsche’s genealogy, he

described his conception of history as genealogy by delegitimizing

and proving the objectiveness of the present and the foreignness of

the past. Foucault rejected any form of global theorizing, avoided

totalizing forms of analysis and was critical of systematicity. He

showed that ideas are usually taken to be permanent truths about

human nature and society changes in the course of history.

Knowledge, Power, and Foucault’s



theory of power is opposed with classical approaches based on a

juridico-political conception of power (Hobbes, Machiavelli)

or on class oppositions and domination (Marx). Foucault’s

works explored the shifting patterns of power within a society and

the ways in which power relates to the self. This led to different

appearances of power such as disciplinary power, bio-power,

governmental power and repressive power. Discipline and Punish followed Madness and Civilization and The Birth of the Clinic was

the next stage in Foucault’s massive project of tracing the genealogy

of control institutions (asylums, teaching hospitals, and prisons)

and the human sciences were symbiotically linked with them

(psychiatry, clinical medicine, criminology,


The main

concern of Foucault throughout his publications was the relationship

between knowledge and power and the articulation of each on the

other. Nietzsche thought that a will to power motivates human

behavior and that traditional values had lost their power over

society. For Foucault, following Nietzsche, knowledge ceases to be a

liberation and becomes a mode of surveillance, regulation, and

discipline. Foucault opposed the humanist position that once we gain

power, we cease to know (because it makes us blind) and that only

those who are no way implicated in tyranny, can attain the truth. For

Foucault, such forms of knowledge as psychiatry and criminology are

directly related to the exercise of power. He added that power itself

creates new objects of knowledge and accumulates new bodies of


Technologies of the Self and Political

Implications of Foucault’s Genealogical


The third major shift of Foucault’s work is

the focus on technologies of the self, ethics, and freedom in the

1980s. Technologies of the self are practices by which subjects

constitute themselves within and through systems of power. These

systems often seem to be either natural or imposed from above.

Foucault theorized that the body is a subject of technologies of

power. These technologies are established through discourses of

“expertise” such as medicine, law, and science. Through these

discourses or truth games, individuals develop knowledge about

themselves, while bodies become the site of domination through

technologies of power, practices of discipline, and


Foucault’s work on modern power and

government inspired other works (for example, neoliberalism of the

New Right) to explore politics and political institutions. Similar to

Foucault’s genealogies, most of these works embody hostility to the

humanist notions of the subject and truth. This hostility sets up

various themes which can be seen as constitutive of a Foucauldian

approach to the study of political institutions. These themes can be

found in Foucault’s work on power and government. They can be divided

into those arising from a critique of traditional structuralism, a

critique of the subject, and a rejection of


Foucault’s genealogies provided examples for a political

science that would take seriously the anti-foundationalist view that

we have neither pure experiences nor pure reason. Such view certainly

overlaps considerably with Foucault’s concern to decentralize

structures, analyze the ways in which individuals are constructed by

their social context, and renounce appeals to a natural or immanent



Comments on Foucauldian Perspectives

Having outlined

some of Foucault’s arguments against technologies and institutions,

the first criticism of his work is that he refused to see any

advantage in modernity in some domains like medicine. Unlike Habermas

who thought that science is unproblematic when it operates according

to the rules of right, Foucault failed on repressive forms of

rationalization and never delineated some progressive aspects of

modernity. For him, all aspects of modernity are disciplinary, which

is quite difficult to accept. Foucault’s analysis did not focus so

much on the question of right but rather on the mechanisms through

which power effects are produced. Instead of fixing the legitimacy of

science or asking what the proper domain of certain knowledge is,

Foucault examined the role of certain knowledge in the production of

effects of power.

The second criticism of Foucault’s work is

that he disregarded the fact that domination has its basis in the

relations of production, exploitation, and in the organization of the

state. In line with Poulantzas’ criticisms, one can note that

Foucault neglected to study the modern form of the state and its

derivation from capitalist perspective of production. He did not see

that all social phenomena always occur in relation to the state and

class division. He exaggerated the importance of disciplinary

techniques in the modern state and thus neglected the continued

importance of violence, legal coercion, and law in general. Unlike

Poulantzas who saw some virtues (reproducing consent) to law and

state (involves in constituting social relations and winning mass

support), Foucault emphasized only the repressive, prohibitive side

of law and the positive productive side of (state) power.


Foucault’s work can be

summarized in three major shifts from the archaeological focus on

systems of knowledge in the 1960s, to the genealogical focus on

modalities of power in the 1970s and to the focus on technologies of

the self, ethics, and freedom in the 1980s. Foucault contributed in

many fields in the humanities and social sciences. As a member of

postmodernist movement and in line with their deconstruction

paradigm, he tried to show the problematic and suspicious aspects of

rationality, knowledge, subjectivity, and the production of social

norms. He thought that the quest of power invaded social and personal

life and pervaded schools, hospitals, prisons, and social sciences.

Foucault saw a link between power, truth and knowledge and he argued

that liberal-humanist values became entangled with, and the supports

of technologies of domination. He criticized both macro theorists who

see power only in terms of class or the state, and micro theorists

who analyze institutions and face-to-face interaction while ignoring

power altogether.