How do you define operational excellence? What factors are involved in achieving operational excellence? Who (within an organization) is responsible for operational excellence, and why is this important?
Please answer in 300 words or one full page without extra spacing.
Please refer to the attached textbooks.
operational excellence ORGANIZATIONAL
Information Systems for Business and Beyond
David T. Bourgeois, Ph.D.
tombryan
Typewritten Text
tombryan
Typewritten Text
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
saylor.org
tombryan
Typewritten Text
Information Systems for Business and Beyond © 2014 David T. Bourgeois, is licensed under a Creative Commons Attribution (CC BY) license made possible by funding from The Saylor Foundation’s Open Textbook Challenge in order to be incorporated into Saylor.org’s collection of open courses available at http://www.saylor.org. Full license terms may be viewed at: http://creativecommons.org/licenses/by/3.0/legalcode
tombryan
Typewritten Text
tombryan
Typewritten Text
tombryan
Typewritten Text
tombryan
Typewritten Text
tombryan
Typewritten Text
tombryan
Typewritten Text
tombryan
Typewritten Text
tombryan
Typewritten Text
tombryan
Typewritten Text
tombryan
Typewritten Text
tombryan
Typewritten Text
tombryan
Typewritten Text
tombryan
Typewritten Text
tombryan
Typewritten Text
tombryan
Typewritten Text
tombryan
Typewritten Text
tombryan
Typewritten Text
tombryan
Typewritten Text
Contents
1Introduction
Part 1: What Is an Information System?
Chapter 1: What Is an Information System?
5David T. Bourgeois
Chapter 2: Hardware
14David T. Bourgeois
Chapter 3: Software
26David T. Bourgeois
Chapter 4: Data and Databases
39David T. Bourgeois
Chapter 5: Networking and Communication
52David T. Bourgeois
Chapter 6: Information Systems Security
64David T. Bourgeois
Part 2: Information Systems for Strategic Advantage
Chapter 7: Does IT Matter?
76David T. Bourgeois
Chapter 8: Business Processes
85David T. Bourgeois
Chapter 9: The People in Information Systems
94David T. Bourgeois
Chapter 10: Information Systems Development
104David T. Bourgeois
Part 3: Information Systems Beyond the Organization
Chapter 11: Globalization and the Digital Divide
120David T. Bourgeois
Chapter 12: The Ethical and Legal Implications of Information Systems
129David T. Bourgeois
Chapter 13: Future Trends in Information Systems
144David T. Bourgeois
150Answers to Study Questions
162Bibliography
iv Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Introduction
Welcome to Information Systems for Business and Beyond. In this book, you will be introduced to the
concept of information systems, their use in business, and the larger impact they are having on our world.
Audience
This book is written as an introductory text, meant for those with little or no experience with computers
or information systems. While sometimes the descriptions can get a little bit technical, every effort has
been made to convey the information essential to understanding a topic while not getting bogged down in
detailed terminology or esoteric discussions.
Chapter Outline
The text is organized around thirteen chapters divided into three major parts, as follows:
• Part 1: What Is an Information System?
Chapter 1: What Is an Information System? – This chapter provides an overview of
information systems, including the history of how we got where we are today.
Chapter 2: Hardware – We discuss information systems hardware and how it works. You
will look at different computer parts and learn how they interact.
Chapter 3: Software – Without software, hardware is useless. In this chapter, we discuss
software and the role it plays in an organization.
Chapter 4: Data and Databases – This chapter explores how organizations use
information systems to turn data into information that can then be used for competitive
advantage. Special attention is paid to the role of databases.
Chapter 5: Networking and Communication – Today’s computers are expected to also be
communication devices. In this chapter we review the history of networking, how the
Internet works, and the use of networks in organizations today.
Chapter 6: Information Systems Security – We discuss the information security triad of
confidentiality, integrity, and availability. We will review different security technologies,
and the chapter concludes with a primer on personal information security.
• Part 2: Information Systems for Strategic Advantage
Chapter 7: Does IT Matter? – This chapter examines the impact that information systems
have on an organization. Can IT give a company a competitive advantage? We will
1Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
discuss seminal works by Brynjolfsson, Carr, and Porter as they relate to IT and
competitive advantage.
Chapter 8: Business Processes – Business processes are the essence of what a business
does, and information systems play an important role in making them work. This chapter
will discuss business process management, business process reengineering, and ERP
systems.
Chapter 9: The People in Information Systems – This chapter will provide an overview of
the different types of people involved in information systems. This includes people who
create information systems, those who operate and administer information systems, those
who manage information systems, and those who use information systems.
Chapter 10: Information Systems Development – How are information systems created?
This chapter will review the concept of programming, look at different methods of
software development, review website and mobile application development, discuss end-
user computing, and look at the “build vs. buy” decision that many companies face.
• Part 3: Information Systems beyond the Organization
Chapter 11: Globalization and the Digital Divide – The rapid rise of the Internet has
made it easier than ever to do business worldwide. This chapter will look at the impact
that the Internet is having on the globalization of business and the issues that firms must
face because of it. It will also cover the concept of the digital divide and some of the steps
being taken to alleviate it.
Chapter 12: The Ethical and Legal Implications of Information Systems – The rapid
changes in information and communication technology in the past few decades have
brought a broad array of new capabilities and powers to governments, organizations, and
individuals alike. This chapter will discuss the effects that these new capabilities have had
and the legal and regulatory changes that have been put in place in response.
Chapter 13: Future Trends in Information Systems – This final chapter will present an
overview of some of the new technologies that are on the horizon. From wearable
technology to 3-D printing, this chapter will provide a look forward to what the next few
years will bring.
For the Student
Each chapter in this text begins with a list of the relevant learning objectives and ends with a chapter
summary. Following the summary is a list of study questions that highlight key topics in the chapter. In
order to get the best learning experience, you would be wise to begin by reading both the learning objectives
and the summary and then reviewing the questions at the end of the chapter.
2 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
For the Instructor
Learning objectives can be found at the beginning of each chapter. Of course, all chapters are recommended
for use in an introductory information systems course. However, for courses on a shorter calendar or
courses using additional textbooks, a review of the learning objectives will help determine which chapters
can be omitted.
At the end of each chapter, there is a set of study questions and exercises (except for chapter 1, which
only offers study questions). The study questions can be assigned to help focus students’ reading on the
learning objectives. The exercises are meant to be a more in-depth, experiential way for students to learn
chapter topics. It is recommended that you review any exercise before assigning it, adding any detail needed
(such as length, due date) to complete the assignment.
Introduction 3
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Part 1: What Is an Information System?
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Chapter 1: What Is an Information System?
David T. Bourgeois
Learning Objectives
Upon successful completion of this chapter, you will be able to:
• define what an information system is by identifying its major components;
• describe the basic history of information systems; and
• describe the basic argument behind the article “Does IT Matter?” by Nicholas Carr.
Introduction
If you are reading this, you are most likely taking a course in information systems, but do you even know
what the course is going to cover? When you tell your friends or your family that you are taking a course
in information systems, can you explain what it is about? For the past several years, I have taught an
Introduction to Information Systems course. The first day of class I ask my students to tell me what they
think an information system is. I generally get answers such as “computers,” “databases,” or “Excel.”
These are good answers, but definitely incomplete ones. The study of information systems goes far beyond
understanding some technologies. Let’s begin our study by defining information systems.
Defining Information Systems
Almost all programs in business require students to take a course in something called information systems.
But what exactly does that term mean? Let’s take a look at some of the more popular definitions, first from
Wikipedia and then from a couple of textbooks:
• “Information systems (IS) is the study of complementary networks of hardware and software that
people and organizations use to collect, filter, process, create, and distribute data.”1
• “Information systems are combinations of hardware, software, and telecommunications networks
that people build and use to collect, create, and distribute useful data, typically in organizational
settings.”2
• “Information systems are interrelated components working together to collect, process, store, and
disseminate information to support decision making, coordination, control, analysis, and
viualization in an organization.”3
1. Wikipedia entry on “Information Systems,” as displayed on August 19, 2012. Wikipedia: The Free Encyclopedia. San Francisco:
Wikimedia Foundation. http://en.wikipedia.org/wiki/Information_systems_(discipline).
2. Excerpted from Information Systems Today – Managing in the Digital World, fourth edition. Prentice-Hall, 2010.
3. Excerpted from Management Information Systems, twelfth edition, Prentice-Hall, 2012.
5
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://en.wikipedia.org/wiki/Information_systems_(discipline)
As you can see, these definitions focus on two different ways of describing information systems:
the components that make up an information system and the role that those components play in an
organization. Let’s take a look at each of these.
The Components of Information Systems
As I stated earlier, I spend the first day of my information systems class discussing exactly what the
term means. Many students understand that an information system has something to do with databases
or spreadsheets. Others mention computers and e-commerce. And they are all right, at least in part:
information systems are made up of different components that work together to provide value to an
organization.
The first way I describe information systems to students is to tell them that they are made up of five components: hardware,
software, data, people, and process. The first three, fitting under the category technology, are generally what most students think of
when asked to define information systems. But the last two, people and process, are really what separate the idea of information
systems from more technical fields, such as computer science. In order to fully understand information systems, students must
understand how all of these components work together to bring value to an organization.
Technology
Technology can be thought of as the application of scientific knowledge for practical purposes. From the
invention of the wheel to the harnessing of electricity for artificial lighting, technology is a part of our lives
in so many ways that we tend to take it for granted. As discussed before, the first three components of
information systems – hardware, software, and data – all fall under the category of technology. Each of
these will get its own chapter and a much lengthier discussion, but we will take a moment here to introduce
them so we can get a full understanding of what an information system is.
Hardware
Information systems hardware is the part of an information system you can touch – the physical components
of the technology. Computers, keyboards, disk drives, iPads, and flash drives are all examples of
information systems hardware. We will spend some time going over these components and how they all
work together in chapter 2.
Software
Software is a set of instructions that tells the hardware what to do. Software is not
tangible – it cannot be touched. When programmers create software programs,
what they are really doing is simply typing out lists of instructions that tell the
hardware what to do. There are several categories of software, with the two main
categories being operating-system software, which makes the hardware usable, and
application software, which does something useful. Examples of operating systems
include Microsoft Windows on a personal computer and Google’s Android on a
mobile phone. Examples of application software are Microsoft Excel and Angry Birds. Software will be
explored more thoroughly in chapter 3.
6 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2012/07/hwswappuser
http://bus206.pressbooks.com/files/2012/07/hwswappuser
Data
The third component is data. You can think of data as a collection of facts. For example, your street address,
the city you live in, and your phone number are all pieces of data. Like software, data is also intangible. By
themselves, pieces of data are not really very useful. But aggregated, indexed, and organized together into
a database, data can become a powerful tool for businesses. In fact, all of the definitions presented at the
beginning of this chapter focused on how information systems manage data. Organizations collect all kinds
of data and use it to make decisions. These decisions can then be analyzed as to their effectiveness and the
organization can be improved. Chapter 4 will focus on data and databases, and their uses in organizations.
Networking Communication: A Fourth Technology Piece?
Besides the components of hardware, software, and data, which have long been considered the core
technology of information systems, it has been suggested that one other component should be added:
communication. An information system can exist without the ability to communicate – the first personal
computers were stand-alone machines that did not access the Internet. However, in today’s hyper-connected
world, it is an extremely rare computer that does not connect to another device or to a network. Technically,
the networking communication component is made up of hardware and software, but it is such a core
feature of today’s information systems that it has become its own category. We will be covering networking
in chapter 5.
People
When thinking about information systems, it is easy to get focused
on the technology components and forget that we must look
beyond these tools to fully understand how they integrate into an
organization. A focus on the people involved in information
systems is the next step. From the front-line help-desk workers, to
systems analysts, to programmers, all the way up to the chief
information officer (CIO), the people involved with information
systems are an essential element that must not be overlooked. The
people component will be covered in chapter 9.
Process
The last component of information systems is process. A process is a series of steps undertaken to
achieve a desired outcome or goal. Information systems are becoming more and more integrated with
organizational processes, bringing more productivity and better control to those processes. But simply
automating activities using technology is not enough – businesses looking to effectively utilize information
systems do more. Using technology to manage and improve processes, both within a company and externally with suppliers and
customers, is the ultimate goal. Technology buzzwords such as “business process reengineering,” “business process management,”
and “enterprise resource planning” all have to do with the continued improvement of these business procedures and the integration
of technology with them. Businesses hoping to gain an advantage over their competitors are highly focused on this component of
information systems. We will discuss processes in chapter 8.
Ch.1:What Is an Information System? 7
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://commons.wikimedia.org/wiki/File%3ASteve_Jobs_and_Bill_Gates_(522695099)
http://commons.wikimedia.org/wiki/File%3ASteve_Jobs_and_Bill_Gates_(522695099)
IBM 704 Mainframe (Copyright: Lawrence Livermore
National Laboratory)
Registered trademark of
International Business Machines
The Role of Information Systems
Now that we have explored the different components of information systems, we need to turn our attention
to the role that information systems play in an organization. So far we have looked at what the components
of an information system are, but what do these components actually do for an organization? From our
definitions above, we see that these components collect, store, organize, and distribute data throughout the
organization. In fact, we might say that one of the roles of information systems is to take data and turn it
into information, and then transform that into organizational knowledge. As technology has developed, this
role has evolved into the backbone of the organization. To get a full appreciation of the role information
systems play, we will review how they have changed over the years.
The Mainframe Era
From the late 1950s through the 1960s, computers were
seen as a way to more efficiently do calculations. These
first business computers were room-sized monsters, with
several refrigerator-sized machines linked together. The
primary work of these devices was to organize and store
large volumes of information that were tedious to manage
by hand. Only large businesses, universities, and
government agencies could afford them, and they took a
crew of specialized personnel and specialized facilities to
maintain. These devices served dozens to hundreds of
users at a time through a process called time-sharing.
Typical functions included scientific calculations and
accounting, under the broader umbrella of “data processing.”
In the late 1960s, the Manufacturing Resources Planning (MRP) systems
were introduced. This software, running on a mainframe computer, gave
companies the ability to manage the manufacturing process, making it
more efficient. From tracking inventory to creating bills of materials to
scheduling production, the MRP systems (and later the MRP II systems)
gave more businesses a reason to want to integrate computing into their
processes. IBM became the dominant mainframe company. Nicknamed
“Big Blue,” the company became synonymous with business computing. Continued improvement in
software and the availability of cheaper hardware eventually brought mainframe computers (and their little
sibling, the minicomputer) into most large businesses.
The PC Revolution
In 1975, the first microcomputer was announced on the cover of Popular Mechanics: the Altair 8800.
Its immediate popularity sparked the imagination of entrepreneurs everywhere, and there were quickly
dozens of companies making these “personal computers.” Though at first just a niche product for computer
hobbyists, improvements in usability and the availability of practical software led to growing sales. The
most prominent of these early personal computer makers was a little company known as Apple Computer,
headed by Steve Jobs and Steve Wozniak, with the hugely successful “Apple II.” Not wanting to be left
out of the revolution, in 1981 IBM (teaming with a little company called Microsoft for their operating-
8 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://commons.wikimedia.org/wiki/File%3AIbm704.gif
http://commons.wikimedia.org/wiki/File%3AIbm704.gif
Registered trademark of
SAP
system software) hurriedly released their own version of the personal computer, simply called the “PC.”
Businesses, who had used IBM mainframes for years to run their businesses, finally had the permission
they needed to bring personal computers into their companies, and the IBM PC took off. The IBM PC was
named Time magazine’s “Man of the Year” for 1982.
Because of the IBM PC’s open architecture, it was easy for other companies to copy, or “clone” it.
During the 1980s, many new computer companies sprang up, offering less expensive versions of the PC.
This drove prices down and spurred innovation. Microsoft developed its Windows operating system and
made the PC even easier to use. Common uses for the PC during this period included word processing,
spreadsheets, and databases. These early PCs were not connected to any sort of network; for the most part
they stood alone as islands of innovation within the larger organization.
Client-Server
In the mid-1980s, businesses began to see the need to connect their computers together as a way to
collaborate and share resources. This networking architecture was referred to as “client-server” because
users would log in to the local area network (LAN) from their PC (the “client”) by connecting to a powerful
computer called a “server,” which would then grant them rights to different resources on the network (such
as shared file areas and a printer). Software companies began developing applications that allowed multiple
users to access the same data at the same time. This evolved into software applications for communicating,
with the first real popular use of electronic mail appearing at this time.
This networking and data sharing all stayed within the confines of each business,
for the most part. While there was sharing of electronic data between companies,
this was a very specialized function. Computers were now seen as tools to
collaborate internally, within an organization. In fact, these networks of computers
were becoming so powerful that they were replacing many of the functions
previously performed by the larger mainframe computers at a fraction of the cost.
It was during this era that the first Enterprise Resource Planning (ERP) systems were developed and run on
the client-server architecture. An ERP system is a software application with a centralized database that can
be used to run a company’s entire business. With separate modules for accounting, finance, inventory,
human resources, and many, many more, ERP systems, with Germany’s SAP leading the way, represented
the state of the art in information systems integration. We will discuss ERP systems as part of the chapter on
process (chapter 9).
The World Wide Web and E-Commerce
First invented in 1969, the Internet was confined to use by universities, government agencies, and
researchers for many years. Its rather arcane commands and user applications made it unsuitable for
mainstream use in business. One exception to this was the ability to expand electronic mail outside the
confines of a single organization. While the first e-mail messages on the Internet were sent in the early
1970s, companies who wanted to expand their LAN-based e-mail started hooking up to the Internet in the
1980s. Companies began connecting their internal networks to the Internet in order to allow communication
between their employees and employees at other companies. It was with these early Internet connections
that the computer truly began to evolve from a computational device to a communications device.
In 1989, Tim Berners-Lee developed a simpler way for researchers to share information over the
network at CERN laboratories, a concept he called the World Wide Web.4 This invention became the
launching point of the growth of the Internet as a way for businesses to share information about themselves.
Ch.1:What Is an Information System? 9
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Registered trademark of Amazon
Technologies, Inc.
As web browsers and Internet connections became the norm, companies rushed to grab domain names and
create websites.
In 1991, the National Science Foundation, which governed how the
Internet was used, lifted restrictions on its commercial use. The year 1994
saw the establishment of both eBay and Amazon.com, two true pioneers in
the use of the new digital marketplace. A mad rush of investment in
Internet-based businesses led to the dot-com boom through the late 1990s,
and then the dot-com bust in 2000. While much can be learned from the speculation and crazy economic
theories espoused during that bubble, one important outcome for businesses was that thousands of miles of
Internet connections were laid around the world during that time. The world became truly “wired” heading
into the new millenium, ushering in the era of globalization, which we will discuss in chapter 11.
As it became more expected for companies to be connected to the Internet, the digital world also
became a more dangerous place. Computer viruses and worms, once slowly propagated through the sharing
of computer disks, could now grow with tremendous speed via the Internet. Software written for a
disconnected world found it very difficult to defend against these sorts of threats. A whole new industry of
computer and Internet security arose. We will study information security in chapter 6.
Web 2.0
As the world recovered from the dot-com bust, the use of technology in business continued to evolve at
a frantic pace. Websites became interactive; instead of just visiting a site to find out about a business and
purchase its products, customers wanted to be able to customize their experience and interact with the
business. This new type of interactive website, where you did not have to know how to create a web page or
do any programming in order to put information online, became known as web 2.0. Web 2.0 is exemplified
by blogging, social networking, and interactive comments being available on many websites. This new
web-2.0 world, in which online interaction became expected, had a big impact on many businesses and
even whole industries. Some industries, such as bookstores, found themselves relegated to a niche status.
Others, such as video rental chains and travel agencies, simply began going out of business as they were
replaced by online technologies. This process of technology replacing a middleman in a transaction is called
disintermediation.
As the world became more connected, new questions arose. Should access to the Internet be
considered a right? Can I copy a song that I downloaded from the Internet? How can I keep information
that I have put on a website private? What information is acceptable to collect from children? Technology
moved so fast that policymakers did not have enough time to enact appropriate laws, making for a Wild
West–type atmosphere. Ethical issues surrounding information systems will be covered in chapter 12.
The Post-PC World
After thirty years as the primary computing device used in most businesses, sales of the PC are now
beginning to decline as sales of tablets and smartphones are taking off. Just as the mainframe before it, the
PC will continue to play a key role in business, but will no longer be the primary way that people interact
and do business. The limited storage and processing power of these devices is being offset by a move to
“cloud” computing, which allows for storage, sharing, and backup of information on a massive scale. This
4. CERN’s “The Birth of the Web.” http://public.web.cern.ch/public/en/about/web-en.html
10 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2012/07/799px-Amazon.com-Logo.svg_
http://bus206.pressbooks.com/files/2012/07/799px-Amazon.com-Logo.svg_
will require new rounds of thinking and innovation on the part of businesses as technology continues to
advance.
The
Eras of Business Computing
Era Hardware Operating System Applications
Mainframe
(1970s)
Terminals connected to
mainframe computer.
Time-sharing
(TSO) on MVS
Custom-written
MRP software
PC
(mid-1980s)
IBM PC or compatible.
Sometimes connected to
mainframe computer via
expansion card.
MS-DOS
WordPerfect,
Lotus 1-2-3
Client-Server
(late 80s to early
90s)
IBM PC “clone” on a Novell
Network.
Windows for Workgroups
Microsoft
Word, Microsoft Excel
World
Wide Web (mid-90s
to early 2000s)
IBM PC “clone” connected to
company intranet.
Windows XP
Microsoft
Office, Internet Explorer
Web 2.0 (mid-2000s
to present)
Laptop connected to company
Wi-Fi.
Windows 7
Microsoft
Office, Firefox
Post-PC
(today and beyond)
Apple iPad iOS
Mobile-friendly
websites, mobile apps
Can Information Systems Bring Competitive Advantage?
It has always been the assumption that the implementation of information systems will, in and of itself,
bring a business competitive advantage. After all, if installing one computer to manage inventory can make
a company more efficient, won’t installing several computers to handle even more of the business continue
to improve it?
In 2003, Nicholas Carr wrote an article in the Harvard Business Review that questioned this
assumption. The article, entitled “IT Doesn’t Matter,” raised the idea that information technology has
become just a commodity. Instead of viewing technology as an investment that will make a company stand
out, it should be seen as something like electricity: It should be managed to reduce costs, ensure that it is
always running, and be as risk-free as possible.
As you might imagine, this article was both hailed and scorned. Can IT bring a competitive advantage?
It sure did for Walmart (see sidebar). We will discuss this topic further in chapter 7.
Ch.1:What Is an Information System? 11
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Registered trademark of Wal-Mart Stores, Inc.
Sidebar: Walmart Uses Information Systems to Become the World’s Leading
Retailer
Walmart is the world’s largest retailer, earning $15.2
billion on sales of $443.9 billion in the fiscal year that
ended on January 31, 2012. Walmart currently serves over
200 million customers every week, worldwide.5 Walmart’s
rise to prominence is due in no small part to their use of
information systems.
One of the keys to this success was the
implementation of Retail Link, a supply-chain
management system. This system, unique when initially
implemented in the mid-1980s, allowed Walmart’s
suppliers to directly access the inventory levels and sales information of their products at any of Walmart’s
more than ten thousand stores. Using Retail Link, suppliers can analyze how well their products are selling
at one or more Walmart stores, with a range of reporting options. Further, Walmart requires the suppliers to
use Retail Link to manage their own inventory levels. If a supplier feels that their products are selling out
too quickly, they can use Retail Link to petition Walmart to raise the levels of inventory for their products.
This has essentially allowed Walmart to “hire” thousands of product managers, all of whom have a vested
interest in the products they are managing. This revolutionary approach to managing inventory has allowed
Walmart to continue to drive prices down and respond to market forces quickly.
Today, Walmart continues to innovate with information technology. Using its tremendous market
presence, any technology that Walmart requires its suppliers to implement immediately becomes a business
standard.
Summary
In this chapter, you have been introduced to the concept of information systems. We have reviewed several
definitions, with a focus on the components of information systems: technology, people, and process. We
have reviewed how the business use of information systems has evolved over the years, from the use of
large mainframe computers for number crunching, through the introduction of the PC and networks, all
the way to the era of mobile computing. During each of these phases, new innovations in software and
technology allowed businesses to integrate technology more deeply.
We are now to a point where every company is using information systems and asking the question:
Does it bring a competitive advantage? In the end, that is really what this book is about. Every
businessperson should understand what an information system is and how it can be used to bring a
competitive advantage. And that is the task we have before us.
Study Questions
1. What are the five components that make up an information system?
2. What are three examples of information system hardware?
5. Walmart 2012 Annual Report.
12 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2012/07/Walmart_exteriorcropped
http://bus206.pressbooks.com/files/2012/07/Walmart_exteriorcropped
3. Microsoft Windows is an example of which component of information systems?
4. What is application software?
5. What roles do people play in information systems?
6. What is the definition of a process?
7. What was invented first, the personal computer or the Internet (ARPANET)?
8. In what year were restrictions on commercial use of the Internet first lifted? When were eBay
and Amazon founded?
9. What does it mean to say we are in a “post-PC world”?
10. What is Carr’s main argument about information technology?
Exercises
1. Suppose that you had to explain to a member of your family or one of your closest friends the
concept of an information system. How would you define it? Write a one-paragraph description in
your own words that you feel would best describe an information system to your friends or
family.
2. Of the five primary components of an information system (hardware, software, data, people,
process), which do you think is the most important to the success of a business organization?
Write a one-paragraph answer to this question that includes an example from your personal
experience to support your answer.
3. We all interact with various information systems every day: at the grocery store, at work, at
school, even in our cars (at least some of us). Make a list of the different information systems you
interact with every day. See if you can identify the technologies, people, and processes involved
in making these systems work.
4. Do you agree that we are in a post-PC stage in the evolution of information systems? Some
people argue that we will always need the personal computer, but that it will not be the primary
device used for manipulating information. Others think that a whole new era of mobile and
biological computing is coming. Do some original research and make your prediction about what
business computing will look like in the next generation.
5. The Walmart case study introduced you to how that company used information systems to
become the world’s leading retailer. Walmart has continued to innovate and is still looked to as a
leader in the use of technology. Do some original research and write a one-page report detailing a
new technology that Walmart has recently implemented or is pioneering.
Ch.1:What Is an Information System? 13
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Chapter 2: Hardware
David T. Bourgeois
Learning Objectives
Upon successful completion of this chapter, you will be able to:
• describe information systems hardware;
• identify the primary components of a computer and the functions they perform; and
• explain the effect of the commoditization of the personal computer.
Introduction
As we learned in the first chapter, an information system is made up of five components: hardware,
software, data, people, and process. The physical parts of computing devices – those that you can actually
touch – are referred to as hardware. In this chapter, we will take a look at this component of information
systems, learn a little bit about how it works, and discuss some of the current trends surrounding it.
As stated above, computer hardware encompasses digital devices that you can physically touch. This
includes devices such as the following:
• desktop computers
• laptop computers
• mobile phones
• tablet computers
• e-readers
• storage devices, such as flash drives
• input devices, such as keyboards, mice, and scanners
• output devices such as printers and speakers.
Besides these more traditional computer hardware devices, many items that were once not considered
digital devices are now becoming computerized themselves. Digital technologies are now being integrated
into many everyday objects, so the days of a device being labeled categorically as computer hardware may
be ending. Examples of these types of digital devices include automobiles, refrigerators, and even soft-
drink dispensers. In this chapter, we will also explore digital devices, beginning with defining what we
mean by the term itself.
14
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206/
Attributed to David T. Bourgeois, Ph.D.
http://www.ford.com/technology/sync/
http://www.npr.org/blogs/thesalt/2012/05/03/151968878/the-smart-fridge-finds-the-lost-lettuce-for-a-price
http://www.coca-colafreestyle.com/
http://www.coca-colafreestyle.com/
Digital Devices
A digital device processes electronic signals that represent either a one (“on”) or a zero (“off”). The on
state is represented by the presence of an electronic signal; the off state is represented by the absence of an
electronic signal. Each one or zero is referred to as a bit (a contraction of binary digit); a group of eight bits
is a byte. The first personal computers could process 8 bits of data at once; modern PCs can now process
64 bits of data at a time, which is where the term 64-bit processor comes from.
Sidebar: Understanding Binary
As you know, the system of numbering we are most familiar with is base-ten numbering. In base-ten
numbering, each column in the number represents a power of ten, with the far-right column representing
10^0 (ones), the next column from the right representing 10^1 (tens), then 10^2 (hundreds), then 10^3
(thousands), etc. For example, the number 1010 in decimal represents: (1 x 1000) + (0 x 100) + (1 x 10) +
(0 x 1).
Computers use the base-two numbering system, also known as binary. In this system, each column in
the number represents a power of two, with the far-right column representing 2^0 (ones), the next column
from the right representing 2^1 (tens), then 2^2 (fours), then 2^3 (eights), etc. For example, the number
1010 in binary represents (1 x 8 ) + (0 x 4) + (1 x 2) + (0 x 1). In base ten, this evaluates to 10.
As the capacities of digital devices grew, new terms were developed to identify the capacities of processors,
memory, and disk storage space. Prefixes were applied to the word byte to represent different orders of
magnitude. Since these are digital specifications, the prefixes were originally meant to represent multiples
of 1024 (which is 210), but have more recently been rounded to mean multiples of 1000.
A Listing of Binary Prefixes
Prefix Represents Example
kilo one thousand kilobyte=one thousand bytes
mega one million megabyte=one million bytes
giga one billion gigabyte=one billion bytes
tera one trillion terabyte=one trillion bytes
Tour of a PC
All personal computers consist of the same basic components: a CPU, memory, circuit board, storage, and
input/output devices. It also turns out that almost every digital device uses the same set of components, so
examining the personal computer will give us insight into the structure of a variety of digital devices. So
let’s take a “tour” of a personal computer and see what makes them function.
Ch.2: Hardware 15
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206/
Attributed to David T. Bourgeois, Ph.D.
Processing Data: The CPU
As stated above, most computing devices have a similar architecture. The core of this architecture is the
central processing unit, or CPU. The CPU can be thought of as the “brains” of the device. The CPU carries
out the commands sent to it by the software and returns results to be acted upon.
The earliest CPUs were large circuit boards with limited functionality. Today, a CPU is generally on
one chip and can perform a large variety of functions. There are two primary manufacturers of CPUs for
personal computers: Intel and Advanced Micro Devices (AMD).
The speed (“clock time”) of a CPU is measured in hertz. A hertz is defined as one cycle per second.
Using the binary prefixes mentioned above, we can see that a kilohertz (abbreviated kHz) is one thousand
cycles per second, a megahertz (mHz) is one million cycles per second, and a gigahertz (gHz) is one billion
cycles per second. The CPU’s processing power is increasing at an amazing rate (see the sidebar about
Moore’s Law). Besides a faster clock time, many CPU chips now contain multiple processors per chip.
These chips, known as dual-core (two processors) or quad-core (four processors), increase the processing
power of a computer by providing the capability of multiple CPUs.
Sidebar: Moore’s Law
We all know that computers get faster every year. Many times, we are not sure if we want to buy today’s
model of smartphone, tablet, or PC because next week it won’t be the most advanced any more. Gordon
Moore, one of the founders of Intel, recognized this phenomenon in 1965, noting that microprocessor
transistor counts had been doubling every year.1 His insight eventually evolved into Moore’s Law, which
states that the number of transistors on a chip will double every two years. This has been generalized into
the concept that computing power will double every two years for the same price point. Another way of
looking at this is to think that the price for the same computing power will be cut in half every two years.
Though many have predicted its demise, Moore’s Law has held true for over forty years (see figure below).
1. Moore, Gordon E. (1965). “Cramming more components onto integrated circuits” (PDF). Electronics Magazine. p. 4. Retrieved
2012-10-18.
16 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
A graphical representation of Moore’s Law (CC-BY-SA: Wgsimon)
There will be a point, someday, where we reach the limits of Moore’s Law, where we cannot continue to
shrink circuits any further. But engineers will continue to seek ways to increase performance.
Ch.2: Hardware 17
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://commons.wikimedia.org/wiki/User:Wgsimon
http://commons.wikimedia.org/wiki/User:Wgsimon
Motherboard (click image to enlarge)
Memory DIMM (click image to enlarge)
Motherboard
The motherboard is the main circuit board on the
computer. The CPU, memory, and storage components,
among other things, all connect into the motherboard.
Motherboards come in different shapes and sizes,
depending upon how compact or expandable the computer
is designed to be. Most modern motherboards have many
integrated components, such as video and sound
processing, which used to require separate components.
The motherboard provides much of the bus of the
computer (the term bus refers to the electrical connection
between different computer components). The bus is an
important determiner of the computer’s speed: the
combination of how fast the bus can transfer data and the
number of data bits that can be moved at one time
determine the speed.
Random-Access Memory
When a computer starts up, it begins to load information from the hard disk into its working memory.
This working memory, called random-access memory (RAM), can transfer data much faster than the hard
disk. Any program that you are running on the computer is loaded into RAM for processing. In order for
a computer to work effectively, some minimal amount of RAM must be installed. In most cases, adding
more RAM will allow the computer to run faster. Another characteristic of RAM is that it is “volatile.”
This means that it can store data as long as it is receiving power; when the computer is turned off, any data
stored in RAM is lost.
RAM is generally installed in a personal computer through
the use of a dual-inline memory module (DIMM). The
type of DIMM accepted into a computer is dependent upon
the motherboard. As described by Moore’s Law, the
amount of memory and speeds of DIMMs have increased
dramatically over the years.
Hard Disk
While the RAM is used as working memory, the computer also needs a place to
store data for the longer term. Most of today’s personal computers use a hard disk
for long-term data storage. A hard disk is where data is stored when the computer
is turned off and where it is retrieved from when the computer is turned on. Why is
it called a hard disk? A hard disk consists of a stack of disks inside a hard metal
case. A floppy disk (discussed below) was a removable disk that, in some cases at
least, was flexible, or “floppy.”
18 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2012/07/IMG_0100
http://bus206.pressbooks.com/files/2012/07/IMG_0100
http://bus206.pressbooks.com/files/2012/07/computer-memory-simm
http://bus206.pressbooks.com/files/2012/07/computer-memory-simm
http://en.wikipedia.org/wiki/Dimm
http://bus206.pressbooks.com/files/2012/07/hard-disk-in-case
http://bus206.pressbooks.com/files/2012/07/hard-disk-in-case
Hard disk enclosure (click image to enlarge)
Solid-State Drives
A relatively new component becoming more common in some personal computers is the solid-state drive
(SSD). The SSD performs the same function as a hard disk: long-term storage. Instead of spinning disks,
the SSD uses flash memory, which is much faster.
Solid-state drives are currently quite a bit more expensive than hard disks. However, the use of flash
memory instead of disks makes them much lighter and faster than hard disks. SSDs are primarily utilized
in portable computers, making them lighter and more efficient. Some computers combine the two storage
technologies, using the SSD for the most accessed data (such as the operating system) while using the hard
disk for data that is accessed less frequently. As with any technology, Moore’s Law is driving up capacity
and speed and lowering prices of solid-state drives, which will allow them to proliferate in the years to
come.
Removable Media
Besides fixed storage components, removable storage media are also used in most personal computers.
Removable media allows you to take your data with you. And just as with all other digital technologies,
these media have gotten smaller and more powerful as the years have gone by. Early computers used floppy
disks, which could be inserted into a disk drive in the computer. Data was stored on a magnetic disk inside
an enclosure. These disks ranged from 8″ in the earliest days down to 3 1/2″.
Floppy-disk evolution (8″ to 5 1/4″ to 3 1/2″) (Public
Domain)
Around the turn of the century, a new portable storage technology was being developed: the USB flash
drive (more about the USB port later in the chapter). This device attaches to the universal serial bus (USB)
connector, which became standard on all personal computers beginning in the late 1990s. As with all other
storage media, flash drive storage capacity has skyrocketed over the years, from initial capacities of eight
megabytes to current capacities of 64 gigabytes and still growing.
Network Connection
When personal computers were first developed, they were stand-alone units, which meant that data was
brought into the computer or removed from the computer via removable media, such as the floppy disk.
Beginning in the mid-1980s, however, organizations began to see the value in connecting computers
together via a digital network. Because of this, personal computers needed the ability to connect to these
networks. Initially, this was done by adding an expansion card to the computer that enabled the network
connection, but by the mid-1990s, a network port was standard on most personal computers. As wireless
Ch.2: Hardware 19
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2012/07/USB-connector
http://bus206.pressbooks.com/files/2012/07/USB-connector
USB connector (click image to enlarge)
technologies began to dominate in the early 2000s, many personal computers also began including wireless
networking capabilities. Digital communication technologies will be discussed further in chapter 5.
Input and Output
In order for a personal computer to be useful, it must have channels for
receiving input from the user and channels for delivering output to the
user. These input and output devices connect to the computer via
various connection ports, which generally are part of the motherboard
and are accessible outside the computer case. In early personal
computers, specific ports were designed for each type of output
device. The configuration of these ports has evolved over the years,
becoming more and more standardized over time. Today, almost all
devices plug into a computer through the use of a USB port. This port type, first introduced in 1996, has
increased in its capabilities, both in its data transfer rate and power supplied.
Bluetooth
Besides USB, some input and output devices connect to the computer via a wireless-technology standard
called Bluetooth. Bluetooth was first invented in the 1990s and exchanges data over short distances using
radio waves. Bluetooth generally has a range of 100 to 150 feet. For devices to communicate via Bluetooth,
both the personal computer and the connecting device must have a Bluetooth communication chip installed.
Input Devices
All personal computers need components that allow the user to input data. Early computers used simply a
keyboard to allow the user to enter data or select an item from a menu to run a program. With the advent of
the graphical user interface, the mouse became a standard component of a computer. These two components
are still the primary input devices to a personal computer, though variations of each have been introduced
with varying levels of success over the years. For example, many new devices now use a touch screen as
the primary way of entering data.
Besides the keyboard and mouse, additional input devices are becoming more common. Scanners
allow users to input documents into a computer, either as images or as text. Microphones can be used to
record audio or give voice commands. Webcams and other types of video cameras can be used to record
video or participate in a video chat session.
Output Devices
Output devices are essential as well. The most obvious output device is a display, visually representing the
state of the computer. In some cases, a personal computer can support multiple displays or be connected to
larger-format displays such as a projector or large-screen television. Besides displays, other output devices
include speakers for audio output and printers for printed output.
20 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206/
Attributed to David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2012/07/USB-connector
http://bus206.pressbooks.com/files/2012/07/USB-connector
Sidebar: What Hardware Components Contribute to the Speed of My Computer?
The speed of a computer is determined by many elements, some related to hardware and some related to
software. In hardware, speed is improved by giving the electrons shorter distances to traverse to complete
a circuit. Since the first CPU was created in the early 1970s, engineers have constantly worked to figure
out how to shrink these circuits and put more and more circuits onto the same chip. And this work has
paid off – the speed of computing devices has been continuously improving ever since.
The hardware components that contribute to the speed of a personal computer are the CPU, the
motherboard, RAM, and the hard disk. In most cases, these items can be replaced with newer, faster
components. In the case of RAM, simply adding more RAM can also speed up the computer. The table
below shows how each of these contributes to the speed of a computer. Besides upgrading hardware, there
are many changes that can be made to the software of a computer to make it faster.
Component
Speed
measured by
Units Description
CPU
Clock
speed
gHz
The time it takes to complete a
circuit.
Motherboard
Bus
speed
mHz
How much data can move
across the bus simultaneously.
RAM
Data
transfer rate
MB/s
The time it takes for data to be
transferred from memory to
system.
Access
time
ms
The time it takes before the disk
can transfer data.
Hard Disk
Data
transfer rate
MBit/s
The time it takes for data to be
transferred from disk to system.
Other Computing Devices
A personal computer is designed to be a general-purpose device. That is, it can be used to solve many
different types of problems. As the technologies of the personal computer have become more
commonplace, many of the components have been integrated into other devices that previously were purely
mechanical. We have also seen an evolution in what defines a computer. Ever since the invention of the
personal computer, users have clamored for a way to carry them around. Here we will examine several
types of devices that represent the latest trends in personal computing.
Ch.2: Hardware 21
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://www.microsoft.com/atwork/maintenance/speed.aspx#fbid=BJ54lqHa0zy
A modern laptop
Portable Computers
In 1983, Compaq Computer Corporation developed the
first commercially successful portable personal computer.
By today’s standards, the Compaq PC was not very
portable: weighing in at 28 pounds, this computer was
portable only in the most literal sense – it could be carried
around. But this was no laptop; the computer was designed
like a suitcase, to be lugged around and laid on its side to be
used. Besides portability, the Compaq was successful
because it was fully compatible with the software being run
by the IBM PC, which was the standard for business.
In the years that followed, portable computing
continued to improve, giving us laptop and notebook
computers. The “luggable” computer has given way to a
much lighter clamshell computer that weighs from 4 to 6
pounds and runs on batteries. In fact, the most recent
advances in technology give us a new class of laptop that is
quickly becoming the standard: these laptops are extremely
light and portable and use less power than their larger
counterparts. The MacBook Air is a good example of this:
it weighs less than three pounds and is only 0.68 inches thick!
Finally, as more and more organizations and individuals are moving much of their computing to the
Internet, laptops are being developed that use “the cloud” for all of their data and application storage. These
laptops are also extremely light because they have no need of a hard disk at all! A good example of this
type of laptop (sometimes called a netbook) is Samsung’s Chromebook.
Smartphones
The first modern-day mobile phone was invented in 1973. Resembling a brick and weighing in at two
pounds, it was priced out of reach for most consumers at nearly four thousand dollars. Since then, mobile
phones have become smaller and less expensive; today mobile phones are a modern convenience available
to all levels of society. As mobile phones evolved, they became more like small computers. These
smartphones have many of the same characteristics as a personal computer, such as an operating system
and memory. The first smartphone was the IBM Simon, introduced in 1994.
In January of 2007, Apple introduced the iPhone. Its ease of use and intuitive interface made it an
immediate success and solidified the future of smartphones. Running on an operating system called iOS,
the iPhone was really a small computer with a touch-screen interface. In 2008, the first Android phone was
released, with similar functionality.
Tablet Computers
A tablet computer is one that uses a touch screen as its primary input and is small enough and light enough
to be carried around easily. They generally have no keyboard and are self-contained inside a rectangular
case. The first tablet computers appeared in the early 2000s and used an attached pen as a writing device
for input. These tablets ranged in size from small personal digital assistants (PDAs), which were handheld,
22 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://commons.wikimedia.org/wiki/File:Floppy_disk_2009_G1
http://commons.wikimedia.org/wiki/File:Floppy_disk_2009_G1
to full-sized, 14-inch devices. Most early tablets used a version of an existing computer operating system,
such as Windows or Linux.
These early tablet devices were, for the most part, commercial failures. In January, 2010, Apple
introduced the iPad, which ushered in a new era of tablet computing. Instead of a pen, the iPad used
the finger as the primary input device. Instead of using the operating system of their desktop and laptop
computers, Apple chose to use iOS, the operating system of the iPhone. Because the iPad had a user
interface that was the same as the iPhone, consumers felt comfortable and sales took off. The iPad has set
the standard for tablet computing. After the success of the iPad, computer manufacturers began to develop
new tablets that utilized operating systems that were designed for mobile devices, such as Android.
The Rise of Mobile Computing
Mobile computing is having a huge impact on the business world today. The use of smartphones and tablet
computers is rising at double-digit rates each year. The Gartner Group, in a report issued in April, 2013,
estimates that over 1.7 million mobile phones will ship in the US in 2013 as compared to just over 340,000
personal computers. Over half of these mobile phones are smartphones.2 Almost 200,000 tablet computers
are predicted to ship in 2013. According to the report, PC shipments will continue to decline as phone and
tablet shipments continue to increase. 3
Integrated Computing
Along with advances in computers themselves, computing technology is being integrated into many
everyday products. From automobiles to refrigerators to airplanes, computing technology is enhancing what
these devices can do and is adding capabilities that would have been considered science fiction just a few
years ago. Here are two of the latest ways that computing technologies are being integrated into everyday
products:
• The Smart House
• The Self-Driving Car
The Commoditization of the Personal Computer
Over the past thirty years, as the personal computer has gone from technical marvel to part of our everyday
lives, it has also become a commodity. The PC has become a commodity in the sense that there is very little
differentiation between computers, and the primary factor that controls their sale is their price. Hundreds of
manufacturers all over the world now create parts for personal computers. Dozens of companies buy these
parts and assemble the computers. As commodities, there are essentially no differences between computers
made by these different companies. Profit margins for personal computers are razor-thin, leading hardware
developers to find the lowest-cost manufacturing.
There is one brand of computer for which this is not the case – Apple. Because Apple does not make
computers that run on the same open standards as other manufacturers, they can make a unique product that
no one can easily copy. By creating what many consider to be a superior product, Apple can charge more
2. Smartphone shipments to surpass feature phones this year. CNet, June 4, 2013. http://news.cnet.com/8301-1035_3-57587583-94/
smartphone-shipments-to-surpass-feature-phones-this-year/
3. Gartner Press Release. April 4, 2013. http://www.gartner.com/newsroom/id/2408515
Ch.2: Hardware 23
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://home.howstuffworks.com/smart-home.htm
Electronic waste (Public Domain)
for their computers than other manufacturers. Just as with the iPad and iPhone, Apple has chosen a strategy
of differentiation, which, at least at this time, seems to be paying off.
The Problem of Electronic Waste
Personal computers have been around for over thirty-five years.
Millions of them have been used and discarded. Mobile phones are
now available in even the remotest parts of the world and, after a few
years of use, they are discarded. Where does this electronic debris end
up?
Often, it gets routed to any country that will accept it. Many
times, it ends up in dumps in developing nations. These dumps are
beginning to be seen as health hazards for those living near them.
Though many manufacturers have made strides in using materials that
can be recycled, electronic waste is a problem with which we must all
deal.
Summary
Information systems hardware consists of the components of digital
technology that you can touch. In this chapter, we reviewed the
components that make up a personal computer, with the understanding
that the configuration of a personal computer is very similar to that of any type of digital computing device.
A personal computer is made up of many components, most importantly the CPU, motherboard, RAM, hard
disk, removable media, and input/output devices. We also reviewed some variations on the personal
computer, such as the tablet computer and the smartphone. In accordance with Moore’s Law, these
technologies have improved quickly over the years, making today’s computing devices much more
powerful than devices just a few years ago. Finally, we discussed two of the consequences of this evolution:
the commoditization of the personal computer and the problem of electronic waste.
Study Questions
1. Write your own description of what the term information systems hardware means.
2. What is the impact of Moore’s Law on the various hardware components described in this
chapter?
3. Write a summary of one of the items linked to in the “Integrated Computing” section.
4. Explain why the personal computer is now considered a commodity.
5. The CPU can also be thought of as the _____________ of the computer.
6. List the following in increasing order (slowest to fastest): megahertz, kilohertz, gigahertz.
7. What is the bus of a computer?
24 Information Systems for Business and Beyond
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://commons.wikimedia.org/wiki/File:Ewaste-pile
http://commons.wikimedia.org/wiki/File:Ewaste-pile
8. Name two differences between RAM and a hard disk.
9. What are the advantages of solid-state drives over hard disks?
10. How heavy was the first commercially successful portable computer?
Exercises
1. Review the sidebar on the binary number system. How would you represent the number 16 in
binary? How about the number 100? Besides decimal and binary, other number bases are used in
computing and programming. One of the most used bases is hexadecimal, which is base-16. In
base-16, the numerals 0 through 9 are supplemented with the letters A (10) through F (15). How
would you represent the decimal number 100 in hexadecimal?
2. Review the timeline of computers at the Old Computers website. Pick one computer from the
listing and write a brief summary. Include the specifications for CPU, memory, and screen size.
Now find the specifications of a computer being offered for sale today and compare. Did Moore’s
Law hold true?
3. The Homebrew Computer Club was one of the original clubs for enthusiasts of the first
personal computer, the Altair 8800. Read some of their newsletters and then discuss some of the
issues surrounding this early personal computer.
4. If you could build your own personal computer, what components would you purchase? Put
together a list of the components you would use to create it, including a computer case,
motherboard, CPU, hard disk, RAM, and DVD drive. How can you be sure they are all
compatible with each other? How much would it cost? How does this compare to a similar
computer purchased from a vendor such as Dell or HP?
5. Review the Wikipedia entry on electronic waste. Now find at least two more scholarly articles
on this topic. Prepare a slideshow that summarizes the issue and then recommend a possible
solution based on your research.
6. As with any technology text, there have been advances in technologies since publication. What
technology that has been developed recently would you add to this chapter?
7. What is the current state of solid-state drives vs. hard disks? Do original research online where
you can compare price on solid-state drives and hard disks. Be sure you note the differences in
price, capacity, and speed.
Ch.2: Hardware 25
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://oldcomputers.net/indexwp.html
http://www.digibarn.com/collections/newsletters/homebrew/index.html
http://en.wikipedia.org/wiki/Electronic_waste
Chapter 3: Software
David T. Bourgeois
Learning Objectives
Upon successful completion of this chapter, you will be able to:
• define the term software;
• describe the two primary categories of software;
• describe the role ERP software plays in an organization;
• describe cloud computing and its advantages and disadvantages for use in an organization; and
• define the term open-source and identify its primary characteristics.
Introduction
The second component of an information system is software. Simply put: Software is the set of instructions
that tell the hardware what to do. Software is created through the process of programming (we will cover the
creation of software in more detail in chapter 10). Without software, the hardware would not be functional.
Types of Software
Software can be broadly divided into two categories: operating systems and
application software. Operating systems manage the hardware and create
the interface between the hardware and the user. Application software is
the category of programs that do something useful for the user.
Operating Systems
The operating system provides several essential functions, including:
1. managing the hardware resources of the computer;
2. providing the user-interface components;
3. providing a platform for software developers to write
applications.
All computing devices run an operating system. For personal computers, the most popular operating
systems are Microsoft’s Windows, Apple’s OS X, and different versions of Linux. Smartphones and tablets
run operating systems as well, such as Apple’s iOS, Google’s Android, Microsoft’s Windows Mobile, and
Blackberry.
Early personal-computer operating systems were simple by today’s standards; they did not provide
multitasking and required the user to type commands to initiate an action. The amount of memory that early
26
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2012/07/Screen-shot-2013-02-08-at-12.26.59-PM
http://bus206.pressbooks.com/files/2012/07/Screen-shot-2013-02-08-at-12.26.59-PM
Linux logo
(Copyright: Larry
Ewing)
operating systems could handle was limited as well, making large programs impractical to run. The most
popular of the early operating systems was IBM’s Disk Operating System, or DOS, which was actually
developed for them by Microsoft.
In 1984, Apple introduced the Macintosh computer, featuring an operating system with a graphical
user interface. Though not the first graphical operating system, it was the first one to find commercial
success. In 1985, Microsoft released the first version of Windows. This version of Windows was not an
operating system, but instead was an application that ran on top of the DOS operating system, providing
a graphical environment. It was quite limited and had little commercial success. It was not until the
1990 release of Windows 3.0 that Microsoft found success with a graphical user interface. Because of
the hold of IBM and IBM-compatible personal computers on business, it was not until Windows 3.0 was
released that business users began using a graphical user interface, ushering us into the graphical-computing
era. Since 1990, both Apple and Microsoft have released many new versions of their operating systems,
with each release adding the ability to process more data at once and access more memory. Features such
as multitasking, virtual memory, and voice input have become standard features of both operating systems.
A third personal-computer operating system family that is gaining in popularity is Linux
(pronounced “linn-ex”). Linux is a version of the Unix operating system that runs on the
personal computer. Unix is an operating system used primarily by scientists and
engineers on larger minicomputers. These are very expensive computers, and software
developer Linus Torvalds wanted to find a way to make Unix run on less expensive
personal computers. Linux was the result. Linux has many variations and now powers a
large percentage of web servers in the world. It is also an example of open-source
software, a topic we will cover later in this chapter.
Sidebar: Mac vs. Windows
Are you a Mac? Are you a PC? Ever since its introduction in 1984, users of the Apple Macintosh have been
quite biased about their preference for the Macintosh operating system (now called OS X) over Microsoft’s.
When Microsoft introduced Windows, Apple sued Microsoft, claiming that they copied the “look and feel”
of the Macintosh operating system. In the end, Microsoft successfully defended themselves.
Over the past few years, Microsoft and Apple have traded barbs with each other, each claiming to
have a better operating system and software. While Microsoft has always had the larger market share (see
sidebar), Apple has been the favorite of artists, musicians, and the technology elite. Apple also provides a
lot of computers to elementary schools, thus gaining a following among the younger generation.
Sidebar: Why Is Microsoft Software So Dominant in the Business World?
If you’ve worked in the world of business, you may have noticed that almost all of the computers run a
version of Microsoft’s Windows operating system. Why is this? On almost all college campuses, you see a
preponderance of Apple Macintosh laptops. In elementary schools, Apple reigns as well. Why has this not
extended into the business world?
As we learned in chapter 1, almost all businesses used IBM mainframe computers back in the 1960s
and 1970s. These same businesses shied away from personal computers until IBM released the PC in 1981.
Ch.3: Software 27
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://commons.wikimedia.org/wiki/File:Linux_logo
http://commons.wikimedia.org/wiki/File:Linux_logo
VisiCalc running on an Apple II. (Public
Domain)
When executives had to make a decision about purchasing personal computers for their employees, they
would choose the safe route and purchase IBM. The saying then was: “No one ever got fired for buying
IBM.” So over the next decade, companies bought IBM personal computers (or those compatible with
them), which ran an operating system called DOS. DOS was created by Microsoft, so when Microsoft
released Windows as the next iteration of DOS, companies took the safe route and started purchasing
Windows.
Microsoft soon found itself with the dominant personal-computer operating system for businesses.
As the networked personal computer began to replace the mainframe computer as the primary way of
computing inside businesses, it became essential for Microsoft to give businesses the ability to administer
and secure their networks. Microsoft developed business-level server products to go along with their
personal computer products, thereby providing a complete business solution. And so now, the saying goes:
“No one ever got fired for buying Microsoft.”
Application Software
The second major category of software is application software. Application software is, essentially,
software that allows the user to accomplish some goal or purpose. For example, if you have to write a paper,
you might use the application-software program Microsoft Word. If you want to listen to music, you might
use iTunes. To surf the web, you might use Internet Explorer or Firefox. Even a computer game could be
considered application software.
The “Killer” App
When a new type of digital device is invented, there are generally a
small group of technology enthusiasts who will purchase it just for
the joy of figuring out how it works. However, for most of us, until
a device can actually do something useful we are not going to
spend our hard-earned money on it. A “killer” application is one
that becomes so essential that large numbers of people will buy a
device just to run that application. For the personal computer, the
killer application was the spreadsheet. In 1979, VisiCalc, the first
personal-computer spreadsheet package, was introduced. It was an
immediate hit and drove sales of the Apple II. It also solidified the
value of the personal computer beyond the relatively small circle
of technology geeks. When the IBM PC was released, another
spreadsheet program, Lotus 1-2-3, was the killer app for business
users.
Productivity Software
Along with the spreadsheet, several other software applications have become standard tools for the
workplace. These applications, called productivity software, allow office employees to complete their daily
work. Many times, these applications come packaged together, such as in Microsoft’s Office suite. Here is
a list of these applications and their basic functions:
28 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://commons.wikimedia.org/wiki/File:Visicalc
http://commons.wikimedia.org/wiki/File:Visicalc
• Word processing: This class of software provides for the creation of written documents. Functions
include the ability to type and edit text, format fonts and paragraphs, and add, move, and delete
text throughout the document. Most modern word-processing programs also have the ability to
add tables, images, and various layout and formatting features to the document. Word processors
save their documents as electronic files in a variety of formats. By far, the most popular word-
processing package is Microsoft Word, which saves its files in the DOCX format. This format can
be read/written by many other word-processor packages.
• Spreadsheet: This class of software provides a way to do numeric calculations and analysis. The
working area is divided into rows and columns, where users can enter numbers, text, or formulas.
It is the formulas that make a spreadsheet powerful, allowing the user to develop complex
calculations that can change based on the numbers entered. Most spreadsheets also include the
ability to create charts based on the data entered. The most popular spreadsheet package is
Microsoft Excel, which saves its files in the XLSX format. Just as with word processors, many
other spreadsheet packages can read and write to this file format.
• Presentation: This class of software provides for the creation of slideshow presentations.
Harkening back to the days of overhead projectors and transparencies, presentation software
allows its users to create a set of slides that can be printed or projected on a screen. Users can add
text, images, and other media elements to the slides. Microsoft’s PowerPoint is the most popular
software right now, saving its files in PPTX format.
• Some office suites include other types of software. For example, Microsoft Office includes
Outlook, its e-mail package, and OneNote, an information-gathering collaboration tool. The
professional version of Office also includes Microsoft Access, a database package. (Databases are
covered more in chapter 4.)
Microsoft popularized the idea of the office-software productivity bundle with their release of Microsoft
Office. This package continues to dominate the market and most businesses expect employees to know
how to use this software. However, many competitors to Microsoft Office do exist and are compatible
with the file formats used by Microsoft (see table below). Recently, Microsoft has begun to offer a web
version of their Office suite. Similar to Google Drive, this suite allows users to edit and share documents
online utilizing cloud-computing technology. Cloud computing will be discussed later in this chapter.
Ch.3: Software 29
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Comparison of office application software suites
Utility Software and Programming Software
Two subcategories of application software worth mentioning are utility software and programming
software. Utility software includes software that allows you to fix or modify your computer in some way.
Examples include antivirus software and disk defragmentation software. These types of software packages
were invented to fill shortcomings in operating systems. Many times, a subsequent release of an operating
system will include these utility functions as part of the operating system itself.
Programming software is software whose purpose is to make more software. Most of these programs
provide programmers with an environment in which they can write the code, test it, and convert it into the
format that can then be run on a computer.
Sidebar: “PowerPointed” to Death
As presentation software, specifically Microsoft PowerPoint, has gained acceptance as the primary method
to formally present information in a business setting, the art of giving an engaging presentation is becoming
rare. Many presenters now just read the bullet points in the presentation and immediately bore those in
attendance, who can already read it for themselves.
The real problem is not with PowerPoint as much as it is with the person creating and presenting. Author
and thinker Seth Godin put it this way: “PowerPoint could be the most powerful tool on your computer.
But it’s not. It’s actually a dismal failure. Almost every PowerPoint presentation sucks rotten eggs.”1 The
software used to help you communicate should not duplicate the presentation you want to give, but instead
1. From Why are your PowerPoints so bad? available for download at http://www.sethgodin.com/freeprize/reallybad-1 .
30 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
it should support it. I highly recommend the book Presentation Zen by Garr Reynolds to anyone who
wants to improve their presentation skills.
Software developers are becoming aware of this problem as well. New digital presentation technologies
are being developed, with the hopes of becoming “the next PowerPoint.” One innovative new presentation
application is Prezi. Prezi is a presentation tool that uses a single canvas for the presentation, allowing
presenters to place text, images, and other media on the canvas, and then navigate between these objects
as they present. Just as with PowerPoint, Prezi should be used to supplement the presentation. And we
must always remember that sometimes the best presentations are made with no digital tools.
Sidebar: I Own This Software, Right? Well . . .
When you purchase software and install it on your computer, are you the owner of that software?
Technically, you are not! When you install software, you are actually just being given a license to use it.
When you first install a software package, you are asked to agree to the terms of service or the license
agreement. In that agreement, you will find that your rights to use the software are limited. For example, in
the terms of the Microsoft Office Excel 2010 software license, you will find the following statement: “This
software is licensed, not sold. This agreement only gives you some rights to use the features included in the
software edition you licensed.”
For the most part, these restrictions are what you would expect: you cannot make illegal copies of the
software and you may not use it to do anything illegal. However, there are other, more unexpected terms in
these software agreements. For example, many software agreements ask you to agree to a limit on liability.
Again, from Microsoft: “Limitation on and exclusion of damages. You can recover from Microsoft and its
suppliers only direct damages up to the amount you paid for the software. You cannot recover any other
damages, including consequential, lost profits, special, indirect or incidental damages.” What this means is
that if a problem with the software causes harm to your business, you cannot hold Microsoft or the supplier
responsible for damages.
Applications for the Enterprise
As the personal computer proliferated inside organizations, control over the information generated by
the organization began splintering. Say the customer service department creates a customer database to
keep track of calls and problem reports, and the sales department also creates a database to keep track of
customer information. Which one should be used as the master list of customers? As another example,
someone in sales might create a spreadsheet to calculate sales revenue, while someone in finance creates
a different one that meets the needs of their department. However, it is likely that the two spreadsheets
will come up with different totals for revenue. Which one is correct? And who is managing all of this
information?
Ch.3: Software 31
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Registered trademark of SAP
Enterprise Resource Planning
In the 1990s, the need to bring the organization’s information back under centralized control became more
apparent. The enterprise resource planning (ERP) system (sometimes just called enterprise software) was
developed to bring together an entire organization in one software application. Simply put, an ERP system
is a software application utilizing a central database that is implemented throughout the entire organization.
Let’s take a closer look at this definition:
• “A software application”: An ERP is a software application that is used by many of an
organization’s employees.
• “utilizing a central database”: All users of the ERP edit and save their information from the data
source. What this means practically is that there is only one customer database, there is only one
calculation for revenue, etc.
• “that is implemented throughout the entire organization”: ERP systems include functionality that
covers all of the essential components of a business. Further, an organization can purchase
modules for its ERP system that match specific needs, such as manufacturing or planning.
ERP systems were originally marketed to large corporations.
However, as more and more large companies began installing
them, ERP vendors began targeting mid-sized and even smaller
businesses. Some of the more well-known ERP systems include
those from SAP, Oracle, and Microsoft.
In order to effectively implement an ERP system in an
organization, the organization must be ready to make a full
commitment. All aspects of the organization are affected as old
systems are replaced by the ERP system. In general, implementing an ERP system can take two to three
years and several million dollars. In most cases, the cost of the software is not the most expensive part of
the implementation: it is the cost of the consultants!
So why implement an ERP system? If done properly, an ERP system can bring an organization a good
return on their investment. By consolidating information systems across the enterprise and using the
software to enforce best practices, most organizations see an overall improvement after implementing an
ERP. Business processes as a form of competitive advantage will be covered in chapter 9.
Sidebar: Y2K and ERP
The initial wave of software-application development began in the 1960s, when applications were
developed for mainframe computers. In those days, computing was expensive, so applications were
designed to take as little space as possible. One shortcut that many programmers took was in the storage
of dates, specifically the year. Instead of allocating four digits to hold the year, many programs allocated
two digits, making the assumption that the first two digits were “19″. For example, to calculate how old
someone was, the application would take the last two digits of the current year (for 1995, for example, that
32 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
would be “95″) and then subtract the two digits stored for the birthday year (“65″ for 1965). 95 minus 65
gives an age of 30, which is correct.
However, as the year 2000 approached, many of these “legacy” applications were still being used, and
businesses were very concerned that any software applications they were using that needed to calculate
dates would fail. To update our age-calculation example, the application would take the last two digits of
the current year (for 2012, that would be “12″) and then subtract the two digits stored for the birthday
year (“65″ for 1965). 12 minus 65 gives an age of -53, which would cause an error. In order to solve this
problem, applications would have to be updated to use four digits for years instead of two. Solving this
would be a massive undertaking, as every line of code and every database would have to be examined.
This is where companies gained additional incentive to implement an ERP system. For many
organizations that were considering upgrading to ERP systems in the late 1990s, this problem, known as
Y2K (year 2000), gave them the extra push they needed to get their ERP installed before the year 2000.
ERP vendors guaranteed that their systems had been designed to be Y2K compliant – which simply meant
that they stored dates using four digits instead of two. This led to a massive increase in ERP installations in
the years leading up to 2000, making the ERP a standard software application for businesses.
Customer Relationship Management
A customer relationship management (CRM) system is a software application designed to manage an
organization’s customers. In today’s environment, it is important to develop relationships with your
customers, and the use of a well-designed CRM can allow a business to personalize its relationship with
each of its customers. Some ERP software systems include CRM modules. An example of a well-known
CRM package is Salesforce.
Supply Chain Management
Many organizations must deal with the complex task of managing their supply chains. At its simplest,
a supply chain is the linkage between an organization’s suppliers, its manufacturing facilities, and the
distributors of its products. Each link in the chain has a multiplying effect on the complexity of the process:
if there are two suppliers, one manufacturing facility, and two distributors, for example, then there are 2 x
1 x 2 = 4 links to handle. However, if you add two more suppliers, another manufacturing facility, and two
more distributors, then you have 4 x 2 x 4 = 32 links to manage.
A supply chain management (SCM) system manages the interconnection between these links, as well
as the inventory of the products in their various stages of development. A full definition of a supply chain
management system is provided by the Association for Operations Management: “The design, planning,
execution, control, and monitoring of supply chain activities with the objective of creating net value,
building a competitive infrastructure, leveraging worldwide logistics, synchronizing supply with demand,
and measuring performance globally.”2 Most ERP systems include a supply chain management module.
Mobile Applications
Just as with the personal computer, mobile devices such as tablet computers and smartphones also have
operating systems and application software. In fact, these mobile devices are in many ways just smaller
versions of personal computers. A mobile app is a software application programmed to run specifically on
a mobile device.
2. http://www.apics.org/dictionary/dictionary-information?ID=3984
Ch.3: Software 33
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
As we saw in chapter 2, smartphones and tablets are becoming a dominant form of computing, with
many more smartphones being sold than personal computers. This means that organizations will have to
get smart about developing software on mobile devices in order to stay relevant.
These days, most mobile devices run on one of two operating systems: Android or iOS. Android is
an open-source operating system purchased and supported by Google; iOS is Apple’s mobile operating
system. In the fourth quarter of 2012, Android was installed on 70.1% of all mobile phones shipped,
followed by 21.0% for iOS. Other mobile operating systems of note are Blackberry (3.2%) and Windows
(2.6%). 3
As organizations consider making their digital presence compatible with mobile devices, they will
have to decide whether to build a mobile app. A mobile app is an expensive proposition, and it will only
run on one type of mobile device at a time. For example, if an organization creates an iPhone app, those
with Android phones cannot run the application. Each app takes several thousand dollars to create, so this
is not a trivial decision for many companies.
One option many companies have is to create a website that is mobile-friendly. A mobile website
works on all mobile devices and costs about the same as creating an app. We will discuss the question of
whether to build a mobile app more thoroughly in Chapter 10.
Cloud Computing
Historically, for software to run on a computer, an individual copy of the software had to be installed on
the computer, either from a disk or, more recently, after being downloaded from the Internet. The concept
of “cloud” computing changes this, however.
To understand cloud computing, we first have to understand what the cloud is. “The cloud” refers to
applications, services, and data storage on the Internet. These service providers rely on giant server farms
and massive storage devices that are connected via Internet protocols. Cloud computing is the use of these
services by individuals and organizations.
You probably already use cloud computing in some forms. For example, if you access your e-mail
via your web browser, you are using a form of cloud computing. If you use Google Drive’s applications,
you are using cloud computing. While these are free versions of cloud computing, there is big business in
providing applications and data storage over the web. Salesforce (see above) is a good example of cloud
computing – their entire suite of CRM applications are offered via the cloud. Cloud computing is not limited
to web applications: it can also be used for services such as phone or video streaming.
Advantages of Cloud Computing
• No software to install or upgrades to maintain.
• Available from any computer that has access to the Internet.
• Can scale to a large number of users easily.
• New applications can be up and running very quickly.
• Services can be leased for a limited time on an as-needed basis.
• Your information is not lost if your hard disk crashes or your laptop is stolen.
• You are not limited by the available memory or disk space on your computer.
3. Taken from IDC Worldwide Mobile Phone Tracker, February 14, 2013. Full report available at http://www.idc.com/
getdoc.jsp?containerId=prUS23946013
34 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://www.idc.com/getdoc.jsp?containerId=prUS23946013
http://www.idc.com/getdoc.jsp?containerId=prUS23946013
Disadvantages of Cloud Computing
• Your information is stored on someone else’s computer – how safe is it?
• You must have Internet access to use it. If you do not have access, you’re out of luck.
• You are relying on a third-party to provide these services.
Cloud computing has the ability to really impact how organizations manage technology. For example, why
is an IT department needed to purchase, configure, and manage personal computers and software when all
that is really needed is an Internet connection?
Using a Private Cloud
Many organizations are understandably nervous about giving up control of their data and some of their
applications by using cloud computing. But they also see the value in reducing the need for installing
software and adding disk storage to local computers. A solution to this problem lies in the concept of a
private cloud. While there are various models of a private cloud, the basic idea is for the cloud service
provider to section off web server space for a specific organization. The organization has full control over
that server space while still gaining some of the benefits of cloud computing.
Virtualization
One technology that is utilized extensively as part of cloud computing is “virtualization.” Virtualization
is the process of using software to simulate a computer or some other device. For example, using
virtualization, a single computer can perform the functions of several computers. Companies such as EMC
provide virtualization software that allows cloud service providers to provision web servers to their clients
quickly and efficiently. Organizations are also implementing virtualization in order to reduce the number
of servers needed to provide the necessary services. For more detail on how virtualization works, see this
informational page from VMWare.
Software Creation
How is software created? If software is the set of instructions that tells the hardware what to do, how are
these instructions written? If a computer reads everything as ones and zeroes, do we have to learn how to
write software that way?
Modern software applications are written using a programming language. A programming language
consists of a set of commands and syntax that can be organized logically to execute specific functions.
This language generally consists of a set of readable words combined with symbols. Using this language,
a programmer writes a program (called the source code) that can then be compiled into machine-readable
form, the ones and zeroes necessary to be executed by the CPU. Examples of well-known programming
languages today include Java, PHP, and various flavors of C (Visual C, C++, C#). Languages such as
HTML and Javascript are used to develop web pages. Most of the time, programming is done inside a
programming environment; when you purchase a copy of Visual Studio from Microsoft, it provides you
with an editor, compiler, and help for many of Microsoft’s programming languages.
Software programming was originally an individual process, with each programmer working on an
entire program, or several programmers each working on a portion of a larger program. However, newer
Ch.3: Software 35
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://www.vmware.com/virtualization/virtualization-basics/how-virtualization-works.html
http://www.vmware.com/virtualization/virtualization-basics/how-virtualization-works.html
methods of software development include a more collaborative approach, with teams of programmers
working on code together. We will cover information-systems development more fully in chapter 10.
Open-Source Software
When the personal computer was first released, it did not serve any practical need. Early computers were
difficult to program and required great attention to detail. However, many personal-computer enthusiasts
immediately banded together to build applications and solve problems. These computer enthusiasts were
happy to share any programs they built and solutions to problems they found; this collaboration enabled
them to more quickly innovate and fix problems.
As software began to become a business, however, this idea of sharing everything fell out of favor, at
least with some. When a software program takes hundreds of man-hours to develop, it is understandable
that the programmers do not want to just give it away. This led to a new business model of restrictive
software licensing, which required payment for software, a model that is still dominant today. This model
is sometimes referred to as closed source, as the source code is not made available to others.
There are many, however, who feel that software should not be restricted. Just as with those early
hobbyists in the 1970s, they feel that innovation and progress can be made much more rapidly if we share
what we learn. In the 1990s, with Internet access connecting more and more people together, the open-
source movement gained steam.
Open-source software is software that makes the source code available for anyone to copy and use. For
most of us, having access to the source code of a program does us little good, as we are not programmers
and won’t be able to do much with it. The good news is that open-source software is also available in
a compiled format that we can simply download and install. The open-source movement has led to the
development of some of the most-used software in the world, including the Firefox browser, the Linux
operating system, and the Apache web server. Many also think open-source software is superior to closed-
source software. Because the source code is freely available, many programmers have contributed to open-
source software projects, adding features and fixing bugs.
Many businesses are wary of open-source software precisely because the code is available for anyone
to see. They feel that this increases the risk of an attack. Others counter that this openness actually decreases
the risk because the code is exposed to thousands of programmers who can incorporate code changes to
quickly patch vulnerabilities.
There are many arguments on both sides of the aisle for the benefits of the two models. Some benefits
of the open-source model are:
• The software is available for free.
• The software source-code is available; it can be examined and reviewed before it is installed.
• The large community of programmers who work on open-source projects leads to quick bug-
fixing and feature additions.
Some benefits of the closed-source model are:
• By providing financial incentive for software development, some of the brightest minds have
chosen software development as a career.
• Technical support from the company that developed the software.
36 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Today there are thousands of open-source software applications available for download. For example, as
we discussed previously in this chapter, you can get the productivity suite from Open Office. One good
place to search for open-source software is sourceforge.net, where thousands of software applications are
available for free download.
Summary
Software gives the instructions that tell the hardware what to do. There are two basic categories of software:
operating systems and applications. Operating systems provide access to the computer hardware and make
system resources available. Application software is designed to meet a specific goal. Productivity software
is a subset of application software that provides basic business functionality to a personal computer: word
processing, spreadsheets, and presentations. An ERP system is a software application with a centralized
database that is implemented across the entire organization. Cloud computing is a method of software
delivery that runs on any computer that has a web browser and access to the Internet. Software is developed
through a process called programming, in which a programmer uses a programming language to put
together the logic needed to create the program. While most software is developed using a closed-source
model, the open-source movement is gaining more support today.
Study Questions
1. Come up with your own definition of software. Explain the key terms in your definition.
2. What are the functions of the operating system?
3. Which of the following are operating systems and which are applications: Microsoft Excel,
Google Chrome, iTunes, Windows, Android, Angry Birds.
4. What is your favorite software application? What tasks does it help you accomplish?
5. What is a “killer” app? What was the killer app for the PC?
6. How would you categorize the software that runs on mobile devices? Break down these apps
into at least three basic categories and give an example of each.
7. Explain what an ERP system does.
8. What is open-source software? How does it differ from closed-source software? Give an
example of each.
9. What does a software license grant?
10. How did the Y2K (year 2000) problem affect the sales of ERP systems?
Exercises
1. Go online and find a case study about the implementation of an ERP system. Was it successful?
How long did it take? Does the case study tell you how much money the organization spent?
Ch.3: Software 37
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://sourceforge.net
2. What ERP system does your university or place of employment use? Find out which one they
use and see how it compares to other ERP systems.
3. If you were running a small business with limited funds for information technology, would you
consider using cloud computing? Find some web-based resources that support your decision.
4. Download and install Open Office. Use it to create a document or spreadsheet. How does it
compare to Microsoft Office? Does the fact that you got it for free make it feel less valuable?
5. Go to sourceforge.net and review their most downloaded software applications. Report back on
the variety of applications you find. Then pick one that interests you and report back on what it
does, the kind of technical support offered, and the user reviews.
6. Review this article on the security risks of open-source software. Write a short analysis giving
your opinion on the different risks discussed.
7. What are three examples of programming languages? What makes each of these languages
useful to programmers?
38 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://openoffice.org
http://sourceforge.net
http://www.zdnet.com/six-open-source-security-myths-debunked-and-eight-real-challenges-to-consider-7000014225/
Chapter 4: Data and Databases
David T. Bourgeois
Learning Objectives
Upon successful completion of this chapter, you will be able to:
• describe the differences between data, information, and knowledge;
• define the term database and identify the steps to creating one;
• describe the role of a database management system;
• describe the characteristics of a data warehouse; and
• define data mining and describe its role in an organization.
Introduction
You have already been introduced to the first two components of information systems: hardware and
software. However, those two components by themselves do not make a computer useful. Imagine if you
turned on a computer, started the word processor, but could not save a document. Imagine if you opened a
music player but there was no music to play. Imagine opening a web browser but there were no web pages.
Without data, hardware and software are not very useful! Data is the third component of an information
system.
Data, Information, and Knowledge
Data are the raw bits and pieces of information with no context. If I told you, “15, 23, 14, 85,” you would
not have learned anything. But I would have given you data.
Data can be quantitative or qualitative. Quantitative data is numeric, the result of a measurement,
count, or some other mathematical calculation. Qualitative data is descriptive. “Ruby Red,” the color of a 2013
Ford Focus, is an example of qualitative data. A number can be qualitative too: if I tell you my favorite number is 5, that
is qualitative data because it is descriptive, not the result of a measurement or mathematical calculation.
39
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
By itself, data is not that useful. To be useful, it needs to be given
context. Returning to the example above, if I told you that “15, 23, 14, and 85″
are the numbers of students that had registered for upcoming classes, that would be
information. By adding the context – that the numbers represent the count of students
registering for specific classes – I have converted data into information.
Once we have put our data into context, aggregated and analyzed it, we
can use it to make decisions for our organization. We can say that this
consumption of information produces knowledge. This knowledge can be
used to make decisions, set policies, and even spark innovation.
The final step up the information ladder is the step from knowledge
(knowing a lot about a topic) to wisdom. We can say that someone has
wisdom when they can combine their knowledge and experience to produce
a deeper understanding of a topic. It often takes many years to develop
wisdom on a particular topic, and requires patience.
Examples of Data
Almost all software programs require data to do anything useful. For example, if you are editing a document
in a word processor such as Microsoft Word, the document you are working on is the data. The word-
processing software can manipulate the data: create a new document, duplicate a document, or modify a
document. Some other examples of data are: an MP3 music file, a video file, a spreadsheet, a web page,
and an e-book. In some cases, such as with an e-book, you may only have the ability to read the data.
Databases
The goal of many information systems is to transform data into information in order to generate knowledge
that can be used for decision making. In order to do this, the system must be able to take data, put the data
into context, and provide tools for aggregation and analysis. A database is designed for just such a purpose.
A database is an organized collection of related information. It is an organized collection, because
in a database, all data is described and associated with other data. All information in a database should
be related as well; separate databases should be created to manage unrelated information. For example, a
database that contains information about students should not also hold information about company stock
prices. Databases are not always digital – a filing cabinet, for instance, might be considered a form of
database. For the purposes of this text, we will only consider digital databases.
Relational Databases
Databases can be organized in many different ways, and thus take many forms. The most popular form of
database today is the relational database. Popular examples of relational databases are Microsoft Access,
MySQL, and Oracle. A relational database is one in which data is organized into one or more tables. Each
table has a set of fields, which define the nature of the data stored in the table. A record is one instance
of a set of fields in a table. To visualize this, think of the records as the rows of the table and the fields
as the columns of the table. In the example below, we have a table of student information, with each row
representing a student and each column representing one piece of information about the student.
40 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2012/07/DIKW
http://bus206.pressbooks.com/files/2012/07/DIKW
Rows and columns in a table
In a relational database, all the tables are related by one or more fields, so that it is possible to connect
all the tables in the database through the field(s) they have in common. For each table, one of the fields
is identified as a primary key. This key is the unique identifier for each record in the table. To help you
understand these terms further, let’s walk through the process of designing a database.
Designing a Database
Suppose a university wants to create an information system to track participation in student clubs. After
interviewing several people, the design team learns that the goal of implementing the system is to give
better insight into how the university funds clubs. This will be accomplished by tracking how many
members each club has and how active the clubs are. From this, the team decides that the system must keep
track of the clubs, their members, and their events. Using this information, the design team determines that
the following tables need to be created:
• Clubs: this will track the club name, the club president, and a short description of the club.
• Students: student name, e-mail, and year of birth.
• Memberships: this table will correlate students with clubs, allowing us to have any given student
join multiple clubs.
• Events: this table will track when the clubs meet and how many students showed up.
Now that the design team has determined which tables to create, they need to define the specific information
that each table will hold. This requires identifying the fields that will be in each table. For example, Club
Name would be one of the fields in the Clubs table. First Name and Last Name would be fields in the
Students table. Finally, since this will be a relational database, every table should have a field in common
with at least one other table (in other words: they should have a relationship with each other).
In order to properly create this relationship, a primary key must be selected for each table. This key is a
unique identifier for each record in the table. For example, in the Students table, it might be possible to use
students’ last name as a way to uniquely identify them. However, it is more than likely that some students
will share a last name (like Rodriguez, Smith, or Lee), so a different field should be selected. A student’s
e-mail address might be a good choice for a primary key, since e-mail addresses are unique. However, a
primary key cannot change, so this would mean that if students changed their e-mail address we would have
to remove them from the database and then re-insert them – not an attractive proposition. Our solution is to
create a value for each student — a user ID — that will act as a primary key. We will also do this for each
of the student clubs. This solution is quite common and is the reason you have so many user IDs!
Ch.4: Data 41
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
You can see the final database design in the figure below:
Student Clubs database diagram
With this design, not only do we have a way to organize all of the information we need to meet the
requirements, but we have also successfully related all the tables together. Here’s what the database tables
might look like with some sample data. Note that the Memberships table has the sole purpose of allowing
us to relate multiple students to multiple clubs.
42 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2012/07/database-diagram-3
http://bus206.pressbooks.com/files/2012/07/database-diagram-3
Normalization
When designing a database, one important concept to understand is normalization. In simple terms, to
normalize a database means to design it in a way that: 1) reduces duplication of data between tables and 2)
gives the table as much flexibility as possible.
In the Student Clubs database design, the design team worked to achieve these objectives. For
example, to track memberships, a simple solution might have been to create a Members field in the Clubs
table and then just list the names of all of the members there. However, this design would mean that if a
student joined two clubs, then his or her information would have to be entered a second time. Instead, the
designers solved this problem by using two tables: Students and Memberships.
In this design, when a student joins their first club, we first must add the student to the Students table,
where their first name, last name, e-mail address, and birth year are entered. This addition to the Students
table will generate a student ID. Now we will add a new entry to denote that the student is a member
of a specific club. This is accomplished by adding a record with the student ID and the club ID in the
Memberships table. If this student joins a second club, we do not have to duplicate the entry of the student’s
name, e-mail, and birth year; instead, we only need to make another entry in the Memberships table of the
second club’s ID and the student’s ID.
The design of the Student Clubs database also makes it simple to change the design without major
modifications to the existing structure. For example, if the design team were asked to add functionality
to the system to track faculty advisors to the clubs, we could easily accomplish this by adding a Faculty
Advisors table (similar to the Students table) and then adding a new field to the Clubs table to hold the
Faculty Advisor ID.
Data Types
When defining the fields in a database table, we must give each field a data type. For example, the field
Birth Year is a year, so it will be a number, while First Name will be text. Most modern databases allow for
several different data types to be stored. Some of the more common data types are listed here:
• Text: for storing non-numeric data that is brief, generally under 256 characters. The database
designer can identify the maximum length of the text.
• Number: for storing numbers. There are usually a few different number types that can be selected,
depending on how large the largest number will be.
• Yes/No: a special form of the number data type that is (usually) one byte long, with a 0 for “No”
or “False” and a 1 for “Yes” or “True”.
• Date/Time: a special form of the number data type that can be interpreted as a number or a time.
• Currency: a special form of the number data type that formats all values with a currency indicator
and two decimal places.
Ch.4: Data 43
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
• Paragraph Text: this data type allows for text longer than 256 characters.
• Object: this data type allows for the storage of data that cannot be entered via keyboard, such as
an image or a music file.
There are two important reasons that we must properly define the data type of a field. First, a data type
tells the database what functions can be performed with the data. For example, if we wish to perform
mathematical functions with one of the fields, we must be sure to tell the database that the field is a number
data type. So if we have, say, a field storing birth year, we can subtract the number stored in that field from
the current year to get age.
The second important reason to define data type is so that the proper amount of storage space is
allocated for our data. For example, if the First Name field is defined as a text(50) data type, this means
fifty characters are allocated for each first name we want to store. However, even if the first name is only
five characters long, fifty characters (bytes) will be allocated. While this may not seem like a big deal, if
our table ends up holding 50,000 names, we are allocating 50 * 50,000 = 2,500,000 bytes for storage of
these values. It may be prudent to reduce the size of the field so we do not waste storage space.
Sidebar: The Difference between a Database and a Spreadsheet
Many times, when introducing the concept of databases to students, they quickly decide that a database is
pretty much the same as a spreadsheet. After all, a spreadsheet stores data in an organized fashion, using
rows and columns, and looks very similar to a database table. This misunderstanding extends beyond the
classroom: spreadsheets are used as a substitute for databases in all types of situations every day, all over
the world.
To be fair, for simple uses, a spreadsheet can substitute for a database quite well. If a simple listing
of rows and columns (a single table) is all that is needed, then creating a database is probably overkill. In
our Student Clubs example, if we only needed to track a listing of clubs, the number of members, and the
contact information for the president, we could get away with a single spreadsheet. However, the need to
include a listing of events and the names of members would be problematic if tracked with a spreadsheet.
When several types of data must be mixed together, or when the relationships between these types of
data are complex, then a spreadsheet is not the best solution. A database allows data from several entities
(such as students, clubs, memberships, and events) to all be related together into one whole. While a
spreadsheet does allow you to define what kinds of values can be entered into its cells, a database provides
more intuitive and powerful ways to define the types of data that go into each field, reducing possible errors
and allowing for easier analysis.
Though not good for replacing databases, spreadsheets can be ideal tools for analyzing the data stored
in a database. A spreadsheet package can be connected to a specific table or query in a database and used
to create charts or perform analysis on that data.
Structured Query Language
Once you have a database designed and loaded with data, how will you do something useful with it? The
primary way to work with a relational database is to use Structured Query Language, SQL (pronounced
“sequel,” or simply stated as S-Q-L). Almost all applications that work with databases (such as database
management systems, discussed below) make use of SQL as a way to analyze and manipulate relational
data. As its name implies, SQL is a language that can be used to work with a relational database. From a
44 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
simple request for data to a complex update operation, SQL is a mainstay of programmers and database
administrators. To give you a taste of what SQL might look like, here are a couple of examples using our
Student Clubs database.
• The following query will retrieve a list of the first and last names of the club presidents:
SELECT “First Name”, “Last Name” FROM “Students” WHERE “Students.ID” = “Clubs.President”
• The following query will create a list of the number of students in each club, listing the club name
and then the number of members:
SELECT “Clubs.Club Name”, COUNT(“Memberships.Student ID”) FROM “Clubs” LEFT JOIN “Memberships” ON “Clubs.Club ID” = “Memberships.Club ID”
An in-depth description of how SQL works is beyond the scope of this introductory text, but these examples
should give you an idea of the power of using SQL to manipulate relational data. Many database packages,
such as Microsoft Access, allow you to visually create the query you want to construct and then generate
the SQL query for you.
Other Types of Databases
The relational database model is the most used database model today. However, many other database
models exist that provide different strengths than the relational model. The hierarchical database model,
popular in the 1960s and 1970s, connected data together in a hierarchy, allowing for a parent/child
relationship between data. The document-centric model allowed for a more unstructured data storage by
placing data into “documents” that could then be manipulated.
Perhaps the most interesting new development is the concept of NoSQL (from the phrase “not only
SQL”). NoSQL arose from the need to solve the problem of large-scale databases spread over several
servers or even across the world. For a relational database to work properly, it is important that only one
person be able to manipulate a piece of data at a time, a concept known as record-locking. But with today’s
large-scale databases (think Google and Amazon), this is just not possible. A NoSQL database can work
with data in a looser way, allowing for a more unstructured environment, communicating changes to the
data over time to all the servers that are part of the database.
Ch.4: Data 45
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Screen shot of the Open Office database management system
Database Management Systems
To the computer, a database looks like one or more
files. In order for the data in the database to be
read, changed, added, or removed, a software
program must access it. Many software
applications have this ability: iTunes can read its
database to give you a listing of its songs (and play
the songs); your mobile-phone software can
interact with your list of contacts. But what about
applications to create or manage a database? What
software can you use to create a database, change a
database’s structure, or simply do analysis? That is
the purpose of a category of software applications
called database management systems (DBMS).
DBMS packages generally provide an
interface to view and change the design of the
database, create queries, and develop reports. Most
of these packages are designed to work with a
specific type of database, but generally are compatible with a wide range of databases.
For example, Apache OpenOffice.org Base (see screen shot) can be used to create, modify, and
analyze databases in open-database (ODB) format. Microsoft’s Access DBMS is used to work with
databases in its own Microsoft Access Database format. Both Access and Base have the ability to read and
write to other database formats as well.
Microsoft Access and Open Office Base are examples of personal database-management systems.
These systems are primarily used to develop and analyze single-user databases. These databases are not
meant to be shared across a network or the Internet, but are instead installed on a particular device and work
with a single user at a time.
Enterprise Databases
A database that can only be used by a single user at a time is not going to meet the needs of most
organizations. As computers have become networked and are now joined worldwide via the Internet, a class
of database has emerged that can be accessed by two, ten, or even a million people. These databases are
sometimes installed on a single computer to be accessed by a group of people at a single location. Other
times, they are installed over several servers worldwide, meant to be accessed by millions. These relational
enterprise database packages are built and supported by companies such as Oracle, Microsoft, and IBM.
The open-source MySQL is also an enterprise database.
As stated earlier, the relational database model does not scale well. The term scale here refers to
a database getting larger and larger, being distributed on a larger number of computers connected via a
network. Some companies are looking to provide large-scale database solutions by moving away from
the relational model to other, more flexible models. For example, Google now offers the App Engine
Datastore, which is based on NoSQL. Developers can use the App Engine Datastore to develop applications
that access data from anywhere in the world. Amazon.com offers several database services for enterprise
use, including Amazon RDS, which is a relational database service, and Amazon DynamoDB, a NoSQL
enterprise solution.
46 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Big Data
A new buzzword that has been capturing the attention of businesses lately is big data. The term refers to
such massively large data sets that conventional database tools do not have the processing power to analyze
them. For example, Walmart must process over one million customer transactions every hour. Storing and
analyzing that much data is beyond the power of traditional database-management tools. Understanding the
best tools and techniques to manage and analyze these large data sets is a problem that governments and
businesses alike are trying to solve.
Sidebar: What Is Metadata?
The term metadata can be understood as “data about data.” For example, when looking at one of the values
of Year of Birth in the Students table, the data itself may be “1992″. The metadata about that value would
be the field name Year of Birth, the time it was last updated, and the data type (integer). Another example
of metadata could be for an MP3 music file, like the one shown in the image below; information such as
the length of the song, the artist, the album, the file size, and even the album cover art, are classified as
metadata. When a database is being designed, a “data dictionary” is created to hold the metadata, defining
the fields and structure of the database.
Data Warehouse
As organizations have begun to utilize databases as the centerpiece of their operations, the need to fully
understand and leverage the data they are collecting has become more and more apparent. However,
directly analyzing the data that is needed for day-to-day operations is not a good idea; we do not want to tax
the operations of the company more than we need to. Further, organizations also want to analyze data in a
historical sense: How does the data we have today compare with the same set of data this time last month,
or last year? From these needs arose the concept of the data warehouse.
The concept of the data warehouse is simple: extract data from one or more of the organization’s
databases and load it into the data warehouse (which is itself another database) for storage and analysis.
However, the execution of this concept is not that simple. A data warehouse should be designed so that it
meets the following criteria:
• It uses non-operational data. This means that the data warehouse is using a copy of data from the
active databases that the company uses in its day-to-day operations, so the data warehouse must
pull data from the existing databases on a regular, scheduled basis.
• The data is time-variant. This means that whenever data is loaded into the data warehouse, it
receives a time stamp, which allows for comparisons between different time periods.
• The data is standardized. Because the data in a data warehouse usually comes from several
different sources, it is possible that the data does not use the same definitions or units. For
example, our Events table in our Student Clubs database lists the event dates using the mm/dd/
Ch.4: Data 47
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
yyyy format (e.g., 01/10/2013). A table in another database might use the format yy/mm/dd (e.g.,
13/01/10) for dates. In order for the data warehouse to match up dates, a standard date format
would have to be agreed upon and all data loaded into the data warehouse would have to be
converted to use this standard format. This process is called extraction-transformation-load
(ETL).
There are two primary schools of thought when designing a data warehouse: bottom-up and top-down. The
bottom-up approach starts by creating small data warehouses, called data marts, to solve specific business
problems. As these data marts are created, they can be combined into a larger data warehouse. The top-
down approach suggests that we should start by creating an enterprise-wide data warehouse and then, as
specific business needs are identified, create smaller data marts from the data warehouse.
Data warehouse process (top-down)
Benefits of Data Warehouses
Organizations find data warehouses quite beneficial for a number of reasons:
• The process of developing a data warehouse forces an organization to better understand the data
that it is currently collecting and, equally important, what data is not being collected.
• A data warehouse provides a centralized view of all data being collected across the enterprise and
provides a means for determining data that is inconsistent.
• Once all data is identified as consistent, an organization can generate one version of the truth.
This is important when the company wants to report consistent statistics about itself, such as
revenue or number of employees.
• By having a data warehouse, snapshots of data can be taken over time. This creates a historical
record of data, which allows for an analysis of trends.
48 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2012/07/Screen-shot-2012-12-17-at-3.42.57-PM
http://bus206.pressbooks.com/files/2012/07/Screen-shot-2012-12-17-at-3.42.57-PM
• A data warehouse provides tools to combine data, which can provide new information and
analysis.
Data Mining
Data mining is the process of analyzing data to find previously unknown trends, patterns, and associations
in order to make decisions. Generally, data mining is accomplished through automated means against
extremely large data sets, such as a data warehouse. Some examples of data mining include:
• An analysis of sales from a large grocery chain might determine that milk is purchased more
frequently the day after it rains in cities with a population of less than 50,000.
• A bank may find that loan applicants whose bank accounts show particular deposit and
withdrawal patterns are not good credit risks.
• A baseball team may find that collegiate baseball players with specific statistics in hitting,
pitching, and fielding make for more successful major league players.
In some cases, a data-mining project is begun with a hypothetical result in mind. For example, a grocery
chain may already have some idea that buying patterns change after it rains and want to get a deeper
understanding of exactly what is happening. In other cases, there are no presuppositions and a data-mining
program is run against large data sets in order to find patterns and associations.
Privacy Concerns
The increasing power of data mining has caused concerns for many, especially in the area of privacy. In
today’s digital world, it is becoming easier than ever to take data from disparate sources and combine them
to do new forms of analysis. In fact, a whole industry has sprung up around this technology: data brokers.
These firms combine publicly accessible data with information obtained from the government and other
sources to create vast warehouses of data about people and companies that they can then sell. This subject
will be covered in much more detail in chapter 12 – the chapter on the ethical concerns of information
systems.
Business Intelligence and Business Analytics
With tools such as data warehousing and data mining at their disposal, businesses are learning how to
use information to their advantage. The term business intelligence is used to describe the process that
organizations use to take data they are collecting and analyze it in the hopes of obtaining a competitive
advantage. Besides using data from their internal databases, firms often purchase information from data
brokers to get a big-picture understanding of their industries. Business analytics is the term used to describe
the use of internal company data to improve business processes and practices.
Knowledge Management
We end the chapter with a discussion on the concept of knowledge management (KM). All companies
accumulate knowledge over the course of their existence. Some of this knowledge is written down or saved,
Ch.4: Data 49
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
but not in an organized fashion. Much of this knowledge is not written down; instead, it is stored inside the
heads of its employees. Knowledge management is the process of formalizing the capture, indexing, and
storing of the company’s knowledge in order to benefit from the experiences and insights that the company
has captured during its existence.
Summary
In this chapter, we learned about the role that data and databases play in the context of information systems.
Data is made up of small facts and information without context. If you give data context, then you have
information. Knowledge is gained when information is consumed and used for decision making. A database
is an organized collection of related information. Relational databases are the most widely used type of
database, where data is structured into tables and all tables must be related to each other through unique
identifiers. A database management system (DBMS) is a software application that is used to create and
manage databases, and can take the form of a personal DBMS, used by one person, or an enterprise DBMS
that can be used by multiple users. A data warehouse is a special form of database that takes data from
other databases in an enterprise and organizes it for analysis. Data mining is the process of looking for
patterns and relationships in large data sets. Many businesses use databases, data warehouses, and data-
mining techniques in order to produce business intelligence and gain a competitive advantage.
Study Questions
1. What is the difference between data, information, and knowledge?
2. Explain in your own words how the data component relates to the hardware and software
components of information systems.
3. What is the difference between quantitative data and qualitative data? In what situations could
the number 42 be considered qualitative data?
4. What are the characteristics of a relational database?
5. When would using a personal DBMS make sense?
6. What is the difference between a spreadsheet and a database? List three differences between
them.
7. Describe what the term normalization means.
8. Why is it important to define the data type of a field when designing a relational database?
9. Name a database you interact with frequently. What would some of the field names be?
10. What is metadata?
11. Name three advantages of using a data warehouse.
12. What is data mining?
50 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Exercises
1. Review the design of the Student Clubs database earlier in this chapter. Reviewing the lists of
data types given, what data types would you assign to each of the fields in each of the tables.
What lengths would you assign to the text fields?
2. Download Apache OpenOffice.org and use the database tool to open the “Student Clubs.odb”
file available here. Take some time to learn how to modify the database structure and then see if
you can add the required items to support the tracking of faculty advisors, as described at the end
of the Normalization section in the chapter. Here is a link to the Getting Started documentation.
3. Using Microsoft Access, download the database file of comprehensive baseball statistics from
the website SeanLahman.com. (If you don’t have Microsoft Access, you can download an
abridged version of the file here that is compatible with Apache Open Office). Review the
structure of the tables included in the database. Come up with three different data-mining
experiments you would like to try, and explain which fields in which tables would have to be
analyzed.
4. Do some original research and find two examples of data mining. Summarize each example
and then write about what the two examples have in common.
5. Conduct some independent research on the process of business intelligence. Using at least two
scholarly or practitioner sources, write a two-page paper giving examples of how business
intelligence is being used.
6. Conduct some independent research on the latest technologies being used for knowledge
management. Using at least two scholarly or practitioner sources, write a two-page paper giving
examples of software applications or new technologies being used in this field.
Ch.4: Data 51
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://www.openoffice.org/download/
http://www.saylor.org/site/wp-content/uploads/2014/02/Student-Clubs.odb
http://wiki.openoffice.org/w/images/3/3c/0108GS33-GettingStartedWithBase
http://www.seanlahman.com/baseball-archive/statistics/
http://www.saylor.org/site/wp-content/uploads/2014/02/lahman.odb
Chapter 5: Networking and Communication
David T. Bourgeois
Learning Objectives
Upon successful completion of this chapter, you will be able to:
• understand the history and development of networking technologies;
• define the key terms associated with networking technologies;
• understand the importance of broadband technologies; and
• describe organizational networking.
Introduction
In the early days of computing, computers were seen as devices for making calculations, storing data,
and automating business processes. However, as the devices evolved, it became apparent that many of
the functions of telecommunications could be integrated into the computer. During the 1980s, many
organizations began combining their once-separate telecommunications and information-systems
departments into an information technology, or IT, department. This ability for computers to communicate
with one another and, maybe more importantly, to facilitate communication between individuals and
groups, has been an important factor in the growth of computing over the past several decades.
Computer networking really began in the 1960s with the birth of the Internet, as we’ll see below.
However, while the Internet and web were evolving, corporate networking was also taking shape in
the form of local area networks and client-server computing. In the 1990s, when the Internet came
of age, Internet technologies began to pervade all areas of the organization. Now, with the Internet a
global phenomenon, it would be unthinkable to have a computer that did not include communications
capabilities. This chapter will review the different technologies that have been put in place to enable this
communications revolution.
A Brief History of the Internet
In the Beginning: ARPANET
The story of the Internet, and networking in general, can be traced back to the late 1950s. The US was in
the depths of the Cold War with the USSR, and each nation closely watched the other to determine which
would gain a military or intelligence advantage. In 1957, the Soviets surprised the US with the launch of
Sputnik, propelling us into the space age. In response to Sputnik, the US Government created the Advanced
Research Projects Agency (ARPA), whose initial role was to ensure that the US was not surprised again.
It was from ARPA, now called DARPA (Defense Advanced Research Projects Agency), that the Internet
first sprang.
52
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://history.nasa.gov/sputnik/
http://history.nasa.gov/sputnik/
ARPA was the center of computing research in the 1960s, but there was just one problem: many of the
computers could not talk to each other. In 1968, ARPA sent out a request for proposals for a communication
technology that would allow different computers located around the country to be integrated together
into one network. Twelve companies responded to the request, and a company named Bolt, Beranek, and
Newman (BBN) won the contract. They began work right away and were able to complete the job just one
year later: in September, 1969, the ARPANET was turned on. The first four nodes were at UCLA, Stanford,
MIT, and the University of Utah.
The Internet and the World Wide Web
Over the next decade, the ARPANET grew and gained popularity. During this time, other networks also
came into existence. Different organizations were connected to different networks. This led to a problem:
the networks could not talk to each other. Each network used its own proprietary language, or protocol (see
sidebar for the definition of protocol), to send information back and forth. This problem was solved
by the invention of transmission control protocol/Internet protocol (TCP/IP). TCP/IP was designed to
allow networks running on different protocols to have an intermediary protocol that would allow them
to communicate. So as long as your network supported TCP/IP, you could communicate with all of the
other networks running TCP/IP. TCP/IP quickly became the standard protocol and allowed networks to
communicate with each other. It is from this breakthrough that we first got the term Internet, which simply
means “an interconnected network of networks.”
Sidebar: An Internet Vocabulary Lesson
Networking communication is full of some very technical concepts based on some simple principles. Learn
the terms below and you’ll be able to hold your own in a conversation about the Internet.
• Packet: The fundamental unit of data transmitted over the Internet. When a device intends to send
a message to another device (for example, your PC sends a request to YouTube to open a video),
it breaks the message down into smaller pieces, called packets. Each packet has the sender’s
address, the destination address, a sequence number, and a piece of the overall message to be sent.
• Hub: A simple network device that connects other devices to the network and sends packets to all
the devices connected to it.
• Bridge: A network device that connects two networks together and only allows packets through
that are needed.
• Switch: A network device that connects multiple devices together and filters packets based on
their destination within the connected devices.
• Router: A device that receives and analyzes packets and then routes them towards their
destination. In some cases, a router will send a packet to another router; in other cases, it will send
it directly to its destination.
• IP Address: Every device that communicates on the Internet, whether it be a personal computer, a
tablet, a smartphone, or anything else, is assigned a unique identifying number called an IP
Ch.5: Networking 53
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bbn.com/about/timeline/arpanet
(Internet Protocol) address. Historically, the IP-address standard used has been IPv4 (version 4),
which has the format of four numbers between 0 and 255 separated by a period. For example, the
domain Saylor.org has the IP address of 107.23.196.166. The IPv4 standard has a limit
of 4,294,967,296 possible addresses. As the use of the Internet has proliferated, the number of IP
addresses needed has grown to the point where the use of IPv4 addresses will be exhausted. This
has led to the new IPv6 standard, which is currently being phased in. The IPv6 standard is
formatted as eight groups of four hexadecimal digits, such as
2001:0db8:85a3:0042:1000:8a2e:0370:7334. The IPv6 standard has a limit of 3.4×1038 possible
addresses. For more detail about the new IPv6 standard, see this Wikipedia article.
• Domain name: If you had to try to remember the IP address of every web server you wanted to
access, the Internet would not be nearly as easy to use. A domain name is a human-friendly name
for a device on the Internet. These names generally consist of a descriptive text followed by the
top-level domain (TLD). For example, Wikepedia’s domain name is wikipedia.org; wikipedia
describes the organization and .org is the top-level domain. In this case, the .org TLD is designed
for nonprofit organizations. Other well-known TLDs include .com, .net, and .gov. For a complete
list and description of domain names, see this Wikipedia article.
• DNS: DNS stands for “domain name system,” which acts as the directory on the Internet. When a
request to access a device with a domain name is given, a DNS server is queried. It returns the IP
address of the device requested, allowing for proper routing.
• Packet-switching: When a packet is sent from one device out over the Internet, it does not follow
a straight path to its destination. Instead, it is passed from one router to another across the Internet
until it is reaches its destination. In fact, sometimes two packets from the same message will take
different routes! Sometimes, packets will arrive at their destination out of order. When this
happens, the receiving device restores them to their proper order. For more details on packet-
switching, see this interactive web page.
• Protocol: In computer networking, a protocol is the set of rules that allow two (or more) devices
to exchange information back and forth across the network.
54 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
https://en.wikipedia.org/wiki/IPv6
http://en.wikipedia.org/wiki/Domain_name
http://www.pbs.org/opb/nerds2.0.1/geek_glossary/packet_switching_flash.html
Worldwide Internet use over 24-hour period (click to go to site of origin). (Public Domain. Courtesy of the
Internet Census 2012 project.)
As we moved into the 1980s, computers were added to the Internet at an increasing rate. These computers
were primarily from government, academic, and research organizations. Much to the surprise of the
engineers, the early popularity of the Internet was driven by the use of electronic mail (see sidebar below).
Using the Internet in these early days was not easy. In order to access information on another server,
you had to know how to type in the commands necessary to access it, as well as know the name of that
device. That all changed in 1990, when Tim Berners-Lee introduced his World Wide Web project, which
provided an easy way to navigate the Internet through the use of linked text (hypertext). The World Wide
Web gained even more steam with the release of the Mosaic browser in 1993, which allowed graphics and
text to be combined together as a way to present information and navigate the Internet. The Mosaic browser
took off in popularity and was soon superseded by Netscape Navigator, the first commercial web browser,
in 1994. The Internet and the World Wide Web were now poised for growth. The chart below shows the
growth in users from the early days until now.
Growth of internet usage, 1995–2012 (click to
enlarge). Data taken from InternetWorldStats.com.
Ch.5: Networking 55
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://internetcensus2012.bitbucket.org/images/geovideo.gif
http://internetcensus2012.bitbucket.org/images/geovideo.gif
http://bus206.pressbooks.com/files/2013/05/InternetGrowth
http://bus206.pressbooks.com/files/2013/05/InternetGrowth
The Dot-Com Bubble
In the 1980s and early 1990s, the Internet was being managed by the National Science Foundation (NSF).
The NSF had restricted commercial ventures on the Internet, which meant that no one could buy or sell
anything online. In 1991, the NSF transferred its role to three other organizations, thus getting the US
government out of direct control over the Internet and essentially opening up commerce online.
This new commercialization of the Internet led to what is now known as the dot-com bubble. A frenzy
of investment in new dot-com companies took place in the late 1990s, running up the stock market to
new highs on a daily basis. This investment bubble was driven by the fact that investors knew that online
commerce would change everything. Unfortunately, many of these new companies had poor business
models and ended up with little to show for all of the funds that were invested in them. In 2000 and 2001,
the bubble burst and many of these new companies went out of business. Many companies also survived,
including the still-thriving Amazon (started in 1994) and eBay (1995). After the dot-com bubble burst, a
new reality became clear: in order to succeed online, e-business companies would need to develop real
business models and show that they could survive financially using this new technology.
Web 2.0
In the first few years of the World Wide Web, creating and putting up a website required a specific set
of knowledge: you had to know how to set up a server on the World Wide Web, how to get a domain
name, how to write web pages in HTML, and how to troubleshoot various technical issues as they came up.
Someone who did these jobs for a website became known as a webmaster.
As the web gained in popularity, it became more and more apparent that those who did not have the
skills to be a webmaster still wanted to create online content and have their own piece of the web. This need
was met with new technologies that provided a website framework for those who wanted to put content
online. Blogger and Wikipedia are examples of these early Web 2.0 applications, which allowed anyone
with something to say a place to go and say it, without the need for understanding HTML or web-server
technology.
Starting in the early 2000s, Web 2.0 applications began a second bubble of optimism and investment.
It seemed that everyone wanted their own blog or photo-sharing site. Here are some of the companies
that came of age during this time: MySpace (2003), Photobucket (2003), Flickr (2004), Facebook (2004),
WordPress (2005), Tumblr (2006), and Twitter (2006). The ultimate indication that Web 2.0 had taken hold
was when Time magazine named “You” its “Person of the Year” in 2006.
Sidebar: E-mail Is the “Killer” App for the Internet
When the personal computer was created, it was a great little toy for technology hobbyists and armchair
programmers. As soon as the spreadsheet was invented, however, businesses took notice, and the rest is
history. The spreadsheet was the killer app for the personal computer: people bought PCs just so they could
run spreadsheets.
The Internet was originally designed as a way for scientists and researchers to share information
and computing power among themselves. However, as soon as electronic mail was invented, it began
driving demand for the Internet. This wasn’t what the developers had in mind, but it turned out that people
connecting to people was the killer app for the Internet.
We are seeing this again today with social networks, specifically Facebook. Many who weren’t
convinced to have an online presence now feel left out without a Facebook account. The connections made
56 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://blogger.com
http://wikipedia.com
between people using Web 2.0 applications like Facebook on their personal computer or smartphone is
driving growth yet again.
Sidebar: The Internet and the World Wide Web Are Not the Same Thing
Many times, the terms “Internet” and “World Wide Web,” or even just “the web,” are used interchangeably.
But really, they are not the same thing at all! The Internet is an interconnected network of networks. Many
services run across the Internet: electronic mail, voice and video, file transfers, and, yes, the World Wide
Web.
The World Wide Web is simply one piece of the Internet. It is made up of web servers that have HTML
pages that are being viewed on devices with web browsers. It is really that simple.
The Growth of Broadband
In the early days of the Internet, most access was done via a modem over an analog telephone line. A
modem (short for “modulator-demodulator”) was connected to the incoming phone line and a computer in
order to connect you to a network. Speeds were measured in bits-per-second (bps), with speeds growing
from 1200 bps to 56,000 bps over the years. Connection to the Internet via these modems is called dial-
up access. Dial-up was very inconvenient because it tied up the phone line. As the web became more and
more interactive, dial-up also hindered usage, as users wanted to transfer more and more data. As a point of
reference, downloading a typical 3.5 mb song would take 24 minutes at 1200 bps and 2 minutes at 28,800
bps.
A broadband connection is defined as one that has speeds of at least 256,000 bps, though most
connections today are much faster, measured in millions of bits per second (megabits or mbps) or even
billions (gigabits). For the home user, a broadband connection is usually accomplished via the cable
television lines or phone lines (DSL). Both cable and DSL have similar prices and speeds, though each
individual may find that one is better than the other for their specific area. Speeds for cable and DSL can
vary during different times of the day or week, depending upon how much data traffic is being used. In
more remote areas, where cable and phone companies do not provide access, home Internet connections
can be made via satellite. The average home broadband speed is anywhere between 3 mbps and 30 mbps.
At 10 mbps, downloading a typical 3.5 mb song would take less than a second. For businesses who require
more bandwidth and reliability, telecommunications companies can provide other options, such as T1 and
T3 lines.
Ch.5: Networking 57
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Growth of broadband use (Source: Pew Internet and American Life Project
Surveys)
Broadband access is important because it impacts how the Internet is used. When a community has access
to broadband, it allows them to interact more online and increases the usage of digital tools overall. Access
to broadband is now considered a basic human right by the United Nations, as declared in their 2011
statement:
“Broadband technologies are fundamentally transforming the way we live,” the Broadband
Commission for Digital Development, set up last year by the UN Educational Scientific and Cultural
Organization (UNESCO) and the UN International Telecommunications Union (ITU), said in issuing
“The Broadband Challenge” at a leadership summit in Geneva.
“It is vital that no one be excluded from the new global knowledge societies we are building. We
believe that communication is not just a human need – it is a right.”1
1. “UN sets goal of bringing broadband to half developing world’s people by 2015.”, UN News Center website, http://www.un.org/apps/news/
story.asp?Cr=broadband&NewsID=40191#.Ut7JOmTTk1J
58 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://pewinternet.org/Commentary/2012/May/~/media/Infographics/Report%20Infographics/2013/23%20-%20Broadband/FT_13.08.29_BroadbandUse_420px%20%281%29 ?w=420&h=530&as=1
http://pewinternet.org/Commentary/2012/May/~/media/Infographics/Report%20Infographics/2013/23%20-%20Broadband/FT_13.08.29_BroadbandUse_420px%20%281%29 ?w=420&h=530&as=1
http://www.un.org/apps/news/story.asp?Cr=broadband&NewsID=40191#.UZlTSyvSOPU
http://www.un.org/apps/news/story.asp?Cr=broadband&NewsID=40191#.UZlTSyvSOPU
http://www.unesco.org/new/en/unesco/
http://www.itu.int/
http://www.broadbandcommission.org/Documents/Broadband_Challenge
Wireless Networking
Today we are used to being able to access the Internet wherever we go. Our smartphones can access the
Internet; Starbucks provides wireless “hotspots” for our laptops or iPads. These wireless technologies have
made Internet access more convenient and have made devices such as tablets and laptops much more
functional. Let’s examine a few of these wireless technologies.
Wi-Fi
Wi-Fi is a technology that takes an Internet signal and converts it into radio waves. These radio waves can
be picked up within a radius of approximately 65 feet by devices with a wireless adapter. Several Wi-Fi
specifications have been developed over the years, starting with 802.11b (1999), followed by the 802.11g
specification in 2003 and 802.11n in 2009. Each new specification improved the speed and range of Wi-
Fi, allowing for more uses. One of the primary places where Wi-Fi is being used is in the home. Home
users are purchasing Wi-Fi routers, connecting them to their broadband connections, and then connecting
multiple devices via Wi-Fi.
Mobile Network
As the cellphone has evolved into the smartphone, the desire for Internet access on these devices has led
to data networks being included as part of the mobile phone network. While Internet connections were
technically available earlier, it was really with the release of the 3G networks in 2001 (2002 in the US) that
smartphones and other cellular devices could access data from the Internet. This new capability drove the
market for new and more powerful smartphones, such as the iPhone, introduced in 2007. In 2011, wireless
carriers began offering 4G data speeds, giving the cellular networks the same speeds that customers were
used to getting via their home connection.
Sidebar: Why Doesn’t My Cellphone Work When I Travel Abroad?
As mobile phone technologies have evolved, providers in different countries have chosen different
communication standards for their mobile phone networks. In the US, both of the two competing standards
exist: GSM (used by AT&T and T-Mobile) and CDMA (used by the other major carriers). Each standard
has its pros and cons, but the bottom line is that phones using one standard cannot easily switch to the other.
In the US, this is not a big deal because mobile networks exist to support both standards. But when you
travel to other countries, you will find that most of them use GSM networks, with the one big exception
being Japan, which has standardized on CDMA. It is possible for a mobile phone using one type of network
to switch to the other type of network by switching out the SIM card, which controls your access to the
mobile network. However, this will not work in all cases. If you are traveling abroad, it is always best to
consult with your mobile provider to determine the best way to access a mobile network.
Ch.5: Networking 59
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Bluetooth
While Bluetooth is not generally used to connect a device to the Internet, it
is an important wireless technology that has enabled many functionalities that
are used every day. When created in 1994 by Ericsson, it was intended to
replace wired connections between devices. Today, it is the standard method for
connecting nearby devices wirelessly. Bluetooth has a range of approximately
300 feet and consumes very little power, making it an excellent choice for a
variety of purposes. Some applications of Bluetooth include: connecting a
printer to a personal computer, connecting a mobile phone and headset,
connecting a wireless keyboard and mouse to a computer, and connecting a
remote for a presentation made on a personal computer.
VoIP
A typical VoIP communication. Image courtesy of
BroadVoice.
A growing class of data being transferred over the Internet is voice data. A protocol called voice over IP,
or VoIP, enables sounds to be converted to a digital format for transmission over the Internet and then re-
created at the other end. By using many existing technologies and software, voice communication over the
Internet is now available to anyone with a browser (think Skype, Google Hangouts). Beyond this, many
companies are now offering VoIP-based telephone service for business and home use.
60 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2013/05/Bluetooth-logo
http://bus206.pressbooks.com/files/2013/05/Bluetooth-logo
http://bus206.pressbooks.com/files/2013/06/Voip-typical.gif
http://bus206.pressbooks.com/files/2013/06/Voip-typical.gif
Scope of business networks
Organizational Networking
LAN and WAN
While the Internet was evolving and creating a way for
organizations to connect to each other and the world,
another revolution was taking place inside organizations.
The proliferation of personal computers inside
organizations led to the need to share resources such as
printers, scanners, and data. Organizations solved this
problem through the creation of local area networks
(LANs), which allowed computers to connect to each other
and to peripherals. These same networks also allowed
personal computers to hook up to legacy mainframe
computers.
An LAN is (by definition) a local network, usually
operating in the same building or on the same campus.
When an organization needed to provide a network over a
wider area (with locations in different cities or states, for
example), they would build a wide area network (WAN).
Client-Server
The personal computer originally was used as a stand-alone computing device. A program was installed
on the computer and then used to do word processing or number crunching. However, with the advent
of networking and local area networks, computers could work together to solve problems. Higher-end
computers were installed as servers, and users on the local network could run applications and share
information among departments and organizations. This is called client-server computing.
Intranet
Just as organizations set up web sites to provide global access to information about their business, they also
set up internal web pages to provide information about the organization to the employees. This internal
set of web pages is called an intranet. Web pages on the intranet are not accessible to those outside the
company; in fact, those pages would come up as “not found” if an employee tried to access them from
outside the company’s network.
Extranet
Sometimes an organization wants to be able to collaborate with its customers or suppliers while at the same
time maintaining the security of being inside its own network. In cases like this a company may want to
create an extranet, which is a part of the company’s network that can be made available securely to those
outside of the company. Extranets can be used to allow customers to log in and check the status of their
orders, or for suppliers to check their customers’ inventory levels.
Ch.5: Networking 61
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2013/05/Intranet-Extranet-Diagram1
http://bus206.pressbooks.com/files/2013/05/Intranet-Extranet-Diagram1
Sometimes, an organization will need to allow someone who is not located physically within its
internal network to gain access. This access can be provided by a virtual private network (VPN). VPNs will
be discussed further in the chapter 6 (on information security).
Sidebar: Microsoft’s SharePoint Powers the Intranet
As organizations begin to see the power of collaboration between their employees, they often look for
solutions that will allow them to leverage their intranet to enable more collaboration. Since most companies
use Microsoft products for much of their computing, it is only natural that they have looked to Microsoft to
provide a solution. This solution is Microsoft’s SharePoint.
SharePoint provides a communication and collaboration platform that integrates seamlessly with
Microsoft’s Office suite of applications. Using SharePoint, employees can share a document and edit it
together – no more e-mailing that Word document to everyone for review. Projects and documents can be
managed collaboratively across the organization. Corporate documents are indexed and made available for
search. No more asking around for that procedures document – now you just search for it in SharePoint. For
organizations looking to add a social networking component to their intranet, Microsoft offers Yammer,
which can be used by itself or integrated into SharePoint.
Cloud Computing
We covered cloud computing in chapter 3, but it should also be mentioned here. The universal availability
of the Internet combined with increases in processing power and data-storage capacity have made cloud
computing a viable option for many companies. Using cloud computing, companies or individuals can
contract to store data on storage devices somewhere on the Internet. Applications can be “rented” as needed,
giving a company the ability to quickly deploy new applications. You can read about cloud computing in
more detail in chapter 3.
Sidebar: Metcalfe’s Law
Just as Moore’s Law describes how computing power is increasing over time, Metcalfe’s Law describes the
power of networking. Specifically, Metcalfe’s Law states that the value of a telecommunications network is
proportional to the square of the number of connected users of the system. Think about it this way: If none
of your friends were on Facebook, would you spend much time there? If no one else at your school or place
of work had e-mail, would it be very useful to you? Metcalfe’s Law tries to quantify this value.
Summary
The networking revolution has completely changed how the computer is used. Today, no one would
imagine using a computer that was not connected to one or more networks. The development of the
Internet and World Wide Web, combined with wireless access, has made information available at our
fingertips. The Web 2.0 revolution has made us all authors of web content. As networking technology
has matured, the use of Internet technologies has become a standard for every type of organization. The
use of intranets and extranets has allowed organizations to deploy functionality to employees and business
62 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
partners alike, increasing efficiencies and improving communications. Cloud computing has truly made
information available everywhere and has serious implications for the role of the IT department.
Study Questions
1. What were the first four locations hooked up to the Internet (ARPANET)?
2. What does the term packet mean?
3. Which came first, the Internet or the World Wide Web?
4. What was revolutionary about Web 2.0?
5. What was the so-called killer app for the Internet?
6. What makes a connection a broadband connection?
7. What does the term VoIP mean?
8. What is an LAN?
9. What is the difference between an intranet and an extranet?
10. What is Metcalfe’s Law?
Exercises
1. What is the IP address of your computer? How did you find out? What is the IP address of
google.com? How did you find out? Did you get IPv4 or IPv6 addresses?
2. What is the difference between the Internet and the World Wide Web? Create at least three
statements that identify the differences between the two.
3. Who are the broadband providers in your area? What are the prices and speeds offered?
4. Pretend you are planning a trip to three foreign countries in the next month. Consult your
wireless carrier to determine if your mobile phone would work properly in those countries. What
would the costs be? What alternatives do you have if it would not work?
Ch.5: Networking 63
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
The security triad
Chapter 6: Information Systems Security
David T. Bourgeois
Learning Objectives
Upon successful completion of this chapter, you will be able to:
• identify the information security triad;
• identify and understand the high-level concepts surrounding information security tools; and
• secure yourself digitally.
Introduction
As computers and other digital devices have become essential to business and commerce, they have also
increasingly become a target for attacks. In order for a company or an individual to use a computing device
with confidence, they must first be assured that the device is not compromised in any way and that all
communications will be secure. In this chapter, we will review the fundamental concepts of information
systems security and discuss some of the measures that can be taken to mitigate security threats. We will
begin with an overview focusing on how organizations can stay secure. Several different measures that a
company can take to improve security will be discussed. We will then follow up by reviewing security
precautions that individuals can take in order to secure their personal computing environment.
The Information Security Triad:
Confidentiality, Integrity, Availability (CIA)
Confidentiality
When protecting information, we want to be able to restrict
access to those who are allowed to see it; everyone else
should be disallowed from learning anything about its
contents. This is the essence of confidentiality. For
example, federal law requires that universities restrict
access to private student information. The university must
be sure that only those who are authorized have access to
view the grade records.
Integrity
Integrity is the assurance that the information being accessed has not been altered and truly represents
what is intended. Just as a person with integrity means what he or she says and can be trusted to
consistently represent the truth, information integrity means information truly represents its intended
64
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
meaning. Information can lose its integrity through malicious intent, such as when someone who is not
authorized makes a change to intentionally misrepresent something. An example of this would be when a
hacker is hired to go into the university’s system and change a grade.
Integrity can also be lost unintentionally, such as when a computer power surge corrupts a file or
someone authorized to make a change accidentally deletes a file or enters incorrect information.
Availability
Information availability is the third part of the CIA triad. Availability means that information can be
accessed and modified by anyone authorized to do so in an appropriate timeframe. Depending on the
type of information, appropriate timeframe can mean different things. For example, a stock trader needs
information to be available immediately, while a sales person may be happy to get sales numbers for the
day in a report the next morning. Companies such as Amazon.com will require their servers to be available
twenty-four hours a day, seven days a week. Other companies may not suffer if their web servers are down
for a few minutes once in a while.
Tools for Information Security
In order to ensure the confidentiality, integrity, and availability of information, organizations can choose
from a variety of tools. Each of these tools can be utilized as part of an overall information-security policy,
which will be discussed in the next section.
Authentication
The most common way to identify someone is through their physical appearance, but how do we identify
someone sitting behind a computer screen or at the ATM? Tools for authentication are used to ensure that
the person accessing the information is, indeed, who they present themselves to be.
Authentication can be accomplished by identifying someone through one or more of three factors:
something they know, something they have, or something they are. For example, the most common
form of authentication today is the user ID and password. In this case, the authentication is done by
confirming something that the user knows (their ID and password). But this form of authentication is easy to
compromise (see sidebar) and stronger forms of authentication are sometimes needed. Identifying someone
only by something they have, such as a key or a card, can also be problematic. When that identifying token
is lost or stolen, the identity can be easily stolen. The final factor, something you are, is much harder to
compromise. This factor identifies a user through the use of a physical characteristic, such as an eye-scan
or fingerprint. Identifying someone through their physical characteristics is called biometrics.
A more secure way to authenticate a user is to do multi-factor authentication. By combining two or
more of the factors listed above, it becomes much more difficult for someone to misrepresent themselves.
An example of this would be the use of an RSA SecurID token. The RSA device is something you have,
and will generate a new access code every sixty seconds. To log in to an information resource using the
RSA device, you combine something you know, a four-digit PIN, with the code generated by the device.
The only way to properly authenticate is by both knowing the code and having the RSA device.
IS Security 65
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://www.rsa.com/node.aspx?id=1159
Access Control
Once a user has been authenticated, the next step is to ensure that they can only access the information
resources that are appropriate. This is done through the use of access control. Access control determines
which users are authorized to read, modify, add, and/or delete information. Several different access control
models exist. Here we will discuss two: the access control list (ACL) and role-based access control
(RBAC).
For each information resource that an organization wishes to manage, a list of users who have the
ability to take specific actions can be created. This is an access control list, or ACL. For each user, specific
capabilities are assigned, such as read, write, delete, or add. Only users with those capabilities are allowed
to perform those functions. If a user is not on the list, they have no ability to even know that the information
resource exists.
ACLs are simple to understand and maintain. However, they have several drawbacks. The primary
drawback is that each information resource is managed separately, so if a security administrator wanted to
add or remove a user to a large set of information resources, it would be quite difficult. And as the number
of users and resources increase, ACLs become harder to maintain. This has led to an improved method of
access control, called role-based access control, or RBAC. With RBAC, instead of giving specific users
access rights to an information resource, users are assigned to roles and then those roles are assigned the
access. This allows the administrators to manage users and roles separately, simplifying administration and,
by extension, improving security.
Comparison of ACL and RBAC (click to enlarge)
Encryption
Many times, an organization needs to transmit information over the Internet or transfer it on external media
such as a CD or flash drive. In these cases, even with proper authentication and access control, it is possible
for an unauthorized person to get access to the data. Encryption is a process of encoding data upon its
transmission or storage so that only authorized individuals can read it. This encoding is accomplished by
a computer program, which encodes the plain text that needs to be transmitted; then the recipient receives
the cipher text and decodes it (decryption). In order for this to work, the sender and receiver need to agree
on the method of encoding so that both parties can communicate properly. Both parties share the encryption
key, enabling them to encode and decode each other’s messages. This is called symmetric key encryption.
This type of encryption is problematic because the key is available in two different places.
An alternative to symmetric key encryption is public key encryption. In public key encryption, two
keys are used: a public key and a private key. To send an encrypted message, you obtain the public key,
encode the message, and send it. The recipient then uses the private key to decode it. The public key can be
given to anyone who wishes to send the recipient a message. Each user simply needs one private key and
66 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2013/06/ACL-RBAC
http://bus206.pressbooks.com/files/2013/06/ACL-RBAC
one public key in order to secure messages. The private key is necessary in order to decrypt something sent
with the public key.
Public key encryption (click for larger diagram)
Sidebar: Password Security
So why is using just a simple user ID/password not considered a secure method of authentication? It turns
out that this single-factor authentication is extremely easy to compromise. Good password policies must
be put in place in order to ensure that passwords cannot be compromised. Below are some of the more
common policies that organizations should put in place.
• Require complex passwords. One reason passwords are compromised is that they can be easily guessed. A recent study
found that the top three passwords people used in 2012 were password, 123456 and 12345678.1 A password should not
be simple, or a word that can be found in a dictionary. One of the first things a hacker will do is try to crack a password
1. “Born to be breached” by Sean Gallagher on Nov 3 2012. Arstechnica. Retrieved from http://arstechnica.com/information-technology/2012/11/born-to-be-breached-the-
worst-passwords-are-still-the-most-common/ on May 15, 2013.
IS Security 67
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2013/06/Encryption-diagram2
http://bus206.pressbooks.com/files/2013/06/Encryption-diagram2
by testing every term in the dictionary! Instead, a good password policy is one that requires the use of a minimum of
eight characters, and at least one upper-case letter, one special character, and one number.
• Change passwords regularly. It is essential that users change their passwords on a regular basis. Users should change
their passwords every sixty to ninety days, ensuring that any passwords that might have been stolen or guessed will not
be able to be used against the company.
• Train employees not to give away passwords. One of the primary methods that is used to steal passwords is to simply
figure them out by asking the users or administrators. Pretexting occurs when an attacker calls a helpdesk or security
administrator and pretends to be a particular authorized user having trouble logging in. Then, by providing some
personal information about the authorized user, the attacker convinces the security person to reset the password and tell
him what it is. Another way that employees may be tricked into giving away passwords is through e-mail phishing.
Phishing occurs when a user receives an e-mail that looks as if it is from a trusted source, such as their bank, or their
employer. In the e-mail, the user is asked to click a link and log in to a website that mimics the genuine website and
enter their ID and password, which are then captured by the attacker.
Backups
Another essential tool for information security is a comprehensive backup plan for the entire organization.
Not only should the data on the corporate servers be backed up, but individual computers used throughout
the organization should also be backed up. A good backup plan should consist of several components.
• A full understanding of the organizational information resources. What information does the organization actually have?
Where is it stored? Some data may be stored on the organization’s servers, other data on users’ hard drives, some in the
cloud, and some on third-party sites. An organization should make a full inventory of all of the information that needs to
be backed up and determine the best way back it up.
• Regular backups of all data. The frequency of backups should be based on how important the data is to the company,
combined with the ability of the company to replace any data that is lost. Critical data should be backed up daily, while
less critical data could be backed up weekly.
• Offsite storage of backup data sets. If all of the backup data is being stored in the same facility as the original copies of
the data, then a single event, such as an earthquake, fire, or tornado, would take out both the original data and the
backup! It is essential that part of the backup plan is to store the data in an offsite location.
• Test of data restoration. On a regular basis, the backups should be put to the test by having some
of the data restored. This will ensure that the process is working and will give the organization
confidence in the backup plan.
Besides these considerations, organizations should also examine their operations to determine what effect
downtime would have on their business. If their information technology were to be unavailable for any
sustained period of time, how would it impact the business?
Additional concepts related to backup include the following:
68 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Network configuration with firewalls, IDS, and a
DMZ. Click to enlarge.
• Universal Power Supply (UPS). A UPS is a device that provides battery backup to critical components of the system,
allowing them to stay online longer and/or allowing the IT staff to shut them down using proper procedures in order to
prevent the data loss that might occur from a power failure.
• Alternate, or “hot” sites. Some organizations choose to have an alternate site where an exact replica of their critical data
is always kept up to date. When the primary site goes down, the alternate site is immediately brought online so that little
or no downtime is experienced.
As information has become a strategic asset, a whole industry has sprung up around the technologies
necessary for implementing a proper backup strategy. A company can contract with a service provider to
back up all of their data or they can purchase large amounts of online storage space and do it themselves.
Technologies such as storage area networks and archival systems are now used by most large businesses.
Firewalls
Another method that an organization should use to
increase security on its network is a firewall. A firewall
can exist as hardware or software (or both). A hardware
firewall is a device that is connected to the network and
filters the packets based on a set of rules. A software
firewall runs on the operating system and intercepts
packets as they arrive to a computer. A firewall protects all
company servers and computers by stopping packets from
outside the organization’s network that do not meet a strict
set of criteria. A firewall may also be configured to restrict
the flow of packets leaving the organization. This may be
done to eliminate the possibility of employees watching
YouTube videos or using Facebook from a company
computer.
Some organizations may choose to implement
multiple firewalls as part of their network security
configuration, creating one or more sections of their
network that are partially secured. This segment of the
network is referred to as a DMZ, borrowing the term demilitarized zone from the military, and it is where an
organization may place resources that need broader access but still need to be secured.
Intrusion Detection Systems
Another device that can be placed on the network for security purposes is an intrusion detection system, or
IDS. An IDS does not add any additional security; instead, it provides the functionality to identify if the
network is being attacked. An IDS can be configured to watch for specific types of activities and then alert
security personnel if that activity occurs. An IDS also can log various types of traffic on the network for
analysis later. An IDS is an essential part of any good security setup.
IS Security 69
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2013/06/Firewall-DMZ-IDS
http://bus206.pressbooks.com/files/2013/06/Firewall-DMZ-IDS
Sidebar: Virtual Private Networks
Using firewalls and other security technologies, organizations can effectively protect many of their
information resources by making them invisible to the outside world. But what if an employee working
from home requires access to some of these resources? What if a consultant is hired who needs to do work
on the internal corporate network from a remote location? In these cases, a virtual private network (VPN)
is called for.
A VPN allows a user who is outside of a corporate network to take a detour around the firewall and
access the internal network from the outside. Through a combination of software and security measures, this
lets an organization allow limited access to its networks while at the same time ensuring overall security.
Physical Security
An organization can implement the best authentication scheme in the world, develop the best access control,
and install firewalls and intrusion prevention, but its security cannot be complete without implementation of
physical security. Physical security is the protection of the actual hardware and networking components that
store and transmit information resources. To implement physical security, an organization must identify all
of the vulnerable resources and take measures to ensure that these resources cannot be physically tampered
with or stolen. These measures include the following.
• Locked doors: It may seem obvious, but all the security in the world is useless if an intruder can simply walk in and
physically remove a computing device. High-value information assets should be secured in a location with limited
access.
• Physical intrusion detection: High-value information assets should be monitored through the use of security cameras and
other means to detect unauthorized access to the physical locations where they exist.
• Secured equipment: Devices should be locked down to prevent them from being stolen. One employee’s hard drive
could contain all of your customer information, so it is essential that it be secured.
• Environmental monitoring: An organization’s servers and other high-value equipment should always be kept in a room
that is monitored for temperature, humidity, and airflow. The risk of a server failure rises when these factors go out of a
specified range.
• Employee training: One of the most common ways thieves steal corporate information is to steal
employee laptops while employees are traveling. Employees should be trained to secure their
equipment whenever they are away from the office.
Security Policies
Besides the technical controls listed above, organizations also need to implement security policies as a
form of administrative control. In fact, these policies should really be a starting point in developing an
overall security plan. A good information-security policy lays out the guidelines for employee use of the
information resources of the company and provides the company recourse in the case that an employee
violates a policy.
70 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
According to the SANS Institute, a good policy is “a formal, brief, and high-level statement or plan
that embraces an organization’s general beliefs, goals, objectives, and acceptable procedures for a specified
subject area.” Policies require compliance; failure to comply with a policy will result in disciplinary action.
A policy does not lay out the specific technical details, instead it focuses on the desired results. A security
policy should be based on the guiding principles of confidentiality, integrity, and availability.2
A good example of a security policy that many will be familiar with is a web use policy. A web use
policy lays out the responsibilities of company employees as they use company resources to access the
Internet. A good example of a web use policy is included in Harvard University’s “Computer Rules and
Responsibilities” policy, which can be found here.
A security policy should also address any governmental or industry regulations that apply to the
organization. For example, if the organization is a university, it must be aware of the Family Educational
Rights and Privacy Act (FERPA), which restricts who has access to student information. Health care
organizations are obligated to follow several regulations, such as the Health Insurance Portability and
Accountability Act (HIPAA).
A good resource for learning more about security policies is the SANS Institute’s Information Security
Policy Page.
Sidebar: Mobile Security
As the use of mobile devices such as smartphones and tablets proliferates, organizations must be ready
to address the unique security concerns that the use of these devices bring. One of the first questions an
organization must consider is whether to allow mobile devices in the workplace at all. Many employees
already have these devices, so the question becomes: Should we allow employees to bring their own devices
and use them as part of their employment activities? Or should we provide the devices to our employees?
Creating a BYOD (“Bring Your Own Device”) policy allows employees to integrate themselves more
fully into their job and can bring higher employee satisfaction and productivity. In many cases, it may be
virtually impossible to prevent employees from having their own smartphones or iPads in the workplace. If
the organization provides the devices to its employees, it gains more control over use of the devices, but it
also exposes itself to the possibility of an administrative (and costly) mess.
Mobile devices can pose many unique security challenges to an organization. Probably one of the
biggest concerns is theft of intellectual property. For an employee with malicious intent, it would be a
very simple process to connect a mobile device either to a computer via the USB port, or wirelessly to the
corporate network, and download confidential data. It would also be easy to secretly take a high-quality
picture using a built-in camera.
When an employee does have permission to access and save company data on his or her device, a
different security threat emerges: that device now becomes a target for thieves. Theft of mobile devices (in
this case, including laptops) is one of the primary methods that data thieves use.
So what can be done to secure mobile devices? It will start with a good policy regarding their use.
According to a 2013 SANS study, organizations should consider developing a mobile device policy that
addresses the following issues: use of the camera, use of voice recording, application purchases, encryption
at rest, Wi-Fi autoconnect settings, bluetooth settings, VPN use, password settings, lost or stolen device
reporting, and backup. 3
2. SANS Institute. “A Short Primer for Developing Security Policies.” Accessed from http://www.sans.org/security-resources/policies/
Policy_Primer on May 31, 2013.
IS Security 71
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://www.fas-it.fas.harvard.edu/services/student/policies/rules_and_responsibilities
http://www.sans.org/security-resources/policies/
http://www.sans.org/security-resources/policies/
Poster from Stop. Think. Connect. Click to
enlarge. (Copyright: Stop. Think. Connect.
http://stopthinkconnect.org/resources)
Besides policies, there are several different tools that an organization can use to mitigate some of these
risks. For example, if a device is stolen or lost, geolocation software can help the organization find it. In
some cases, it may even make sense to install remote data-removal software, which will remove data from
a device if it becomes a security risk.
Usability
When looking to secure information resources, organizations must balance the need for security with users’
need to effectively access and use these resources. If a system’s security measures make it difficult to use,
then users will find ways around the security, which may make the system more vulnerable than it would
have been without the security measures! Take, for example, password policies. If the organization requires
an extremely long password with several special characters, an employee may resort to writing it down and
putting it in a drawer since it will be impossible to memorize.
Personal Information Security
We will end this chapter with a discussion of what measures each of
us, as individual users, can take to secure our computing
technologies. There is no way to have 100% security, but there are
several simple steps we, as individuals, can take to make ourselves
more secure.
• Keep your software up to date. Whenever a software vendor determines
that a security flaw has been found in their software, they will release an
update to the software that you can download to fix the problem. Turn on
automatic updating on your computer to automate this process.
• Install antivirus software and keep it up to date. There are many good
antivirus software packages on the market today, including free ones.
• Be smart about your connections. You should be aware of your
surroundings. When connecting to a Wi-Fi network in a public place, be
aware that you could be at risk of being spied on by others sharing that
network. It is advisable not to access your financial or personal data while
attached to a Wi-Fi hotspot. You should also be aware that connecting
USB flash drives to your device could also put you at risk. Do not attach
an unfamiliar flash drive to your device unless you can scan it first with your security software.
• Back up your data. Just as organizations need to back up their data, individuals need to as well. And the same rules
apply: do it regularly and keep a copy of it in another location. One simple solution for this is to set up an account with
an online backup service, such as Mozy or Carbonite, to automate your backups.
3. Taken from SANS Institute’s Mobile Device Checklist. You can review the full checklist at www.sans.org/score/checklists/mobile-device-
checklist.xls.
72 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2013/06/updateED
http://bus206.pressbooks.com/files/2013/06/updateED
http://download.cnet.com/windows/security-software/
www.sans.org/score/checklists/mobile-device-checklist.xls�
www.sans.org/score/checklists/mobile-device-checklist.xls�
• Secure your accounts with two-factor authentication. Most e-mail and social media providers now have a two-factor
authentication option. The way this works is simple: when you log in to your account from an unfamiliar computer for
the first time, it sends you a text message with a code that you must enter to confirm that you are really you. This means
that no one else can log in to your accounts without knowing your password and having your mobile phone with them.
• Make your passwords long, strong, and unique. For your personal passwords, you should follow the same rules that are
recommended for organizations. Your passwords should be long (eight or more characters) and contain at least two of
the following: upper-case letters, numbers, and special characters. You also should use different passwords for different
accounts, so that if someone steals your password for one account, they still are locked out of your other accounts.
• Be suspicious of strange links and attachments. When you receive an e-mail, tweet, or Facebook post, be suspicious of
any links or attachments included there. Do not click on the link directly if you are at all suspicious. Instead, if you want
to access the website, find it yourself and navigate to it directly.
You can find more about these steps and many other ways to be secure with your computing by going
to Stop. Think. Connect. This website is part of a campaign that was launched in October of 2010 by the
STOP. THINK. CONNECT. Messaging Convention in partnership with the U.S. government, including the
White House.
Summary
As computing and networking resources have become more and more an integral part of business, they have
also become a target of criminals. Organizations must be vigilant with the way they protect their resources.
The same holds true for us personally: as digital devices become more and more intertwined with our lives,
it becomes crucial for us to understand how to protect ourselves.
Study Questions
1. Briefly define each of the three members of the information security triad.
2. What does the term authentication mean?
3. What is multi-factor authentication?
4. What is role-based access control?
5. What is the purpose of encryption?
6. What are two good examples of a complex password?
7. What is pretexting?
8. What are the components of a good backup plan?
9. What is a firewall?
10. What does the term physical security mean?
IS Security 73
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://stopthinkconnect.org/
Exercises
1. Describe one method of multi-factor authentication that you have experienced and discuss the
pros and cons of using multi-factor authentication.
2. What are some of the latest advances in encryption technologies? Conduct some independent
research on encryption using scholarly or practitioner resources, then write a two- to three-page
paper that describes at least two new advances in encryption technology.
3. What is the password policy at your place of employment or study? Do you have to change
passwords every so often? What are the minimum requirements for a password?
4. When was the last time you backed up your data? What method did you use? In one to two
pages, describe a method for backing up your data. Ask your instructor if you can get extra credit
for backing up your data.
5. Find the information security policy at your place of employment or study. Is it a good policy?
Does it meet the standards outlined in the chapter?
6. How are you doing on keeping your own information secure? Review the steps listed in the
chapter and comment on how well you are doing.
74 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Part 2: Information Systems for Strategic Advantage
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Chapter 7: Does IT Matter?
David T. Bourgeois
Learning Objectives
Upon successful completion of this chapter, you will be able to:
• define the productivity paradox and explain the current thinking on this topic;
• evaluate Carr’s argument in “Does IT Matter?”;
• describe the components of competitive advantage; and
• describe information systems that can provide businesses with competitive advantage.
Introduction
For over fifty years, computing technology has been a part of business. Organizations have spent trillions
of dollars on information technologies. But has all this investment in IT made a difference? Have we seen
increases in productivity? Are companies that invest in IT more competitive? In this chapter, we will look
at the value IT can bring to an organization and try to answer these questions. We will begin by highlighting
two important works from the past two decades.
The Productivity Paradox
In 1991, Erik Brynjolfsson wrote an article, published in the Communications of the ACM, entitled “The
Productivity Paradox of Information Technology: Review and Assessment.” By reviewing studies about the
impact of IT investment on productivity, Brynjolfsson was able to conclude that the addition of information
technology to business had not improved productivity at all – the “productivity paradox.” From the article1
He does not draw any specific conclusions from this finding, and provides the following analysis:
Although it is too early to conclude that IT’s productivity contribution has been subpar, a paradox
remains in our inability to unequivocally document any contribution after so much effort. The
various explanations that have been proposed can be grouped into four categories:
1) Mismeasurement of outputs and inputs,
2) Lags due to learning and adjustment,
3) Redistribution and dissipation of profits,
4) Mismanagement of information and technology.
1. Brynjolfsson, Erik. “The Productivity Paradox of Information Technology: Review and Assessment” Copyright © 1993, 1994 Erik
Brynjolfsson, All Rights Reserved Center for Coordination Science MIT Sloan School of Management Cambridge, Massachusetts Previous
version: December 1991 This Version: September 1992 Published in Communications of the ACM, December, 1993; and Japan
Management Research, June, 1994 (in Japanese).
76
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://ccs.mit.edu/papers/CCSWP130/ccswp130.html
http://ccs.mit.edu/papers/CCSWP130/ccswp130.html
In 1998, Brynjolfsson and Lorin Hitt published a follow-up paper entitled “Beyond the Productivity Paradox.”2 In this paper, the
authors utilized new data that had been collected and found that IT did, indeed, provide a positive result for businesses. Further,
they found that sometimes the true advantages in using technology were not directly relatable to higher productivity, but to “softer”
measures, such as the impact on organizational structure. They also found that the impact of information technology can vary widely
between companies.
IT Doesn’t Matter
Just as a consensus was forming about the value of IT, the Internet stock market bubble burst. Just two
years later, in 2003, Harvard professor Nicholas Carr wrote his article “IT Doesn’t Matter” in the Harvard
Business Review. In this article Carr asserts that as information technology has become more ubiquitous,
it has also become less of a differentiator. In other words: because information technology is so readily
available and the software used so easily copied, businesses cannot hope to implement these tools to
provide any sort of competitive advantage. Carr goes on to suggest that since IT is essentially a commodity,
it should be managed like one: low cost, low risk. Using the analogy of electricity, Carr describes how a
firm should never be the first to try a new technology, thereby letting others take the risks. IT management
should see themselves as a utility within the company and work to keep costs down . For IT, providing the
best service with minimal downtime is the goal.
As you can imagine, this article caused quite an uproar, especially from IT companies. Many articles
were written in defense of IT; many others in support of Carr. Carr released a book based on the article in
2004, entitled Does IT Matter? Click here to watch a video of Carr being interviewed about his book on
CNET.
Probably the best thing to come out of the article and subsequent book was that it opened up discussion
on the place of IT in a business strategy, and exactly what role IT could play in competitive advantage. It is
that question that we want to address in the rest of the this chapter.
[Embed “IT Doesn’t Matter” classroom video footage here: http://quickstream.biola.edu/
distancelearning/busn220bourgeois/ITDoesn’tMatter.f4v]
Competitive Advantage
What does it mean when a company has a competitive advantage? What are the factors that play into
it? While there are entire courses and many different opinions on this topic, let’s go with one of the
most accepted definitions, developed by Michael Porter in his book Competitive Advantage: Creating and
Sustaining Superior Performance. A company is said to have a competitive advantage over its rivals when
it is able to sustain profits that exceed average for the industry. According to Porter, there are two primary
methods for obtaining competitive advantage: cost advantage and differentiation advantage. So the question
becomes: how can information technology be a factor in one or both of these methods? In the sections
below we will explore this question using two of Porter’s analysis tools: the value chain and the five forces
model. We will also use Porter’s analysis in his 2001 article “Strategy and the Internet,” which examines
the impact of the Internet on business strategy and competitive advantage, to shed further light on the role
of information technology in competitive advantage.
2. Brynjolfsson, Erik and Lorin Hitt. “Beyond the Productivity Paradox”, Communications of the ACM, August 1998, Vol. 41(8): pp. 49–55. Copyright © 1998 by
Association for Computing Machinery, Inc. (ACM).
IS for Competitive Advantage 77
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://ebusiness.mit.edu/erik/bpp
http://news.cnet.com/1606-2_3-30103.html
Porter’s value chain (click to enlarge)
The Value Chain
In his book, Porter describes exactly how a company can
create value (and therefore profit). Value is built through
the value chain: a series of activities undertaken by the
company to produce a product or service. Each step in the
value chain contributes to the overall value of a product or
service. While the value chain may not be a perfect model
for every type of company, it does provide a way to
analyze just how a company is producing value. The value
chain is made up of two sets of activities: primary activities and support activities. We will briefly examine
these activities and discuss how information technology can play a role in creating value by contributing to
cost advantage or differentiation advantage, or both.
The primary activities are the functions that directly impact the creation of a product or service. The
goal of the primary activities is to add more value than they cost. The primary activities are:
• Inbound logistics: These are the functions performed to bring in raw materials and
other needed inputs. Information technology can be used here to make these processes more
efficient, such as with supply-chain management systems, which allow the suppliers to manage
their own inventory.
• Operations: Any part of a business that is involved in converting the raw materials into the final
products or services is part of operations. From manufacturing to business process management
(covered in chapter 8), information technology can be used to provide more efficient processes
and increase innovation through flows of information.
• Outbound logistics: These are the functions required to get the product out to the customer. As
with inbound logistics, IT can be used here to improve processes, such as allowing for real-time
inventory checks. IT can also be a delivery mechanism itself.
• Sales/Marketing: The functions that will entice buyers to purchase the products are part of sales
and marketing. Information technology is used in almost all aspects of this activity. From online
advertising to online surveys, IT can be used to innovate product design and reach customers like
never before. The company website can be a sales channel itself.
• Service: The functions a business performs after the product has been purchased to maintain and
enhance the product’s value are part of the service activity. Service can be enhanced via
technology as well, including support services through websites and knowledge bases.
The support activities are the functions in an organization that support, and cut across, all of the primary
activities. The support activities are:
• Firm infrastructure: This includes organizational functions such as finance, accounting, and
quality control, all of which depend on information technology; the use of ERP systems (to be
covered in chapter 9) is a good example of the impact that IT can have on these functions.
78 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2013/04/Porters-Value-Chain
http://bus206.pressbooks.com/files/2013/04/Porters-Value-Chain
Porter’s five forces (click to enlarge)
• Human resource management: This activity consists of recruiting, hiring, and other services
needed to attract and retain employees. Using the Internet, HR departments can increase their
reach when looking for candidates. There is also the possibility of allowing employees to use
technology for a more flexible work environment.
• Technology development: Here we have the technological advances and innovations that support
the primary activities. These advances are then integrated across the firm or within one of the
primary activities to add value. Information technology would fall specifically under this activity.
• Procurement: The activities involved in acquiring the raw materials used in the creation of
products and services are called procurement. Business-to-business e-commerce can be used to
improve the acquisition of materials.
This analysis of the value chain provides some insight into how information technology can lead to
competitive advantage. Let’s now look at another tool that Porter developed – the “five forces” model.
Porter’s Five Forces
Porter developed the “five forces” model as a framework
for industry analysis. This model can be used to help
understand just how competitive an industry is and to
analyze its strengths and weaknesses. The model consists
of five elements, each of which plays a role in determining
the average profitability of an industry. In 2001, Porter
wrote an article entitled ”Strategy and the Internet,” in
which he takes this model and looks at how the Internet
impacts the profitability of an industry. Below is a quick
summary of each of the five forces and the impact of the
Internet.
• Threat of substitute products or services: How easily can
a product or service be replaced with something else? The more types of products or services there are that can meet a
particular need, the less profitability there will be in an industry. For example, the advent of the mobile phone has
replaced the need for pagers. The Internet has made people more aware of substitute products, driving down industry
profits in those industries being substituted.
• Bargaining power of suppliers: When a company has several suppliers to choose from, it can demand a lower price.
When a sole supplier exists, then the company is at the mercy of the supplier. For example, if only one company makes
the controller chip for a car engine, that company can control the price, at least to some extent. The Internet has given
companies access to more suppliers, driving down prices. On the other hand, suppliers now also have the ability to sell
directly to customers.
• Bargaining power of customers: A company that is the sole provider of a unique product has the ability to control
pricing. But the Internet has given customers many more options to choose from.
IS for Competitive Advantage 79
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2013/04/Porters-Five-Forces1
http://bus206.pressbooks.com/files/2013/04/Porters-Five-Forces1
• Barriers to entry: The easier it is to enter an industry, the tougher it will be to make a profit in that industry. The
Internet has an overall effect of making it easier to enter industries. It is also very easy to copy technology, so new
innovations will not last that long.
• Rivalry among existing competitors: The more competitors there are in an industry, the bigger a factor price becomes.
The advent of the Internet has increased competition by widening the geographic market and lowering the costs of doing
business. For example, a manufacturer in Southern California may now have to compete against a manufacturer in the
South, where wages are lower.
Porter’s five forces are used to analyze an industry to determine the average profitability of a company
within that industry. Adding in Porter’s analysis of the Internet, we can see that the Internet (and by
extension, information technology in general) has the effect of lowering overall profitability. 3 While the
Internet has certainly produced many companies that are big winners, the overall winners have been the
consumers, who have been given an ever-increasing market of products and services and lower prices.
Using Information Systems for Competitive Advantage
Now that we have an understanding of competitive advantage and some of the ways that IT may be used
to help organizations gain it, we will turn our attention to some specific examples. A strategic information
system is an information system that is designed specifically to implement an organizational strategy meant
to provide a competitive advantage. These sorts of systems began popping up in the 1980s, as noted in a
paper by Charles Wiseman entitled “Creating Competitive Weapons From Information Systems.”4
Specifically, a strategic information system is one that attempts to do one or more of the following:
• deliver a product or a service at a lower cost;
• deliver a product or service that is differentiated;
• help an organization focus on a specific market segment;
• enable innovation.
Following are some examples of information systems that fall into this category.
Business Process Management Systems
In their book, IT Doesn’t Matter – Business Processes Do, Howard Smith and Peter Fingar argue that it is
the integration of information systems with business processes that leads to competitive advantage. They
then go on to state that Carr’s article is dangerous because it gave CEOs and IT managers the green light
to start cutting their technology budgets, putting their companies in peril. They go on to state that true
competitive advantage can be found with information systems that support business processes. In chapter 8
we will focus on the use of business processes for competitive advantage.
3. Porter, Michael. “Strategy and the Internet,” Harvard Business Review, Vol. 79, No. 3, March 2001. http://hbswk.hbs.edu/item/2165.html
4. Wiseman, C., & MacMillan, I. C. (1984). CREATING COMPETITIVE WEAPONS FROM INFORMATION SYSTEMS. Journal Of
Business Strategy, 5(2), 42.
80 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Electronic Data Interchange
One of the ways that information systems have participated in competitive advantage is through integrating
the supply chain electronically. This is primarily done through a process called electronic data interchange,
or EDI. EDI can be thought of as the computer-to-computer exchange of business documents in a standard
electronic format between business partners. By integrating suppliers and distributors via EDI, a company
can vastly reduce the resources required to manage the relevant information. Instead of manually ordering
supplies, the company can simply place an order via the computer and the next time the order process runs,
it is ordered.
EDI example (click to enlarge)
Collaborative Systems
As organizations began to implement networking technologies, information systems emerged that allowed
employees to begin collaborating in different ways. These systems allowed users to brainstorm ideas
together without the necessity of physical, face-to-face meetings. Utilizing tools such as discussion boards,
document sharing, and video, these systems made it possible for ideas to be shared in new ways and the
thought processes behind these ideas to be documented.
Broadly speaking, any software that allows multiple users to interact on a document or topic could be
considered collaborative. Electronic mail, a shared Word document, social networks, and discussion boards
would fall into this broad definition. However, many software tools have been created that are designed
specifically for collaborative purposes. These tools offer a broad spectrum of collaborative functions. Here
is just a short list of some collaborative tools available for businesses today:
• Google Drive. Google Drive offers a suite of office applications (such as a word processor,
spreadsheet, drawing, presentation) that can be shared between individuals. Multiple users can
edit the documents at the same time and threaded comments are available.
IS for Competitive Advantage 81
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2013/04/EDI-example-1
http://bus206.pressbooks.com/files/2013/04/EDI-example-1
http://drive.google.com
• Microsoft SharePoint. SharePoint integrates with Microsoft Office and allows for collaboration
using tools most office workers are familiar with. SharePoint was covered in more detail in
chapter 5.
• Cisco WebEx. WebEx is a business communications platform that combines video and audio
communications and allows participants to interact with each other’s computer desktops. WebEx
also provides a shared whiteboard and the capability for text-based chat to be going on during the
sessions, along with many other features. Mobile editions of WebEx allow for full participation
using smartphones and tablets.
• Atlassian Confluence. Confluence provides an all-in-one project-management application that
allows users to collaborate on documents and communicate progress. The mobile edition of
Confluence allows the project members to stay connected throughout the project.
• IBM Lotus Notes/Domino. One of the first true “groupware” collaboration tools, Lotus Notes
(and its web-based cousin, Domino) provides a full suite of collaboration software, including
integrated e-mail.
Decision Support Systems
A decision support system (DSS) is an information system built to help an organization make a specific
decision or set of decisions. DSSs can exist at different levels of decision-making with the organization,
from the CEO to the first-level managers. These systems are designed to take inputs regarding a known (or
partially-known) decision-making process and provide the information necessary to make a decision. DSSs
generally assist a management-level person in the decision-making process, though some can be designed
to automate decision-making.
An organization has a wide variety of decisions to make, ranging from highly structured decisions
to unstructured decisions. A structured decision is usually one that is made quite often, and one in which
the decision is based directly on the inputs. With structured decisions, once you know the necessary
information you also know the decision that needs to be made. For example, inventory reorder levels can be
structured decisions: once our inventory of widgets gets below a specific threshold, automatically reorder
ten more. Structured decisions are good candidates for automation, but we don’t necessarily build decision-
support systems for them.
An unstructured decision involves a lot of unknowns. Many times, unstructured decisions are
decisions being made for the first time. An information system can support these types of decisions
by providing the decision-maker(s) with information-gathering tools and collaborative capabilities. An
example of an unstructured decision might be dealing with a labor issue or setting policy for a new
technology.
Decision support systems work best when the decision-maker(s) are making semi-structured decisions.
A semi-structured decision is one in which most of the factors needed for making the decision are known
but human experience and other outside factors may still play a role. A good example of an semi-structured
decision would be diagnosing a medical condition (see sidebar).
As with collaborative systems, DSSs can come in many different formats. A nicely designed
spreadsheet that allows for input of specific variables and then calculates required outputs could be
considered a DSS. Another DSS might be one that assists in determining which products a company should
82 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://office.microsoft.com/en-us/microsoft-sharepoint-collaboration-software-FX103479517.aspx
http://webex.com
http://www.atlassian.com/software/confluence/overview/team-collaboration-software
http://www-03.ibm.com/software/products/us/en/ibmnotes/
develop. Input into the system could include market research on the product, competitor information, and
product development costs. The system would then analyze these inputs based on the specific rules and
concepts programmed into it. Finally, the system would report its results, with recommendations and/or key
indicators to be used in making a decision. A DSS can be looked at as a tool for competitive advantage in
that it can give an organization a mechanism to make wise decisions about products and innovations.
Sidebar: Isabel – A Health Care DSS
A discussed in the text, DSSs are best applied to semi-structured decisions, in which most of the needed
inputs are known but human experience and environmental factors also play a role. A good example that is
in use today is Isabel, a health care DSS. The creators of Isabel explain how it works:
Isabel uses the information routinely captured during your workup, whether free text or structured
data, and instantaneously provides a diagnosis checklist for review. The checklist contains a list
of possible diagnoses with critical “Don’t Miss Diagnoses” flagged. When integrated into your
EMR system Isabel can provide “one click” seamless diagnosis support with no additional data
entry. 5
Investing in IT for Competitive Advantage
In 2008, Brynjolfsson and McAfee published a study in the Harvard Business Review on the role of IT in
competitive advantage, entitled “Investing in the IT That Makes a Competitive Difference.” Their study
confirmed that IT can play a role in competitive advantage, if deployed wisely. In their study, they draw
three conclusions6:
• First, the data show that IT has sharpened differences among companies instead of
reducing them. This reflects the fact that while companies have always varied widely in
their ability to select, adopt, and exploit innovations, technology has accelerated and
amplified these differences.
• Second, good management matters: Highly qualified vendors, consultants, and IT
departments might be necessary for the successful implementation of enterprise
technologies themselves, but the real value comes from the process innovations that can
now be delivered on those platforms. Fostering the right innovations and propagating
them widely are both executive responsibilities – ones that can’t be delegated.
• Finally, the competitive shakeup brought on by IT is not nearly complete, even in the IT-
intensive US economy. We expect to see these altered competitive dynamics in other
countries, as well, as their IT investments grow.
5. Taken from http://www.isabelhealthcare.com/home/ourmission. Accessed July 15, 2013.
6. McAfee, Andrew and Brynjolfsson, Erik “Investing in the IT That Makes a Competitive Difference” Harvard Business Review, (July-
August, 2008)
IS for Competitive Advantage 83
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://www.isabelhealthcare.com/
http://hbr.org/2008/07/investing-in-the-it-that-makes-a-competitive-difference/ar/1
Information systems can be used for competitive advantage, but they must be used strategically.
Organizations must understand how they want to differentiate themselves and then use all the elements of
information systems (hardware, software, data, people, and process) to accomplish that differentiation.
Summary
Information systems are integrated into all components of business today, but can they bring competitive
advantage? Over the years, there have been many answers to this question. Early research could not draw
any connections between IT and profitability, but later research has shown that the impact can be positive.
IT is not a panacea; just purchasing and installing the latest technology will not, by itself, make a company
more successful. Instead, the combination of the right technologies and good management, together, will
give a company the best chance of a positive result.
Study Questions
1. What is the productivity paradox?
2. Summarize Carr’s argument in “Does IT Matter.”
3. How is the 2008 study by Brynjolfsson and McAfee different from previous studies? How is it the same?
4. What does it mean for a business to have a competitive advantage?
5. What are the primary activities and support activities of the value chain?
6. What has been the overall impact of the Internet on industry profitability? Who has been the
true winner?
7. How does EDI work?
8. Give an example of a semi-structured decision and explain what inputs would be necessary to
provide assistance in making the decision.
9. What does a collaborative information system do?
10. How can IT play a role in competitive advantage, according to the 2008 article by
Brynjolfsson and McAfee?
Exercises
1. Do some independent research on Nicholas Carr (the author of “IT Doesn’t Matter”) and
explain his current position on the ability of IT to provide competitive advantage.
2. Review the WebEx website. What features of WebEx would contribute to good collaboration?
What makes WebEx a better collaboration tool than something like Skype or Google Hangouts?
3. Think of a semi-structured decision that you make in your daily life and build your own DSS
using a spreadsheet that would help you make that decision.
84 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://webex.com
Chapter 8: Business Processes
David T. Bourgeois
Learning Objectives
Upon successful completion of this chapter, you will be able to:
• define the term business process;
• identify the different systems needed to support business processes in an organization;
• explain the value of an enterprise resource planning (ERP) system;
• explain how business process management and business process reengineering work; and
• understand how information technology combined with business processes can bring an
organization competitive advantage.
Introduction
The fourth component of information systems is process. But what is a process and how does it tie into
information systems? And in what ways do processes have a role in business? This chapter will look to
answer those questions and also describe how business processes can be used for strategic advantage.
What Is a Business Process?
We have all heard the term process before, but what exactly does it mean? A process is a series of tasks
that are completed in order to accomplish a goal. A business process, therefore, is a process that is focused
on achieving a goal for a business. If you have worked in a business setting, you have participated in a
business process. Anything from a simple process for making a sandwich at Subway to building a space
shuttle utilizes one or more business processes.
Processes are something that businesses go through every day in order to accomplish their mission.
The better their processes, the more effective the business. Some businesses see their processes as a strategy
for achieving competitive advantage. A process that achieves its goal in a unique way can set a company
apart. A process that eliminates costs can allow a company to lower its prices (or retain more profit).
Documenting a Process
Every day, each of us will conduct many processes without even thinking about them: getting ready for
work, using an ATM, reading our e-mail, etc. But as processes grow more complex, they need to be
documented. For businesses, it is essential to do this, because it allows them to ensure control over how
activities are undertaken in their organization. It also allows for standardization: McDonald’s has the same
process for building a Big Mac in all of its restaurants.
The simplest way to document a process is to simply create a list. The list shows each step in the
process; each step can be checked off upon completion. For example, a simple process, such as how to
create an account on eBay, might look like this:
85
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
1. Go to ebay.com.
2. Click on “register.”
3. Enter your contact information in the “Tell us about you” box.
4. Choose your user ID and password.
5. Agree to User Agreement and Privacy Policy by clicking on “Submit.”
For processes that are not so straightforward, documenting the process as a checklist may not be sufficient.
For example, here is the process for determining if an article for a term needs to be added to Wikipedia:
1. Search Wikipedia to determine if the term already exists.
2. If the term is found, then an article is already written, so you must think of another term. Go to
1.
3. If the term is not found, then look to see if there is a related term.
4. If there is a related term, then create a redirect.
5. If there is not a related term, then create a new article.
This procedure is relatively simple – in fact, it has the same number of steps as the previous example – but
because it has some decision points, it is more difficult to track with as a simple list. In these cases, it may
make more sense to use a diagram to document the process:
Process diagram for determining if a new term should
be added to Wikipedia (click to enlarge). (Public
Domain)
Managing Business Process Documentation
As organizations begin to document their processes, it becomes an administrative task to keep track of
them. As processes change and improve, it is important to know which processes are the most recent. It is
also important to manage the process so that it can be easily updated! The requirement to manage process
documentation has been one of the driving forces behind the creation of the document management system.
A document management system stores and tracks documents and supports the following functions:
86 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2012/08/Process_diagram1
http://bus206.pressbooks.com/files/2012/08/Process_diagram1
An ERP system (click to enlarge)
• Versions and timestamps. The document management system will keep multiple versions of documents. The most recent
version of a document is easy to identify and will be served up by default.
• Approvals and workflows. When a process needs to be changed, the system will manage both access to the documents
for editing and the routing of the document for approvals.
• Communication. When a process changes, those who implement the process need to be made aware of the changes. A
document management system will notify the appropriate people when a change to a document is approved.
Of course, document management systems are not only used for managing business process documentation. Many other types of
documents are managed in these systems, such as legal documents or design documents.
ERP Systems
An enterprise resource planning (ERP) system is a software application with a centralized database that can
be used to run an entire company. Let’s take a closer look at the definition of each of these components:
• A software application: The system is a software application,
which means that it has been developed with specific logic
and rules behind it. It has to be installed and configured to
work specifically for an individual organization.
• With a centralized database: All data in an ERP system is
stored in a single, central database. This centralization is key
to the success of an ERP – data entered in one part of the
company can be immediately available to other parts of the
company.
• That can be used to run an entire company: An ERP can be
used to manage an entire organization’s operations. If they so
wish, companies can purchase modules for an ERP that
represent different functions within the organization, such as
finance, manufacturing, and sales. Some companies choose
to purchase many modules, others choose a subset of the
modules.
An ERP system not only centralizes an organization’s data, but the processes it enforces are the processes
the organization adopts. When an ERP vendor designs a module, it has to implement the rules for the
associated business processes. A selling point of an ERP system is that it has best practices built right into
it. In other words, when an organization implements an ERP, it also gets improved best practices as part of
the deal!
For many organizations, the implementation of an ERP system is an excellent opportunity to improve
their business practices and upgrade their software at the same time. But for others, an ERP brings them a
challenge: Is the process embedded in the ERP really better than the process they are currently utilizing?
Chapter 8: Business Processes 87
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2012/08/ERP
http://bus206.pressbooks.com/files/2012/08/ERP
Registered trademark of
SAP
And if they implement this ERP, and it happens to be the same one that all of their competitors have, will
they simply become more like them, making it much more difficult to differentiate themselves?
This has been one of the criticisms of ERP systems: that they commoditize
business processes, driving all businesses to use the same processes and thereby
lose their uniqueness. The good news is that ERP systems also have the capability
to be configured with custom processes. For organizations that want to continue
using their own processes or even design new ones, ERP systems offer ways to
support this through the use of customizations.
But there is a drawback to customizing an ERP system: organizations have to
maintain the changes themselves. Whenever an update to the ERP system comes
out, any organization that has created a custom process will be required to add that
change to their ERP. This will require someone to maintain a listing of these
changes and will also require retesting the system every time an upgrade is made. Organizations will have to
wrestle with this decision: When should they go ahead and accept the best-practice processes built into the
ERP system and when should they spend the resources to develop their own processes? It makes the most
sense to only customize those processes that are critical to the competitive advantage of the company.
Some of the best-known ERP vendors are SAP, Microsoft, and Oracle.
Business Process Management
Organizations that are serious about improving their business processes will also create structures to
manage those processes. Business process management (BPM) can be thought of as an intentional effort
to plan, document, implement, and distribute an organization’s business processes with the support of
information technology.
BPM is more than just automating some simple steps. While automation can make a business more
efficient, it cannot be used to provide a competitive advantage. BPM, on the other hand, can be an integral
part of creating that advantage.
Not all of an organization’s processes should be managed this way. An organization should look
for processes that are essential to the functioning of the business and those that may be used to bring
a competitive advantage. The best processes to look at are those that include employees from multiple
departments, those that require decision-making that cannot be easily automated, and processes that change
based on circumstances.
To make this clear, let’s take a look at an example.
Suppose a large clothing retailer is looking to gain a competitive advantage through superior customer
service. As part of this, they create a task force to develop a state-of-the-art returns policy that allows
customers to return any article of clothing, no questions asked. The organization also decides that, in
order to protect the competitive advantage that this returns policy will bring, they will develop their own
customization to their ERP system to implement this returns policy. As they prepare to roll out the system,
they invest in training for all of their customer-service employees, showing them how to use the new system
and specifically how to process returns. Once the updated returns process is implemented, the organization
will be able to measure several key indicators about returns that will allow them to adjust the policy as
needed. For example, if they find that many women are returning their high-end dresses after wearing them
once, they could implement a change to the process that limits – to, say, fourteen days – the time after the
original purchase that an item can be returned. As changes to the returns policy are made, the changes are
rolled out via internal communications, and updates to the returns processing on the system are made. In our
88 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2012/08/sap.gif
http://bus206.pressbooks.com/files/2012/08/sap.gif
example, the system would no longer allow a dress to be returned after fourteen days without an approved
reason.
If done properly, business process management will provide several key benefits to an organization,
which can be used to contribute to competitive advantage. These benefits include:
• Empowering employees. When a business process is designed correctly and supported with information technology,
employees will be able to implement it on their own authority. In our returns-policy example, an employee would be
able to accept returns made before fourteen days or use the system to make determinations on what returns would be
allowed after fourteen days.
• Built-in reporting. By building measurement into the programming, the organization can keep up to date on key metrics
regarding their processes. In our example, these can be used to improve the returns process and also, ideally, to reduce
returns.
• Enforcing best practices. As an organization implements processes supported by information systems, it can work to
implement the best practices for that class of business process. In our example, the organization may want to require that
all customers returning a product without a receipt show a legal ID. This requirement can be built into the system so that
the return will not be processed unless a valid ID number is entered.
• Enforcing consistency. By creating a process and enforcing it with information technology, it is possible to create a
consistency across the entire organization. In our example, all stores in the retail chain can enforce the same returns
policy. And if the returns policy changes, the change can be instantly enforced across the entire chain.
Business Process Reengineering
As organizations look to manage their processes to gain a competitive advantage, they also need to
understand that their existing ways of doing things may not be the most effective or efficient. A process
developed in the 1950s is not going to be better just because it is now supported by technology.
In 1990, Michael Hammer published an article in the Harvard Business Review entitled
“Reengineering Work: Don’t Automate, Obliterate.” This article put forward the thought that simply
automating a bad process does not make it better. Instead, companies should “blow up” their existing
processes and develop new processes that take advantage of the new technologies and concepts. He states
in the introduction to the article:1
Many of our job designs, work flows, control mechanisms, and organizational structures came of
age in a different competitive environment and before the advent of the computer. They are geared
towards greater efficiency and control. Yet the watchwords of the new decade are innovation and
speed, service, and quality.
It is time to stop paving the cow paths. Instead of embedding outdated processes in silicon
and software, we should obliterate them and start over. We should “reengineer” our businesses:
use the power of modern information technology to radically redesign our business processes in
order to achieve dramatic improvements in their performance.
Business process reengineering is not just taking an existing process and automating it. BPR is fully
understanding the goals of a process and then dramatically redesigning it from the ground up to achieve
1. Hammer, Michael. “Reengineering work: don’t automate, obliterate.” Harvard Business Review 68.4 (1990): 104–112.
Chapter 8: Business Processes 89
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
dramatic improvements in productivity and quality. But this is easier said than done. Most of us think in
terms of how to do small, local improvements to a process; complete redesign requires thinking on a larger
scale. Hammer provides some guidelines for how to go about doing business process reengineering:
• Organize around outcomes, not tasks. This simply means to design the process so that, if possible, one person performs
all the steps. Instead of repeating one step in the process over and over, the person stays involved in the process from
start to finish.
• Have those who use the outcomes of the process perform the process. Using information technology, many simple tasks
are now automated, so we can empower the person who needs the outcome of the process to perform it. The example
Hammer gives here is purchasing: instead of having every department in the company use a purchasing department to
order supplies, have the supplies ordered directly by those who need the supplies using an information system.
• Subsume information-processing work into the real work that produces the information. When one part of the company
creates information (like sales information, or payment information), it should be processed by that same department.
There is no need for one part of the company to process information created in another part of the company.
• Treat geographically dispersed resources as though they were centralized. With the communications technologies in
place today, it becomes easier than ever to not worry about physical location. A multinational organization does not need
separate support departments (such as IT, purchasing, etc.) for each location anymore.
• Link parallel activities instead of integrating their results. Departments that work in parallel should be sharing data and
communicating with each other during their activities instead of waiting until each group is done and then comparing
notes.
• Put the decision points where the work is performed, and build controls into the process. The people who do the work
should have decision-making authority and the process itself should have built-in controls using information
technology.
• Capture information once, at the source. Requiring information to be entered more than once causes delays and errors.
With information technology, an organization can capture it once and then make it available whenever needed.
These principles may seem like common sense today, but in 1990 they took the business world by storm.
Hammer gives example after example of how organizations improved their business processes by many
orders of magnitude without adding any new employees, simply by changing how they did things (see
sidebar).
Unfortunately, business process reengineering got a bad name in many organizations. This was
because it was used as an excuse for cost cutting that really had nothing to do with BPR. For example, many
companies simply used it as an excuse for laying off part of their workforce. Today, however, many of the
principles of BPR have been integrated into businesses and are considered part of good business-process
management.
90 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Sidebar: Reengineering the College Bookstore
The process of purchasing the correct textbooks in a timely manner for college classes has always been
problematic. And now, with online bookstores such as Amazon competing directly with the college
bookstore for students’ purchases, the college bookstore is under pressure to justify its existence.
But college bookstores have one big advantage over their competitors: they have access to students’
data. In other words, once a student has registered for classes, the bookstore knows exactly what books
that student will need for the upcoming term. To leverage this advantage and take advantage of new
technologies, the bookstore wants to implement a new process that will make purchasing books through the
bookstore advantageous to students. Though they may not be able to compete on price, they can provide
other advantages, such as reducing the time it takes to find the books and the ability to guarantee that the
book is the correct one for the class. In order to do this, the bookstore will need to undertake a process
redesign.
The goal of the process redesign is simple: capture a higher percentage of students as customers of the
bookstore. After diagramming the existing process and meeting with student focus groups, the bookstore
comes up with a new process. In the new process, the bookstore utilizes information technology to reduce
the amount of work the students need to do in order to get their books. In this new process, the bookstore
sends the students an e-mail with a list of all the books required for their upcoming classes. By clicking a
link in this e-mail, the students can log into the bookstore, confirm their books, and purchase the books.
The bookstore will then deliver the books to the students.
College bookstore process redesign (click to enlarge)
Chapter 8: Business Processes 91
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2012/08/BPR-Example
http://bus206.pressbooks.com/files/2012/08/BPR-Example
ISO Certification
Many organizations now claim that they are using best practices when it comes to
business processes. In order to set themselves apart and prove to their customers
(and potential customers) that they are indeed doing this, these organizations are
seeking out an ISO 9000 certification. ISO is an acronym for International
Standards Organization (website here). This body defines quality standards that
organizations can implement to show that they are, indeed, managing business
processes in an effective way. The ISO 9000 certification is focused on quality
management.
In order to receive ISO certification, an organization must be audited and found to meet specific
criteria. In its most simple form, the auditors perform the following review:
• Tell me what you do (describe the business process).
• Show me where it says that (reference the process documentation).
• Prove that this is what happened (exhibit evidence in documented records).
Over the years, this certification has evolved and many branches of the certification now exist. ISO
certification is one way to separate an organization from others. You can find out more about the ISO 9000
standard here.
Summary
The advent of information technologies has had a huge impact on how organizations design, implement,
and support business processes. From document management systems to ERP systems, information systems
are tied into organizational processes. Using business process management, organizations can empower
employees and leverage their processes for competitive advantage. Using business process reengineering,
organizations can vastly improve their effectiveness and the quality of their products and services.
Integrating information technology with business processes is one way that information systems can bring
an organization lasting competitive advantage.
Study Questions
1. What does the term business process mean?
2. What are three examples of business process from a job you have had or an organization you
have observed?
3. What is the value in documenting a business process?
4. What is an ERP system? How does an ERP system enforce best practices for an organization?
5. What is one of the criticisms of ERP systems?
92 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2012/08/iso9001
http://bus206.pressbooks.com/files/2012/08/iso9001
http://www.iso.org/iso/home.html
http://www.iso.org/iso/home/standards/management-standards/iso_9000.htm
6. What is business process reengineering? How is it different from incrementally improving a
process?
7. Why did BPR get a bad name?
8. List the guidelines for redesigning a business process.
9. What is business process management? What role does it play in allowing a company to
differentiate itself?
10. What does ISO certification signify?
Exercises
1. Think of a business process that you have had to perform in the past. How would you document
this process? Would a diagram make more sense than a checklist? Document the process both as a
checklist and as a diagram.
2. Review the return policies at your favorite retailer, then answer this question: What information
systems do you think would need to be in place to support their return policy.
3. If you were implementing an ERP system, in which cases would you be more inclined to
modify the ERP to match your business processes? What are the drawbacks of doing this?
4. Which ERP is the best? Do some original research and compare three leading ERP systems to
each other. Write a two- to three-page paper that compares their features.
Chapter 8: Business Processes 93
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Chapter 9: The People in Information Systems
David T. Bourgeois
Learning Objectives
Upon successful completion of this chapter, you will be able to:
• describe each of the different roles that people play in the design, development, and use of
information systems;
• understand the different career paths available to those who work with information systems;
• explain the importance of where the information-systems function is placed in an organization;
and
• describe the different types of users of information systems.
Introduction
In the opening chapters of this text, we focused on the technology behind information systems: hardware,
software, data, and networking. In the last chapter, we discussed business processes and the key role they
can play in the success of a business. In this chapter, we will be discussing the last component of an
information system: people.
People are involved in information systems in just about every way you can think of: people imagine
information systems, people develop information systems, people support information systems, and,
perhaps most importantly, people use information systems.
The Creators of Information Systems
The first group of people we are going to look at play a role in designing, developing, and building
information systems. These people are generally very technical and have a background in programming and
mathematics. Just about everyone who works in the creation of information systems has a minimum of a
bachelor’s degree in computer science or information systems, though that is not necessarily a requirement.
We will be looking at the process of creating information systems in more detail in chapter 10.
Systems Analyst
The role of the systems analyst is to straddle the divide between identifying business needs and imagining a
new or redesigned computer-based system to fulfill those needs. This individual will work with a person,
team, or department with business requirements and identify the specific details of a system that needs to
be built. Generally, this will require the analyst to have a good understanding of the business itself , the
94
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
business processes involved, and the ability to document them well. The analyst will identify the different
stakeholders in the system and work to involve the appropriate individuals in the process.
Once the requirements are determined, the analyst will begin the process of translating these
requirements into an information-systems design. A good analyst will understand what different
technological solutions will work and provide several different alternatives to the requester, based on the
company’s budgetary constraints, technology constraints, and culture. Once the solution is selected, the
analyst will create a detailed document describing the new system. This new document will require that the
analyst understand how to speak in the technical language of systems developers.
A systems analyst generally is not the one who does the actual development of the information system.
The design document created by the systems analyst provides the detail needed to create the system and
is handed off to a programmer (or team of programmers) to do the actual creation of the system. In some
cases, however, a systems analyst may go ahead and create the system that he or she designed. This person
is sometimes referred to as a programmer-analyst.
In other cases, the system may be assembled from off-the-shelf components by a person called a
systems integrator. This is a specific type of systems analyst that understands how to get different software
packages to work with each other.
To become a systems analyst, you should have a background both in the business and in systems
design. Many analysts first worked as programmers and/or had experience in the business before becoming
systems analysts.
Programmer
Programmers spend their time writing computer code in a programming language. In the case of systems
development, programmers generally attempt to fulfill the design specifications given to them by a systems
analyst. Many different styles of programming exist: a programmer may work alone for long stretches of
time or may work in a team with other programmers. A programmer needs to be able to understand complex
processes and also the intricacies of one or more programming languages. Generally, a programmer is very
proficient in mathematics, as mathematical concepts underlie most programming code.
Computer Engineer
Computer engineers design the computing devices that we use every day. There are many types of computer
engineers, who work on a variety of different types of devices and systems. Some of the more prominent
engineering jobs are as follows:
• Hardware engineer. A hardware engineer designs hardware components, such as microprocessors.
Many times, a hardware engineer is at the cutting edge of computing technology, creating
something brand new. Other times, the hardware engineer’s job is to engineer an existing
component to work faster or use less power. Many times, a hardware engineer’s job is to write
code to create a program that will be implemented directly on a computer chip.
• Software engineer. Software engineers do not actually design devices; instead, they create new
programming languages and operating systems, working at the lowest levels of the hardware to
develop new kinds of software to run on the hardware.
People 95
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
• Systems engineer. A systems engineer takes the components designed by other engineers and
makes them all work together. For example, to build a computer, the mother board, processor,
memory, and hard disk all have to work together. A systems engineer has experience with many
different types of hardware and software and knows how to integrate them to create
new functionality.
• Network engineer. A network engineer’s job is to understand the networking requirements of an
organization and then design a communications system to meet those needs, using the networking
hardware and software available.
There are many different types of computer engineers, and often the job descriptions overlap. While many
may call themselves engineers based on a company job title, there is also a professional designation of
“professional engineer,” which has specific requirements behind it. In the US, each state has its own set of
requirements for the use of this title, as do different countries around the world. Most often, it involves a
professional licensing exam.
Information-Systems Operations and Administration
Another group of information-systems professionals are involved in the day-to-day operations and
administration of IT. These people must keep the systems running and up-to-date so that the rest of the
organization can make the most effective use of these resources.
Computer Operator
A computer operator is the person who keeps the large computers running. This person’s job is to oversee
the mainframe computers and data centers in organizations. Some of their duties include keeping the
operating systems up to date, ensuring available memory and disk storage, and overseeing the physical
environment of the computer. Since mainframe computers increasingly have been replaced with servers,
storage management systems, and other platforms, computer operators’ jobs have grown broader and
include working with these specialized systems.
Database Administrator
A database administrator (DBA) is the person who manages the databases for an organization. This person
creates and maintains databases that are used as part of applications or the data warehouse. The DBA
also consults with systems analysts and programmers on projects that require access to or the creation of
databases.
Help-Desk/Support Analyst
Most mid-size to large organizations have their own information-technology help desk. The help desk is
the first line of support for computer users in the company. Computer users who are having problems or
need information can contact the help desk for assistance. Many times, a help-desk worker is a junior-level
employee who does not necessarily know how to answer all of the questions that come his or her way. In
these cases, help-desk analysts work with senior-level support analysts or have a computer knowledgebase
96 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
at their disposal to help them investigate the problem at hand. The help desk is a great place to break into
working in IT because it exposes you to all of the different technologies within the company. A successful
help-desk analyst should have good people and communications skills, as well as at least junior-level IT
skills.
Trainer
A computer trainer conducts classes to teach people specific computer skills. For example, if a new ERP
system is being installed in an organization, one part of the implementation process is to teach all of the
users how to use the new system. A trainer may work for a software company and be contracted to come
in to conduct classes when needed; a trainer may work for a company that offers regular training sessions;
or a trainer may be employed full time for an organization to handle all of their computer instruction needs.
To be successful as a trainer, you need to be able to communicate technical concepts well and also have a
lot of patience!
Managing Information Systems
The management of information-systems functions is critical to the success of information systems within
the organization. Here are some of the jobs associated with the management of information systems.
CIO
The CIO, or chief information officer, is the head of the information-systems function. This person aligns
the plans and operations of the information systems with the strategic goals of the organization. This
includes tasks such as budgeting, strategic planning, and personnel decisions for the information-systems
function. The CIO must also be the face of the IT department within the organization. This involves
working with senior leaders in all parts of the organization to ensure good communication and planning.
Interestingly, the CIO position does not necessarily require a lot of technical expertise. While helpful,
it is more important for this person to have good management skills and understand the business. Many
organizations do not have someone with the title of CIO; instead, the head of the information-systems
function is called vice president of information systems or director of information systems.
Functional Manager
As an information-systems organization becomes larger, many of the different functions are grouped
together and led by a manager. These functional managers report to the CIO and manage the employees
specific to their function. For example, in a large organization, there is a group of systems analysts who
report to a manager of the systems-analysis function. For more insight into how this might look, see the discussion later
in the chapter of how information systems are organized.
ERP Management
Organizations using an ERP require one or more individuals to manage these systems. These people make
sure that the ERP system is completely up to date, work to implement any changes to the ERP that are
needed, and consult with various user departments on needed reports or data extracts.
People 97
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Project Managers
Information-systems projects are notorious for going over budget and being delivered late. In many cases,
a failed IT project can spell doom for a company. A project manager is responsible for keeping projects on
time and on budget. This person works with the stakeholders of the project to keep the team organized and
communicates the status of the project to management. A project manager does not have authority over the
project team; instead, the project manager coordinates schedules and resources in order to maximize the
project outcomes. A project manager must be a good communicator and an extremely organized person.
A project manager should also have good people skills. Many organizations require each of their project
managers to become certified as a project management professional (PMP).
Information-Security Officer
An information-security officer is in charge of setting information-security policies for an organization, and
then overseeing the implementation of those policies. This person may have one or more people reporting
to them as part of the information-security team. As information has become a critical asset, this position
has become highly valued. The information-security officer must ensure that the organization’s information
remains secure from both internal and external threats.
Emerging Roles
As technology evolves, many new roles are becoming more common as other roles fade. For example, as
we enter the age of “big data,” we are seeing the need for more data analysts and business-intelligence
specialists. Many companies are now hiring social-media experts and mobile-technology specialists. The
increased use of cloud computing and virtual-machine technologies also is breeding demand for expertise
in those areas.
98 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://www.pmi.org/Certification/Project-Management-Professional-PMP.aspx
Career Paths in Information Systems
These job descriptions do not represent all possible jobs within an
information-systems organization. Larger organizations will have more
specialized roles; smaller organizations may combine some of these roles.
Many of these roles may exist outside of a traditional information-systems
organization, as we will discuss below.
Working with information systems can be a rewarding career choice.
Whether you want to be involved in very technical jobs (programmer,
database administrator), or you want to be involved in working with people
(systems analyst, trainer), there are many different career paths available.
Many times, those in technical jobs who want career advancement
find themselves in a dilemma: do they want to continue doing technical
work, where sometimes their advancement options are limited, or do they
want to become a manager of other employees and put themselves on a
management career track? In many cases, those proficient in technical
skills are not gifted with managerial skills. Some organizations, especially
those that highly value their technically skilled employees, will create a
technical track that exists in parallel to the management track so that they
can retain employees who are contributing to the organization with their
technical skills.
Sidebar: Are Certifications Worth Pursuing?
As technology is becoming more and more important to businesses, hiring employees with technical skills
is becoming critical. But how can an organization ensure that the person they are hiring has the necessary
skills? These days, many organizations are including technical certifications as a prerequisite for getting
hired.
Certifications are designations given by a certifying body that someone has a specific level of
knowledge in a specific technology. This certifying body is often the vendor of the product itself, though
independent certifying organizations, such as CompTIA, also exist. Many of these organizations offer
certification tracks, allowing a beginning certificate as a prerequisite to getting more advanced certificates.
To get a certificate, you generally attend one or more training classes and then take one or more certification
exams. Passing the exams with a certain score will qualify you for a certificate. In most cases, these classes
and certificates are not free and, in fact, can run into the thousands of dollars. Some examples of the
certifications in highest demand include Microsoft (software certifications), Cisco (networking), and SANS
(security).
For many working in IT (or thinking about an IT career), determining whether to pursue one or more of
these certifications is an important question. For many jobs, such as those involving networking or security,
a certificate will be required by the employer as a way to determine which potential employees have a
basic level of skill. For those who are already in an IT career, a more advanced certificate may lead to a
promotion. There are other cases, however, when experience with a certain technology will negate the need
for certification. For those wondering about the importance of certification, the best solution is to talk to
potential employers and those already working in the field to determine the best choice.
People 99
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2013/04/Career-Paths
http://bus206.pressbooks.com/files/2013/04/Career-Paths
http://certification.comptia.org/home.aspx
http://www.microsoft.com/learning/en/us/certification-overview.aspx
http://www.cisco.com/web/learning/certifications/index.html
http://www.sans.org/
Organizing the Information-Systems Function
In the early years of computing, the information-systems function (generally called data processing) was
placed in the finance or accounting department of the organization. As computing became more important,
a separate information-systems function was formed, but it still was generally placed under the CFO and
considered to be an administrative function of the company. In the 1980s and 1990s, when companies began
networking internally and then linking up to the Internet, the information-systems function was combined
with the telecommunications functions and designated the information technology (IT) department. As the
role of information technology continued to increase, its place in the organization also moved up the ladder.
In many organizations today, the head of IT (the CIO) reports directly to the CEO.
Where in the Organization Should IS Be?
Before the advent of the personal computer, the information-systems function was centralized within
organizations in order to maximize control over computing resources. When the PC began proliferating,
many departments within organizations saw it as a chance to gain some computing resources for
themselves. Some departments created an internal information-systems group, complete with systems
analysts, programmers, and even database administrators. These departmental-IS groups were dedicated
to the information needs of their own departments, providing quicker turnaround and higher levels of
service than a centralized IT department. However, having several IS groups within an organization led to
a lot of inefficiencies: there were now several people performing the same jobs in different departments.
This decentralization also led to company data being stored in several places all over the company. In
some organizations, a “matrix” reporting structure has developed, in which IT personnel are placed within
a department and report to both the department management and the functional management within IS. The advantages of
dedicated IS personnel for each department are weighed against the need for more control over the strategic information resources
of the company.
For many companies, these questions are resolved by the implementation of the ERP system (see
discussion of ERP in chapter 8). Because an ERP system consolidates most corporate data back into a single
database, the implementation of an ERP system requires organizations to find “islands” of data so that they
can integrate them back into the corporate system. The ERP allows organizations to regain control of their
information and influences organizational decisions throughout the company.
Outsourcing
Many times, an organization needs a specific skill for a limited period of time. Instead of training an
existing employee or hiring someone new, it may make more sense to outsource the job. Outsourcing can be
used in many different situations within the information-systems function, such as the design and creation
of a new website or the upgrade of an ERP system. Some organizations see outsourcing as a cost-cutting
move, contracting out a whole group or department.
New Models of Organizations
The integration of information technology has influenced the structure of organizations. The increased
ability to communicate and share information has led to a “flattening” of the organizational structure due to
the removal of one or more layers of management.
100 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Technology adoption user types (click to enlarge).
(Public Domain)
Another organizational change enabled by information systems is the network-based organizational
structure. In a networked-based organizational structure, groups of employees can work somewhat
independently to accomplish a project. In a networked organization, people with the right skills are brought
together for a project and then released to work on other projects when that project is over. These groups
are somewhat informal and allow for all members of the group to maximize their effectiveness.
Information-Systems Users – Types of Users
Besides the people who work to create, administer, and manage information systems, there is one more
extremely important group of people: the users of information systems. This group represents a very large
percentage of the people involved. If the user is not able to successfully learn and use an information
system, the system is doomed to failure.
One tool that can be used to understand how users will
adopt a new technology comes from a 1962 study by
Everett Rogers. In his book, Diffusion of Innovation,1 Rogers
studied how farmers adopted new technologies, and he noticed that the
adoption rate started slowly and then dramatically increased once
adoption hit a certain point. He identified five specific types of
technology adopters:
• Innovators. Innovators are the first individuals to
adopt a new technology. Innovators are willing to
take risks, are the youngest in age, have the
highest social class, have great financial liquidity,
are very social, and have the closest contact with
scientific sources and interaction with other
innovators. Risk tolerance has them adopting technologies that may ultimately fail. Financial
resources help absorb these failures (Rogers 1962 5th ed, p. 282).
• Early adopters. The early adopters are those who adopt innovation after a technology has been
introduced and proven. These individuals have the highest degree of opinion leadership among
the other adopter categories, which means that they can influence the opinions of the largest
majority. They are typically younger in age, have higher social status, more financial liquidity,
more advanced education, and are more socially aware than later adopters. These people are more
discrete in adoption choices than innovators, and realize judicious choice of adoption will help
them maintain a central communication position (Rogers 1962 5th ed, p. 283).
• Early majority. Individuals in this category adopt an innovation after a varying degree of time.
This time of adoption is significantly longer than the innovators and early adopters. This group
tends to be slower in the adoption process, has above average social status, has contact with early
1. Rogers, E. M. (1962). Diffusion of innovations. New York: Free Press
People 101
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2013/04/diffusion-of-innovation
http://bus206.pressbooks.com/files/2013/04/diffusion-of-innovation
adopters, and seldom holds positions of opinion leadership in a system (Rogers 1962 5th ed, p.
283).
• Late majority. The late majority will adopt an innovation after the average member of the society.
These individuals approach an innovation with a high degree of skepticism, have below average
social status, very little financial liquidity, are in contact with others in the late majority and the
early majority, and show very little opinion leadership.
• Laggards. Individuals in this category are the last to adopt an innovation. Unlike those in the
previous categories, individuals in this category show no opinion leadership. These individuals
typically have an aversion to change-agents and tend to be advanced in age. Laggards typically
tend to be focused on “traditions,” are likely to have the lowest social status and the lowest
financial liquidity, be oldest of all other adopters, and be in contact with only family and close
friends.
These five types of users can be translated into information-technology adopters as well, and provide
additional insight into how to implement new information systems within an organization. For example,
when rolling out a new system, IT may want to identify the innovators and early adopters within the
organization and work with them first, then leverage their adoption to drive the rest of the implementation.
Summary
In this chapter, we have reviewed the many different categories of individuals who make up the people
component of information systems. The world of information technology is changing so fast that new roles
are being created all the time, and roles that existed for decades are being phased out. That said, this chapter
should have given you a good idea of the importance of the people component of information systems.
Study Questions
1. Describe the role of a systems analyst.
2. What are some of the different roles for a computer engineer?
3. What are the duties of a computer operator?
4. What does the CIO do?
5. Describe the job of a project manager.
6. Explain the point of having two different career paths in information systems.
7. What are the advantages and disadvantages of centralizing the IT function?
8. What impact has information technology had on the way companies are organized?
9. What are the five types of information-systems users?
10. Why would an organization outsource?
102 Information Systems for Business and Beyond
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Exercises
1. Which IT job would you like to have? Do some original research and write a two-page paper
describing the duties of the job you are interested in.
2. Spend a few minutes on Dice or Monster to find IT jobs in your area. What IT jobs are
currently available? Write up a two-page paper describing three jobs, their starting salary (if
listed), and the skills and education needed for the job.
3. How is the IT function organized in your school or place of employment? Create an
organization chart showing how the IT organization fits into your overall organization. Comment
on how centralized or decentralized the IT function is.
4. What type of IT user are you? Take a look at the five types of technology adopters and then
write a one-page summary of where you think you fit in this model.
People 103
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://dice.com
http://monster.com
Chapter 10: Information Systems Development
David T. Bourgeois
Learning Objectives
Upon successful completion of this chapter, you will be able to:
• explain the overall process of developing a new software application;
• explain the differences between software development methodologies;
• understand the different types of programming languages used to develop software;
• understand some of the issues surrounding the development of websites and mobile applications;
and
• identify the four primary implementation policies.
Introduction
When someone has an idea for a new function to be performed by a computer, how does that idea become
reality? If a company wants to implement a new business process and needs new hardware or software to
support it, how do they go about making it happen? In this chapter, we will discuss the different methods
of taking those ideas and bringing them to reality, a process known as information systems development.
Programming
As we learned in chapter 2, software is created via programming. Programming is the process of creating
a set of logical instructions for a digital device to follow using a programming language. The process of
programming is sometimes called “coding” because the syntax of a programming language is not in a form
that everyone can understand – it is in “code.”
The process of developing good software is usually not as simple as sitting down and writing some
code. True, sometimes a programmer can quickly write a short program to solve a need. But most of the
time, the creation of software is a resource-intensive process that involves several different groups of people
in an organization. In the following sections, we are going to review several different methodologies for
software development.
Systems-Development Life Cycle
The first development methodology we are going to review is the systems-development life cycle (SDLC).
This methodology was first developed in the 1960s to manage the large software projects associated with
corporate systems running on mainframes. It is a very structured and risk-averse methodology designed to
manage large projects that included multiple programmers and systems that would have a large impact on
the organization.
104
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
SDLC waterfall (click to enlarge)
Various definitions of the SDLC methodology exist, but
most contain the following phases.
1. Preliminary Analysis. In this phase, a review is
done of the request. Is creating a solution
possible? What alternatives exist? What is
currently being done about it? Is this project a
good fit for our organization? A key part of this
step is a feasibility analysis, which includes an
analysis of the technical feasibility (is it possible
to create this?), the economic feasibility (can we
afford to do this?), and the legal feasibility (are
we allowed to do this?). This step is important in determining if the project should even get
started.
2. System Analysis. In this phase, one or more system analysts work with different stakeholder
groups to determine the specific requirements for the new system. No programming is done in this
step. Instead, procedures are documented, key players are interviewed, and data requirements are
developed in order to get an overall picture of exactly what the system is supposed to do. The
result of this phase is a system-requirements document.
3. System Design. In this phase, a designer takes the system-requirements document created in the
previous phase and develops the specific technical details required for the system. It is in this
phase that the business requirements are translated into specific technical requirements. The
design for the user interface, database, data inputs and outputs, and reporting are developed here.
The result of this phase is a system-design document. This document will have everything a
programmer will need to actually create the system.
4. Programming. The code finally gets written in the programming phase. Using the system-
design document as a guide, a programmer (or team of programmers) develop the program. The
result of this phase is an initial working program that meets the requirements laid out in the
system-analysis phase and the design developed in the system-design phase.
5. Testing. In the testing phase, the software program developed in the previous phase is put
through a series of structured tests. The first is a unit test, which tests individual parts of the code
for errors or bugs. Next is a system test, where the different components of the system are tested
to ensure that they work together properly. Finally, the user-acceptance test allows those that will
be using the software to test the system to ensure that it meets their standards. Any bugs, errors, or
problems found during testing are addressed and then tested again.
Chapter 10: Information Systems Development 105
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2013/04/SDLC-Waterfall
http://bus206.pressbooks.com/files/2013/04/SDLC-Waterfall
The RAD methodology (Public Domain)
6. Implementation. Once the new system is developed and tested, it has to be implemented in the
organization. This phase includes training the users, providing documentation, and conversion
from any previous system to the new system. Implementation can take many forms, depending on
the type of system, the number and type of users, and how urgent it is that the system become
operational. These different forms of implementation are covered later in the chapter.
7. Maintenance. This final phase takes place once the implementation phase is complete. In this
phase, the system has a structured support process in place: reported bugs are fixed and requests
for new features are evaluated and implemented; system updates and backups are performed on a
regular basis.
The SDLC methodology is sometimes referred to as the waterfall methodology to represent how each step
is a separate part of the process; only when one step is completed can another step begin. After each step,
an organization must decide whether to move to the next step or not. This methodology has been criticized
for being quite rigid. For example, changes to the requirements are not allowed once the process has begun.
No software is available until after the programming phase.
Again, SDLC was developed for large, structured projects. Projects using SDLC can sometimes
take months or years to complete. Because of its inflexibility and the availability of new programming
techniques and tools, many other software-development methodologies have been developed. Many of
these retain some of the underlying concepts of SDLC but are not as rigid.
Rapid Application Development
Rapid application development (RAD) is a software-
development (or systems-development) methodology that
focuses on quickly building a working model of the
software, getting feedback from users, and then using that
feedback to update the working model. After several
iterations of development, a final version is developed and
implemented.
The RAD methodology consists of four phases:
1. Requirements Planning. This phase is similar to the
preliminary-analysis, system-analysis, and design phases
of the SDLC. In this phase, the overall requirements for
the system are defined, a team is identified, and feasibility
is determined.
2. User Design. In this phase, representatives of the users work with the system analysts,
designers, and programmers to interactively create the design of the system. One technique for
working with all of these various stakeholders is the so-called JAD session. JAD is an acronym
for joint application development. A JAD session gets all of the stakeholders together to have a
106 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2013/04/RADModel
http://bus206.pressbooks.com/files/2013/04/RADModel
structured discussion about the design of the system. Application developers also sit in on this
meeting and observe, trying to understand the essence of the requirements.
3. Construction. In the construction phase, the application developers, working with the users,
build the next version of the system.This is an interactive process, and changes can be made as
developers are working on the program. This step is executed in parallel with the User Design
step in an iterative fashion, until an acceptable version of the product is developed.
4. Cutover. In this step, which is similar to the implementation step of the SDLC, the system goes
live. All steps required to move from the previous state to the use of the new system are
completed here.
As you can see, the RAD methodology is much more compressed than SDLC. Many of the SDLC steps
are combined and the focus is on user participation and iteration. This methodology is much better suited
for smaller projects than SDLC and has the added advantage of giving users the ability to provide feedback
throughout the process. SDLC requires more documentation and attention to detail and is well suited
to large, resource-intensive projects. RAD makes more sense for smaller projects that are less resource-
intensive and need to be developed quickly.
Agile Methodologies
Agile methodologies are a group of methodologies that utilize incremental changes with a focus on quality
and attention to detail. Each increment is released in a specified period of time (called a time box), creating
a regular release schedule with very specific objectives. While considered a separate methodology from
RAD, they share some of the same principles: iterative development, user interaction, ability to change.
The agile methodologies are based on the “Agile Manifesto,” first released in 2001.
The characteristics of agile methods include:
• small cross-functional teams that include development-team members and users;
• daily status meetings to discuss the current state of the project;
• short time-frame increments (from days to one or two weeks) for each change to be completed;
and
• at the end of each iteration, a working project is completed to demonstrate to the stakeholders.
The goal of the agile methodologies is to provide the flexibility of an iterative approach while ensuring a
quality product.
Chapter 10: Information Systems Development 107
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://agilemanifesto.org/
The lean methodology (click to enlarge)
The quality triangle
Lean Methodology
One last methodology we will discuss is a relatively new
concept taken from the business bestseller The Lean
Startup, by Eric Reis. In this methodology, the focus is on
taking an initial idea and developing a minimum viable
product (MVP). The MVP is a working software
application with just enough functionality to demonstrate
the idea behind the project. Once the MVP is developed, it
is given to potential users for review. Feedback on the
MVP is generated in two forms: (1) direct observation and
discussion with the users, and (2) usage statistics gathered
from the software itself. Using these two forms of
feedback, the team determines whether they should
continue in the same direction or rethink the core idea
behind the project, change the functions, and create a new
MVP. This change in strategy is called a pivot. Several
iterations of the MVP are developed, with new functions
added each time based on the feedback, until a final
product is completed.
The biggest difference between the lean methodology
and the other methodologies is that the full set of requirements for the system are not known when the
project is launched. As each iteration of the project is released, the statistics and feedback gathered are used
to determine the requirements. The lean methodology works best in an entrepreneurial environment where a
company is interested in determining if their idea for a software application is worth developing.
Sidebar: The Quality Triangle
When developing software, or any sort of product or
service, there exists a tension between the developers and
the different stakeholder groups, such as management,
users, and investors. This tension relates to how quickly
the software can be developed (time), how much money
will be spent (cost), and how well it will be built (quality).
The quality triangle is a simple concept. It states that for
any product or service being developed, you can only
address two of the following: time, cost, and quality.
So what does it mean that you can only address two of
the three? It means that you cannot complete a low-cost,
high-quality project in a small amount of time. However, if
you are willing or able to spend a lot of money, then a
project can be completed quickly with high-quality results
(through hiring more good programmers). If a project’s
completion date is not a priority, then it can be completed at a lower cost with higher-quality results. Of
108 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2013/04/Lean-Methology
http://bus206.pressbooks.com/files/2013/04/Lean-Methology
http://theleanstartup.com/
http://theleanstartup.com/
http://bus206.pressbooks.com/files/2013/04/Quality-Triangle
http://bus206.pressbooks.com/files/2013/04/Quality-Triangle
course, these are just generalizations, and different projects may not fit this model perfectly. But overall,
this model helps us understand the tradeoffs that we must make when we are developing new products and
services.
Programming Languages
As I noted earlier, software developers create software using one of several programming languages. A
programming language is an artificial language that provides a way for a programmer to create structured
code to communicate logic in a format that can be executed by the computer hardware. Over the past few
decades, many different types of programming languages have evolved to meet many different needs. One
way to characterize programming languages is by their “generation.”
Generations of Programming Languages
Early languages were specific to the type of hardware that had to be programmed; each type of computer
hardware had a different low-level programming language (in fact, even today there are differences at
the lower level, though they are now obscured by higher-level programming languages). In these early
languages, very specific instructions had to be entered line by line – a tedious process.
First-generation languages are called machine code. In machine code, programming is done by directly
setting actual ones and zeroes (the bits) in the program using binary code. Here is an example program that
adds 1234 and 4321 using machine language:
10111001 00000000
11010010 10100001
00000100 00000000
10001001 00000000
00001110 10001011
00000000 00011110
00000000 00011110
00000000 00000010
10111001 00000000
11100001 00000011
00010000 11000011
10001001 10100011
00001110 00000100
00000010 00000000
Assembly language is the second-generation language. Assembly language gives english-like phrases to
the machine-code instructions, making it easier to program. An assembly-language program must be run
through an assembler, which converts it into machine code. Here is an example program that adds 1234 and
4321 using assembly language:
Chapter 10: Information Systems Development 109
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
MOV CX,1234
MOV DS:[0],CX
MOV CX,4321
MOV AX,DS:[0]
MOV BX,DS:[2]
ADD AX,BX
MOV DS:[4],AX
Third-generation languages are not specific to the type of hardware on which they run and are much more
like spoken languages. Most third-generation languages must be compiled, a process that converts them
into machine code. Well-known third-generation languages include BASIC, C, Pascal, and Java. Here is an
example using BASIC:
A=1234
B=4321
C=A+B
END
Fourth-generation languages are a class of programming tools that enable fast application development
using intuitive interfaces and environments. Many times, a fourth-generation language has a very specific
purpose, such as database interaction or report-writing. These tools can be used by those with very little
formal training in programming and allow for the quick development of applications and/or functionality.
Examples of fourth-generation languages include: Clipper, FOCUS, FoxPro, SQL, and SPSS.
Why would anyone want to program in a lower-level language when they require so much more work?
The answer is similar to why some prefer to drive stick-shift automobiles instead of automatic transmission:
control and efficiency. Lower-level languages, such as assembly language, are much more efficient and
execute much more quickly. You have finer control over the hardware as well. Sometimes, a combination
of higher- and lower-level languages are mixed together to get the best of both worlds: the programmer will
create the overall structure and interface using a higher-level language but will use lower-level languages
for the parts of the program that are used many times or require more precision.
110 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
The programming language spectrum (click to enlarge)
Compiled vs. Interpreted
Besides classifying a program language based on its generation, it can also be classified by whether it is
compiled or interpreted. As we have learned, a computer language is written in a human-readable form. In
a compiled language, the program code is translated into a machine-readable form called an executable that
can be run on the hardware. Some well-known compiled languages include C, C++, and COBOL.
An interpreted language is one that requires a runtime program to be installed in order to execute.
This runtime program then interprets the program code line by line and runs it. Interpreted languages
are generally easier to work with but also are slower and require more system resources. Examples of
popular interpreted languages include BASIC, PHP, PERL, and Python. The web languages of HTML and
Javascript would also be considered interpreted because they require a browser in order to run.
The Java programming language is an interesting exception to this classification, as it is actually
a hybrid of the two. A program written in Java is partially compiled to create a program that can be
understood by the Java Virtual Machine (JVM). Each type of operating system has its own JVM which
must be installed, which is what allows Java programs to run on many different types of operating systems.
Procedural vs. Object-Oriented
A procedural programming language is designed to allow a programmer to define a specific starting point
for the program and then execute sequentially. All early programming languages worked this way. As user
interfaces became more interactive and graphical, it made sense for programming languages to evolve to
allow the user to define the flow of the program. The object-oriented programming language is set up so that
the programmer defines “objects” that can take certain actions based on input from the user. In other words,
a procedural program focuses on the sequence of activities to be performed; an object-oriented program
focuses on the different items being manipulated.
For example, in a human-resources system, an “EMPLOYEE” object would be needed. If the program
needed to retrieve or set data regarding an employee, it would first create an employee object in the
program and then set or retrieve the values needed. Every object has properties, which are descriptive
Chapter 10: Information Systems Development 111
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2013/04/Programming-Languages-Spectrum
http://bus206.pressbooks.com/files/2013/04/Programming-Languages-Spectrum
fields associated with the object. In the example below, an employee object has the properties “Name”,
“Employee number”, “Birthdate” and “Date of hire”. An object also has “methods”, which can take actions
related to the object. In the example, there are two methods. The first is “ComputePay()”, which will return
the current amount owed the employee. The second is “ListEmployees()”, which will retrieve a list of
employees who report to this employee.
Object: EMPLOYEE
Name
Employee number
Birthdate
Date of hire
ComputePay()
ListEmployees()
Figure: An example of an object
Sidebar: What is COBOL?
If you have been around business programming very long, you may have heard about the COBOL
programming language. COBOL is a procedural, compiled language that at one time was the primary
programming language for business applications. Invented in 1959 for use on large mainframe computers,
COBOL is an abbreviation of common business-oriented language. With the advent of more efficient
programming languages, COBOL is now rarely seen outside of old, legacy applications.
Programming Tools
To write a program, a programmer needs little more than a text editor and a good idea. However, to be
productive, he or she must be able to check the syntax of the code, and, in some cases, compile the code. To
be more efficient at programming, additional tools, such as an integrated development environment (IDE)
or computer-aided software-engineering (CASE) tools, can be used.
Integrated Development Environment
For most programming languages, an IDE can be used. An IDE provides a variety of tools for the
programmer, and usually includes:
• an editor for writing the program that will color-code or highlight keywords from the
programming language;
• a help system that gives detailed documentation regarding the programming language;
• a compiler/interpreter, which will allow the programmer to run the program;
• a debugging tool, which will provide the programmer details about the execution of the program
in order to resolve problems in the code; and
112 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
• a check-in/check-out mechanism, which allows for a team of programmers to work together on a
project and not write over each other’s code changes.
Probably the most popular IDE software package right now is Microsoft’s Visual Studio. Visual Studio is
the IDE for all of Microsoft’s programming languages, including Visual Basic, Visual C++, and Visual C#.
CASE Tools
While an IDE provides several tools to assist the programmer in writing the program, the code still must
be written. Computer-aided software-engineering (CASE) tools allow a designer to develop software with
little or no programming. Instead, the CASE tool writes the code for the designer. CASE tools come in
many varieties, but their goal is to generate quality code based on input created by the designer.
Sidebar: Building a Website
In the early days of the World Wide Web, the creation of a website required knowing how to use hypertext
markup language (HTML). Today, most websites are built with a variety of tools, but the final product
that is transmitted to a browser is still HTML. HTML, at its simplest, is a text language that allows you
to define the different components of a web page. These definitions are handled through the use of HTML
tags, which consist of text between brackets. For example, an HTML tag can tell the browser to show a
word in italics, to link to another web page, or to insert an image. In the example below, some text is being
defined as a heading while other text is being emphasized.
Simple HTML
Simple HTML output
While HTML is used to define the components of a web page, cascading style sheets (CSS) are used to
define the styles of the components on a page. The use of CSS allows the style of a website to be set and
stay consistent throughout. For example, if the designer wanted all first-level headings (h1) to be blue and
centered, he or she could set the “h1″ style to match. The following example shows how this might look.
Chapter 10: Information Systems Development 113
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://www.microsoft.com/visualstudio
http://bus206.pressbooks.com/files/2013/04/HTML-code-1
http://bus206.pressbooks.com/files/2013/04/HTML-code-1
http://bus206.pressbooks.com/files/2013/04/HTML-output-1
http://bus206.pressbooks.com/files/2013/04/HTML-output-1
HTML with CSS
HTML with CSS output
The combination of HTML and CSS can be used to create a wide variety of formats and designs and has
been widely adopted by the web-design community. The standards for HTML are set by a governing body
called the World Wide Web Consortium. The current version of HTML is HTML 5, which includes new
standards for video, audio, and drawing.
When developers create a website, they do not write it out manually in a text editor. Instead, they use
web design tools that generate the HTML and CSS for them. Tools such as Adobe Dreamweaver allow the
designer to create a web page that includes images and interactive elements without writing a single line of
code. However, professional web designers still need to learn HTML and CSS in order to have full control
over the web pages they are developing.
Build vs. Buy
When an organization decides that a new software program needs to be developed, they must determine if
it makes more sense to build it themselves or to purchase it from an outside company. This is the “build vs.
buy” decision.
There are many advantages to purchasing software from an outside company. First, it is generally less
expensive to purchase a software package than to build it. Second, when a software package is purchased, it
is available much more quickly than if the package is built in-house. Software applications can take months
or years to build; a purchased package can be up and running within a month. A purchased package has
already been tested and many of the bugs have already been worked out. It is the role of a systems integrator
to make various purchased systems and the existing systems at the organization work together.
There are also disadvantages to purchasing software. First, the same software you are using can be
used by your competitors. If a company is trying to differentiate itself based on a business process that is in
114 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2013/04/HTML-CSS-code
http://bus206.pressbooks.com/files/2013/04/HTML-CSS-code
http://bus206.pressbooks.com/files/2013/04/HTML-CSS-output
http://bus206.pressbooks.com/files/2013/04/HTML-CSS-output
http://www.w3.org/
http://www.adobe.com/products/dreamweaver.html
that purchased software, it will have a hard time doing so if its competitors use the same software. Another
disadvantage to purchasing software is the process of customization. If you purchase a software package
from a vendor and then customize it, you will have to manage those customizations every time the vendor
provides an upgrade. This can become an administrative headache, to say the least!
Even if an organization determines to buy software, it still makes sense to go through many of the
same analyses that they would do if they were going to build it themselves. This is an important decision
that could have a long-term strategic impact on the organization.
Web Services
As we saw in chapter 3, the move to cloud computing has allowed software to be looked at as a service. One
option companies have these days is to license functions provided by other companies instead of writing the
code themselves. These are called web services, and they can greatly simplify the addition of functionality
to a website.
For example, suppose a company wishes to provide a map showing the location of someone who
has called their support line. By utilizing Google Maps API web services, they can build a Google Map
right into their application. Or a shoe company could make it easier for its retailers to sell shoes online by
providing a shoe-size web service that the retailers could embed right into their website.
Web services can blur the lines between “build vs. buy.” Companies can choose to build a software
application themselves but then purchase functionality from vendors to supplement their system.
End-User Computing
In many organizations, application development is not limited to the programmers and analysts in the
information-technology department. Especially in larger organizations, other departments develop their
own department-specific applications. The people who build these are not necessarily trained in
programming or application development, but they tend to be adept with computers. A person, for example,
who is skilled in a particular software package, such as a spreadsheet or database package, may be called
upon to build smaller applications for use by his or her own department. This phenomenon is referred to as
end-user development, or end-user computing.
End-user computing can have many advantages for an organization. First, it brings the development
of applications closer to those who will use them. Because IT departments are sometimes quite backlogged, it also provides a means
to have software created more quickly. Many organizations encourage end-user computing to reduce the strain on the IT department.
End-user computing does have its disadvantages as well. If departments within an organization are
developing their own applications, the organization may end up with several applications that perform
similar functions, which is inefficient, since it is a duplication of effort. Sometimes, these different versions
of the same application end up providing different results, bringing confusion when departments interact.
These applications are often developed by someone with little or no formal training in programming. In
these cases, the software developed can have problems that then have to be resolved by the IT department.
End-user computing can be beneficial to an organization, but it should be managed. The IT department
should set guidelines and provide tools for the departments who want to create their own solutions.
Communication between departments will go a long way towards successful use of end-user computing.
Chapter 10: Information Systems Development 115
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
https://developers.google.com/maps/documentation/webservices/
Sidebar: Building a Mobile App
In many ways, building an application for a mobile device is exactly the same as building an application for
a traditional computer. Understanding the requirements for the application, designing the interface, working
with users – all of these steps still need to be carried out.
So what’s different about building an application for a mobile device? In some ways, mobile
applications are more limited. An application running on a mobile device must be designed to be functional
on a smaller screen. Mobile applications should be designed to use fingers as the primary pointing device.
Mobile devices generally have less available memory, storage space, and processing power.
Mobile applications also have many advantages over applications built for traditional computers.
Mobile applications have access to the functionality of the mobile device, which usually includes features
such as geolocation data, messaging, the camera, and even a gyroscope.
One of the most important questions regarding development for mobile devices is this: Do we want to
develop an app at all? A mobile app is an expensive proposition, and it will only run on one type of mobile
device at a time. For example, if you create an iPhone app, users with Android phones are out of luck. Each
app takes several thousand dollars to create, so this may not be the best use of your funds.
Many organizations are moving away from developing a specific app for a mobile device and are
instead making their websites more functional on mobile devices. Using a web-design framework called
responsive design, a website can be made highly functional no matter what type of device is browsing it.
With a responsive website, images resize themselves based on the size of the device’s screen, and text flows
and sizes itself properly for optimal viewing. You can find out more about responsive design here.
Implementation Methodologies
Once a new system is developed (or purchased), the organization must determine the best method for
implementing it. Convincing a group of people to learn and use a new system can be a very difficult
process. Using new software, and the business processes it gives rise to, can have far-reaching effects
within the organization.
There are several different methodologies an organization can adopt to implement a new system. Four
of the most popular are listed below.
• Direct cutover. In the direct-cutover implementation methodology, the organization selects a
particular date that the old system is not going to be used anymore. On that date, the users begin
using the new system and the old system is unavailable. The advantages to using this
methodology are that it is very fast and the least expensive. However, this method is the riskiest
as well. If the new system has an operational problem or if the users are not properly prepared, it
could prove disastrous for the organization.
• Pilot implementation. In this methodology, a subset of the organization (called a pilot group)
starts using the new system before the rest of the organization. This has a smaller impact on the
company and allows the support team to focus on a smaller group of individuals.
• Parallel operation. With parallel operation, the old and new systems are used simultaneously for a
limited period of time. This method is the least risky because the old system is still being used
116 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://mashable.com/2012/12/11/responsive-web-design/
while the new system is essentially being tested. However, this is by far the most expensive
methodology since work is duplicated and support is needed for both systems in full.
• Phased implementation. In phased implementation, different functions of the new application are
used as functions from the old system are turned off. This approach allows an organization to
slowly move from one system to another.
Which of these implementation methodologies to use depends on the complexity and importance of the old
and new systems.
Change Management
As new systems are brought online and old systems are phased out, it becomes important to manage the
way change is implemented in the organization. Change should never be introduced in a vacuum. The
organization should be sure to communicate proposed changes before they happen and plan to minimize
the impact of the change that will occur after implementation. Change management is a critical component
of IT oversight.
Maintenance
Once a new system has been introduced, it enters the maintenance phase. In this phase, the system is in
production and is being used by the organization. While the system is no longer actively being developed,
changes need to be made when bugs are found or new features are requested. During the maintenance
phase, IT management must ensure that the system continues to stay aligned with business priorities and
continues to run well.
Summary
Software development is about so much more than programming. Developing new software applications
requires several steps, from the formal SDLC process to more informal processes such as agile
programming or lean methodologies. Programming languages have evolved from very low-level machine-
specific languages to higher-level languages that allow a programmer to write software for a wide variety
of machines. Most programmers work with software development tools that provide them with integrated
components to make the software development process more efficient. For some organizations, building
their own software applications does not make the most sense; instead, they choose to purchase software
built by a third party to save development costs and speed implementation. In end-user computing, software
development happens outside the information technology department. When implementing new software
applications, there are several different types of implementation methodologies that must be considered.
Study Questions
1. What are the steps in the SDLC methodology?
2. What is RAD software development?
Chapter 10: Information Systems Development 117
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
3. What makes the lean methodology unique?
4. What are three differences between second-generation and third-generation languages?
5. Why would an organization consider building its own software application if it is cheaper to
buy one?
6. What is responsive design?
7. What is the relationship between HTML and CSS in website design?
8. What is the difference between the pilot implementation methodology and the parallel
implementation methodology?
9. What is change management?
10. What are the four different implementation methodologies?
Exercises
1. Which software-development methodology would be best if an organization needed to develop
a software tool for a small group of users in the marketing department? Why? Which
implementation methodology should they use? Why?
2. Doing your own research, find three programming languages and categorize them in these
areas: generation, compiled vs. interpreted, procedural vs. object-oriented.
3. Some argue that HTML is not a programming language. Doing your own research, find three
arguments for why it is not a programming language and three arguments for why it is.
4. Read more about responsive design using the link given in the text. Provide the links to three
websites that use responsive design and explain how they demonstrate responsive-design
behavior.
118 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Part 3: Information Systems Beyond the Organization
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Chapter 11: Globalization and the Digital Divide
David T. Bourgeois
Learning Objectives
Upon successful completion of this chapter, you will be able to:
• explain the concept of globalization;
• describe the role of information technology in globalization;
• identify the issues experienced by firms as they face a global economy; and
• define the digital divide and explain Nielsen’s three stages of the digital divide.
Introduction
The Internet has wired the world. Today it is just as simple to communicate with someone on the other side of the
world as it is to talk to someone next door. In this chapter, we will look at the implications of globalization and the impact it is
having on the world.
What Is Globalization?
Globalization is the term used to refer to the integration of goods, services, and culture among the
nations of the world. Globalization is not necessarily a new phenomenon; in many ways, we have been
experiencing globalization since the days of European colonization. Further advances in telecommunication
and transportation technologies accelerated globalization. The advent of the the worldwide Internet has
made all nations next-door neighbors.
The Internet is truly a worldwide phenomenon. As of 2012, the Internet was being used in over 150
countries by a staggering 2.4 billion people worldwide, and growing.1 From its initial beginnings in the
United States in the 1970s to the development of the World Wide Web in the 1990s to the social networks
and e-commerce of today, the Internet has continued to increase the integration between countries, making
globalization a fact of life for citizens all over the world.
1. http://internetworldstats.com/
120
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Source: Internet World Stats
The Network Society
In 1996, social-sciences researcher Manuel Castells published The Rise of the Network Society, in which
he identified new ways in which economic activity was being organized around the networks that the new
telecommunication technologies have provided. This new, global economic activity was different from the
past, because “it is an economy with the capacity to work as a unit in real time on a planetary scale.”2 We
are now into this network society, where we are all connected on a global scale.
The World Is Flat
In 2005, Thomas Friedman’s seminal book, The World Is Flat, was published. In this book, Friedman
unpacks the impacts that the personal computer, the Internet, and communication software have had on
business, specifically the impact they have had on globalization. He begins the book by defining the three
eras of globalization3:
• “Globalization 1.0″ occurred from 1492 until about 1800. In this era, globalization was centered
around countries. It was about how much horsepower, wind power, and steam power a country
had and how creatively it was deployed. The world shrank from size “large” to size “medium.”
• “Globalization 2.0″ occurred from about 1800 until 2000, interrupted only by the two World
Wars. In this era, the dynamic force driving change was multinational companies. The world
shrank from size “medium” to size “small.”
• “Globalization 3.0″ is our current era, beginning in the year 2000. The convergence of the
personal computer, fiber-optic Internet connections, and software has created a “flat-world
platform” that allows small groups and even individuals to go global. The world has shrunk from
size “small” to size “tiny.”
2. Manuel Castells. 2000. The Rise of the Network Society (2nd ed.). Blackwell Publishers, Inc., Cambridge, MA, USA.
3. Friedman, T. L. (2005). The world is flat: A brief history of the twenty-first century. New York: Farrar, Straus and Giroux.
Chapter 11: Globalization and the Digital Divide 121
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://www.internetworldstats.com/stats.htm
http://www.internetworldstats.com/stats.htm
According to Friedman, this third era of globalization was brought about, in many respects, by information
technology. Some of the specific technologies he lists include:
• The graphical user interface for the personal computer popularized in the late 1980s. Before the
graphical user interface, using a computer was relatively difficult. By making the personal
computer something that anyone could use, it became commonplace very quickly. Friedman
points out that this digital storage of content made people much more productive and, as the
Internet evolved, made it simpler to communicate content worldwide.
• The build-out of the Internet infrastructure during the dot-com boom during the
late-1990s. During the late 1990s, telecommunications companies laid thousands of miles of
fiber-optic cable all over the world, turning network communications into a commodity. At the
same time, the Internet protocols, such as SMTP (e-mail), HTML (web pages), and TCP/IP
(network communications) became standards that were available for free and used by everyone.
• The introduction of software to automate and integrate business processes. As the Internet
continued to grow and become the dominant form of communication, it became essential to build
on the standards developed earlier so that the websites and applications running on the Internet
would work well together. Friedman calls this “workflow software,” by which he means software
that allows people to work together more easily, and allows different software packages and
databases to integrate with each other more easily. Examples include payment-processing systems
and shipping calculators.
These three technologies came together in the late 1990s to create a “platform for global collaboration.”
Once these technologies were in place, they continued to evolve. Friedman also points out a couple more
technologies that have contributed to the flat-world platform – the open-source movement (see chapter 10)
and the advent of mobile technologies.
The World Is Flat was published in 2005. Since then, we have seen even more growth in information
technologies that have contributed to global collaborations. We will discuss current and future trends in
chapter 13.
The Global Firm
The new era of globalization allows any business to become international. By accessing this new platform
of technologies, Castells’s vision of working as a unit in real time on a planetary scale can be a reality.
Some of the advantages of this include the following:
• The ability to locate expertise and labor around the world. Instead of drawing employees from
their local area, organizations can now hire people from the global labor pool. This also allows
organizations to pay a lower labor cost for the same work based on the prevailing wage in
different countries.
122 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
• The ability to operate 24 hours a day. With employees in different time zones all around the
world, an organization can literally operate around the clock, handing off work on projects from
one part of the world to another. Businesses can also keep their digital storefront (their website)
open all the time.
• A larger market for their products. Once a product is being sold online, it is available for purchase
from a worldwide consumer base. Even if a company’s products do not appeal beyond its own
country’s borders, being online has also made the product more visible to consumers within that
country.
In order to fully take advantage of these new capabilities, companies need to understand that there are also
challenges in dealing with employees and customers from different cultures. Some of these challenges include:
• Infrastructure differences. Each country has its own infrastructure, many of which are not of the
same quality as the US infrastructure (average 4.60 MBps). For every South Korea (16 MBps
average speed) there is an Egypt (0.83 MBps) or an India (0.82 MBps). A business cannot depend
on every country it deals with having the same Internet speeds. See the sidebar called “How Does
My Internet Speed Compare?”
• Labor laws and regulations. Different countries (even different states in the United States) have
different laws and regulations. A company that wants to hire employees from other countries must
understand the different regulations and concerns.
• Legal restrictions. Many countries have restrictions on what can be sold or how a product can be
advertised. It is important for a business to understand what is allowed. For example, in Germany,
it is illegal to sell anything Nazi related; in China, it is illegal to put anything sexually suggestive
online.
• Language, customs, and preferences. Every country has its own (or several) unique culture(s),
which a business must consider when trying to market a product there. Additionally, different
countries have different preferences. For example, in some parts of the world, people prefer to eat
their french fries with mayonnaise instead of ketchup; in other parts of the world, specific hand
gestures (such as the thumbs-up) are offensive.
• International shipping. Shipping products between countries in a timely manner can be
challenging. Inconsistent address formats, dishonest customs agents, and prohibitive shipping
costs are all factors that must be considered when trying to deliver products internationally.
Because of these challenges, many businesses choose not to expand globally, either for labor or for
customers. Whether a business has its own website or relies on a third-party, such as Amazon or eBay, the
question of whether or not to globalize must be carefully considered.
Chapter 11: Globalization and the Digital Divide 123
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Sidebar: How Does My Internet Speed Compare?
How does your Internet speed compare with others in your state, country, or around the world? The chart
below shows how Internet speeds compare in different countries. You can find the full list of countries by going to this
article (http://royal.pingdom.com/2010/11/12/real-connection-speeds-for-internet-users-across-the-world/). You can also compare
the evolution of Internet speeds among countries by using this tool (http://www.akamai.com/stateoftheinternet/).
Internet speeds by country. Click to enlarge.
So how does your own Internet speed compare? There are many online tools you can use to determine the
speed at which you are connected. One of the most trusted sites is speedtest.net, where you can test both
your download speeds and upload speeds.
The Digital Divide
As the Internet continues to make inroads across the world, it is also creating a separation between those
who have access to this global network and those who do not. This separation is called the “digital divide”
and is of great concern. An article in Crossroads puts it this way4:
Adopted by the ACM Council in 1992, the ACM Code of Ethics and Professional Conduct
focuses on issues involving the Digital Divide that could prevent certain categories of people
— those from low-income households, senior citizens, single-parent children, the undereducated,
4. Kibum Kim. 2005. Challenges in HCI: digital divide. Crossroads 12, 2 (December 2005), 2-2. DOI=10.1145/1144375.1144377
http://doi.acm.org/10.1145/1144375.1144377
124 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://royal.pingdom.com/2010/11/12/real-connection-speeds-for-internet-users-across-the-world/
http://royal.pingdom.com/2010/11/12/real-connection-speeds-for-internet-users-across-the-world/
http://www.akamai.com/stateoftheinternet/
http://bus206.pressbooks.com/files/2013/04/Internet-speeds-by-country
http://bus206.pressbooks.com/files/2013/04/Internet-speeds-by-country
http://www.speedtest.net/
The XO laptop. Click to enlarge. (CC-BY: Mike
McGregor)
minorities, and residents of rural areas — from receiving adequate access to the wide variety of
resources offered by computer technology. This Code of Ethics positions the use of computers as a
fundamental ethical consideration: “In a fair society, all individuals would have equal opportunity
to participate in, or benefit from, the use of computer resources regardless of race, sex, religion,
age, disability, national origin, or other similar factors.” This article summarizes the digital divide
in its various forms, and analyzes reasons for the growing inequality in people’s access to Internet
services. It also describes how society can bridge the digital divide: the serious social gap between
information “haves” and “have-nots.”
The digital divide can occur between countries, regions, or even neighborhoods. In many US cities, there
are pockets with little or no Internet access, while just a few miles away high-speed broadband is common.
Solutions to the digital divide have had mixed success over the years. Many times, just providing
Internet access and/or computing devices is not enough to bring true Internet access to a country, region, or
neighborhood.
One Laptop per Child
One attempt to repair the digital divide was the One Laptop per Child effort. As stated on the organization’s
website, “The mission of One Laptop per Child (OLPC) is to empower the children of developing countries
to learn by providing one connected laptop to every school-age child. In order to accomplish our goal, we
need people who believe in what we’re doing and want to help make education for the world’s children a
priority, not a privilege.”5 Announced to great fanfare in 2005 by Nicholas Negroponte, the OLPC project
seemed destined for success.
The centerpiece of the project was the laptop itself: an
inexpensive computer designed to withstand a lot of
punishment. It utilized a revolutionary “mesh” network,
allowing the laptops to act as repeaters, extending a Wi-Fi
network far beyond their normal range. They also used
minimal power, making them practical for remote areas
with limited access to the electrical grid.
Unfortunately, the OLPC project failed to live up to
expectations, running into many of the problems related
to globalization discussed above: different cultures, corruption, and
competition. In an article that examined the success and failures of
OLPC, the authors state, “Expecting a laptop to cause such a
revolutionary change showed a degree of naivete, even for an
organization with the best of intentions and the smartest of people.”6
Today, OLPC is evolving their methods and their technology, trying to
deliver an OLPC tablet computer.
5. http://laptop.org/en/vision/mission/
6. One Laptop Per Child: Vision vs. Reality By Kenneth L. Kraemer, Jason Dedrick, Prakul Sharma Communications of the ACM, Vol. 52 No. 6, Pages 66-73
Chapter 11: Globalization and the Digital Divide 125
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2013/04/650px-XO-Beta1-mikemcgregor-2
http://bus206.pressbooks.com/files/2013/04/650px-XO-Beta1-mikemcgregor-2
A New Understanding of the Digital Divide
In 2006, web-usability consultant Jakob Nielsen wrote an article that got to the heart of our understanding
of this problem. In his article, he breaks the digital divide up into three stages: the economic divide, the
usability divide, and the empowerment divide7.
What is usually called the digital divide is, in Nielsen’s terms, the economic divide: the idea that some people can
afford to have a computer and Internet access while others cannot. Because of Moore’s Law (see chapter
2), the price of hardware has continued to drop and, at this point, we can now access digital technologies, such as
smartphones, for very little. This fact, Nielsen asserts, means that for all intents and purposes, the economic divide is a moot point
and we should not focus our resources on solving it.
The usability divide is concerned with the fact that “technology remains so complicated that many
people couldn’t use a computer even if they got one for free.” And even for those who can use a computer,
accessing all the benefits of having one is beyond their understanding. Included in this group are those with
low literacy and seniors. According to Nielsen, we know how to help these users, but we are not doing it
because there is little profit in doing so.
The empowerment divide is the most difficult to solve. It is concerned with how we use technology
to empower ourselves. Very few users truly understand the power that digital technologies can give them. In his article,
Nielsen explains that his (and others’) research has shown that very few users contribute content to the Internet, use advanced search,
or can even distinguish paid search ads from organic search results. Many people will limit what they can do online by accepting
the basic, default settings of their computer and not work to understand how they can truly be empowered.
Understanding the digital divide using these three stages provides a more nuanced view of how we can
work to alleviate it. While efforts such as One Laptop per Child are an excellent start, more work needs to
be done to address the second and third stages of the digital divide for a more holistic solution.
Sidebar: Using Gaming to Bridge the Digital Divide
Paul Kim, the Assistant Dean and Chief Technology Officer of the Stanford Graduate School of Education,
designed a project to address the digital divide for children in developing countries. 8 In their project, the
researchers wanted to understand if children can adopt and teach themselves mobile learning technology,
without help from teachers or other adults, and the processes and factors involved in this phenomenon.
The researchers developed a mobile device called TeacherMate, which contained a game designed to help
children learn math. The unique part of this research was that the researchers interacted directly with the
children; they did not channel the mobile devices through the teachers or the schools. Another important
factor to consider: in order to understand the context of the children’s educational environment, the researchers began the
project by working with parents and local nonprofits six months before their visit. While the results of this research are too detailed
to go into here, it can be said that the researchers found that children can, indeed, adopt and teach themselves mobile learning
technologies.
What makes this research so interesting when thinking about the digital divide is that the researchers
found that, in order to be effective, they had to customize their technology and tailor their implementation
to the specific group they were trying to reach. One of their conclusions stated the following:
7. http://www.nngroup.com/articles/digital-divide-the-three-stages/
8. Kim, P., Buckner, E., Makany, T., & Kim, H. (2011). A comparative analysis of a game-based mobile learning model in low-
socioeconomic communities of India. International Journal of Educational Development. doi:10.1016/j.ijedudev.2011.05.008.
126 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Considering the rapid advancement of technology today, mobile learning options for future
projects will only increase. Consequently, researchers must continue to investigate their impact;
we believe there is a specific need for more in-depth studies on ICT [information and
communication technology] design variations to meet different challenges of different localities.
To read more about Dr. Kim’s project, locate the paper referenced in this sidebar.
Summary
Information technology has driven change on a global scale. As documented by Castells and Friedman,
technology has given us the ability to integrate with people all over the world using digital tools. These tools
have allowed businesses to broaden their labor pools, their markets, and even their operating hours. But
they have also brought many new complications for businesses, which now must understand regulations,
preferences, and cultures from many different nations. This new globalization has also exacerbated the
digital divide. Nielsen has suggested that the digital divide consists of three stages (economic, usability,
and empowerment), of which the economic stage is virtually solved.
Study Questions
1. What does the term globalization mean?
2. How does Friedman define the three eras of globalization?
3. Which technologies have had the biggest effect on globalization?
4. What are some of the advantages brought about by globalization?
5. What are the challenges of globalization?
6. What does the term digital divide mean?
7. What are Jakob Nielsen’s three stages of the digital divide?
8. What was one of the key points of The Rise of the Network Society?
9. Which country has the highest average Internet speed? How does your country compare?
10. What is the OLPC project? Has it been successful?
Exercises
1. Compare the concept of Friedman’s “Globalization 3.0″ with Nielsen empowerment stage of
the digital divide.
2. Do some original research to determine some of the regulations that a US company may have
to consider before doing business in one of the following countries: China, Germany, Saudi
Arabia, Turkey.
Chapter 11: Globalization and the Digital Divide 127
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
3. Go to speedtest.net to determine your Internet speed. Compare your speed at home to the
Internet speed at two other locations, such as your school, place of employment, or local coffee
shop. Write up a one-page summary that compares these locations.
4. Give one example of the digital divide and describe what you would do to address it.
5. How did the research conducted by Paul Kim address the three levels of the digital divide?
128 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Chapter 12: The Ethical and Legal Implications of Information Systems
David T. Bourgeois
Learning Objectives
Upon successful completion of this chapter, you will be able to:
• describe what the term information systems ethics means;
• explain what a code of ethics is and describe the advantages and disadvantages;
• define the term intellectual property and explain the protections provided by copyright, patent,
and trademark; and
• describe the challenges that information technology brings to individual privacy.
Introduction
Information systems have had an impact far beyond the world of business. New technologies create new
situations that we have never dealt with before. How do we handle the new capabilities that these devices
empower us with? What new laws are going to be needed to protect us from ourselves? This chapter will
kick off with a discussion of the impact of information systems on how we behave (ethics). This will be
followed with the new legal structures being put in place, with a focus on intellectual property and privacy.
Information Systems Ethics
The term ethics is defined as “a set of moral principles” or “the principles of conduct governing an
individual or a group.”1 Since the dawn of civilization, the study of ethics and their impact has fascinated
mankind. But what do ethics have to do with information systems?
The introduction of new technology can have a profound effect on human behavior. New technologies
give us capabilities that we did not have before, which in turn create environments and situations that have
not been specifically addressed in ethical terms. Those who master new technologies gain new power;
those who cannot or do not master them may lose power. In 1913, Henry Ford implemented the first
moving assembly line to create his Model T cars. While this was a great step forward technologically
(and economically), the assembly line reduced the value of human beings in the production process. The
development of the atomic bomb concentrated unimaginable power in the hands of one government, who
then had to wrestle with the decision to use it. Today’s digital technologies have created new categories of
ethical dilemmas.
For example, the ability to anonymously make perfect copies of digital music has tempted many music
fans to download copyrighted music for their own use without making payment to the music’s owner. Many
of those who would never have walked into a music store and stolen a CD find themselves with dozens of
illegally downloaded albums.
1. http://www.merriam-webster.com/dictionary/ethics
129
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Digital technologies have given us the ability to aggregate information from multiple sources to create profiles
of people. What would have taken weeks of work in the past can now be done in seconds, allowing private organizations
and governments to know more about individuals than at any time in history. This information has value, but also chips away at the
privacy of consumers and citizens.
Code of Ethics
One method for navigating new ethical waters is a code of ethics. A code of ethics is a document that
outlines a set of acceptable behaviors for a professional or social group; generally, it is agreed to by
all members of the group. The document details different actions that are considered appropriate and
inappropriate.
A good example of a code of ethics is the Code of Ethics and Professional Conduct of the Association
for Computing Machinery,2 an organization of computing professionals that includes academics,
researchers, and practitioners. Here is a quote from the preamble:
Commitment to ethical professional conduct is expected of every member (voting members,
associate members, and student members) of the Association for Computing Machinery (ACM).
This Code, consisting of 24 imperatives formulated as statements of personal responsibility,
identifies the elements of such a commitment. It contains many, but not all, issues professionals
are likely to face. Section 1 outlines fundamental ethical considerations, while Section 2 addresses
additional, more specific considerations of professional conduct. Statements in Section 3 pertain
more specifically to individuals who have a leadership role, whether in the workplace or in a
volunteer capacity such as with organizations like ACM. Principles involving compliance with
this Code are given in Section 4.
In the ACM’s code, you will find many straightforward ethical instructions, such as the admonition to be
honest and trustworthy. But because this is also an organization of professionals that focuses on computing,
there are more specific admonitions that relate directly to information technology:
• No one should enter or use another’s computer system, software, or data files without permission.
One must always have appropriate approval before using system resources, including
communication ports, file space, other system peripherals, and computer time.
• Designing or implementing systems that deliberately or inadvertently demean individuals or
groups is ethically unacceptable.
• Organizational leaders are responsible for ensuring that computer systems enhance, not degrade,
the quality of working life. When implementing a computer system, organizations must consider
the personal and professional development, physical safety, and human dignity of all workers.
Appropriate human-computer ergonomic standards should be considered in system design and in
the workplace.
One of the major advantages of creating a code of ethics is that it clarifies the acceptable standards of
behavior for a professional group. The varied backgrounds and experiences of the members of a group
2. ACM Code of Ethics and Professional Conduct Adopted by ACM Council 10/16/92.
130 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://www.acm.org/about/code-of-ethics
http://www.acm.org/about/code-of-ethics/#sect1
http://www.acm.org/about/code-of-ethics/#sect2
http://www.acm.org/about/code-of-ethics/#sect3
http://www.acm.org/about/code-of-ethics/#sect4
lead to a variety of ideas regarding what is acceptable behavior. While to many the guidelines may
seem obvious, having these items detailed provides clarity and consistency. Explicitly stating standards
communicates the common guidelines to everyone in a clear manner.
Having a code of ethics can also have some drawbacks. First of all, a code of ethics does not have legal
authority; in other words, breaking a code of ethics is not a crime in itself. So what happens if someone
violates one of the guidelines? Many codes of ethics include a section that describes how such situations
will be handled. In many cases, repeated violations of the code result in expulsion from the group.
In the case of ACM: “Adherence of professionals to a code of ethics is largely a voluntary matter.
However, if a member does not follow this code by engaging in gross misconduct, membership in
ACM may be terminated.” Expulsion from ACM may not have much of an impact on many individuals,
since membership in ACM is usually not a requirement for employment. However, expulsion from other
organizations, such as a state bar organization or medical board, could carry a huge impact.
Another possible disadvantage of a code of ethics is that there is always a chance that important
issues will arise that are not specifically addressed in the code. Technology is quickly changing, and a code
of ethics might not be updated often enough to keep up with all of the changes. A good code of ethics,
however, is written in a broad enough fashion that it can address the ethical issues of potential changes to
technology while the organization behind the code makes revisions.
Finally, a code of ethics could have also be a disadvantage in that it may not entirely reflect the ethics
or morals of every member of the group. Organizations with a diverse membership may have internal
conflicts as to what is acceptable behavior. For example, there may be a difference of opinion on the
consumption of alcoholic beverages at company events. In such cases, the organization must make a choice
about the importance of addressing a specific behavior in the code.
Sidebar: Acceptable Use Policies
Many organizations that provide technology services to a group of constituents or the public require
agreement to an acceptable use policy (AUP) before those services can be accessed. Similar to a code
of ethics, this policy outlines what is allowed and what is not allowed while someone is using the
organization’s services. An everyday example of this is the terms of service that must be agreed to before
using the public Wi-Fi at Starbucks, McDonald’s, or even a university. Here is an example of an acceptable
use policy from Virginia Tech.
Just as with a code of ethics, these acceptable use policies specify what is allowed and what is not
allowed. Again, while some of the items listed are obvious to most, others are not so obvious:
• “Borrowing” someone else’s login ID and password is prohibited.
• Using the provided access for commercial purposes, such as hosting your own business website,
is not allowed.
• Sending out unsolicited email to a large group of people is prohibited.
Also as with codes of ethics, violations of these policies have various consequences. In most cases, such
as with Wi-Fi, violating the acceptable use policy will mean that you will lose your access to the resource.
While losing access to Wi-Fi at Starbucks may not have a lasting impact, a university student getting
banned from the university’s Wi-Fi (or possibly all network resources) could have a large impact.
Chapter 12: The Ethical and Legal Implications of Information Systems 131
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://www.vt.edu/about/acceptable-use.html
Intellectual Property
One of the domains that have been deeply impacted by digital technologies is the domain of intellectual
property. Digital technologies have driven a rise in new intellectual property claims and made it much more
difficult to defend intellectual property.
Intellectual property is defined as “property (as an idea, invention, or process) that derives from the
work of the mind or intellect.”3 This could include creations such as song lyrics, a computer program, a
new type of toaster, or even a sculpture.
Practically speaking, it is very difficult to protect an idea. Instead, intellectual property laws are written
to protect the tangible results of an idea. In other words, just coming up with a song in your head is not
protected, but if you write it down it can be protected.
Protection of intellectual property is important because it gives people an incentive to be creative.
Innovators with great ideas will be more likely to pursue those ideas if they have a clear understanding
of how they will benefit. In the US Constitution, Article 8, Section 8, the authors saw fit to recognize the
importance of protecting creative works:
Congress shall have the power . . . To promote the Progress of Science and useful Arts, by
securing for limited Times to Authors and Inventors the exclusive Right to their respective
Writings and Discoveries.
An important point to note here is the “limited time” qualification. While protecting intellectual property is
important because of the incentives it provides, it is also necessary to limit the amount of benefit that can
be received and allow the results of ideas to become part of the public domain.
Outside of the US, intellectual property protections vary. You can find out more about a specific
country’s intellectual property laws by visiting the World Intellectual Property Organization.
In the following sections we will review three of the best-known intellectual property protections:
copyright, patent, and trademark.
Copyright
Copyright is the protection given to songs, computer programs, books, and other creative works; any work
that has an “author” can be copyrighted. Under the terms of copyright, the author of a work controls what
can be done with the work, including:
• Who can make copies of the work.
• Who can make derivative works from the original work.
• Who can perform the work publicly.
• Who can display the work publicly.
• Who can distribute the work.
3. http://www.merriam-webster.com/dictionary/intellectual%20property
132 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://www.wipo.int/
Many times, a work is not owned by an individual but is instead owned by a publisher with whom the
original author has an agreement. In return for the rights to the work, the publisher will market and
distribute the work and then pay the original author a portion of the proceeds.
Copyright protection lasts for the life of the original author plus seventy years. In the case of a
copyrighted work owned by a publisher or another third party, the protection lasts for ninety-five years
from the original creation date. For works created before 1978, the protections vary slightly. You can see
the full details on copyright protections by reviewing the Copyright Basics document available at the US
Copyright Office’s website.
Obtaining Copyright Protection
In the United States, a copyright is obtained by the simple act of creating the original work. In other words,
when an author writes down that song, makes that film, or designs that program, he or she automatically
has the copyright. However, for a work that will be used commercially, it is advisable to register for a
copyright with the US Copyright Office. A registered copyright is needed in order to bring legal action
against someone who has used a work without permission.
First Sale Doctrine
If an artist creates a painting and sells it to a collector who then, for whatever reason, proceeds to destroy
it, does the original artist have any recourse? What if the collector, instead of destroying it, begins making
copies of it and sells them? Is this allowed? The first sale doctrine is a part of copyright law that addresses
this, as shown below4:
The first sale doctrine, codified at 17 U.S.C. § 109, provides that an individual who knowingly
purchases a copy of a copyrighted work from the copyright holder receives the right to sell,
display or otherwise dispose of that particular copy, notwithstanding the interests of the copyright
owner.
So, in our examples, the copyright owner has no recourse if the collector destroys her artwork. But the
collector does not have the right to make copies of the artwork.
Fair Use
Another important provision within copyright law is that of fair use. Fair use is a limitation on copyright
law that allows for the use of protected works without prior authorization in specific cases. For example,
if a teacher wanted to discuss a current event in her class, she could pass out copies of a copyrighted news
story to her students without first getting permission. Fair use is also what allows a student to quote a small
portion of a copyrighted work in a research paper.
Unfortunately, the specific guidelines for what is considered fair use and what constitutes copyright
violation are not well defined. Fair use is a well-known and respected concept and will only be challenged
when copyright holders feel that the integrity or market value of their work is being threatened. The
following four factors are considered when determining if something constitutes fair use: 5
1. The purpose and character of the use, including whether such use is of commercial nature or is
for nonprofit educational purposes;
4. http://www.justice.gov/usao/eousa/foia_reading_room/usam/title9/crm01854.htm
5. http://www.copyright.gov/fls/fl102.html
Chapter 12: The Ethical and Legal Implications of Information Systems 133
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://www.copyright.gov/circs/circ01
http://www.copyright.gov/circs/circ01
2. The nature of the copyrighted work;
3. The amount and substantiality of the portion used in relation to the copyrighted work as a
whole;
4. The effect of the use upon the potential market for, or value of, the copyrighted work.
If you are ever considering using a copyrighted work as part of something you are creating, you may be
able to do so under fair use. However, it is always best to check with the copyright owner to be sure you
are staying within your rights and not infringing upon theirs.
Sidebar: The History of Copyright Law
As noted above, current copyright law grants copyright protection for seventy years after the author’s death,
or ninety-five years from the date of creation for a work created for hire. But it was not always this way.
The first US copyright law, which only protected books, maps, and charts, provided protection for
only 14 years with a renewable term of 14 years. Over time, copyright law was revised to grant protections
to other forms of creative expression, such as photography and motion pictures. Congress also saw fit
to extend the length of the protections, as shown in the chart below. Today, copyright has become big
business, with many businesses relying on the income from copyright-protected works for their income.
Many now think that the protections last too long. The Sonny Bono Copyright Term Extension Act has
been nicknamed the “Mickey Mouse Protection Act,” as it was enacted just in time to protect the copyright
on the Walt Disney Company’s Mickey Mouse character. Because of this term extension, many works from
the 1920s and 1930s that would have been available now in the public domain are not available.
Evolution of copyright term length. (CC-BY-SA: Tom Bell)
134 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://commons.wikimedia.org/wiki/File%3ACopyright_term.svg
http://commons.wikimedia.org/wiki/File%3ACopyright_term.svg
The Digital Millennium Copyright Act
As digital technologies have changed what it means to create, copy, and distribute media, a policy vacuum
has been created. In 1998, the US Congress passed the Digital Millennium Copyright Act (DMCA), which
extended copyright law to take into consideration digital technologies. Two of the best-known provisions
from the DMCA are the anti-circumvention provision and the “safe harbor” provision.
• The anti-circumvention provision makes it illegal to create technology to circumvent technology
that has been put in place to protect a copyrighted work. This provision includes not just the
creation of the technology but also the publishing of information that describes how to do it.
While this provision does allow for some exceptions, it has become quite controversial and has
led to a movement to have it modified.
• The “safe harbor” provision limits the liability of online service providers when someone using
their services commits copyright infringement. This is the provision that allows YouTube, for
example, not to be held liable when someone posts a clip from a copyrighted movie. The
provision does require the online service provider to take action when they are notified of the
violation (a “takedown” notice). For an example of how takedown works, here’s how YouTube
handles these requests: YouTube Copyright Infringement Notification.
Many think that the DMCA goes too far and ends up limiting our freedom of speech. The Electronic
Frontier Foundation (EFF) is at the forefront of this battle. For example, in discussing the anti-
circumvention provision, the EFF states:
Yet the DMCA has become a serious threat that jeopardizes fair use, impedes competition and
innovation, chills free expression and scientific research, and interferes with computer intrusion
laws. If you circumvent DRM [digital rights management] locks for non-infringing fair uses or
create the tools to do so you might be on the receiving end of a lawsuit.
Sidebar: Creative Commons
In chapter 2, we learned about open-source software. Open-source software has few or no copyright
restrictions; the creators of the software publish their code and make their software available for others to
use and distribute for free. This is great for software, but what about other forms of copyrighted works? If
an artist or writer wants to make their works available, how can they go about doing so while still protecting
the integrity of their work? Creative Commons is the solution to this problem.
Creative Commons is a nonprofit organization that provides legal tools for artists and authors. The
tools offered make it simple to license artistic or literary work for others to use or distribute in a manner
consistent with the author’s intentions. Creative Commons licenses are indicated with the symbol . It
is important to note that Creative Commons and public domain are not the same. When something is in
the public domain, it has absolutely no restrictions on its use or distribution. Works whose copyrights have
expired, for example, are in the public domain.
Chapter 12: The Ethical and Legal Implications of Information Systems 135
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://fixthedmca.org/
http://www.youtube.com/yt/copyright/copyright-complaint.html
By using a Creative Commons license, authors can control the use of their work while still making
it widely accessible. By attaching a Creative Commons license to their work, a legally binding license is
created. Here are some examples of these licenses:
• CC-BY: This is the least restrictive license. It lets others distribute and build upon the work, even
commercially, as long as they give the author credit for the original work.
• CC-BY-SA: This license restricts the distribution of the work via the “share-alike” clause. This
means that others can freely distribute and build upon the work, but they must give credit to the
original author and they must share using the same Creative Commons license.
• CC-BY-NC: This license is the same as CC-BY but adds the restriction that no one can make
money with this work. NC stands for “non-commercial.”
• CC-BY-NC-ND: This license is the same as CC-BY-NC but also adds the ND restriction, which
means that no derivative works may be made from the original.
These are a few of the more common licenses that can be created using the tools that Creative Commons
makes available. For a full listing of the licenses and to learn much more about Creative Commons, visit
their web site.
Patent
Another important form of intellectual property protection is the patent. A patent creates protection for
someone who invents a new product or process. The definition of invention is quite broad and covers many
different fields. Here are some examples of items receiving patents:
• circuit designs in semiconductors;
• prescription drug formulas;
• firearms;
• locks;
• plumbing;
• engines;
• coating processes; and
• business processes.
Once a patent is granted, it provides the inventor with protection from others infringing on his or her
patent. A patent holder has the right to “exclude others from making, using, offering for sale, or selling the
invention throughout the United States or importing the invention into the United States for a limited time
in exchange for public disclosure of the invention when the patent is granted.”6
As with copyright, patent protection lasts for a limited period of time before the invention or process
enters the public domain. In the US, a patent lasts twenty years. This is why generic drugs are available to
replace brand-name drugs after twenty years.
6. From the US Patent and Trademark Office, “What Is A Patent?” http://www.uspto.gov/patents/
136 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Obtaining Patent Protection
Unlike copyright, a patent is not automatically granted when someone has an interesting idea and writes
it down. In most countries, a patent application must be submitted to a government patent office. A patent
will only be granted if the invention or process being submitted meets certain conditions:
• It must be original. The invention being submitted must not have been submitted before.
• It must be non-obvious. You cannot patent something that anyone could think of. For example,
you could not put a pencil on a chair and try to get a patent for a pencil-holding chair.
• It must be useful. The invention being submitted must serve some purpose or have some use that
would be desired.
The job of the patent office is to review patent applications to ensure that the item being submitted
meets these requirements. This is not an easy job: in 2012, the US Patent Office received 576,763 patent
applications and granted 276,788 patents. The current backlog for a patent approval is 18.1 months. Over
the past fifty years, the number of patent applications has risen from just 100,000 a year to almost 600,000;
digital technologies are driving much of this innovation.
Increase in patent applications, 1963–2012 (Source: US Patent and Trademark Office)
Chapter 12: The Ethical and Legal Implications of Information Systems 137
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2012/08/patent_stats
http://bus206.pressbooks.com/files/2012/08/patent_stats
Sidebar: What Is a Patent Troll?
The advent of digital technologies has led to a large increase in patent filings and therefore a large number
of patents being granted. Once a patent is granted, it is up to the owner of the patent to enforce it; if someone
is found to be using the invention without permission, the patent holder has the right to sue to force that
person to stop and to collect damages.
The rise in patents has led to a new form of profiteering called patent trolling. A patent troll is a person
or organization who gains the rights to a patent but does not actually make the invention that the patent
protects. Instead, the patent troll searches for those who are illegally using the invention in some way and
sues them. In many cases, the infringement being alleged is questionable at best. For example, companies
have been sued for using Wi-Fi or for scanning documents, technologies that have been on the market for
many years.
Recently, the US government has begun taking action against patent trolls. Several pieces of
legislation are working their way through Congress that will, if enacted, limit the ability of patent trolls
to threaten innovation. You can learn a lot more about patent trolls by listening to a detailed investigation
conducted by the radio program This American Life, by clicking this link.
Trademark
A trademark is a word, phrase, logo, shape or sound that identifies a source of goods or services. For
example, the Nike “Swoosh,” the Facebook “f”, and Apple’s apple (with a bite taken out of it) are
all trademarked. The concept behind trademarks is to protect the consumer. Imagine going to the local
shopping center to purchase a specific item from a specific store and finding that there are several stores all
with the same name!
Two types of trademarks exist – a common-law trademark and a registered trademark. As with
copyright, an organization will automatically receive a trademark if a word, phrase, or logo is being used in
the normal course of business (subject to some restrictions, discussed below). A common-law trademark is
designated by placing “TM” next to the trademark. A registered trademark is one that has been examined,
approved, and registered with the trademark office, such as the Patent and Trademark Office in the US. A
registered trademark has the circle-R (®) placed next to the trademark.
While most any word, phrase, logo, shape, or sound can be trademarked, there are a few limitations.
A trademark will not hold up legally if it meets one or more of the following conditions:
1. The trademark is likely to cause confusion with a mark in a registration or prior application.
2. The trademark is merely descriptive for the goods/services. For example, trying to register the
trademark “blue” for a blue product you are selling will not pass muster.
3. The trademark is a geographic term.
4. The trademark is a surname. You will not be allowed to trademark “Smith’s Bookstore.”
5. The trademark is ornamental as applied to the goods. For example, a repeating flower pattern that
is a design on a plate cannot be trademarked.
As long as an organization uses its trademark and defends it against infringement, the protection afforded
by it does not expire. Because of this, many organizations defend their trademark against other companies
whose branding even only slightly copies their trademark. For example, Chick-fil-A has trademarked the
138 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
https://www.eff.org/deeplinks/2013/01/scanning-documents-patent-trolls-want-you-pay
http://www.thisamericanlife.org/radio-archives/episode/441/when-patents-attack
phrase “Eat Mor Chikin” and has vigorously defended it against a small business using the slogan “Eat
More Kale.” Coca-Cola has trademarked the contour shape of its bottle and will bring legal action against
any company using a bottle design similar to theirs. As an example of trademarks that have been diluted
and have now lost their protection in the US are “aspirin” (originally trademarked by Bayer), “escalator”
(originally trademarked by Otis), and “yo-yo” (originally trademarked by Duncan).
Information Systems and Intellectual Property
The rise of information systems has forced us to rethink how we deal with intellectual property. From the
increase in patent applications swamping the government’s patent office to the new laws that must be put
in place to enforce copyright protection, digital technologies have impacted our behavior.
Privacy
The term privacy has many definitions, but for our purposes, privacy will mean the ability to control
information about oneself. Our ability to maintain our privacy has eroded substantially in the past decades,
due to information systems.
Personally Identifiable Information
Information about a person that can be used to uniquely establish that person’s identify is called personally
identifiable information, or PII. This is a broad category that includes information such as:
• name;
• social security number;
• date of birth;
• place of birth;
• mother‘s maiden name;
• biometric records (fingerprint, face, etc.);
• medical records;
• educational records;
• financial information; and
• employment information.
Organizations that collect PII are responsible to protect it. The Department of Commerce recommends that
“organizations minimize the use, collection, and retention of PII to what is strictly necessary to accomplish
their business purpose and mission.” They go on to state that “the likelihood of harm caused by a breach
involving PII is greatly reduced if an organization minimizes the amount of PII it uses, collects, and
stores.”7 Organizations that do not protect PII can face penalties, lawsuits, and loss of business. In the US,
7. Guide to Protecting the Confidentiality of Personally Identifiable Information (PII). National Institute of Standards and Technology, US
Department of Commerce Special Publication 800-122. http://csrc.nist.gov/publications/nistpubs/800-122/sp800-122
Chapter 12: The Ethical and Legal Implications of Information Systems 139
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://finance.yahoo.com/blogs/the-exchange/eat-more-kale-company-losing-against-chick-fil-212157027.html
http://finance.yahoo.com/blogs/the-exchange/eat-more-kale-company-losing-against-chick-fil-212157027.html
most states now have laws in place requiring organizations that have had security breaches related to PII to
notify potential victims, as does the European Union.
Just because companies are required to protect your information does not mean they are restricted
from sharing it. In the US, companies can share your information without your explicit consent (see sidebar
below), though not all do so. Companies that collect PII are urged by the FTC to create a privacy policy and post
it on their website. The state of California requires a privacy policy for any website that does business with a resident of the state
(see http://www.privacy.ca.gov/lawenforcement/laws.htm).
While the privacy laws in the US seek to balance consumer protection with promoting commerce, in
the European Union privacy is considered a fundamental right that outweighs the interests of commerce.
This has led to much stricter privacy protection in the EU, but also makes commerce more difficult between
the US and the EU.
Non-Obvious Relationship Awareness
Digital technologies have given us many new capabilities that simplify and expedite the collection of
personal information. Every time we come into contact with digital technologies, information about us
is being made available. From our location to our web-surfing habits, our criminal record to our credit
report, we are constantly being monitored. This information can then be aggregated to create profiles of
each and every one of us. While much of the information collected was available in the past, collecting
it and combining it took time and effort. Today, detailed information about us is available for purchase
from different companies. Even information not categorized as PII can be aggregated in such a way that an
individual can be identified.
This process of collecting large quantities of a variety of information and then combining it to create
profiles of individuals is known as non-obvious relationship awareness, or NORA. First commercialized by
big casinos looking to find cheaters, NORA is used by both government agencies and private organizations,
and it is big business.
140 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://www.privacy.ca.gov/lawenforcement/laws.htm
Non-obvious relationship awareness (NORA)
In some settings, NORA can bring many benefits, such as in law enforcement. By being able to
identify potential criminals more quickly, crimes can be solved more quickly or even prevented before they
happen. But these advantages come at a price: our privacy.
Restrictions on Record Collecting
In the US, the government has strict guidelines on how much information can be collected about its citizens.
Certain classes of information have been restricted by laws over time, and the advent of digital tools has
made these restrictions more important than ever.
Children’s Online Privacy Protection Act
Websites that are collecting information from children under the age of thirteen are required to comply with
the Children’s Online Privacy Protection Act (COPPA), which is enforced by the Federal Trade Commission (FTC). To comply
with COPPA, organizations must make a good-faith effort to determine the age of those accessing their websites and, if users are
under thirteen years old, must obtain parental consent before collecting any information.
Chapter 12: The Ethical and Legal Implications of Information Systems 141
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2012/08/NORA
http://bus206.pressbooks.com/files/2012/08/NORA
http://www.coppa.org/
Family Educational Rights and Privacy Act
The Family Educational Rights and Privacy Act (FERPA) is a US law that protects the privacy of
student education records. In brief, this law specifies that parents have a right to their child’s educational
information until the child reaches either the age of eighteen or begins attending school beyond the high
school level. At that point, control of the information is given to the child. While this law is not specifically
about the digital collection of information on the Internet, the educational institutions that are collecting
student information are at a higher risk for disclosing it improperly because of digital technologies.
Health Insurance Portability and Accountability Act
The Health Insurance Portability and Accountability Act of 1996 (HIPAA) is the law the specifically singles out
records related to health care as a special class of personally identifiable information. This law gives patients specific rights to
control their medical records, requires health care providers and others who maintain this information to get specific permission in
order to share it, and imposes penalties on the institutions that breach this trust. Since much of this information is now shared via
electronic medical records, the protection of those systems becomes paramount.
Sidebar: Do Not Track
When it comes to getting permission to share personal information, the US and the EU have different
approaches. In the US, the “opt-out” model is prevalent; in this model, the default agreement is that you
have agreed to share your information with the organization and must explicitly tell them that you do not
want your information shared. There are no laws prohibiting the sharing of your data (beyond some specific
categories of data, such as medical records). In the European Union, the “opt-in” model is required to be
the default. In this case, you must give your explicit permission before an organization can share your
information.
To combat this sharing of information, the Do Not Track initiative was created. As its creators
explain8:
Do Not Track is a technology and policy proposal that enables users to opt out of tracking by
websites they do not visit, including analytics services, advertising networks, and social platforms.
At present few of these third parties offer a reliable tracking opt out, and tools for blocking them
are neither user-friendly nor comprehensive. Much like the popular Do Not Call registry, Do Not
Track provides users with a single, simple, persistent choice to opt out of third-party web tracking.
Summary
The rapid changes in information technology in the past few decades have brought a broad array of new
capabilities and powers to governments, organizations, and individuals alike. These new capabilities have
required thoughtful analysis and the creation of new norms, regulations, and laws. In this chapter, we have
seen how the areas of intellectual property and privacy have been affected by these new capabilities and
how the regulatory environment has been changed to address them.
8. http://donottrack.us/
142 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Study Questions
1. What does the term information systems ethics mean?
2. What is a code of ethics? What is one advantage and one disadvantage of a code of ethics?
3. What does the term intellectual property mean? Give an example.
4. What protections are provided by a copyright? How do you obtain one?
5. What is fair use?
6. What protections are provided by a patent? How do you obtain one?
7. What does a trademark protect? How do you obtain one?
8. What does the term personally identifiable information mean?
9. What protections are provided by HIPAA, COPPA, and FERPA?
10. How would you explain the concept of NORA?
Exercises
1. Provide one example of how information technology has created an ethical dilemma that would
not have existed before the advent of information technology.
2. Find an example of a code of ethics or acceptable use policy related to information technology
and highlight five points that you think are important.
3. Do some original research on the effort to combat patent trolls. Write a two-page paper that
discusses this legislation.
4. Give an example of how NORA could be used to identify an individual.
5. How are intellectual property protections different across the world? Pick two countries and do
some original research, then compare the patent and copyright protections offered in those
countries to those in the US. Write a two- to three-page paper describing the differences.
Chapter 12: The Ethical and Legal Implications of Information Systems 143
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Chapter 13: Future Trends in Information Systems
David T. Bourgeois
Learning Objectives
Upon successful completion of this chapter, you will be able to:
• describe future trends in information systems.
Introduction
Information systems have evolved at a rapid pace ever since their introduction in the 1950s. Today, devices
that we can hold in one hand are more powerful than the computers used to land a man on the moon. The
Internet has made the entire world accessible to us, allowing us to communicate and collaborate with each
other like never before. In this chapter, we will examine current trends and look ahead to what is coming
next.
Global
The first trend to note is the continuing expansion of globalization. The use of the Internet is growing all
over the world, and with it the use of digital devices. The growth is coming from some unexpected places;
countries such as Indonesia and Iran are leading the way in Internet growth.
Global Internet growth, 2008–2012. Click to enlarge.
(Source: Internet World Stats)
144
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2013/09/internetgrowth
http://bus206.pressbooks.com/files/2013/09/internetgrowth
Social
Social media growth is another trend that continues. Facebook now has over one billion users! In 2013,
80% of Facebook users were outside of the US and Canada.1 Countries where Facebook is growing rapidly
include Indonesia, Mexico, and the Philippines. 2
Besides Facebook, other social media sites are also seeing tremendous growth. Over 70% of
YouTube’s users are outside the US, with the UK, India, Germany, Canada, France, South Korea, and
Russia leading the way.3 Pinterest gets over 50% of its users from outside the US, with over 9% from
India. 4 Twitter now has over 230 million active users. 5 Social media sites not based in the US are also
growing. China’s QQ instant-messaging service is the eighth most-visited site in the world.6
Personal
Ever since the advent of Web 2.0 and e-commerce, users of information systems have expected to be able
to modify their experiences to meet their personal tastes. From custom backgrounds on computer desktops
to unique ringtones on mobile phones, makers of digital devices provide the ability to personalize how we
use them. More recently, companies such as Netflix have begun assisting their users with personalizations
by making suggestions. In the future, we will begin seeing devices perfectly matched to our personal
preferences, based upon information collected about us in the past.
Mobile
Perhaps the most impactful trend in digital technologies in the last decade has been the advent of mobile
technologies. Beginning with the simple cellphone in the 1990s and evolving into the smartphones and
tablets of today, the growth of mobile has been overwhelming. Here are some key indicators of this trend:
• Mobile device sales. In 2011, smartphones began outselling personal computers.7
• The number of smartphone subscribers grew at 31% in 2013, with China leading the way at 354
million smartphone users.
• Internet access via mobile. In May of 2013, mobile accounted for 15% of all Internet traffic. In
China, 75% of Internet users used their smartphone to access it. Facebook reported that 68% of its
active users used their mobile platform to access the social network.
• The rise of tablets. While Apple defined the smartphone with the iPhone, the iPad sold more than
three times as many units in its first twelve months as the iPhone did in its first twelve months.
Tablet shipments now outpace notebook PCs and desktop PCs. The research firm IDC predicts
that 87% of all connected devices will be either smartphones or tablets by 2017. 8
1. http://newsroom.fb.com/Key-Facts
2. http://www.socialbakers.com/blog/38-top-10-countries-on-facebook-in-the-last-six-months
3. http://newmediarockstars.com/2013/03/the-top-10-countries-in-youtube-viewership-outside-the-usa-infographic/
4. http://www.alexa.com/siteinfo/pinterest.com
5. https://about.twitter.com/company
6. http://www.alexa.com/siteinfo/qq.com
7. http://mashable.com/2012/02/03/smartphone-sales-overtake-pcs/
Chapter 13: Future Trends in Information Systems 145
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Google Glass. Click to enlarge. (CC-
BY: Flickr, user Tedeytan)
Wearable
The average smartphone user looks at his or her smartphone 150 times a day for functions such as
messaging (23 times), phone calls (22), listening to music (13), and social media (9).9 Many of these
functions would be much better served if the technology was worn on, or even physically integrated into,
our bodies. This technology is known as a “wearable.”
Wearables have been around for a long time, with technologies such as
hearing aids and, later, bluetooth earpieces. But now, we are seeing an
explosion of new wearable technologies. Perhaps the best known of
these is Google Glass, an augmented reality device that you wear over
your eyes like a pair of eyeglasses. Visible only to you, Google Glass
will project images into your field of vision based on your context and
voice commands. You can find out much more about Google Glass
at http://www.google.com/glass/start/.
Another class of wearables are those related to health care. The UP
by Jawbone consists of a wristband and an app that tracks how you sleep,
move, and eat, then helps you use that information to feel your best. 10 It can be used to track your sleep
patterns, moods, eating patterns, and other aspects of daily life, and then report back to you via an app on
your smartphone or tablet. As our population ages and technology continues to evolve, there will be a large
increase in wearables like this.
Collaborative
As more of us use smartphones and wearables, it will be simpler than ever to share data with each other
for mutual benefit. Some of this sharing can be done passively, such as reporting our location in order to
update traffic statistics. Other data can be reported actively, such as adding our rating of a restaurant to a
review site.
The smartphone app Waze is a community-based tool that keeps track of the route you are traveling
and how fast you are making your way to your destination. In return for providing your data, you can
benefit from the data being sent from all of the other users of the app. Waze will route you around traffic
and accidents based upon real-time reports from other users.
Yelp! allows consumers to post ratings and reviews of local businesses into a database, and then
it provides that data back to consumers via its website or mobile phone app. By compiling ratings of
restaurants, shopping centers, and services, and then allowing consumers to search through its directory,
Yelp! has become a huge source of business for many companies. Unlike data collected passively however,
Yelp! relies on its users to take the time to provide honest ratings and reviews.
8. http://www.forbes.com/sites/louiscolumbus/2013/09/12/idc-87-of-connected-devices-by-2017-will-be-tablets-and-smartphones/
9. http://communities-dominate.blogs.com/brands/2013/03/the-annual-mobile-industry-numbers-and-stats-blog-yep-this-year-we-will-hit-the-
mobile-moment.html
10. https://jawbone.com/up
146 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://bus206.pressbooks.com/files/2013/09/Google_Glass_Explorer_Edition
http://bus206.pressbooks.com/files/2013/09/Google_Glass_Explorer_Edition
http://www.waze.com/
http://www.yelp.com
Printable
One of the most amazing innovations to be developed recently is the 3-D printer. A 3-D printer allows you
to print virtually any 3-D object based on a model of that object designed on a computer. 3-D printers work
by creating layer upon layer of the model using malleable materials, such as different types of glass, metals,
or even wax.
3-D printing is quite useful for prototyping the designs of products to determine their feasibility and
marketability. 3-D printing has also been used to create working prosthetic legs, handguns, and even an ear
that can hear beyond the range of normal hearing. The US Air Force now uses 3-D printed parts on the F-18
fighter jet.11
3-D printing is one of many technologies embraced by the “maker” movement. Chris Anderson, editor
of Wired magazine, puts it this way12:
In a nutshell, the term “Maker” refers to a new category of builders who are using open-source
methods and the latest technology to bring manufacturing out of its traditional factory context,
and into the realm of the personal desktop computer. Until recently, the ability to manufacture was
reserved for those who owned factories. What’s happened over the last five years is that we’ve
brought the Web’s democratizing power to manufacturing. Today, you can manufacture with the
push of a button.
Findable
The “Internet of Things” refers to the idea of physical objects being connected to the Internet. Advances
in wireless technologies and sensors will allow physical objects to send and receive data about themselves.
Many of the technologies to enable this are already available – it is just a matter of integrating them
together.
In a 2010 report by McKinsey & Company on the Internet of Things13, six broad applications are
identified:
• Tracking behavior. When products are embedded with sensors, companies can track the
movements of these products and even monitor interactions with them. Business models can be
fine-tuned to take advantage of this behavioral data. Some insurance companies, for example, are
offering to install location sensors in customers’ cars. That allows these companies to base the
price of policies on how a car is driven as well as where it travels.
• Enhanced situational awareness. Data from large numbers of sensors deployed, for example, in
infrastructure (such as roads and buildings), or to report on environmental conditions (including
soil moisture, ocean currents, or weather), can give decision makers a heightened awareness of
real-time events, particularly when the sensors are used with advanced display or visualization
technologies. Security personnel, for instance, can use sensor networks that combine video, audio,
and vibration detectors to spot unauthorized individuals who enter restricted areas.
11. http://www.economist.com/news/technology-quarterly/21584447-digital-manufacturing-there-lot-hype-around-3d-printing-it-fast
12. Makers: The New Industrial Revolution by Chris Anderson. Crown Business; 2012.
13. http://www.mckinsey.com/insights/high_tech_telecoms_internet/the_internet_of_things
Chapter 13: Future Trends in Information Systems 147
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
These Beautiful Customized 3D-Printed Prosthetic Legs Are Made to Be Seen
http://mashable.com/2013/06/02/3d-printed-gun/
http://news.yahoo.com/7-weirdest-things-made-3d-printing-122023635.html
http://news.yahoo.com/7-weirdest-things-made-3d-printing-122023635.html
• Sensor-driven decision analysis. The Internet of Things also can support longer-range, more
complex human planning and decision making. The technology requirements – tremendous
storage and computing resources linked with advanced software systems that generate a variety of
graphical displays for analyzing data – rise accordingly.
• Process optimization. Some industries, such as chemical production, are installing legions of
sensors to bring much greater granularity to monitoring. These sensors feed data to computers,
which in turn analyze the data and then send signals to actuators that adjust processes – for
example, by modifying ingredient mixtures, temperatures, or pressures.
• Optimized resource consumption. Networked sensors and automated feedback mechanisms can
change usage patterns for scarce resources, such as energy and water. This can be accomplished
by dynamically changing the price of these goods to increase or reduce demand.
• Complex autonomous systems. The most demanding use of the Internet of Things involves the
rapid, real-time sensing of unpredictable conditions and instantaneous responses guided by
automated systems. This kind of machine decision-making mimics human reactions, though at
vastly enhanced performance levels. The automobile industry, for instance, is stepping up the
development of systems that can detect imminent collisions and take evasive action.
Autonomous
A final trend that is emerging is an extension of the Internet of Things: autonomous robots and vehicles. By combining software,
sensors, and location technologies, devices that can operate themselves to perform specific functions are being developed. These
take the form of creations such as medical nanotechnology robots (nanobots), self-driving cars, or unmanned aerial vehicles (UAVs).
A nanobot is a robot whose components are on the scale of about a nanometer, which is one-billionth
of a meter. While still an emerging field, it is showing promise for applications in the medical field. For
example, a set of nanobots could be introduced into the human body to combat cancer or a specific disease.
In March of 2012, Google introduced the world to their driverless car by releasing a video on YouTube
showing a blind man driving the car around the San Francisco area. The car combines several technologies,
including a laser radar system, worth about $150,000. While the car is not available commercially yet, three
US states (Nevada, Florida, and California) have already passed legislation making driverless cars legal.
A UAV, often referred to as a “drone,” is a small airplane or helicopter that can fly without a pilot.
Instead of a pilot, they are either run autonomously by computers in the vehicle or operated by a person
using a remote control. While most drones today are used for military or civil applications, there is a
growing market for personal drones. For around $300, a consumer can purchase a drone for personal use.
Summary
As the world of information technology moves forward, we will be constantly challenged by new
capabilities and innovations that will both amaze and disgust us. As we learned in chapter 12, many times
the new capabilities and powers that come with these new technologies will test us and require a new way
148 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
of thinking about the world. Businesses and individuals alike need to be aware of these coming changes
and prepare for them.
Study Questions
1. Which countries are the biggest users of the Internet? Social media? Mobile?
2. Which country had the largest Internet growth (in %) between 2008 and 2012?
3. How will most people connect to the Internet in the future?
4. What are two different applications of wearable technologies?
5. What are two different applications of collaborative technologies?
6. What capabilities do printable technologies have?
7. How will advances in wireless technologies and sensors make objects “findable”?
8. What is enhanced situational awareness?
9. What is a nanobot?
10. What is a UAV?
Exercises
1. If you were going to start a new technology business, which of the emerging trends do you
think would be the biggest opportunity? Do some original research to estimate the market size.
2. What privacy concerns could be raised by collaborative technologies such as Waze?
3. Do some research about the first handgun printed using a 3-D printer and report on some of the
concerns raised.
4. Write up an example of how the Internet of Things might provide a business with a competitive
advantage.
5. How do you think wearable technologies could improve overall healthcare?
6. What potential problems do you see with a rise in the number of driverless cars? Do some
independent research and write a two-page paper that describes where driverless cars are legal and
what problems may occur.
7. Seek out the latest presentation by Mary Meeker on “Internet Trends” (if you cannot find it, the
video from 2013 is available at http://allthingsd.com/20130529/mary-meekers-2013-internet-
trends-deck-the-full-video/). Write a one-page paper describing what the top three trends are, in
your opinion.
Chapter 13: Future Trends in Information Systems 149
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://allthingsd.com/20130529/mary-meekers-2013-internet-trends-deck-the-full-video/
http://allthingsd.com/20130529/mary-meekers-2013-internet-trends-deck-the-full-video/
Answers to Study Questions
Chapter 1
1. What are the five components that make up an information system?
a. hardware, software, data, people, process
2. What are three examples of information system hardware?
a. There are a number of possible answers: a PC, a printer, a mouse, tablets, mobile
phones, etc.
3. Microsoft Windows is an example of which component of information systems?
a. It is an operating system, which is a part of the software component.
4. What is application software?
a. Software that does something useful.
5. What roles do people play in information systems?
a. The text includes examples such as helpdesk support, systems analyst, programmer, and
CIO.
6. What is the definition of a process?
a. A process is a series of steps undertaken to achieve a desired outcome or goal.
7. What was invented first, the personal computer or the Internet (ARPANET)?
a. The Internet was activated in 1969; the personal computer was introduced in 1975.
8. In what year were restrictions on commercial use of the Internet first lifted? When were eBay
and Amazon founded?
a. Restrictions were lifted in 1991, Amazon was founded in 1994, and eBay was founded in
1995.
9. What does it mean to say we are in a “post-PC world”?
a. The personal computer will no longer be the primary way that people interact and do
business.
10. What is Carr’s main argument about information technology?
a. That information technology is just a commodity and cannot be used to gain a
competitive advantage.
Chapter 2
1. Write your own description of what the term information systems hardware means.
a. Answers will vary, but should say something about information systems hardware
consisting of the physical parts of computing devices that can actually be touched.
2. What is the impact of Moore’s Law on the various hardware components described in this
chapter?
a. The student should pick one of the components and discuss the impact of the fact that
computing doubles in speed every two years. Most devices are getting smaller, faster, cheaper,
and this should be indicated in the answer.
3. Write a summary of one of the items linked to in the “Integrated Computing” section.
a. The student should write a summary of one of the linked articles.
150
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
4. Explain why the personal computer is now considered a commodity.
a. The PC has become a commodity in the sense that there is very little differentiation
between computers, and the primary factor that controls their sale is their price.
5. The CPU can also be thought of as the _____________ of the computer.
a. brain
6. List the following in increasing order (slowest to fastest): megahertz, kilohertz, gigahertz.
a. kilohertz, megahertz, gigahertz
7. What is the bus of a computer?
a. The bus is the electrical connection between different computer components.
8. Name two differences between RAM and a hard disk.
a. RAM is volatile; the hard disk is non-volatile. Data access in RAM is faster than on the
hard disk.
9. What are the advantages of solid-state drives over hard disks?
a. The main advantage is speed: an SSD has much faster data-access speeds than a
traditional hard disk.
10. How heavy was the first commercially successful portable computer?
a. The Compaq PC was 28 pounds.
Chapter 3
1. Come up with your own definition of software. Explain the key terms in your definition.
a. A variety of answers are possible, but should be similar to the definition in the text:
Software is the set of instructions that tell the hardware what to do. Software is created through
the process of programming.
2. What are the functions of the operating system?
a. The operating system manages the hardware resources of the computer, provides
the user-interface components, and provides a platform for software developers to write
applications.
3. Which of the following are operating systems and which are applications: Microsoft Excel,
Google Chrome, iTunes, Windows, Android, Angry Birds.
a. Microsoft Excel (application), Google Chrome (application), iTunes (application),
WIndows (operating system), Android (operating system), Angry Birds (application)
4. What is your favorite software application? What tasks does it help you accomplish?
a. Students will have various answers to this question. They should pick an application, not
an operating system. They should be able to list at least one thing that it helps them accomplish.
5. What is a “killer” app? What was the killer app for the PC?
a. A killer app is application software that is so useful that people will purchase the
hardware just so they can run it. The killer app for the PC was the spreadsheet (Visicalc).
6. How would you categorize the software that runs on mobile devices? Break down these apps
into at least three basic categories and give an example of each.
a. There are various ways to answer this question. Students should identify that there are
mobile operating systems and mobile apps. Most likely, students will break down mobile apps
into multiple categories: games, GPS, reading, communication, etc.
7. Explain what an ERP system does.
a. An ERP (enterprise resource planning) system is a software application with a
centralized database that is implemented across the entire organization.
Answers to Study Questions 151
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
8. What is open-source software? How does it differ from closed-source software? Give an
example of each.
a. Open-source software is software that makes the source code available for anyone to
copy and use. It is free to download, copy, and distribute. Closed-source software does not make
the source code available and generally is not free to download, copy, and distribute. There
are many examples of both, such as: Firefox (open source), Linux (open source), iTunes (closed
source), Microsoft Office (closed source).
9. What does a software license grant?
a. Software licenses are not all the same, but generally they grant the user the right to use
the software on a limited basis. The terms of the license dictate users’ rights in detail.
10. How did the Y2K (year 2000) problem affect the sales of ERP systems?
a. Organizations purchased ERP software to replace their older systems in order to avoid
any problems with the year 2000 in their software.
Chapter 4
1. What is the difference between data, information, and knowledge?
a. Data are the raw bits and pieces of facts and statistics with no context. Data can
be quantitative or qualitative. Information is data that has been given context. Knowledge is
information that has been aggregated and analyzed and can be used for making decisions.
2. Explain in your own words how the data component relates to the hardware and software
components of information systems.
a. There are numerous answers to this question, but all should be variations on the
following: Data is processed by the hardware via software. A database is software that runs on
the hardware. Hardware stores the data, software processes the data.
3. What is the difference between quantitative data and qualitative data? In what situations could
the number 42 be considered qualitative data?
a. Quantitative data is numeric, the result of a measurement, count, or some other
mathematical calculation. Qualitative data is descriptive. The number 42 could be qualitative if
it is a designation instead of a measurement, count, or calculation. For example: that player’s
jersey has number 42 on it.
4. What are the characteristics of a relational database?
a. A relational database is one in which data is organized into one or more tables. Each
table has a set of fields, which define the nature of the data stored in the table. A record is one
instance of a set of fields in a table. All the tables are related by one or more fields in common.
5. When would using a personal DBMS make sense?
a. When working on a smaller database for personal use, or when disconnected from the
network.
6. What is the difference between a spreadsheet and a database? List three differences between
them.
a. A database is generally more powerful and complex than a spreadsheet, with the ability
to handle multiple types of data and link them together. Some differences: A database has
defined field types, a spreadsheet does not. A database uses a standardized query language
(such as SQL), a spreadsheet does not. A database can hold much larger amounts of data than
a spreadsheet.
7. Describe what the term normalization means.
152 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
a. To normalize a database means to design it in a way that: 1) reduces duplication of data
between tables and 2) gives the table as much flexibility as possible.
8. Why is it important to define the data type of a field when designing a relational database?
a. A data type tells the database what functions can be performed with the data. The
second important reason to define the data type is so that the proper amount of storage space is
allocated for the data.
9. Name a database you interact with frequently. What would some of the field names be?
a. The student can choose any sort of system that they interact with, such as Amazon or
their school’s online systems. The fields would be the names of data being collected, such as
“first name”, or “address”.
10. What is metadata?
a. Metadata is data about data. It refers to the data used to describe other data, such as
the length of a song in iTunes, which describes the music file.
11. Name three advantages of using a data warehouse.
a. The text lists the following (the student should pick at least three of these):
i. The process of developing a data warehouse forces an organization to better
understand the data that it is currently collecting and, equally important, what data is
not being collected.
ii. A data warehouse provides a centralized view of all data being collected across
the enterprise and provides a means of determining data that is inconsistent.
iii. Once all data is identified as consistent, an organization can generate one
version of the truth. This is important when the company wants to report consistent
statistics about itself, such as revenue or number of employees.
iv. By having a data warehouse, snapshots of data can be taken over time. This
creates a historical record of data, which allows for an analysis of trends.
v. A data warehouse provides tools to combine data, which can provide new
information and analysis.
12. What is data mining?
a. Data mining is the process of analyzing data to find previously unknown trends, patterns,
and associations in order to make decisions.
Chapter 5
1. What were the first four locations hooked up to the Internet (ARPANET)?
a. UCLA, Stanford, MIT, and the University of Utah
2. What does the term packet mean?
a. The fundamental unit of data transmitted over the Internet. Each packet has the sender’s
address, the destination address, a sequence number, and a piece of the overall message to be
sent.
3. Which came first, the Internet or the World Wide Web?
a. the Internet
4. What was revolutionary about Web 2.0?
a. Anyone could post content to the web, without the need for understanding HTML or web-
server technology.
5. What was the so-called killer app for the Internet?
a. electronic mail (e-mail)
6. What makes a connection a broadband connection?
Answers to Study Questions 153
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
a. A broadband connection is defined as one that has speeds of at least 256,000 bps.
7. What does the term VoIP mean?
a. Voice over Internet protocol – a way to have voice conversations over the Internet.
8. What is an LAN?
a. An LAN is a local network, usually operating in the same building or on the same
campus.
9. What is the difference between an intranet and an extranet?
a. An intranet consists of the set of web pages and resources available on a company’s
internal network. These items are not available to those outside of the company. An extranet is a
part of the company’s network that is made available securely to those outside of the company.
Extranets can be used to allow customers to log in and check the status of their orders, or for
suppliers to check their customers’ inventory levels.
10. What is Metcalfe’s Law?
a. Metcalfe’s Law states that the value of a telecommunications network is proportional to
the square of the number of connected users of the system.
Chapter 6
1. Briefly define each of the three members of the information security triad.
a. The three members are as follows:
i. Confidentiality: we want to be able to restrict access to those who are allowed to
see given information.
ii. Integrity: the assurance that the information being accessed has not been altered
and truly represents what is intended.
iii. Availability: information can be accessed and modified by anyone authorized to
do so in an appropriate timeframe.
2. What does the term authentication mean?
a. The process of ensuring that a person is who he or she claims to be.
3. What is multi-factor authentication?
a. The use of more than one method of authentication. The methods are: something you
know, something you have, and something you are.
4. What is role-based access control?
a. With role-based access control (RBAC), instead of giving specific users access rights
to an information resource, users are assigned to roles and then those roles are assigned the
access.
5. What is the purpose of encryption?
a. To keep transmitted data secret so that only those with the proper key can read it.
6. What are two good examples of a complex password?
a. There are many examples of this. Students need to provide examples of passwords that
are a minimum of eight characters, with at least one upper-case letter, one special character,
and one number.
7. What is pretexting?
a. Pretexting occurs when an attacker calls a helpdesk or security administrator and
pretends to be a particular authorized user having trouble logging in. Then, by providing some
personal information about the authorized user, the attacker convinces the security person to
reset the password and tell him what it is.
8. What are the components of a good backup plan?
154 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
a. Knowing what needs to be backed up, regular backups of all data, offsite storage of all
backed-up data, and a test of the restoration process.
9. What is a firewall?
a. A firewall can be either a hardware firewall or a software firewall. A hardware firewall
is a device that is connected to the network and filters the packets based on a set of rules.
A software firewall runs on the operating system and intercepts packets as they arrive to a
computer.
10. What does the term physical security mean?
a. Physical security is the protection of the actual hardware and networking components
that store and transmit information resources.
Chapter 7
1. What is the productivity paradox?
a. The productivity paradox is based on Erik Brynjolfsson’s finding, based on research he
conducted in the early 1990s, that the addition of information technology to business had not
improved productivity at all.
2. Summarize Carr’s argument in “Does IT Matter.”
a. Information technology is now a commodity and cannot be used to provide an
organization with competitive advantage.
3. How is the 2008 study by Brynjolfsson and McAfee different from previous studies? How is it
the same?
a. It is different because it shows that IT can bring a competitive advantage, given the right
conditions. It is the same in the sense that it shows that IT, by itself, does not bring competitive
advantage.
4. What does it mean for a business to have a competitive advantage?
a. A company is said to have a competitive advantage over its rivals when it is able to
sustain profits that exceed average for the industry.
5. What are the primary activities and support activities of the value chain?
a. The primary activities are those that directly impact the creation of a product or service.
The support activities are those that support the primary activities. Primary: inbound logistics,
operations, outbound logistics, sales/marketing, and service. Support: firm infrastructure,
human resources, technology development, and procurement.
6. What has been the overall impact of the Internet on industry profitability? Who has been the
true winner?
a. The overall impact has been a reduction in average industry profitability. The consumer
has been the true winner.
7. How does EDI work?
a. EDI is the computer-to-computer exchange of business documents in a standard
electronic format between business partners.
8. Give an example of a semi-structured decision and explain what inputs would be necessary to
provide assistance in making the decision.
a. A semi-structured decision is one in which most of the factors needed for making the
decision are known but human experience and other outside factors may still play a role. The
student should provide an example of a decision that uses an information system to provide
information but is not made by the system. Examples would include: budgeting decisions,
diagnosing a medical condition, and investment decisions.
9. What does a collaborative information system do?
Answers to Study Questions 155
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
a. A collaborative system is software that allows multiple users to interact on a document
or topic in order to complete a task or make a decision.
10. How can IT play a role in competitive advantage, according to the 2008 article by Brynjolfsson
and McAfee?
a. The article suggests that IT can influence competitive advantage when good
management develops and delivers IT-supported process innovation.
Chapter 8
1. What does the term business process mean?
a. A process is a series of tasks that are completed in order to accomplish a goal. A
business process, therefore, is a process that is focused on achieving a goal for a business.
2. What are three examples of business process from a job you have had or an organization you
have observed?
a. Students can answer this in almost any way. The examples should consist of more than
a single step.
3. What is the value in documenting a business process?
a. There are many answers to this. From the text: it allows for better control of the process,
and for standardization.
4. What is an ERP system? How does an ERP system enforce best practices for an organization?
a. An ERP (enterprise resource planning) system is a software application with a
centralized database that is implemented across the entire organization. It enforces best
practices through the business processes embedded in the software.
5. What is one of the criticisms of ERP systems?
a. ERP systems can lead to the commoditization of business processes, meaning that every
company that uses an ERP system will perform business processes the same way.
6. What is business process reengineering? How is it different from incrementally improving a
process?
a. Business process reengineering (BPR) occurs when a business process is redesigned
from the ground up. It is different from incrementally improving a process in that it does not
simply take the existing process and modify it.
7. Why did BPR get a bad name?
a. BPR became an excuse to lay off employees and try to complete the same amount of
work using fewer employees.
8. List the guidelines for redesigning a business process.
a. The guidelines are as follows:
i. Organize around outcomes, not tasks.
ii. Have those who use the outcomes of the process perform the process.
iii. Subsume information-processing work into the real work that produces the
information. Treat geographically dispersed resources as though they were centralized.
iv. Link parallel activities instead of integrating their results.
v. Put the decision points where the work is performed, and build controls into the
process.
vi. Capture information once, at the source.
9. What is business process management? What role does it play in allowing a company to
differentiate itself?
156 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
a. Business process management (BPM) can be thought of as an intentional effort to plan,
document, implement, and distribute an organization’s business processes with the support of
information technology. It can play a role in differentiation through built-in reporting, and by
empowering employees, enforcing best practices, and enforcing consistency.
10. What does ISO certification signify?
a. ISO certification shows that you know what you do, do what you say, and have
documented your processes.
Chapter 9
1. Describe the role of a systems analyst.
a. To understand business requirements and translate them into the requirements of an
information system.
2. What are some of the different roles for a computer engineer?
a. hardware engineer, software engineer, network engineer, systems engineer
3. What are the duties of a computer operator?
a. Duties include keeping the operating systems up to date, ensuring available memory and
disk storage, and overseeing the physical environment of the computer.
4. What does the CIO do?
a. The CIO aligns the plans and operations of the information systems with the strategic
goals of the organization. This includes tasks such as budgeting, strategic planning, and
personnel decisions relevant to the information-systems function.
5. Describe the job of a project manager.
a. A project manager is responsible for keeping projects on time and on budget. This
person works with the stakeholders of the project to keep the team organized and communicates
the status of the project to management.
6. Explain the point of having two different career paths in information systems.
a. To allow for career growth for those who do not want to manage other employees but
instead want to focus on technical skills.
7. What are the advantages and disadvantages of centralizing the IT function?
a. There are several possible answers here. Advantages of centralizing include more
control over the company’s systems and data. Disadvantages include a more limited availability
of IT resources.
8. What impact has information technology had on the way companies are organized?
a. The organizational structure has been flattened, with fewer layers of management.
9. What are the five types of information-systems users?
a. innovators, early adopters, early majority, late majority, laggards
10. Why would an organization outsource?
a. Because it needs a specific skill for a limited amount of time, and/or because it can cut
costs by outsourcing.
Chapter 10
1. What are the steps in the SDLC methodology?
a. The steps are Preliminary Analysis, System Analysis, System Design, Programming,
Testing, Implementation, and Maintenance.
2. What is RAD software development?
Answers to Study Questions 157
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
a. Rapid application development (RAD) is a software-development (or systems-
development) methodology that focuses on quickly building a working model of the software,
getting feedback from users, and then using that feedback to update the working model.
3. What makes the lean methodology unique?
a. The biggest difference between the lean methodology and the other methodologies is that
the full set of requirements for the system is not known when the project is launched.
4. What are three differences between second-generation and third-generation languages?
a. Three key differences are as follows:
i. The words used in the language: third generation languages use more English-
like words than second-generation languages.
ii. Hardware specificity: third generation languages are not specific to hardware,
second-generation languages are.
iii. Learning curve: third generation languages are easier to learn and use.
5. Why would an organization consider building its own software application if it is cheaper to buy
one?
a. They may wish to build their own in order to have something that is unique (different
from their competitors), and/or something that more closely matches their business processes.
They also may choose to do this if they have more time and/or more money available to do it.
6. What is responsive design?
a. Responsive design is a method of developing websites that allows them to be viewed
on many different types of devices without losing capability or effectiveness. With a responsive
website, images resize themselves based on the size of the device’s screen, and text flows and
sizes itself properly for optimal viewing.
7. What is the relationship between HTML and CSS in website design?
a. While HTML is used to define the components of a web page, cascading style sheets
(CSS) are used to define the styles of the components on a page.
8. What is the difference between the pilot implementation methodology and the parallel
implementation methodology?
a. The pilot methodology implements new software for just one group of people while
the rest of the users use the previous version of the software. The parallel implementation
methodology uses both the old and the new applications at the same time.
9. What is change management?
a. The oversight of the changes brought about in an organization.
10. What are the four different implementation methodologies?
a. direct cutover, pilot, parallel, phased
Chapter 11
1. What does the term globalization mean?
a. Globalization refers to the integration of goods, services, and cultures among the nations
of the world.
2. How does Friedman define the three eras of globalization?
a. The three eras are as follows:
i. “Globalization 1.0” occurred from 1492 until about 1800. In this era,
globalization was centered around countries. It was about how much horsepower, wind
power, and steam power a country had and how creatively it was deployed. The world
shrank from size “large” to size “medium.”
158 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
ii. “Globalization 2.0” occurred from about 1800 until 2000, interrupted only by
the two World Wars. In this era, the dynamic force driving change was comprised of
multinational companies. The world shrank from size “medium” to size “small.”
iii. “Globalization 3.0” is our current era, beginning in the year 2000. The
convergence of the personal computer, fiber-optic Internet connections, and software
has created a “flat-world platform” that allows small groups and even individuals to go
global. The world has shrunk from size “small” to size “tiny.”
3. Which technologies have had the biggest effect on globalization?
a. There are several answers to this. Probably the most obvious are the Internet, the
graphical interface of Windows and the World Wide Web, and workflow software.
4. What are some of the advantages brought about by globalization?
a. Advantages include the ability to locate expertise and labor around the world, the ability
to operate 24 hours a day, and a larger market for products.
5. What are the challenges of globalization?
a. Challenges include infrastructure differences, labor laws and regulations, legal
restrictions, and different languages, customs, and preferences.
6. What does the term digital divide mean?
a. The separation between those who have access to the global network and those who do
not. The digital divide can occur between countries, regions, or even neighborhoods.
7. What are Jakob Nielsen’s three stages of the digital divide?
a. economic, usability, and empowerment
8. What was one of the key points of The Rise of the Network Society?
a. There are two key points to choose from. One is that economic activity was, when
the book was published in 1996, being organized around the networks that the new
telecommunication technologies had provided. The other is that this new, global economic
activity was different from the past, because “it is an economy with the capacity to work as a
unit in real time on a planetary scale.”
9. Which country has the highest average Internet speed? How does your country compare?
a. According to the chart in the chapter, South Korea has the highest Internet speeds.
Students will need to look up their own to compare.
10. What is the OLPC project? Has it been successful?
a. One Laptop Per Child. By most measures, it has not been a successful program.
Chapter 12
1. What does the term information systems ethics mean?
a. There are various ways of answering this question, but the answer should include
something about the application of ethics to the new capabilities and cultural norms brought
about by information technology.
2. What is a code of ethics? What is one advantage and one disadvantage of a code of ethics?
a. A code of ethics is a document that outlines a set of acceptable behaviors for a
professional or social group. Answers may differ for the second part, but from the text: one
advantage of a code of ethics is that it clarifies the acceptable standards of behavior for a
professional group. One disadvantage is that it does not necessarily have legal authority.
3. What does the term intellectual property mean? Give an example.
a. Intellectual property is defined as “property (as an idea, invention, or process) that
derives from the work of the mind or intellect.”
Answers to Study Questions 159
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
4. What protections are provided by a copyright? How do you obtain one?
a. Copyright protections address the following: who can make copies of the work, who can
make derivative works from the original work, who can perform the work publicly, who can
display the work publicly, and who can distribute the work. You obtain a copyright as soon as
the work is put into tangible form.
5. What is fair use?
a. Fair use is a limitation on copyright law that allows for the use of protected works
without prior authorization in specific cases.
6. What protections are provided by a patent? How do you obtain one?
a. Once a patent is granted, it provides the inventor with protection from others infringing
on the patent. In the US, a patent holder has the right to “exclude others from making, using,
offering for sale, or selling the invention throughout the United States or importing the invention
into the United States for a limited time in exchange for public disclosure of the invention when
the patent is granted.” You obtain a patent by filing an application with the patent office. A
patent will be granted if the work is deemed to be original, useful, and non-obvious.
7. What does a trademark protect? How do you obtain one?
a. A trademark protects a word, phrase, logo, shape, or sound that identifies a source of
goods or services. You can obtain one by registering with the Patent and Trademark Office
(US). There is also a common-law trademark.
8. What does the term personally identifiable information mean?
a. Information about a person that can be used to uniquely establish that person’s identity
is called personally identifiable information, or PII.
9. What protections are provided by HIPAA, COPPA, and FERPA?
a. The answers are as follows:
i. HIPAA: protects records related to health care as a special class of personally
identifiable information.
ii. COPPA: protects information collected from children under the age of thirteen.
iii. FERPA: protects student educational records.
10. How would you explain the concept of NORA?
a. There are various ways to answer this. The basic answer is that NORA (non-obvious
relationship awareness) is the process of collecting large quantities of a variety of information
and then combining it to create profiles of individuals.
Chapter 13
1. Which countries are the biggest users of the Internet? Social media? Mobile?
a. Students will need to look outside the text for this, as it changes all the time. There are
also different ways of measurement: number of users, % of population, most active users, etc.
Some good sites to use are Internet World Stats, Kissmetrics, and the World Bank.
2. Which country had the largest Internet growth (in %) between 2008 and 2012?
a. Iran, at 205%
3. How will most people connect to the Internet in the future?
a. via mobile devices
4. What are two different applications of wearable technologies?
a. There are many answers to this question; two examples are Google Glass and Jawbone
UP.
5. What are two different applications of collaborative technologies?
160 Information Systems for Business and Beyond
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
http://www.internetworldstats.com/stats.htm
http://blog.kissmetrics.com/facebook-statistics/
http://data.worldbank.org/indicator/IT.CEL.SETS.P2
a. There are many answers to this; two examples are software that routes us to our
destination in the shortest amount of time, and websites that review different companies.
6. What capabilities do printable technologies have?
a. Using 3-D printers, designers can quickly test prototypes or build something as a proof
of concept. Printable technologies also make it possible to bring manufacturing to the desktop
computer.
7. How will advances in wireless technologies and sensors make objects “findable”?
a. Advances in wireless technologies and sensors will allow physical objects to send and
receive data about themselves.
8. What is enhanced situational awareness?
a. Data from large numbers of sensors can give decision makers a heightened awareness of
real-time events, particularly when the sensors are used with advanced display or visualization
technologies.
9. What is a nanobot?
a. A nanobot is a robot whose components are on the scale of about a nanometer.
10. What is a UAV?
a. An unmanned aerial vehicle – a small airplane or helicopter that can fly without a pilot.
UAVs are run by computer or remote control.
Answers to Study Questions 161
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Anderson, Chris. Makers: The New Industrial Revolution. New York: Crown Business, 2012.
Brynjolfsson, Erik. “The Productivity Paradox of Information Technology: Review and
Assessment.” Communications of the ACM, December, 1993. http://ccs.mit.edu/papers/CCSWP130/
ccswp130.html.
Brynjolfsson, Erik and Lorin Hitt. “Beyond the Productivity Paradox: Computers are the Catalyst
for Bigger Changes.” Communications of the ACM, August 1998, vol. 41(8): pp.
49–55. http://ebusiness.mit.edu/erik/bpp
Castells, Manuel. The Rise of the Network Society. 2nd ed. Cambridge, MA: Blackwell
Publishers, 2000.Valacich, Joseph, and Christoph Schneider. Information Systems Today: Managing in the
Digital World. 4th ed. Upper Saddle River, NJ: Prentice-Hall, 2010.
Chui, Michael, Markus Löffler, and Roger Roberts. “The Internet of Things.” McKinsey Quarterly,
March 2010. http://www.mckinsey.com/insights/high_tech_telecoms_internet/the_internet_of_things
Columbus, Louis. “IDC: 87% of Connected Devices Sales by 2017 Will Be Tablets and Smartphones.”
Tech section of forbes.com, September 12, 2013. http://www.forbes.com/sites/louiscolumbus/2013/09/12/
idc-87-of-connected-devices-by-2017-will-be-tablets-and-smartphones/
Friedman, T. L. The World Is Flat: A Brief History of the Twenty-First Century. New York: Farrar,
Straus and Giroux, 2005.
Gallagher, Sean. ”Born to Be Breached: The Worst Passwords Are Still the Most
Common.” Arstechnica, November 3, 2012. Retrieved from http://arstechnica.com/information-
technology/2012/11/born-to-be-breached-the-worst-passwords-are-still-the-most-common/ on May 15,
2013.
Godin, Seth. Really Bad PowerPoint (and How to Avoid It). Do You Zoom, Inc.,
2001. http://www.sethgodin.com/freeprize/reallybad-1 .
Guel, Michele D. ”A Short Primer for Developing Security Policies.” SANS Institute, 2007. Accessed
from http://www.sans.org/security-resources/policies/Policy_Primer on May 31, 2013.
Hammer, Michael. “Reengineering Work: Don’t Automate, Obliterate.” Harvard
Business Review, 68.4 (1990): 104–112.
Kibum, Kim. “Challenges in HCI: Digital Divide.” Crossroads, vol. 12, issue 2 (December 2005),
2–2, doi: 10.1145/1144375.1144377. http://doi.acm.org/10.1145/1144375.1144377.
Kim, P., E. Buckner, T. Makany, and H. Kim. “A Comparative Analysis of a Game-Based Mobile
Learning Model in Low-Socioeconomic Communities of India.” International Journal of Educational
Development, vol. 32, issue 2 (March 2012), pp. 205–366, doi:10.1016/j.ijedudev.2011.05.008.
Kraemer, Kenneth L., Jason Dedrick, and Prakul Sharma. “One Laptop Per Child: Vision vs.
Reality.” Communications of the ACM, vol. 52, no. 6, pp. 66–73.
Laudon, Kenneth C., and Jane P. Laudon. Management Information Systems: Managing the Digital
Firm. 12th ed. Upper Saddle River, NJ: Prentice-Hall, 2012.
McAfee, Andrew and Erik Brynjolfsson. “Investing in the IT That Makes a Competitive
Difference.” Harvard Business Review, July-August, 2008.
McCallister, Erika, Tim Grance, and Karen Scarfone. Guide to Protecting the Confidentiality of
Personally Identifiable Information (PII). National Institute of Standards and Technology, US Department
of Commerce Special Publication 800-122, April 2010. http://csrc.nist.gov/publications/nistpubs/800-122/
sp800-122
162
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Moore, Gordon E. “Cramming More Components onto Integrated Circuits.” Electronics, pp. 114–117,
April 19, 1965.
Porter, Michael. “Strategy and the Internet.” Harvard Business Review, vol. 79, no. 3, March
2001. http://hbswk.hbs.edu/item/2165.html
Rogers, E. M. Diffusion of Innovations. New York: Free Press, 1962.
Whitney, Lance. “Smartphone Shipments to Surpass Feature Phones This Year.” CNet, June 4, 2013.
http://news.cnet.com/8301-1035_3-57587583-94/smartphone-shipments-to-surpass-feature-phones-this-
year/
Wiseman, C., and I. C. MacMillan. “Creating Competitive Weapons from Information
Systems.” Journal Of Business Strategy, 5(2) (1984), p. 42.
Bibliography 163
saylor.org
Saylor URL: http://www.saylor.org/courses/bus206
Attributed to: David T. Bourgeois, Ph.D.
Information Systems for Business and Beyond
Contents
Introduction
Part 1: What Is an Information System?
Chapter 1: What Is an Information System?
Chapter 2: Hardware
Chapter 3: Software
Chapter 4: Data and Databases
Chapter 5: Networking and Communication
Chapter 6: Information Systems Security
Part 2: Information Systems for Strategic Advantage
Chapter 7: Does IT Matter?
Chapter 8: Business Processes
Chapter 9: The People in Information Systems
Chapter 10: Information Systems Development
Part 3: Information Systems Beyond the Organization
Chapter 11: Globalization and the Digital Divide
Chapter 12: The Ethical and Legal Implications of Information Systems
Chapter 13: Future Trends in Information Systems
Answers to Study Questions
Bibliography
Blank Page
Blank Page
Information Technology
and Organizational
Learning
Managing Behavioral Change
in the Digital Age
Third Edition
Information Technology
and Organizational
Learning
Managing Behavioral Change
in the Digital Age
Third Edition
Arthur M. Langer
CRC Press
Taylor & Francis Group
6000 Broken Sound Parkway NW, Suite 300
Boca Raton, FL 33487-2742
© 2018 by Taylor & Francis Group, LLC
CRC Press is an imprint of Taylor & Francis Group, an Informa business
No claim to original U.S. Government works
Printed on acid-free paper
International Standard Book Number-13: 978-1-4987-7575-5 (Paperback)
International Standard Book Number-13: 978-1-138-23858-9 (Hardback)
This book contains information obtained from authentic and highly regarded sources. Reasonable
efforts have been made to publish reliable data and information, but the author and publisher cannot
assume responsibility for the validity of all materials or the consequences of their use. The authors and
publishers have attempted to trace the copyright holders of all material reproduced in this publication
and apologize to copyright holders if permission to publish in this form has not been obtained. If any
copyright material has not been acknowledged please write and let us know so we may rectify in any
future reprint.
Except as permitted under U.S. Copyright Law, no part of this book may be reprinted, reproduced,
transmitted, or utilized in any form by any electronic, mechanical, or other means, now known or
hereafter invented, including photocopying, microfilming, and recording, or in any information stor-
age or retrieval system, without written permission from the publishers.
For permission to photocopy or use material electronically from this work, please access www.copy-
right.com (http://www.copyright.com/) or contact the Copyright Clearance Center, Inc. (CCC), 222
Rosewood Drive, Danvers, MA 01923, 978-750-8400. CCC is a not-for-profit organization that
provides licenses and registration for a variety of users. For organizations that have been granted a
photocopy license by the CCC, a separate system of payment has been arranged.
Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and are
used only for identification and explanation without intent to infringe.
Visit the Taylor & Francis Web site at
http://www.taylorandfrancis.com
and the CRC Press Web site at
http://www.crcpress.com
v
Contents
Foreword xi
Acknowledgments xiii
Author xv
IntroductIon xvii
chApter 1 the “rAvell” corporAtIon 1
Introduction 1
A New Approach 3
The Blueprint for Integration 5
Enlisting Support 6
Assessing Progress 7
Resistance in the Ranks 8
Line Management to the Rescue 8
IT Begins to Reflect 9
Defining an Identity for Information Technology 10
Implementing the Integration: A Move toward Trust and
Reflection 12
Key Lessons 14
Defining Reflection and Learning for an Organization 14
Working toward a Clear Goal 15
Commitment to Quality 15
Teaching Staff “Not to Know” 16
Transformation of Culture 16
Alignment with Administrative Departments 17
Conclusion 19
vi Contents
chApter 2 the It dIlemmA 21
Introduction 21
Recent Background 23
IT in the Organizational Context 24
IT and Organizational Structure 24
The Role of IT in Business Strategy 25
Ways of Evaluating IT 27
Executive Knowledge and Management of IT 28
IT: A View from the Top 29
Section 1: Chief Executive Perception of the Role of IT 32
Section 2: Management and Strategic Issues 34
Section 3: Measuring IT Performance and Activities 35
General Results 36
Defining the IT Dilemma 36
Recent Developments in Operational Excellence 38
chApter 3 technology As A vArIAble And responsIve
orgAnIzAtIonAl dynAmIsm 41
Introduction 41
Technological Dynamism 41
Responsive Organizational Dynamism 42
Strategic Integration 43
Summary 48
Cultural Assimilation 48
IT Organization Communications with “ Others” 49
Movement of Traditional IT Staff 49
Summary 51
Technology Business Cycle 52
Feasibility 53
Measurement 53
Planning 54
Implementation 55
Evolution 57
Drivers and Supporters 58
Santander versus Citibank 60
Information Technology Roles and Responsibilities 60
Replacement or Outsource 61
chApter 4 orgAnIzAtIonAl leArnIng theorIes And
technology 63
Introduction 63
Learning Organizations 72
Communities of Practice 75
Learning Preferences and Experiential Learning 83
Social Discourse and the Use of Language 89
Identity 91
Skills 92
viiContents
Emotion 92
Linear Development in Learning Approaches 96
chApter 5 mAnAgIng orgAnIzAtIonAl leArnIng And
technology 109
The Role of Line Management 109
Line Managers 111
First-Line Managers 111
Supervisor 111
Management Vectors 112
Knowledge Management 116
Ch ange Management 120
Change Management for IT Organizations 123
Social Networks and Information Technology 134
chApter 6 orgAnIzAtIonAl trAnsFormAtIon And the
bAlAnced scorecArd 139
Introduction 139
Methods of Ongoing Evaluation 146
Balanced Scorecards and Discourse 156
Knowledge Creation, Culture, and Strategy 158
chApter 7 vIrtuAl teAms And outsourcIng 163
Introduction 163
Status of Virtual Teams 165
Management Considerations 166
Dealing with Multiple Locations 166
Externalization 169
Internalization 171
Combination 171
Socialization 172
Externalization Dynamism 172
Internalization Dynamism 173
Combination Dynamism 173
Socialization Dynamism 173
Dealing with Multiple Locations and Outsourcing 177
Revisiting Social Discourse 178
Identity 179
Skills 180
Emotion 181
chApter 8 synergIstIc unIon oF It And
orgAnIzAtIonAl leArnIng 187
Introduction 187
Siemens AG 187
Aftermath 202
ICAP 203
viii Contents
Five Years Later 224
HTC 225
IT History at HTC 226
Interactions of the CEO 227
The Process 228
Transformation from the Transition 229
Five Years Later 231
Summary 233
chApter 9 FormIng A cyber securIty culture 239
Introduction 239
History 239
Talking to the Board 241
Establishing a Security Culture 241
Understanding What It Means to be Compromised 242
Cyber Security Dynamism and Responsive Organizational
Dynamism 242
Cyber Strategic Integration 243
Cyber Cultural Assimilation 245
Summary 246
Organizational Learning and Application Development 246
Cyber Security Risk 247
Risk Responsibility 248
Driver /Supporter Implications 250
chApter 10 dIgItAl trAnsFormAtIon And chAnges In
consumer behAvIor 251
Introduction 251
Requirements without Users and without Input 254
Concepts of the S-Curve and Digital Transformation
Analysis and Design 258
Organizational Learning and the S-Curve 260
Communities of Practice 261
The IT Leader in the Digital Transformation Era 262
How Technology Disrupts Firms and Industries 264
Dynamism and Digital Disruption 264
Critical Components of “ Digital” Organization 265
Assimilating Digital Technology Operationally and Culturally 267
Conclusion 268
chApter 11 IntegrAtIng generAtIon y employees to
AccelerAte competItIve AdvAntAge 269
Introduction 269
The Employment Challenge in the Digital Era 270
Gen Y Population Attributes 272
Advantages of Employing Millennials to Support Digital
Transformation 272
Integration of Gen Y with Baby Boomers and Gen X 273
ixContents
Designing the Digital Enterprise 274
Assimilating Gen Y Talent from Underserved and Socially
Excluded Populations 276
Langer Workforce Maturity Arc 277
Theoretical Constructs of the LWMA 278
The LWMA and Action Research 281
Implications for New Pathways for Digital Talent 282
Demographic Shifts in Talent Resources 282
Economic Sustainability 283
Integration and Trust 283
Global Implications for Sources of Talent 284
Conclusion 284
chApter 12 towArd best prActIces 287
Introduction 287
Chief IT Executive 288
Definitions of Maturity Stages and Dimension Variables in
the Chief IT Executive Best Practices Arc 297
Maturity Stages 297
Performance Dimensions 298
Chief Executive Officer 299
CIO Direct Reporting to the CEO 305
Outsourcing 306
Centralization versus Decentralization of IT 306
CIO Needs Advanced Degrees 307
Need for Standards 307
Risk Management 307
The CEO Best Practices Technology Arc 313
Definitions of Maturity Stages and Dimension Variables in
the CEO Technology Best Practices Arc 314
Maturity Stages 314
Performance Dimensions 315
Middle Management 316
The Middle Management Best Practices Technology Arc 323
Definitions of Maturity Stages and Dimension Variables in
the Middle Manager Best Practices Arc 325
Maturity Stages 325
Performance Dimensions 326
Summary 327
Ethics and Maturity 333
chApter 13 conclusIons 339
Introduction 339
glossAry 357
reFerences 363
Index 373
xi
Foreword
Digital technologies are transforming the global economy. Increasingly,
firms and other organizations are assessing their opportunities, develop-
ing and delivering products and services, and interacting with custom-
ers and other stakeholders digitally. Established companies recognize
that digital technologies can help them operate their businesses with
greater speed and lower costs and, in many cases, offer their custom-
ers opportunities to co-design and co-produce products and services.
Many start-up companies use digital technologies to develop new prod-
ucts and business models that disrupt the present way of doing busi-
ness, taking customers away from firms that cannot change and adapt.
In recent years, digital technology and new business models have dis-
rupted one industry after another, and these developments are rapidly
transforming how people communicate, learn, and work.
Against this backdrop, the third edition of Arthur Langer’ s
Information Technology and Organizational Learning is most welcome.
For decades, Langer has been studying how firms adapt to new or
changing conditions by increasing their ability to incorporate and use
advanced information technologies. Most organizations do not adopt
new technology easily or readily. Organizational inertia and embed-
ded legacy systems are powerful forces working against the adoption
of new technology, even when the advantages of improved technology
are recognized. Investing in new technology is costly, and it requires
xii Foreword
aligning technology with business strategies and transforming cor-
porate cultures so that organization members use the technology to
become more productive.
Information Technology and Organizational Learning addresses these
important issues— and much more. There are four features of the new
edition that I would like to draw attention to that, I believe, make
this a valuable book. First, Langer adopts a behavioral perspective
rather than a technical perspective. Instead of simply offering norma-
tive advice about technology adoption, he shows how sound learn-
ing theory and principles can be used to incorporate technology into
the organization. His discussion ranges across the dynamic learning
organization, knowledge management, change management, com-
munities of practice, and virtual teams. Second, he shows how an
organization can move beyond technology alignment to true technol-
ogy integration. Part of this process involves redefining the traditional
support role of the IT department to a leadership role in which IT
helps to drive business strategy through a technology-based learn-
ing organization. Third, the book contains case studies that make the
material come alive. The book begins with a comprehensive real-life
case that sets the stage for the issues to be resolved, and smaller case
illustrations are sprinkled throughout the chapters, to make concepts
and techniques easily understandable. Lastly, Langer has a wealth of
experience that he brings to his book. He spent more than 25 years
as an IT consultant and is the founder of the Center for Technology
Management at Columbia University, where he directs certificate and
executive programs on various aspects of technology innovation and
management. He has organized a vast professional network of tech-
nology executives whose companies serve as learning laboratories for
his students and research. When you read the book, the knowledge
and insight gained from these experiences is readily apparent.
If you are an IT professional, Information Technology and Organi
zational Learning should be required reading. However, anyone who
is part of a firm or agency that wants to capitalize on the opportunities
provided by digital technology will benefit from reading the book.
Charles C. Snow
Professor Emeritus, Penn State University
CoEditor, Journal of Organization Design
xiii
Acknowledgments
Many colleagues and clients have provided significant support during
the development of the third edition of Information Technology and
Organizational Learning.
I owe much to my colleagues at Teachers College, namely, Professor
Victoria Marsick and Lyle Yorks, who guided me on many of the the-
ories on organizational learning, and Professor Lee Knefelkamp, for
her ongoing mentorship on adult learning and developmental theo-
ries. Professor David Thomas from the Harvard Business School also
provided valuable direction on the complex issues surrounding diver-
sity, and its importance in workforce development.
I appreciate the corporate executives who agreed to participate
in the studies that allowed me to apply learning theories to actual
organizational practices. Stephen McDermott from ICAP provided
invaluable input on how chief executive officers (CEOs) can success-
fully learn to manage emerging technologies. Dana Deasy, now global
chief information officer (CIO) of JP Morgan Chase, contributed
enormous information on how corporate CIOs can integrate tech-
nology into business strategy. Lynn O’ Connor Vos, CEO of Grey
Healthcare, also showed me how technology can produce direct mon-
etary returns, especially when the CEO is actively involved.
And, of course, thank you to my wonderful students at Columbia
University. They continue to be at the core of my inspiration and love
for writing, teaching, and scholarly research.
xv
Author
Arthur M. Langer, EdD, is professor of professional practice
of management and the director of the Center for Technology
Management at Columbia University. He is the academic direc-
tor of the Executive Masters of Science program in Technology
Management, vice chair of faculty and executive advisor to the dean
at the School of Professional Studies and is on the faculty of the
Department of Organization and Leadership at the Graduate School
of Education (Teachers College). He has also served as a member of
the Columbia University Faculty Senate. Dr. Langer is the author
of Guide to Software Development: Designing & Managing the Life
Cycle. 2nd Edition (2016), Strategic IT: Best Practices for Managers
and Executives (2013 with Lyle Yorks), Information Technology and
Organizational Learning (2011), Analysis and Design of Information
Systems (2007), Applied Ecommerce (2002), and The Art of Analysis
(1997), and has numerous published articles and papers, relating
to digital transformation, service learning for underserved popula-
tions, IT organizational integration, mentoring, and staff develop-
ment. Dr. Langer consults with corporations and universities on
information technology, cyber security, staff development, man-
agement transformation, and curriculum development around the
Globe. Dr. Langer is also the chairman and founder of Workforce
Opportunity Services (www.wforce.org), a non-profit social venture
xvi Author
that provides scholarships and careers to underserved populations
around the world.
Dr. Langer earned a BA in computer science, an MBA in
accounting/finance, and a Doctorate of Education from Columbia
University.
xvii
Introduction
Background
Information technology (IT) has become a more significant part of
workplace operations, and as a result, information systems person-
nel are key to the success of corporate enterprises, especially with
the recent effects of the digital revolution on every aspect of business
and social life (Bradley & Nolan, 1998; Langer, 1997, 2011; Lipman-
Blumen, 1996). This digital revolution is defined as a form of “ dis-
ruption.” Indeed, the big question facing many enterprises today is,
How can executives anticipate the unexpected threats brought on by
technological advances that could devastate their business? This book
focuses on the vital role that information and digital technology orga-
nizations need to play in the course of organizational development
and learning, and on the growing need to integrate technology fully
into the processes of workplace organizational learning. Technology
personnel have long been criticized for their inability to function as
part of the business, and they are often seen as a group outside the
corporate norm (Schein, 1992). This is a problem of cultural assimila-
tion, and it represents one of the two major fronts that organizations
now face in their efforts to gain a grip on the new, growing power of
technology, and to be competitive in a global world. The other major
xviii IntroduCtIon
front concerns the strategic integration of new digital technologies
into business line management.
Because technology continues to change at such a rapid pace, the
ability of organizations to operate within a new paradigm of dynamic
change emphasizes the need to employ action learning as a way to
build competitive learning organizations in the twenty-first century.
Information Technology and Organizational Learning integrates some
of the fundamental issues bearing on IT today with concepts from
organizational learning theory, providing comprehensive guidance,
based on real-life business experiences and concrete research.
This book also focuses on another aspect of what IT can mean to
an organization. IT represents a broadening dimension of business life
that affects everything we do inside an organization. This new reality is
shaped by the increasing and irreversible dissemination of technology.
To maximize the usefulness of its encroaching presence in everyday
business affairs, organizations will require an optimal understanding
of how to integrate technology into everything they do. To this end,
this book seeks to break new ground on how to approach and concep-
tualize this salient issue— that is, that the optimization of information
and digital technologies is best pursued with a synchronous imple-
mentation of organizational learning concepts. Furthermore, these
concepts cannot be implemented without utilizing theories of strategic
learning. Therefore, this book takes the position that technology liter-
acy requires individual and group strategic learning if it is to transform
a business into a technology-based learning organization. Technology
based organizations are defined as those that have implemented a means
of successfully integrating technology into their process of organiza-
tional learning. Such organizations recognize and experience the real-
ity of technology as part of their everyday business function. It is what
many organizations are calling “ being digital.”
This book will also examine some of the many existing organi-
zational learning theories, and the historical problems that have
occurred with companies that have used them, or that have failed
to use them. Thus, the introduction of technology into organizations
actually provides an opportunity to reassess and reapply many of the
past concepts, theories, and practices that have been used to support
the importance of organizational learning. It is important, however,
not to confuse this message with a reason for promoting organizational
xixIntroduCtIon
learning, but rather, to understand the seamless nature of the relation-
ship between IT and organizational learning. Each needs the other to
succeed. Indeed, technology has only served to expose problems that
have existed in organizations for decades, e.g., the inability to drive
down responsibilities to the operational levels of the organization, and
to be more agile with their consumers.
This book is designed to help businesses and individual manag-
ers understand and cope with the many issues involved in developing
organizational learning programs, and in integrating an important
component: their IT and digital organizations. It aims to provide a
combination of research case studies, together with existing theories
on organizational learning in the workplace. The goal is also to pro-
vide researchers and corporate practitioners with a book that allows
them to incorporate a growing IT infrastructure with their exist-
ing workforce culture. Professional organizations need to integrate
IT into their organizational processes to compete effectively in the
technology-driven business climate of today. This book responds to
the complex and various dilemmas faced by many human resource
managers and corporate executives regarding how to actually deal
with many marginalized technology personnel who somehow always
operate outside the normal flow of the core business.
While the history of IT, as a marginalized organization, is rela-
tively short, in comparison to that of other professions, the problems
of IT have been consistent since its insertion into business organiza-
tions in the early 1960s. Indeed, while technology has changed, the
position and valuation of IT have continued to challenge how execu-
tives manage it, account for it, and, most important, ultimately value
its contributions to the organization. Technology personnel continue
to be criticized for their inability to function as part of the business,
and they are often seen as outside the business norm. IT employees
are frequently stereotyped as “ techies,” and are segregated in such a
way that they become isolated from the organization. This book pro-
vides a method for integrating IT, and redefining its role in organiza-
tions, especially as a partner in formulating and implementing key
business strategies that are crucial for the survival of many companies
in the new digital age. Rather than provide a long and extensive list of
common issues, I have decided it best to uncover the challenges of IT
integration and performance through the case study approach.
xx IntroduCtIon
IT continues to be one of the most important yet least understood
departments in an organization. It has also become one of the most
significant components for competing in the global markets of today.
IT is now an integral part of the way companies become successful,
and is now being referred to as the digital arm of the business. This
is true across all industries. The role of IT has grown enormously in
companies throughout the world, and it has a mission to provide stra-
tegic solutions that can make companies more competitive. Indeed,
the success of IT, and its ability to operate as part of the learning
organization, can mean the difference between the success and failure
of entire companies. However, IT must be careful that it is not seen as
just a factory of support personnel, and does not lose its justification
as driving competitive advantage. We see in many organizations that
other digital-based departments are being created, due to frustration
with the traditional IT culture, or because they simply do not see IT
as meeting the current needs for operating in a digital economy.
This book provides answers to other important questions that have
challenged many organizations for decades. First, how can manag-
ers master emerging digital technologies, sustain a relationship with
organizational learning, and link it to strategy and performance?
Second, what is the process by which to determine the value of using
technology, and how does it relate to traditional ways of calculating
return on investment, and establishing risk models? Third, what are
the cyber security implications of technology-based products and
services? Fourth, what are the roles and responsibilities of the IT
executive, and the department in general? To answer these questions,
managers need to focus on the following objectives:
• Address the operational weaknesses in organizations, in
terms of how to deal with new technologies, and how to bet-
ter realize business benefits.
• Provide a mechanism that both enables organizations to deal
with accelerated change caused by technological innovations,
and integrates them into a new cycle of processing, and han-
dling of change.
• Provide a strategic learning framework, by which every new
technology variable adds to organizational knowledge and
can develop a risk and security culture.
xxiIntroduCtIon
• Establish an integrated approach that ties technology account-
ability to other measurable outcomes, using organizational
learning techniques and theories.
To realize these objectives, organizations must be able to
• create dynamic internal processes that can deal, on a daily
basis, with understanding the potential fit of new technologies
and their overall value within the structure of the business;
• provide the discourse to bridge the gaps between IT- and non-
IT-related investments, and uses, into one integrated system;
• monitor investments and determine modifications to the life
cycle;
• implement various organizational learning practices, includ-
ing learning organization, knowledge management, change
management, and communities of practice, all of which help
foster strategic thinking, and learning, and can be linked to
performance (Gephardt & Marsick, 2003).
The strengths of this book are that it integrates theory and practice
and provides answers to the four common questions mentioned. Many
of the answers provided in these pages are founded on theory and
research and are supported by practical experience. Thus, evidence of
the performance of the theories is presented via case studies, which
are designed to assist the readers in determining how such theories
and proven practices can be applied to their specific organization.
A common theme in this book involves three important terms:
dynamic , unpredictable , and acceleration . Dynamic is a term that rep-
resents spontaneous and vibrant things— a motive force. Technology
behaves with such a force and requires organizations to deal with its
capabilities. Glasmeier (1997) postulates that technology evolution,
innovation, and change are dynamic processes. The force then is tech-
nology, and it carries many motives, as we shall see throughout this
book. Unpredictable suggests that we cannot plan what will happen
or will be needed. Many organizational individuals, including execu-
tives, have attempted to predict when, how, or why technology will
affect their organization. Throughout our recent history, especially
during the “ digital disruption” era, we have found that it is difficult,
if not impossible, to predict how technology will ultimately benefit or
xxii IntroduCtIon
hurt organizational growth and competitive advantage. I believe that
technology is volatile and erratic at times. Indeed, harnessing tech-
nology is not at all an exact science; certainly not in the ways in which
it can and should be used in today’ s modern organization. Finally, I
use the term acceleration to convey the way technology is speeding up
our lives. Not only have emerging technologies created this unpre-
dictable environment of change, but they also continue to change it
rapidly— even from the demise of the dot-com era decades ago. Thus,
what becomes important is the need to respond quickly to technology.
The inability to be responsive to change brought about by technologi-
cal innovations can result in significant competitive disadvantages for
organizations.
This new edition shows why this is a fact especially when examining
the shrinking S-Curve. So, we look at these three words— dynamic,
unpredictable, and acceleration— as a way to define how technology
affects organizations; that is, technology is an accelerating motive
force that occurs irregularly. These words name the challenges that
organizations need to address if they are to manage technological
innovations and integrate them with business strategy and competi-
tive advantage. It only makes sense that the challenge of integrating
technology into business requires us first to understand its potential
impact, determine how it occurs, and see what is likely to follow.
There are no quick remedies to dealing with emerging technologies,
just common practices and sustained processes that must be adopted
for organizations to survive in the future.
I had four goals in mind in writing this book. First, I am inter-
ested in writing about the challenges of using digital technologies
strategically. What particularly concerns me is the lack of literature
that truly addresses this issue. What is also troublesome is the lack
of reliable techniques for the evaluation of IT, especially since IT
is used in almost every aspect of business life. So, as we increase
our use and dependency on technology, we seem to understand less
about how to measure and validate its outcomes. I also want to
convey my thoughts about the importance of embracing nonmon-
etary methods for evaluating technology, particularly as they relate
to determining return on investment. Indeed, indirect and non-
monetary benefits need to be part of the process of assessing and
approving IT projects.
xxiiiIntroduCtIon
Second, I want to apply organizational learning theory to the field
of IT and use proven learning models to help transform IT staff into
becoming better members of their organizations. Everyone seems to
know about the inability of IT people to integrate with other depart-
ments, yet no one has really created a solution to the problem. I find
that organizational learning techniques are an effective way of coach-
ing IT staff to operate more consistently with the goals of the busi-
nesses that they support.
Third, I want to present cogent theories about IT and organiza-
tional learning; theories that establish new ways for organizations to
adapt new technologies. I want to share my experiences and those of
other professionals who have found approaches that can provide posi-
tive outcomes from technology investments.
Fourth, I have decided to express my concerns about the valid-
ity and reliability of organizational learning theories and practices as
they apply to the field of IT. I find that most of these models need to
be enhanced to better fit the unique aspects of the digital age. These
modified models enable the original learning techniques to address
IT-specific issues. In this way, the organization can develop a more
holistic approach toward a common goal for using technology.
Certainly, the balance of how technology ties in with strategy is
essential. However, there has been much debate over whether tech-
nology should drive business strategy or vice versa. We will find that
the answer to this is “ yes.” Yes, in the sense that technology can affect
the way organizations determine their missions and business strate-
gies; but “ no” in that technology should not be the only component
for determining mission and strategy. Many managers have realized
that business is still business, meaning that technology is not a “ sil-
ver bullet.” The challenge, then, is to determine how best to fit tech-
nology into the process of creating and supporting business strategy.
Few would doubt today that technology is, indeed, the most signifi-
cant variable affecting business strategy. However, the most viable
approach is to incorporate technology into the process of determin-
ing business strategy. I have found that many businesses still formu-
late their strategies first, and then look at technology, as a means to
efficiently implement objectives and goals. Executives need to better
understand the unique and important role that technology provides
us; it can drive business strategy, and support it, at the same time.
xxiv IntroduCtIon
Managers should not solely focus their attention on generating
breakthrough innovations that will create spectacular results. Most
good uses of technology are much subtler, and longer-lasting. For this
reason, this book discusses and defines new technology life cycles
that blend business strategy and strategic learning. Building on this
theme, I introduce the idea of responsive organizational dynamism as
the core theory of this book. Responsive organizational dynamism
defines an environment that can respond to the three important
terms (dynamic, unpredictable, and acceleration). Indeed, technology
requires organizations that can sustain a system, in which individu-
als can deal with dynamic, unpredictable, and accelerated change, as
part of their regular process of production. The basis of this concept
is that organizations must create and sustain such an environment to
be competitive in a global technologically-driven economy. I further
analyze responsive organizational dynamism in its two subcompo-
nents: strategic integration and cultural assimilation, which address
how technology needs to be measured as it relates to business strategy,
and what related social– structural changes are needed, respectively.
Change is an important principle of this book. I talk about the
importance of how to change, how to manage such change, and why
emerging technologies are a significant agent of change. I support
the need for change, as an opportunity to use many of the learning
theories that have been historically difficult to implement. That is,
implementing change brought on by technological innovation is an
opportunity to make the organization more “ change ready” or, as we
define it today, more “ agile.” However, we also know that little is
known about how organizations should actually go about modifying
existing processes to adapt to new technologies and become digital
entities— and to be accustomed to doing this regularly. Managing
through such periods of change requires that we develop a model that
can deal with dynamic, unpredictable, and accelerated change. This is
what responsive organizational dynamism is designed to do.
We know that over 20% of IT projects still fail to be completed.
Another 54% fail to meet their projected completion date. We now sit
at the forefront of another technological spurt of innovations that will
necessitate major renovations to existing legacy systems, requiring that
they be linked to sophisticated e-business systems. These e-business
systems will continue to utilize the Internet, and emerging mobile
xxvIntroduCtIon
technologies. While we tend to focus primarily on what technology
generically does, organizations need urgently to prepare themselves
for the next generation of advances, by forming structures that can
deal with continued, accelerated change, as the norm of daily opera-
tions. For this edition, I have added new sections and chapters that
address the digital transformation, ways of dealing with changing
consumer behavior, the need to form evolving cyber security cultures,
and the importance of integrating Gen Y employees to accelerate
competitive advantage.
This book provides answers to a number of dilemmas but ultimately
offers an imbricate cure for the problem of latency in performance and
quality afflicting many technologically-based projects. Traditionally,
management has attempted to improve IT performance by increasing
technical skills and project manager expertise through new processes.
While there has been an effort to educate IT managers to become
more interested and participative in business issues, their involvement
continues to be based more on service than on strategy. Yet, at the
heart of the issue is the entirety of the organization. It is my belief that
many of the programmatic efforts conducted in traditional ways and
attempting to mature and integrate IT with the rest of the organiza-
tion will continue to deliver disappointing results.
My personal experience goes well beyond research; it draws from
living and breathing the IT experience for the past 35 years, and
from an understanding of the dynamics of what occurs inside and
outside the IT department in most organizations. With such experi-
ence, I can offer a path that engages the participation of the entire
management team and operations staff of the organization. While
my vision for this kind of digital transformation is different from
other approaches, it is consistent with organizational learning theo-
ries that promote the integration of individuals, communities, and
senior management to participate in more democratic and vision-
ary forms of thinking, reflection, and learning. It is my belief that
many of the dilemmas presented by IT have existed in other parts of
organizations for years, and that the Internet revolution only served
to expose them. If we believe this to be true, then we must begin
the process of integrating technology into strategic thinking and
stop depending on IT to provide magical answers, and inappropriate
expectations of performance.
xxvi IntroduCtIon
Technology is not the responsibility of any one person or depart-
ment; rather, it is part of the responsibility of every employee. Thus,
the challenge is to allow organizations to understand how to modify
their processes, and the roles and responsibilities of their employees,
to incorporate digital technologies as part of normal workplace activi-
ties. Technology then becomes more a subject and a component of
discourse. IT staff members need to emerge as specialists who par-
ticipate in decision making, development, and sustained support of
business evolution. There are also technology-based topics that do
not require the typical expertise that IT personnel provide. This is
a literacy issue that requires different ways of thinking and learning
during the everyday part of operations. For example, using desktop
tools, communicating via e-mail, and saving files and data, are inte-
gral to everyday operations. These activities affect projects, yet they
are not really part of the responsibilities of IT departments. Given
the knowledge that technology is everywhere, we must change the
approach that we take to be successful. Another way of looking at this
phenomenon is to define technology more as a commodity, readily
available to all individuals. This means that the notion of technology
as organizationally segregated into separate cubes of expertise is prob-
lematic, particularly on a global front.
Thus, the overall aim of this book is to promote organizational
learning that disseminates the uses of technology throughout a busi-
ness, so that IT departments are a partner in its use, as opposed to
being its sole owner. The cure to IT project failure, then, is to engage
the business in technology decisions in such a way that individuals
and business units are fundamentally involved in the process. Such
processes need to be designed to dynamically respond to technology
opportunities and thus should not be overly bureaucratic. There is a
balance between establishing organizations that can readily deal with
technology versus those that become too complex and inefficient.
This balance can only be attained using organizational learning
techniques as the method to grow and reach technology maturation.
Overview of the Chapters
Chapter 1 provides an important case study of the Ravell Corporation
(a pseudonym), where I was retained for over five years. During this
xxviiIntroduCtIon
period, I applied numerous organizational learning methods toward
the integration of the IT department with the rest of the organiza-
tion. The chapter allows readers to understand how the theories of
organizational learning can be applied in actual practice, and how
those theories are particularly beneficial to the IT community. The
chapter also shows the practical side of how learning techniques can
be linked to measurable outcomes, and ultimately related to business
strategy. This concept will become the basis of integrating learning
with strategy (i.e., “ strategic learning” ). The Ravell case study also
sets the tone of what I call the IT dilemma, which represents the
core problem faced by organizations today. Furthermore, the Ravell
case study becomes the cornerstone example throughout the book and
is used to relate many of the theories of learning and their practical
applicability in organizations. The Ravell case has also been updated
in this second edition to include recent results that support the impor-
tance of alignment with the human resources department.
Chapter 2 presents the details of the IT dilemma. This chapter
addresses issues such as isolation of IT staff, which results in their
marginalization from the rest of the organization. I explain that while
executives want technology to be an important part of business strat-
egy, few understand how to accomplish it. In general, I show that
individuals have a lack of knowledge about how technology and busi-
ness strategy can, and should, be linked, to form common business
objectives. The chapter provides the results of a three-year study of
how chief executives link the role of technology with business strat-
egy. The study captures information relating to how chief executives
perceive the role of IT, how they manage it, and use it strategically,
and the way they measure IT performance and activities.
Chapter 3 focuses on defining how organizations need to respond
to the challenges posed by technology. I analyze technological dyna-
mism in its core components so that readers understand the different
facets that comprise its many applications. I begin by presenting tech-
nology as a dynamic variable that is capable of affecting organizations
in a unique way. I specifically emphasize the unpredictability of tech-
nology, and its capacity to accelerate change— ultimately concluding
that technology, as an independent variable, has a dynamic effect on
organizational development. This chapter also introduces my theory
of responsive organizational dynamism, defined as a disposition in
xxviii IntroduCtIon
organizational behavior that can respond to the demands of tech-
nology as a dynamic variable. I establish two core components of
responsive organizational dynamism: strategic integration and cultural
assimilation . Each of these components is designed to tackle a specific
problem introduced by technology. Strategic integration addresses the
way in which organizations determine how to use technology as part
of business strategy. Cultural assimilation, on the other hand, seeks
to answer how the organization, both structurally and culturally, will
accommodate the actual human resources of an IT staff and depart-
ment within the process of implementing new technologies. Thus,
strategic integration will require organizational changes in terms of
cultural assimilation. The chapter also provides a perspective of the
technology life cycle so that readers can see how responsive organi-
zational dynamism is applied, on an IT project basis. Finally, I define
the driver and supporter functions of IT and how these contribute to
managing technology life cycles.
Chapter 4 introduces theories on organizational learning, and
applies them specifically to responsive organizational dynamism. I
emphasize that organizational learning must result in individual, and
organizational transformation, that leads to measurable performance
outcomes. The chapter defines a number of organizational learning
theories, such as reflective practices, learning organization, communi-
ties of practice, learning preferences and experiential learning, social
discourse, and the use of language. These techniques and approaches
to promoting organizational learning are then configured into various
models that can be used to assess individual and organizational devel-
opment. Two important models are designed to be used in responsive
organizational dynamism: the applied individual learning wheel and
the technology maturity arc. These models lay the foundation for my
position that learning maturation involves a steady linear progression
from an individual focus toward a system or organizational perspec-
tive. The chapter also addresses implementation issues— political
challenges that can get in the way of successful application of the
learning theories.
Chapter 5 explores the role of management in creating and sustain-
ing responsive organizational dynamism. I define the tiers of middle
management in relation to various theories of management partici-
pation in organizational learning. The complex issues of whether
xxixIntroduCtIon
organizational learning needs to be managed from the top down,
bottom up, or middle-top-down are discussed and applied to a model
that operates in responsive organizational dynamism. This chapter
takes into account the common three-tier structure in which most
organizations operate: executive, middle, and operations. The execu-
tive level includes the chief executive officer (CEO), president, and
senior vice presidents. The middle is the most complex, ranging from
vice president/director to supervisory roles. Operations covers what is
commonly known as “ staff,” including clerical functions. The knowl-
edge that I convey suggests that all of these tiers need to participate in
management, including operations personnel, via a self-development
model. The chapter also presents the notion that knowledge manage-
ment is necessary to optimize competitive advantage, particularly as
it involves transforming tacit knowledge into explicit knowledge. I
view the existing theories on knowledge management, create a hybrid
model that embraces technology issues, and map them to responsive
organizational dynamism. Discussions on change management are
included as a method of addressing the unique ways that technol-
ogy affects product development. Essentially, I tie together respon-
sive organizational dynamism with organizational change theory, by
offering modifications to generally accepted theories. There is also a
specific model created for IT organizations, that maps onto organi-
zational-level concepts. Although I have used technology as the basis
for the need for responsive organizational dynamism, I show that the
needs for its existence can be attributed to any variable that requires
dynamic change. As such, I suggest that readers begin to think about
the next “ technology” or variable that can cause the same needs to
occur inside organizations. The chapter has been extended to address
the impact of social networking and the leadership opportunities it
provides to technology executives.
Chapter 6 examines how organizational transformation occurs.
The primary focus of the chapter is to integrate transformation theory
with responsive organizational dynamism. The position taken is that
organizational learning techniques must inevitably result in orga-
nizational transformation. Discussions on transformation are often
addressed at organizational level, as opposed to focusing on individual
development. As in other sections of the book, I extend a number
of theories so that they can operate under the auspices of responsive
xxx IntroduCtIon
organizational dynamism, specifically, the works of Yorks and Marsick
(2000) and Aldrich (2001). I expand organizational transformation
to include ongoing assessment within technology deliverables. This
is accomplished through the use of a modified Balanced Scorecard
originally developed by Kaplan and Norton (2001). The Balanced
Scorecard becomes the vehicle for establishing a strategy-focused and
technology-based organization.
Chapter 7 deals with the many business transformation projects
that require outsource arrangements and virtual team management.
This chapter provides an understanding of when and how to consider
outsourcing and the intricacies of considerations once operating with
virtual teams. I cover such issues as management considerations and
the challenges of dealing in multiple locations. The chapter extends the
models discussed in previous chapters so that they can be aligned with
operating in a virtual team environment. Specifically, this includes
communities of practice, social discourse, self-development, knowl-
edge management, and, of course, responsive organizational dyna-
mism and its corresponding maturity arcs. Furthermore, I expand the
conversation to include IT and non-IT personnel, and the arguments
for the further support needed to integrate all functions across the
organization.
Chapter 8 presents updated case studies that demonstrate how my
organizational learning techniques are actually applied in practice.
Three case studies are presented: Siemens AG, ICAP, and HTC.
Siemens AG is a diverse international company with 20 discrete
businesses in over 190 countries. The case study offers a perspec-
tive of how a corporate chief information officer (CIO) introduced
e- business strategy. ICAP is a leading international money and secu-
rity broker. This case study follows the activities of the electronic trad-
ing community (ETC) entity, and how the CEO transformed the
organization and used organizational learning methods to improve
competitive advantage. HTC (a pseudonym) provides an example of
why the chief IT executive should report to the CEO, and how a
CEO can champion specific projects to help transform organizational
norms and behaviors. This case study also maps the transformation of
the company to actual examples of strategic advantage.
Chapter 9 focuses on the challenges of forming a “ cyber security”
culture. The growing challenges of protecting companies from outside
xxxiIntroduCtIon
attacks have established the need to create a cyber security culture.
This chapter addresses the ways in which information technology
organizations must further integrate with business operations, so
that their firms are better equipped to protect against outside threats.
Since the general consensus is that no system can be 100% protected,
and that most system compromises occur as a result of internal expo-
sures, information technology leaders must educate employees on
best practices to limit cyberattacks. Furthermore, while prevention is
the objective, organizations must be internally prepared to deal with
attacks and thus have processes in place should a system become pen-
etrated by third-party agents.
Chapter 10 explores the effects of the digital global economy on
the ways in which organizations need to respond to the consumeriza-
tion of products and services. From this perspective, digital transfor-
mation involves a type of social reengineering that affects the ways in
which organizations communicate internally, and how they consider
restructuring departments. Digital transformation also affects the
risks that organizations must take in what has become an accelerated
changing consumer market.
Chapter 11 provides conclusions and focuses on Gen Y employ-
ees who are known as “ digital natives” and represent the new supply
chain of talent. Gen Y employees possess the attributes to assist com-
panies to transform their workforce to meet the accelerated change in
the competitive landscape. Most executives across industries recog-
nize that digital technologies are the most powerful variable to main-
taining and expanding company markets. Gen Y employees provide a
natural fit for dealing with emerging digital technologies. However,
success with integrating Gen Y employees is contingent upon Baby
Boomer and Gen X management adopting new leadership philoso-
phies and procedures suited to meet the expectations and needs of
these new workers. Ignoring the unique needs of Gen Y employees
will likely result in an incongruent organization that suffers high
turnover of young employees who will ultimately seek a more entre-
preneurial environment.
Chapter 12 seeks to define best practices to implement and sus-
tain responsive organizational dynamism. The chapter sets forth a
model that creates separate, yet linked, best practices and maturity
arcs that can be used to assess stages of the learning development
xxxii IntroduCtIon
of the chief IT executive, the CEO, and the middle management. I
discuss the concept of common threads , by which each best practices
arc links through common objectives and outcomes to the responsive
organizational dynamism maturity arc presented in Chapter 4. Thus,
these arcs represent an integrated and hierarchical view of how each
component of the organization contributes to overall best practices. A
new section has been added that links ethics to technology leadership
and maturity.
Chapter 13 summarizes the many aspects of how IT and organi-
zational learning operate together to support the responsive organi-
zational dynamism environment. The chapter emphasizes the specific
key themes developed in the book, such as evolution versus revolu-
tion; control and empowerment; driver and supporter operations; and
responsive organizational dynamism and self-generating organiza-
tions. Finally, I provide an overarching framework for “ organizing”
reflection and integrate it with the best practices arcs.
As a final note, I need to clarify my use of the words information
technology, digital technology, and technology. In many parts of the book,
they are used interchangeably, although there is a defined difference.
Of course, not all technology is related to information or digital; some
is based on machinery or the like. For the purposes of this book, the
reader should assume that IT and digital technology are the primary
variables that I am addressing. However, the theories and processes
that I offer can be scaled to all types of technological innovation.
1
1
The “Ravell” CoRpoRaTion
Introduction
Launching into an explanation of information technology (IT),
organizational learning, and the practical relationship into which I
propose to bring them is a challenging topic to undertake. I choose,
therefore, to begin this discussion by presenting an actual case study
that exemplifies many key issues pertaining to organizational learn-
ing, and how it can be used to improve the performance of an IT
department. Specifically, this chapter summarizes a case study of
the IT department at the Ravell Corporation (a pseudonym) in New
York City. I was retained as a consultant at the company to improve
the performance of the department and to solve a mounting politi-
cal problem involving IT and its relation to other departments. The
case offers an example of how the growth of a company as a “learn-
ing organization”—one in which employees are constantly learning
during the normal workday (Argyris, 1993; Watkins & Marsick,
1993)— utilized reflective practices to help it achieve the practical stra-
tegic goals it sought. Individuals in learning organizations integrate
processes of learning into their work. Therefore, a learning organiza-
tion must advocate a system that allows its employees to interact, ask
questions, and provide insight to the business. The learning organiza-
tion will ultimately promote systematic thinking, and the building
of organizational memory (Watkins & Marsick, 1993). A learning
organization (discussed more fully in Chapter 4) is a component of
the larger topic of organizational learning.
The Ravell Corporation is a firm with over 500 employees who,
over the years, had become dependent on the use of technology to
run its business. Its IT department, like that of many other compa-
nies, was isolated from the rest of the business and was regarded as
a peripheral entity whose purpose was simply to provide technical
support. This was accompanied by actual physical isolation—IT was
2 INFORMATION TECHNOLOGY
placed in a contained and secure location away from mainstream
operations. As a result, IT staff rarely engaged in active discourse
with other staff members unless specific meetings were called relat-
ing to a particular project. The Ravell IT department, therefore, was
not part of the community of organizational learning—it did not
have the opportunity to learn along with the rest of the organiza-
tion, and it was never asked to provide guidance in matters of gen-
eral relevance to the business as a whole. This marginalized status
resulted in an us-versus-them attitude on the part of IT and non-IT
personnel alike.
Much has been written about the negative impact of marginal-
ization on individuals who are part of communities. Schlossberg
(1989) researched adults in various settings and how marginal-
ization affected their work and self-efficacy. Her theory on mar-
ginalization and mattering is applied to this case study because of
its relevance and similarity to her prior research. For example, IT
represents similar characteristics to a separate group on a college
campus or in a workplace environment. Its physical isolation can
also be related to how marginalized groups move away from the
majority population and function without contact. The IT direc-
tor, in particular, had cultivated an adversarial relationship with his
peers. The director had shaped a department that fueled his view of
separation. This had the effect of further marginalizing the posi-
tion of IT within the organization. Hand in hand with this form of
separatism came a sense of actual dislike on the part of IT personnel
for other employees. IT staff members were quick to point fingers
at others and were often noncommunicative with members of other
departments within the organization. As a result of this kind of
behavior, many departments lost confidence in the ability of IT to
provide support; indeed, the quality of support that IT furnished
had begun to deteriorate. Many departments at Ravell began to hire
their own IT support personnel and were determined to create their
own information systems subdepartments. This situation eventually
became unacceptable to management, and the IT director was ter-
minated. An initiative was begun to refocus the department and its
position within the organization. I was retained to bring about this
change and to act as the IT director until a structural transforma-
tion of the department was complete.
3the “rAvell” CorporAtIon
A New Approach
My mandate at Ravell was initially unclear—I was to “fix” the
problem; the specific solution was left up to me to design and imple-
ment. My goal became one of finding a way to integrate IT fully into
the organizational culture at Ravell. Without such integration, IT
would remain isolated, and no amount of “fixing” around this issue
would address the persistence of what was, as well, a cultural prob-
lem. Unless IT became a true part of the organization as a whole,
the entire IT staff could be replaced without any real change having
occurred from the organization’s perspective. That is, just replacing
the entire IT staff was an acceptable solution to senior management.
The fact that this was acceptable suggested to me that the knowledge
and value contained in the IT department did not exist or was mis-
understood by the senior management of the firm. In my opinion,
just eliminating a marginalized group was not a solution because I
expected that such knowledge and value did exist, and that it needed
to be investigated properly. Thus, I rejected management’s option and
began to formulate a plan to better understand the contributions that
could be made by the IT department. The challenge was threefold: to
improve the work quality of the IT department (a matter of perfor-
mance), to help the department begin to feel itself a part of the orga-
nization as a whole and vice versa (a matter of cultural assimilation),
and to persuade the rest of the organization to accept the IT staff as
equals who could contribute to the overall direction and growth of the
organization (a fundamental matter of strategic integration).
My first step was to gather information. On my assignment to the
position of IT director, I quickly arranged a meeting with the IT
department to determine the status and attitudes of its personnel.
The IT staff meeting included the chief financial officer (CFO), to
whom IT reported. At this meeting, I explained the reasons behind
the changes occurring in IT management. Few questions were asked;
as a result, I immediately began scheduling individual meetings with
each of the IT employees. These employees varied in terms of their
position within the corporate hierarchy, in terms of salary, and in
terms of technical expertise. The purpose of the private meetings was
to allow IT staff members to speak openly, and to enable me to hear
their concerns. I drew on the principles of action science, pioneered
4 INFORMATION TECHNOLOGY
by Argyris and Schö n (1996), designed to promote individual self-
reflection regarding behavior patterns, and to encourage a produc-
tive exchange among individuals. Action science encompasses a range
of methods to help individuals learn how to be reflective about their
actions. By reflecting, individuals can better understand the outcomes
of their actions and, especially, how they are seen by others. This was
an important approach because I felt learning had to start at the indi-
vidual level as opposed to attempting group learning activities. It was
my hope that the discussions I orchestrated would lead the IT staff to
a better understanding than they had previously shown, not only of
the learning process itself, but also of the significance of that process.
I pursued these objectives by guiding them to detect problem areas in
their work and to undertake a joint effort to correct them (Argyris,
1993; Arnett, 1992).
Important components of reflective learning are single-loop and
double-loop learning. Single-loop learning requires individuals to
reflect on a prior action or habit that needs to be changed in the future
but does not require individuals to change their operational proce-
dures with regard to values and norms. Double-loop learning, on the
other hand, does require both change in behavior and change in oper-
ational procedures. For example, people who engage in double-loop
learning may need to adjust how they perform their job, as opposed to
just the way they communicate with others, or, as Argyris and Schö n
(1996, p. 22) state, “the correction of error requires inquiry through
which organizational values and norms themselves are modified.”
Despite my efforts and intentions, not all of the exchanges were
destined to be successful. Many of the IT staff members felt that the
IT director had been forced out, and that there was consequently
no support for the IT function in the organization. There was also
clear evidence of internal political division within the IT department;
members openly criticized each other. Still other interviews resulted
in little communication. This initial response from IT staff was disap-
pointing, and I must admit I began to doubt whether these learning
methods would be an antidote for the department. Replacing people
began to seem more attractive, and I now understood why many man-
agers prefer to replace staff, as opposed to investing in their transfor-
mation. However, I also knew that learning is a gradual process and
that it would take time and trust to see results.
5the “rAvell” CorporAtIon
I realized that the task ahead called for nothing short of a total cul-
tural transformation of the IT organization at Ravell. Members of the
IT staff had to become flexible and open if they were to become more
trusting of one another and more reflective as a group (Garvin, 2000;
Schein, 1992). Furthermore, they had to have an awareness of their
history, and they had to be willing to institute a vision of partnering
with the user community. An important part of the process for me
was to accept the fact that the IT staff were not habitually inclined to
be reflective. My goal then was to create an environment that would
foster reflective learning, which would in turn enable a change in
individual and organizational values and norms (Senge, 1990).
The Blueprint for Integration
Based on information drawn from the interviews, I developed a pre-
liminary plan to begin to integrate IT into the day-to-day operations
at Ravell, and to bring IT personnel into regular contact with other
staff members. According to Senge (1990), the most productive learn-
ing occurs when skills are combined in the activities of advocacy and
inquiry. My hope was to encourage both among the staff at Ravell. The
plan for integration and assimilation involved assigning IT resources
to each department; that is, following the logic of the self-dissemina-
tion of technology, each department would have its own dedicated IT
person to support it. However, just assigning a person was not enough,
so I added the commitment to actually relocate an IT person into each
physical area. This way, rather than clustering together in an area of
their own, IT people would be embedded throughout the organiza-
tion, getting first-hand exposure to what other departments did, and
learning how to make an immediate contribution to the productiv-
ity of these departments. The on-site IT person in each department
would have the opportunity to observe problems when they arose—
and hence, to seek ways to prevent them—and, significantly, to share
in the sense of accomplishment when things went well. To reinforce
their commitment to their respective areas, I specified that IT person-
nel were to report not only to me but also to the line manager in their
respective departments. In addition, these line managers were to have
input on the evaluation of IT staff. I saw that making IT staff offi-
cially accountable to the departments they worked with was a tangible
6 INFORMATION TECHNOLOGY
way to raise their level of commitment to the organization. I hoped
that putting line managers in a supervisory position, would help build
a sense of teamwork between IT and non-IT personnel. Ultimately,
the focus of this approach was to foster the creation of a tolerant and
supportive cultural climate for IT within the various departments; an
important corollary goal here was also to allow reflective reviews of
performance to flourish (Garvin, 1993).
Enlisting Support
Support for this plan had to be mustered quickly if I was to create an
environment of trust. I had to reestablish the need for the IT func-
tion within the company, show that it was critical for the company’s
business operations, and show that its integration posed a unique
challenge to the company. However, it was not enough just for me
to claim this. I also had to enlist key managers to claim it. Indeed,
employees will cooperate only if they believe that self-assessment and
critical thinking are valued by management (Garvin, 2000). I decided
to embark on a process of arranging meetings with specific line man-
agers in the organization. I selected individuals who would represent
the day-to-day management of the key departments. If I could get
their commitment to work with IT, I felt it could provide the stimulus
we needed. Some line managers were initially suspicious of the effort
because of their prior experiences with IT. However, they generally
liked the idea of integration and assimilation that was presented to
them, and agreed to support it, at least on a trial basis.
Predictably, the IT staff were less enthusiastic about the idea. Many
of them felt threatened, fearing that they were about to lose their
independence or lose the mutual support that comes from being in a
cohesive group. I had hoped that holding a series of meetings would
help me gain support for the restructuring concept. I had to be care-
ful to ensure that the staff members would feel that they also had an
opportunity to develop a plan, that they were confident would work.
During a number of group sessions, we discussed various scenarios of
how such a plan might work. I emphasized the concepts of integra-
tion and assimilation, and that a program of their implementation
would be experimental. Without realizing it, I had engaged IT staff
members in a process of self-governance. Thus, I empowered them
7the “rAvell” CorporAtIon
to feel comfortable with voicing new ideas, without being concerned
that they might be openly criticized by me if I did not agree. This pro-
cess also encouraged individuals to begin thinking more as a group.
Indeed, by directing the practice of constructive criticism among
the IT staff, I had hoped to elicit a higher degree of reflective action
among the group and to show them that they had the ability to learn
from one another as well as the ability to design their own roles in the
organization (Argyris, 1993). Their acceptance of physical integration
and, hence, cultural assimilation became a necessary condition for
the ability of the IT group, to engage in greater reflective behavior
(Argyris & Schö n, 1996).
Assessing Progress
The next issue concerned individual feedback. How was I to let each
person know how he or she was doing? I decided first, to get feedback
from the larger organizational community. This was accomplished
by meeting with the line managers and obtaining whatever feed-
back was available from them. I was surprised at the large quantity
of information they were willing to offer. The line managers were not
shy about participating, and their input allowed me to complete two
objectives: (1) to understand how the IT staff was being perceived in
its new assignment and (2) to create a social and reflective relation-
ship between IT individuals and the line managers. The latter objec-
tive was significant, for if we were to be successful, the line managers
would have to assist us in the effort to integrate and assimilate IT
functions within their community.
After the discussions with managers were completed, individual
meetings were held with each IT staff member to discuss the feedback.
I chose not to attribute the feedback to specific line managers but rather
to address particular issues by conveying the general consensus about
them. Mixed feelings were also disclosed by the IT staff. After convey-
ing the information, I listened attentively to the responses of IT staff
members. Not surprisingly, many of them responded to the feedback
negatively and defensively. Some, for example, felt that many technology
users were unreasonable in their expectations of IT. It was important for
me as facilitator not to find blame among them, particularly if I was to
be a participant in the learning organization (Argyris & Schö n, 1996).
8 INFORMATION TECHNOLOGY
Resistance in the Ranks
Any major organizational transformation is bound to elicit resistance
from some employees. The initiative at Ravell proved to be no excep-
tion. Employees are not always sincere, and some individuals will
engage in political behavior that can be detrimental to any organiza-
tional learning effort. Simply put, they are not interested in partici-
pating, or, as Marsick (1998) states, “It would be naï ve to expect that
everyone is willing to play on an even field (i.e., fairly).” Early in the
process, the IT department became concerned that its members spent
much of their time trying to figure out how best to position themselves
for the future instead of attending to matters at hand. I heard from
other employees that the IT staff felt that they would live through my
tenure; that is, just survive until a permanent IT director was hired. It
became difficult at times to elicit the truth from some members of the
IT staff. These individuals would skirt around issues and deny making
statements that were reported by other employees rather than con-
front problems head on. Some IT staff members would criticize me in
front of other groups and use the criticism as proof that the plan for
a general integration was bound to fail. I realized in a most tangible
sense that pursuing change through reflective practice does not come
without resistance, and that this resistance needs to be factored into
the planning of any such organizationally transformative initiative.
Line Management to the Rescue
At the time that we were still working through the resistance within
IT, the plan to establish a relationship with line management began
to work. A number of events occurred that allowed me to be directly
involved in helping certain groups solve their IT problems. Word
spread quickly that there was a new direction in IT that could be
trusted. Line management support is critical for success in such trans-
formational situations. First, line management is typically comprised
of people from the ranks of supervisors and middle managers, who are
responsible for the daily operations of their department. Assuming
they do their jobs, senior management will cater to their needs and
listen to their feedback. The line management of any organiza-
tion, necessarily engaged to some degree in the process of learning
9the “rAvell” CorporAtIon
(a “learning organization”), is key to its staff. Specifically, line manag-
ers are responsible for operations personnel; at the same time, they
must answer to senior management. Thus, they understand both exec-
utive and operations perspectives of the business (Garvin, 2000). They
are often former staff members themselves and usually have a high
level of technical knowledge. Upper management, while important
for financial support, has little effect at the day-to-day level, yet this is
the level at which the critical work of integration and the building of
a single learning community must be done.
Interestingly, the line management organization had previously
had no shortage of IT-related problems. Many of these line managers
had been committed to developing their own IT staffs; however, they
quickly realized that the exercise was beyond their expertise, and that
they needed guidance and leadership. Their participation in IT staff
meetings had begun to foster a new trust in the IT department, and
they began to see the possibilities of working closely with IT to solve
their problems. Their support began to turn toward what Watkins and
Marsick (1993, p. 117) call “creating alignment by placing the vision
in the hands of autonomous, cross-functional synergetic teams.” The
combination of IT and non-IT teams began to foster a synergy among
the communities, which established new ideas about how best to use
technology.
IT Begins to Reflect
Although it was initially difficult for some staff members to accept,
they soon realized that providing feedback opened the door to the
process of self-reflection within IT. We undertook a number of exer-
cises, to help IT personnel understand how non-IT personnel per-
ceived them, and how their own behavior may have contributed to
these perceptions. To foster self-reflection, I adopted a technique
developed by Argyris called “the left-hand column.” In this technique,
individuals use the right-hand column of a piece of paper to transcribe
dialogues that they felt had not resulted in effective communication.
In the left-hand column of the same page, participants are to write
what they were really thinking at the time of the dialogue but did not
say. This exercise is designed to reveal underlying assumptions that
speakers may not be aware of during their exchanges and that may be
10 INFORMATION TECHNOLOGY
impeding their communication with others by giving others a wrong
impression. The exercise was extremely useful in helping IT personnel
understand how others in the organization perceived them.
Most important, the development of reflective skills, according to
Schö n (1983), starts with an individual’s ability to recognize “leaps
of abstraction”—the unconscious and often inaccurate generalizations
people make about others based on incomplete information. In the
case of Ravell, such generalizations were deeply entrenched among its
various personnel sectors. Managers tended to assume that IT staffers
were “ just techies,” and that they therefore held fundamentally differ-
ent values and had little interest in the organization as a whole. For
their part, the IT personnel were quick to assume that non-IT people
did not understand or appreciate the work they did. Exposing these
“leaps of abstraction” was key to removing the roadblocks that pre-
vented Ravell from functioning as an integrated learning organization.
Defining an Identity for Information Technology
It was now time to start the process of publicly defining the identity
of IT. Who were we, and what was our purpose? Prior to this time,
IT had no explicit mission. Instead, its members had worked on an
ad hoc basis, putting out fires and never fully feeling that their work
had contributed to the growth or development of the organization as
a whole. This sense of isolation made it difficult for IT members to
begin to reflect on what their mission should or could be. I organized
a series of meetings to begin exploring the question of a mission, and I
offered support by sharing exemplary IT mission statements that were
being implemented in other organizations. The focus of the meetings
was not on convincing them to accept any particular idea but rather to
facilitate a reflective exercise with a group that was undertaking such
a task for the first time (Senge, 1990).
The identity that emerged for the IT department at Ravell was dif-
ferent from the one implicit in their past role. Our new mission would
be to provide technical support and technical direction to the organi-
zation. Of necessity, IT personnel would remain specialists, but they
were to be specialists who could provide guidance to other depart-
ments in addition to helping them solve and prevent problems. As
they became more intimately familiar with what different departments
11the “rAvell” CorporAtIon
did—and how these departments contributed to the organization as a
whole—IT professionals would be able to make better informed rec-
ommendations. The vision was that IT people would grow from being
staff who fixed things into team members who offered their expertise
to help shape the strategic direction of the organization and, in the
process, participate fully in organizational growth and learning.
To begin to bring this vision to life, I invited line managers to
attend our meetings. I had several goals in mind with this invita-
tion. Of course, I wanted to increase contact between IT and non-IT
people; beyond this, I wanted to give IT staff an incentive to change
by making them feel a part of the organization as a whole. I also got
a commitment from IT staff that we would not cover up our prob-
lems during the sessions, but would deal with all issues with trust
and honesty. I also believed that the line managers would reciprocate
and allow us to attend their staff meetings. A number of IT indi-
viduals were concerned that my approach would only further expose
our problems with regard to quality performance, but the group as
a whole felt compelled to stick with the beliefs that honesty would
always prevail over politics. Having gained insight into how the rest of
the organization perceived them, IT staff members had to learn how
to deal with disagreement and how to build consensus to move an
agenda forward. Only then could reflection and action be intimately
intertwined so that after-the-fact reviews could be replaced with peri-
ods of learning and doing (Garvin, 2000).
The meetings were constructive, not only in terms of content issues
handled in the discussions, but also in terms of the number of line
managers who attended them. Their attendance sent a strong message
that the IT function was important to them, and that they under-
stood that they also had to participate in the new direction that IT
was taking. The sessions also served as a vehicle to demonstrate how
IT could become socially assimilated within all the functions of the
community while maintaining its own identity.
The meetings were also designed as a venue for group members to
be critical of themselves. The initial meetings were not successful in
this regard; at first, IT staff members spent more time blaming oth-
ers than reflecting on their own behaviors and attitudes. These ses-
sions were difficult in that I would have to raise unpopular questions
and ask whether the staff had truly “looked in the mirror” concerning
12 INFORMATION TECHNOLOGY
some of the problems at hand. For example, one IT employee found
it difficult to understand why a manager from another department
was angry about the time it took to get a problem resolved with his
computer. The problem had been identified and fixed within an hour,
a time frame that most IT professionals would consider very respon-
sive. As we looked into the reasons why the manager could have been
justified in his anger, it emerged that the manager had a tight deadline
to meet. In this situation, being without his computer for an hour was
a serious problem.
Although under normal circumstances a response time of one hour
is good, the IT employee had failed to ask about the manager’s par-
ticular circumstance. On reflection, the IT employee realized that
putting himself in the position of the people he was trying to support
would enable him to do his job better. In this particular instance, had
the IT employee only understood the position of the manager, there
were alternative ways of resolving the problem that could have been
implemented much more quickly.
Implementing the Integration: A Move toward Trust and Reflection
As communication became more open, a certain synergy began to
develop in the IT organization. Specifically, there was a palpable rise
in the level of cooperation and agreement, with regard to the over-
all goals set during these meetings. This is not to suggest that there
were no disagreements but rather that discussions tended to be more
constructive in helping the group realize its objective of providing
outstanding technology support to the organization. The IT staff
also felt freer to be self-reflective by openly discussing their ideas and
their mistakes. The involvement of the departmental line manag-
ers also gave IT staff members the support they needed to carry out
the change. Slowly, there developed a shift in behavior in which the
objectives of the group sharpened its focus on the transformation of
the department, on its acknowledgment of successes and failures, and
on acquiring new knowledge, to advance the integration of IT into
the core business units.
Around this time, an event presented itself that I felt would allow
the IT department to establish its new credibility and authority to
the other departments: the physical move of the organization to a
13the “rAvell” CorporAtIon
new location. The move was to be a major event, not only because
it represented the relocation of over 500 people and the technologi-
cal infrastructure they used on a day-to-day basis, but also because
the move was to include the transition of the media communications
systems of the company, to digital technology. The move required
tremendous technological work, and the organization decided to
perform a “technology acceleration,” meaning that new technology
would be introduced more quickly because of the opportunity pre-
sented by the move. The entire moving process was to take a year, and
I was immediately summoned to work with the other departments in
determining the best plan to accomplish the transition.
For me, the move became an emblematic event for the IT group at
Ravell. It would provide the means by which to test the creation of,
and the transitioning into, a learning organization. It was also to pro-
vide a catalyst for the complete integration and assimilation of IT into
the organization as a whole. The move represented the introduction
of unfamiliar processes in which “conscious reflection is … necessary
if lessons are to be learned” (Garvin, 2000, p. 100). I temporarily
reorganized IT employees into “SWAT” teams (subgroups formed
to deal with defined problems in high-pressure environments), so
that they could be eminently consumed in the needs of their com-
munity partners. Dealing with many crisis situations helped the IT
department change the existing culture by showing users how to bet-
ter deal with technology issues in their everyday work environment.
Indeed, because of the importance of technology in the new location,
the core business had an opportunity to embrace our knowledge and
to learn from us.
The move presented new challenges every day, and demanded
openness and flexibility from everyone. Some problems required that
IT listen intently to understand and meet the needs of its commu-
nity partners. Other situations put IT in the role of teaching; assess-
ing needs and explaining to other departments what was technically
possible, and then helping them to work out compromises based on
technical limitations. Suggestions for IT improvement began to come
from all parts of the organization. Ideas from others were embraced
by IT, demonstrating that employees throughout the organization
were learning together. IT staff behaved assertively and without fear
of failure, suggesting that, perhaps for the first time, their role had
14 INFORMATION TECHNOLOGY
extended beyond that of fixing what was broken to one of helping
to guide the organization forward into the future. Indeed, the move
established the kind of “special problem” that provided an opportunity
for growth in personal awareness through reflection (Moon, 1999).
The move had proved an ideal laboratory for implementing the
IT integration and assimilation plan. It provided real and important
opportunities for IT to work hand in hand with other departments—
all focusing on shared goals. The move fostered tremendous cama-
raderie within the organization and became an excellent catalyst for
teaching reflective behavior. It was, if you will, an ideal project in
which to show how reflection in action can allow an entire organiza-
tion to share in the successful attainment of a common goal. Because
it was a unique event, everyone—IT and non-IT personnel alike—
made mistakes, but this time, there was virtually no finger-pointing.
People accepted responsibility collectively and cooperated in finding
solutions. When the company recommenced operations from its new
location—on time and according to schedule—no single group could
claim credit for the success; it was universally recognized that success
had been the result of an integrated effort.
Key Lessons
The experience of the reorganization of the IT department at Ravell
can teach us some key lessons with respect to the cultural transforma-
tion and change of marginalized technical departments, generally.
Defining Reflection and Learning for an Organization
IT personnel tend to view learning as a vocational event. They gener-
ally look to increase their own “technical” knowledge by attending
special training sessions and programs. However, as Kegan (1998)
reminds us, there must be more: “Training is really insufficient as a
sole diet of education—it is, in reality a subset of education.” True
education involves transformation, and transformation, according to
Kegan, is the willingness to take risks, to “get out of the bedroom of
our comfortable world.” In my work at Ravell, I tried to augment this
“diet” by embarking on a project that delivered both vocational train-
ing and education through reflection. Each IT staff person was given
15the “rAvell” CorporAtIon
one week of technical training per year to provide vocational develop-
ment. But beyond this, I instituted weekly learning sessions in which
IT personnel would meet without me and produce a weekly memo of
“reflection.” The goal of this practice was to promote dialogue, in the
hope that IT would develop a way to deal with its fears and mistakes
on its own. Without knowing it, I had begun the process of creating
a discursive community in which social interactions could act as insti-
gators of reflective behavior leading to change.
Working toward a Clear Goal
The presence of clearly defined, measurable, short-term objectives
can greatly accelerate the process of developing a “learning organiza-
tion” through reflective practice. At Ravell, the move into new physi-
cal quarters provided a common organizational goal toward which
all participants could work. This goal fostered cooperation among IT
and non-IT employees and provided an incentive for everyone to work
and, consequently, learn together. Like an athletic team before an
important game, or even an army before battle, the IT staff at Ravell
rallied around a cause and were able to use reflective practices to help
meet their goals. The move also represented what has been termed an
“eye-opening event,” one that can trigger a better understanding of a
culture whose differences challenge one’s presuppositions (Mezirow,
1990). It is important to note, though, that while the move accelerated
the development of the learning organization as such, the move itself
would not have been enough to guarantee the successes that followed
it. Simply setting a deadline is no substitute for undergoing the kind
of transformation necessary for a consummately reflective process.
Only as the culmination of a process of analysis, socialization, and
trust building, can an event like this speed the growth of a learning
organization.
Commitment to Quality
Apart from the social challenges it faced in merging into the core
business, the IT group also had problems with the quality of its out-
put. Often, work was not performed in a professional manner. IT
organizations often suffer from an inability to deliver on schedule,
16 INFORMATION TECHNOLOGY
and Ravell was no exception. The first step in addressing the qual-
ity problem, was to develop IT’s awareness of the importance of the
problem, not only in my estimation but in that of the entire company.
The IT staff needed to understand how technology affected the day-
to-day operations of the entire company. One way to start the dia-
logue on quality is to first initiate one about failures. If something was
late, for instance, I asked why. Rather than addressing the problems
from a destructive perspective (Argyris & Schö n, 1996; Schein, 1992;
Senge, 1990), the focus was on encouraging IT personnel to under-
stand the impact of their actions—or lack of action—on the company.
Through self-reflection and recognition of their important role in the
organization, the IT staff became more motivated than before to per-
form higher quality work.
Teaching Staff “Not to Know”
One of the most important factors that developed out of the process
of integrating IT was the willingness of the IT staff “not to know.”
The phenomenology of “not knowing” or “knowing less” became the
facilitator of listening; that is, by listening, we as individuals are better
able to reflect. This sense of not knowing also “allows the individual
to learn an important lesson: the acceptance of what is, without our
attempts to control, manipulate, or judge” (Halifax, 1999, p. 177). The
IT staff improved their learning abilities by suggesting and adopting
new solutions to problems. An example of this was the creation of a
two-shift help desk that provided user support during both day and
evening. The learning process allowed IT to contribute new ideas to
the community. More important, their contributions did not dramat-
ically change the community; instead, they created gradual adjust-
ments that led to the growth of a new hybrid culture. The key to
this new culture was its ability to share ideas, accept error as a reality
(Marsick, 1998), and admit to knowing less (Halifax, 1999).
Transformation of Culture
Cultural changes are often slow to develop, and they occur in small
intervals. Furthermore, small cultural changes may even go unnoticed
or may be attributed to factors other than their actual causes. This
17the “rAvell” CorporAtIon
raises the issue of the importance of cultural awareness and our ability
to measure individual and group performance. The history of the IT
problems at Ravell made it easy for me to make management aware of
what we were newly attempting to accomplish and of our reasons for
creating dialogues about our successes and failures. Measurement and
evaluation of IT performance are challenging because of the intrica-
cies involved in determining what represents success. I feel that one
form of measurement can be found in the behavioral patterns of an
organization. When it came time for employee evaluations, reviews
were held with each IT staff member. Discussions at evaluation
reviews focused on the individuals’ perceptions of their role, and how
they felt about their job as a whole. The feedback from these review
meetings suggested that the IT staff had become more devoted, and
more willing to reflect on their role in the organization, and, gen-
erally, seemed happier at their jobs than ever before. Interestingly,
and significantly, they also appeared to be having fun at their jobs.
This happiness propagated into the community and influenced other
supporting departments to create similar infrastructures that could
reproduce our type of successes. This interest was made evident by
frequent inquiries I received from other departments about how the
transformation of IT was accomplished, and how it might be trans-
lated to create similar changes in staff behavior elsewhere in the com-
pany. I also noticed that there were fewer complaints and a renewed
ability for the staff to work with our consultants.
Alignment with Administrative Departments
Ravell provided an excellent lesson about the penalties of not align-
ing properly with other strategic and operational partners in a firm.
Sometimes, we become insistent on forcing change, especially when
placed in positions that afford a manager power—the power to get
results quickly and through force. The example of Ravell teaches us
that an approach of power will not ultimately accomplish transforma-
tion of the organization. While senior management can authorize and
mandate change, change usually occurs much more slowly than they
wish, if it occurs at all. The management ranks can still push back
and cause problems, if not sooner, then later. While I aligned with
the line units, I failed to align with important operational partners,
18 INFORMATION TECHNOLOGY
particularly human resources (HR). HR in my mind at that time
was impeding my ability to accomplish change. I was frustrated and
determined to get things done by pushing my agenda. This approach
worked early on, but I later discovered that the HR management was
bitter and devoted to stopping my efforts. The problems I encountered
at Ravell are not unusual for IT organizations. The historical issues
that affect the relationship between HR and IT are as follows:
• IT has unusual staff roles and job descriptions that can be
inconsistent with the rest of the organization.
• IT tends to have complex working hours and needs.
• IT has unique career paths that do not “fit” with HR standards.
• IT salary structures shift more dynamically and are very sen-
sitive to market conditions.
• IT tends to operate in silos.
The challenge, then, to overcome these impediments requires IT to
• reduce silos and IT staff marginalization
• achieve better organization-wide alignment
• develop shared leadership
• define and create an HR/IT governance model
The success of IT/HR alignment should follow practices similar
to those I instituted with the line managers at Ravell, specifically the
following:
• Successful HR/IT integration requires organizational learn-
ing techniques.
• Alignment requires an understanding of the relationship
between IT investments and business strategy.
• An integration of IT can create new organizational cultures
and structures.
• HR/IT alignment will likely continue to be dynamic in
nature, and evolve at an accelerated pace.
The oversight of not integrating better with HR cost IT dearly at
Ravell. HR became an undisclosed enemy—that is, a negative force
against the entire integration. I discovered this problem only later, and
was never able to bring the HR department into the fold. Without
HR being part of the learning organization, IT staff continued to
19the “rAvell” CorporAtIon
struggle with aligning their professional positions with those of the
other departments. Fortunately, within two years the HR vice presi-
dent retired, which inevitably opened the doors for a new start.
In large IT organizations, it is not unusual to have an HR member
assigned to focus specifically on IT needs. Typically, it is a joint position
in which the HR individual in essence works for the IT executive. This
is an effective alternative in that the HR person becomes versed in IT
needs and can properly represent IT in the area of head count needs and
specific titles. Furthermore, the unique aspect of IT organizations is in
the hybrid nature of their staff. Typically, a number of IT staff members
are consultants, a situation that presents problems similar to the one I
encountered at Ravell—that is, the resentment of not really being part
of the organization. Another issue is that many IT staff members are
outsourced across the globe, a situation that brings its own set of chal-
lenges. In addition, the role of HR usually involves ensuring compliance
with various regulations. For example, in many organizations, a con-
sultant is permitted to work on site for only one year before U.S. gov-
ernment regulations force the company to hire them as employees. The
HR function must work closely with IT to enforce these regulations.
Yet another important component of IT and HR collaboration is talent
management. That is, HR must work closely with IT to understand new
roles and responsibilities as they develop in the organization. Another
challenge is the integration of technology into the day-to-day business
of a company, and the question of where IT talent should be dispersed
throughout the organization. Given this complex set of challenges, IT
alone cannot facilitate or properly represent itself, unless it aligns with
the HR departments. This becomes further complex with the prolifera-
tion of IT virtual teams across the globe that create complex structures
that often have different HR ramifications, both legally and culturally.
Virtual team management is discussed further in the book.
Conclusion
This case study shows that strategic integration of technical resources
into core business units can be accomplished, by using those aspects of
organizational learning that promote reflection in action. This kind of
integration also requires something of a concomitant form of assimila-
tion, on the cultural level (see Chapter 3). Reflective thinking fosters the
20 INFORMATION TECHNOLOGY
development of a learning organization, which in turn allows for the
integration of the “other” in its various organizational manifestations.
The experience of this case study also shows that the success of organi-
zational learning will depend on the degree of cross fertilization achiev-
able in terms of individual values and on the ability of the community
to combine new concepts and beliefs, to form a hybrid culture. Such a
new culture prospers with the use of organizational learning strategies
to enable it to share ideas, accept mistakes, and learn to know less as a
regular part their discourse and practice in their day-to-day operations.
Another important conclusion from the Ravell experience is that
time is an important factor to the success of organizational learning
approaches. One way of dealing with the problem of time is with
patience—something that many organizations do not have. Another
element of success came in the acceleration of events (such as the relo-
cation at Ravell), which can foster a quicker learning cycle and helps
us see results faster. Unfortunately, impatience with using organiza-
tional learning methods is not an acceptable approach because it will
not render results that change individual and organizational behavior.
Indeed, I almost changed my approach when I did not get the results
I had hoped for early in the Ravell engagement. Nevertheless, my per-
sistence paid off. Finally, the belief that replacing the staff, as opposed
to investing in its knowledge, results from a faulty generalization. I
found that most of the IT staff had much to contribute to the orga-
nization and, ultimately, to help transform the culture. Subsequent
chapters of this book build on the Ravell experience and discuss spe-
cific methods for integrating organizational learning and IT in ways
that can improve competitive advantage.
Another recent perception, which I discuss further in Chapter 4,
is the commitment to “complete” integration. Simply put, IT cannot
select which departments to work with, or choose to participate only
with line managers; as they say, it is “all or nothing at all.” Furthermore,
as Friedman (2007, p. 8) states “The world is flat.” Certainly, part of
the “flattening” of the world has been initiated by technology, but it
has also created overwhelming challenges for seamless integration of
technology within all operations. The flattening of the world has cre-
ated yet another opportunity for IT to better integrate itself into what
is now an everyday challenge for all organizations.
21
2
The iT Dilemma
Introduction
We have seen much discussion in recent writing about how informa-
tion technology has become an increasingly significant component of
corporate business strategy and organizational structure (Bradley &
Nolan, 1998; Levine et al., 2000; Siebel, 1999). But, do we know
about the ways in which this significance takes shape? Specifically,
what are the perceptions and realities regarding the importance of
technology from organization leaders, business managers, and core
operations personnel? Furthermore, what forms of participation
should IT assume within the rest of the organization?
The isolation of IT professionals within their companies often pre-
vents them from becoming active participants in the organization.
Technology personnel have long been criticized for their inability to
function as part of the business and are often seen as a group falling
outside business cultural norms (Schein, 1992). They are frequently
stereotyped as “techies” and segregated into areas of the business
where they become marginalized and isolated from the rest of the
organization. It is my experience, based on case studies such as the
one reviewed in Chapter 1 (the Ravell Corporation), that if an orga-
nization wishes to absorb its IT department into its core culture, and
if it wishes to do so successfully, the company as a whole must be pre-
pared to consider structural changes and to seriously consider using
organizational learning approaches.
The assimilation of technical people into an organization presents
a special challenge in the development of true organizational learning
practices (developed more fully in Chapter 3). This challenge stems
from the historical separation of a special group that is seen as stand-
ing outside the everyday concerns of the business. IT is generally
acknowledged as having a key support function in the organization as
a whole. However, empirical studies have shown that it is a challenging
22 InForMAtIon teChnoloGY
endeavor to successfully integrate IT personnel into the learning fold
and to do so in such a way that they not only are accepted, but also
understood to be an important part of the social and cultural struc-
ture of the business (Allen & Morton, 1994; Cassidy, 1998; Langer,
2007; Schein, 1992; Yourdon, 1998).
In his book In Over Our Heads, Kegan (1994) discusses the chal-
lenges of dealing with individual difference. IT personnel have been
consistently regarded as “different” fixtures; as outsiders who do not
quite fit easily into the mainstream organization. Perhaps, because
of their technical practices, which may at times seem “foreign,” or
because of perceived differences in their values, IT personnel can
become marginalized; imagined as outside the core social structures
of business. As in any social structure, marginalization can result in
the withdrawal of the individual from the community (Schlossberg,
1989). As a result, many organizations are choosing to outsource their
IT services rather than confront and address the issues of cultural
absorption and organizational learning. The outsourcing alternative
tends to further distance the IT function from the core organiza-
tion, thus increasing the effects of marginalization. Not only does the
outsourcing of IT personnel separate them further from their peers,
but it also invariably robs the organization of a potentially important
contributor to the social growth and organizational learning of the
business. For example, technology personnel should be able to offer
insight into how technology can support further growth and learning
within the organization. In addition, IT personnel are usually trained
to take a logical approach to problem solving; as a result, they should
be able to offer a complementary focus on learning. Hence, the inte-
gration of IT staff members into the larger business culture can offer
significant benefits to an organization in terms of learning and orga-
nizational growth.
Some organizations have attempted to improve communications
between IT and non-IT personnel through the use of an intermedi-
ary who can communicate easily with both groups. This intermediary
is known in many organizations as the business analyst. Typically, the
business analyst will take responsibility for the interface between IT
and the larger business community. Although a business analyst may
help facilitate communication between IT and non-IT personnel,
this arrangement cannot help but carry the implication that different
23the It dIleMMA
“languages” are spoken by these two groups and, by extension, that
direct communication is not possible. Therefore, the use of such an
intermediary suffers the danger of failing to promote integration
between IT and the rest of the organization; in fact, it may serve to
keep the two camps separate. True integration, in the form of direct
contact between IT and non-IT personnel, represents a greater chal-
lenge for an organization than this remedy would suggest.
Recent Background
Since the 1990s, IT has been seen as a kind of variable that possesses
the great potential to reinvent business. Aspects of this promise affected
many of the core business rules used by successful chief executives and
business managers. While organizations have used IT for the process-
ing of information, decision-support processing, and order processing,
the impact of the Internet and e-commerce systems has initiated
revolutionary responses in every business sector. This economic phe-
nomenon became especially self-evident with the formation of dot-coms
in the mid- and late 1990s. The advent of this phenomenon stressed
the need to challenge fundamental business concepts. Many financial
wizards surmised that new technologies were indeed changing the very
infrastructure of business, affecting how businesses would operate and
compete in the new millennium. Much of this hoopla seemed justified
by the extraordinary potential that technology offered, particularly with
respect to the revolutionizing of old-line marketing principles, for it
was technology that came to violate what was previously thought to be
protected market conditions and sectors. Technology came to reinvent
these business markets and to allow new competitors to cross market in
sectors they otherwise could not have entered.
With this new excitement also came fear— fear that fostered unnat-
ural and accelerated entry into technology because any delay might
sacrifice important new market opportunities. Violating some of their
traditional principles, many firms invested in creating new organi-
zations that would “incubate” and eventually, capture large market
segments using the Internet as the delivery vehicle. By 2000, many of
these dot-coms were in trouble, and it became clear that their notion
of new business models based on the Internet contained significant
flaws and shortfalls. As a result of this crisis, the role and valuation
24 InForMAtIon teChnoloGY
of IT is again going through a transformation and once more we are
skeptical about the value IT can provide a business and about the way
to measure the contributions of IT.
IT in the Organizational Context
Technology not only plays a significant role in workplace operations,
but also continues to increase its relevance among other traditional
components of any business, such as operations, accounting, and
marketing (Earl, 1996b; Langer, 2001a; Schein, 1992). Given this
increasing relevance, IT gains significance in relation to
1. The impact it bears on organizational structure
2. The role it can assume in business strategy
3. The ways in which it can be evaluated
4. The extent to which chief executives feel the need to manage
operational knowledge and thus to manage IT effectively
IT and Organizational Structure
Sampler’s (1996) research explores the relationship between IT and
organizational structure. His study indicated that there is no clear-cut
relationship that has been established between the two. However, he
concluded that there are five principal positions that IT can take in
this relationship:
1. IT can lead to centralization of organizational control.
2. Conversely, IT can lead to decentralization of organizational
control.
3. IT can bear no impact on organizational control, its signifi-
cance being based on other factors.
4. Organizations and IT can interact in an unpredictable
manner.
5. IT can enable new organizational arrangements, such as net-
worked or virtual organizations.
According to Sampler (1996), the pursuit of explanatory models for
the relationship between IT and organizational structure continues
to be a challenge, especially since IT plays dual roles. On the one
25the It dIleMMA
hand, it enhances and constrains the capabilities of workers within
the organization, and because of this, it also possesses the ability
to create a unique cultural component. While both roles are active,
their impact on the organization cannot be predicted; instead, they
evolve as unique social norms within the organization. Because IT
has changed so dramatically over the past decades, it continues to be
difficult to compare prior research on the relationship between IT and
organizational structure.
Earl (1996a) studied the effects of applying business process reen-
gineering (BPR) to organizations. BPR is a process that organizations
undertake to determine how best to use technology, to improve busi-
ness performance. Earl concludes that BPR is “an unfortunate title: it
does not reflect the complex nature of either the distinctive underpin-
ning concept of BPR [i.e., to reevaluate methods and rules of business
operations] or the essential practical challenges to make it happen
[i.e., the reality of how one goes about doing that]” (p. 54).
In my 2001 study of the Ravell Corporation (“Fixing Bad Habits,”
Langer, 2001b), I found that BPR efforts require buy-in from business
line managers, and that such efforts inevitably require the adaptation
by individuals of different cultural norms and practices.
Schein (1992) recognizes that IT culture represents a subculture in
collision with many others within an organization. He concludes that if
organizations are to be successful in using new technologies in a global
context, they must cope with ceaseless flows of information to ensure
organizational health and effectiveness. His research indicates that chief
executive officers (CEOs) have been reluctant to implement a new sys-
tem of technology unless their organizations felt comfortable with it and
were ready to use it. While many CEOs were aware of cost and effi-
ciency implications in using IT, few were aware of the potential impact
on organizational structure that could result from “adopting an IT view
of their organizations” (p. 293). Such results suggest that CEOs need
to be more active and more cognizant than they have been of potential
shifts in organizational structure when adopting IT opportunities.
The Role of IT in Business Strategy
While many chief executives recognize the importance of IT in
the day-to-day operations of their business, their experience with
26 InForMAtIon teChnoloGY
attempting to utilize IT as a strategic business tool, has been frustrat-
ing. Typical executive complaints about IT, according to Bensaou and
Earl (1998), fall into five problem areas:
1. A lack of correspondence between IT investments and busi-
ness strategy
2. Inadequate payoff from IT investments
3. The perception of too much “technology for technology’s
sake”
4. Poor relations between IT specialists and users
5. The creation of system designs that fail to incorporate users’
preferences and work habits
McFarlan created a strategic grid (as presented in Applegate et al.,
2003) designed to assess the impact of IT on operations and strategy.
The grid shows that IT has maximum value when it affects both oper-
ations and core business objectives. Based on McFarlan’s hypothesis,
Applegate et al. established five key questions about IT that may be
used by executives to guide strategic decision making:
1. Can IT be used to reengineer core value activities, and change
the basis of competition?
2. Can IT change the nature of the relationship, and the balance
of power, between buyers and sellers?
3. Can IT build or reduce barriers to entry?
4. Can IT increase or decrease switching costs?
5. Can IT add value to existing products and services, or create
new ones?
The research and analysis conducted by McFarlan and Applegate,
respectively, suggest that when operational strategy and its results
are maximized, IT is given its highest valuation as a tool that can
transform the organization. It then receives the maximum focus
from senior management and board members. However, Applegate
et al. (2003) also focus on the risks of using technology. These risks
increase when executives have a poor understanding of competitive
dynamics, when they fail to understand the long-term implications
of a strategic system that they have launched, or when they fail to
account for the time, effort, and cost required to ensure user adop-
tion, assimilation, and effective utilization. Applegate’s conclusion
27the It dIleMMA
underscores the need for IT management to educate senior man-
agement, so that the latter will understand the appropriate indi-
cators for what can maximize or minimize their investments in
technology.
Szulanski and Amin (2000) claim that while emerging technologies
shrink the window in which any given strategy can be implemented,
if the strategy is well thought out, it can remain viable. Mintzberg’s
(1987) research suggests that it would be useful to think of strategy as
an art, not a science. This perspective is especially true in situations
of uncertainty. The rapidly changing pace of emerging technologies,
we know, puts a strain on established approaches to strategy— that is
to say, it becomes increasingly difficult to find comfortable implemen-
tation of technological strategies in such times of fast-moving envi-
ronments, requiring sophisticated organizational infrastructure and
capabilities.
Ways of Evaluating IT
Firms have been challenged to find a way to best evaluate IT,
particularly using traditional return on investment (ROI) approaches.
Unfortunately, in this regard, many components of IT do not generate
direct returns. Cost allocations based on overhead formulas (e.g., costs
of IT as a percentage of revenues) are not applicable to most IT spend-
ing needs. Lucas (1999) established nonmonetary methods for evalu-
ating IT. His concept of conversion effectiveness places value on the
ability of IT to complete its projects on time and within its budgets.
This alone is a sufficient factor for providing ROI, assuming that the
project was approved for valid business reasons. He called this overall
process for evaluation the “garbage can” model. It allows organizations
to present IT needs through a funneling pipeline of conversion effec-
tiveness that filters out poor technology plans and that can determine
which projects will render direct and indirect benefits to the organiza-
tion. Indirect returns, according to Lucas, are those that do not pro-
vide directly measurable monetary returns but do provide significant
value that can be measured using his IT investment opportunities
matrix. Utilizing statistical probabilities of returns, the opportunities
matrix provides an effective tool for evaluating the impact of indirect
returns.
28 InForMAtIon teChnoloGY
Executive Knowledge and Management of IT
While much literature and research have been produced on how IT
needs to participate in and bring value to an organization, there has
been relatively little analysis conducted on what non-IT chief execu-
tives need to know about technology. Applegate et al. (2003) suggest
that non-IT executives need to understand how to differentiate new
technologies from older ones, and how to gauge the expected impact
of these technologies on the businesses, in which the firm competes
for market share. This is to say that technology can change the rela-
tionship between customer and vendor, and thus, should be examined
as a potential for providing competitive advantage. The authors state
that non-IT business executives must become more comfortable with
technology by actively participating in technology decisions rather than
delegating them to others. They need to question experts as they would
in the financial areas of their businesses. Lou Gerstner, former CEO
of IBM , is a good example of a non-IT chief executive who acquired
sufficient knowledge and understanding of a technology firm. He was
then able to form a team of executives who better understood how to
develop the products, services, and overall business strategy of the firm.
Allen and Percival (2000) also investigate the importance of non-
IT executive knowledge and participation with IT: “If the firm lacks
the necessary vision, insights, skills, or core competencies, it may be
unwise to invest in the hottest [IT] growth market” (p. 295). The
authors point out that success in using emerging technologies is dif-
ferent from success in other traditional areas of business. They con-
cluded that non-IT managers need to carefully consider expected
synergies to determine whether an IT investment can be realized and,
especially, whether it is efficient to earn cost of capital.
Recent studies have focused on four important components in the
linking of technology and business: its relationship to organizational
structure, its role in business strategy, the means of its evaluation, and
the extent of non-IT executive knowledge in technology. The chal-
lenge in determining the best organizational structure for IT is posed
by the accelerating technological advances since the 1970s and by the
difficulty in comparing organizational models to consistent business
cases. Consequently, there is no single organizational structure that
has been adopted by businesses.
29the It dIleMMA
While most chief executives understand the importance of using
technology as part of their business strategy, they express frustra-
tion in determining how to effectively implement a technology-based
strategic approach. This frustration results from difficulties in under-
standing how IT investments relate to other strategic business issues,
from difficulty in assessing payoff and performance of IT generally
and from perceived poor relations between IT and other departments.
Because most IT projects do not render direct monetary returns, exec-
utives find themselves challenged to understand technology investments.
They have difficulty measuring value since traditional ROI formulas are
not applicable. Thus, executives would do better to focus on valuing tech-
nology investments by using methods that can determine payback based
on a matrix of indirect returns, which do not always include monetary
sources. There is a lack of research on the question of what general knowl-
edge non-IT executives need to have to effectively manage the strategic
use of technology within their firms. Non-IT chief executives are often
not engaged in day-to-day IT activities, and they often delegate dealing
with strategic technology issues to other managers. The remainder of this
chapter examines the issues raised by the IT dilemma in its various guises
especially as they become relevant to, and are confronted from, the top
management or chief executive point of view.
IT: A View from the Top
To investigate further the critical issues facing IT, I conducted a study
in which I personally interviewed over 40 chief executives in vari-
ous industries, including finance/investment, publishing, insurance,
wholesale/retail, and hotel management. Executives interviewed
were either the CEO or president of their respective corporations. I
canvassed a population of New York-based midsize corporations for
this interview study. Midsize firms, in our case, comprise businesses
of between 200 and 500 employees. Face-to-face interviews were
conducted, to allow participants the opportunity to articulate their
responses, in contrast to answering printed survey questions; execu-
tives were therefore allowed to expand, and clarify, their responses to
questions. An interview guide (see questions in Tables 2.1 through
2.3) was designed to raise issues relevant to the challenges of using
technology, as reported in the recent research literature, and to
30 InForMAtIon teChnoloGY
consider significant phenomena, that could affect changes in the uses
of technology, such as the Internet. The interview discussions focused
on three sections: (1) chief executive perception of the role of IT, (2)
management and strategic issues, and (3) measuring IT performance
and activities. The results of the interviews are summarized next.
Table 2.1 Perception and Role of IT
QUESTION ANALYSIS
1. How do you define the role and the
mission of IT in your firm?
Fifty-seven percent responded that their IT
organizations were reactive and did not really have a
mission. Twenty-eight percent had an IT mission that
was market driven; that is, their IT departments were
responsible for actively participating in marketing
and strategic processes.
2. What impact has the Internet had
on your business strategy?
Twenty-eight percent felt the impact was insignificant,
while 24% felt it was critical. The remaining 48% felt
that the impact of the Internet was significant to daily
transactions.
3. Does the firm have its own internal
software development activity? Do
you develop your own in-house
software or use software
packages?
Seventy-six percent had an internal development
organization. Eighty-one percent had internally
developed software.
4. What is your opinion of
outsourcing? Do you have the need
to outsource technology? If so, how
is this accomplished?
Sixty-two percent had outsourced certain aspects of
their technology needs.
5. Do you use consultants to help
formulate the role of IT? If yes,
what specific roles do they play? If
not, why?
Sixty-two percent of the participants used consultants
to assist them in formulating the role of IT.
6. Do you feel that IT will become
more important to the strategy of
the business? If yes, why?
Eighty-five percent felt that IT had recently become
more important to the strategic planning of the
business.
7. How is the IT department viewed
by other departments? Is the IT
department liked, or is it
marginalized?
Twenty-nine percent felt that IT was still marginalized.
Another 29% felt it was not very integrated. Thirty-eight
percent felt IT was sufficiently integrated within the
organization, but only one chief executive felt that IT
was very integrated with the culture of his firm.
8. Do you feel there is too much
“ hype” about the importance and
role of technology?
Fifty-three percent felt that there was no hype. However,
32% felt that there were levels of hype attributed to the
role of technology; 10% felt it was “ all hype.”
9. Have the role and the uses of
technology in the firm significantly
changed over the last 5 years? If
so, what are the salient changes?
Fourteen percent felt little had changed, whereas 43%
stated that there were moderate changes. Thirty-eight
percent stated there was significant change.
31the It dIleMMA
Table 2.2 Management and Strategic Issues
QUESTION ANALYSIS
1. What is the most senior title held
by someone in IT? Where does
this person rank on the
organization hierarchy?
Sixty-six percent called the highest position chief
information officer (CIO). Ten percent used managing
director, while 24% used director as the highest title.
2. Does IT management ultimately
report to you?
Fifty percent of IT leaders reported directly to the chief
executive (CEO). The other half reported to either the
chief financial officer (CFO) or the chief operating
officer (COO).
3. How active are you in working
with IT issues?
Fifty-seven percent stated that they are very active— on
a weekly basis. Thirty-eight percent were less active or
inconsistently involved, usually stepping in when an
issue becomes problematic.
4. Do you discuss IT strategy with
your peers from other firms?
Eighty-one percent did not communicate with peers at
all. Only 10% actively engaged in peer-to-peer
communication about IT strategy.
5. Do IT issues get raised at board,
marketing, and/or strategy
meetings?
Eighty-six percent confirmed that IT issues were
regularly discussed at board meetings. However, only
57% acknowledged IT discussion during marketing
meetings, and only 38% confirmed like discussions at
strategic sessions.
6. How critical is IT to the
day-to-day business?
Eighty-two percent of the chief executives felt it was very
significant or critical to the business.
Table 2.3 Measuring IT Performance and Activities
QUESTION ANALYSIS
1. Do you have any view of how IT
should be measured and
accounted for?
Sixty-two percent stated that they had a view on
measurement; however, there was significant
variation in how executives defined measurement.
2. Are you satisfied with IT
performance in the firm?
There was significant variation in IT satisfaction. Only
19% were very satisfied. Thirty-three percent were
satisfied, another 33% were less satisfied, and 14%
were dissatisfied.
3. How do you budget IT costs? Is it
based on a percentage of gross
revenues?
Fifty-seven percent stated that they did not use gross
revenues in their budgeting methodologies.
4. To what extent do you perceive
technology as a means of
increasing marketing or
productivity or both?
Seventy-one percent felt that technology was a
significant means of increasing both marketing and
productivity in their firms.
5. Are Internet/Web marketing
activities part of the IT function?
Only 24% stated that Internet/Web marketing efforts
reported directly to the IT organization.
32 InForMAtIon teChnoloGY
Section 1: Chief Executive Perception of the Role of IT
This section of the interview focuses on chief executive perceptions of
the role of IT within the firm. For the first question, about the role
and mission of IT, over half of the interviewees responded in ways
that suggested their IT organizations were reactive, without a strate-
gic mission. One executive admitted, “IT is not really defined. I guess
its mission is to meet our strategic goals and increase profitability.”
Another response betrays a narrowly construed understanding of its
potential: “The mission is that things must work— zero tolerance for
failure.” These two responses typify the vague and generalized percep-
tion that IT “has no explicit mission” except to advance the important
overall mission of the business itself. Little over a quarter of respon-
dents could confirm a market-driven role for IT; that is, actively par-
ticipating in marketing and strategic processes. Question 2, regarding
the impact of the Internet on business strategy, drew mixed responses.
Some of these revealed the deeply reflective challenges posed by the
Internet: “I feel the Internet forces us to take a longer-term view and a
sharper focus to our business.” Others emphasized its transformative
potential: “The Internet is key to decentralization of our offices and
business strategy.”
Questions 3 and 4 focused on the extent to which firms have their own
software development staffs, whether they use internally developed or
packaged software, and whether they outsource IT services. Control over
internal development of systems and applications remained important to
the majority of chief executives: “I do not like outsourcing— surrender
control, and it’s hard to bring back.” Almost two-thirds of the partici-
pants employed consultants to assist them in formulating the role of IT
within their firms but not always without reservation: “Whenever we
have a significant design issue we bring in consultants to help us— but
not to do actual development work.” Only a few were downright skepti-
cal: “I try to avoid consultants— what is their motivation?” The percep-
tion of outsourcing is still low in midsize firms, as compared to the recent
increase in IT outsourcing abroad. The lower use could be related to the
initial costs and management overheads that are required to properly
implement outsource operations in foreign countries.
A great majority of chief executives recognized some form of the
strategic importance of IT to business planning: “More of our business
33the It dIleMMA
is related to technology and therefore I believe IT is more important
to strategic planning.” Still, this sense of importance remained some-
what intuitive: “I cannot quantify how IT will become more strategic
to the business planning— but I sense that job functions will be dra-
matically altered.” In terms of how IT is viewed by other departments
within the firm, responses were varied. A little over a third of respon-
dents felt IT was reasonably integrated within the organization: “The
IT department is vitally important— but rarely noticed.” The major-
ity of respondents, however, recognized a need for greater integra-
tion: “IT was marginalized— but it is changing. While IT drives the
system— it needs to drive more of the business.” Some articulated
clearly the perceived problems: “IT needs to be more proactive— they
do not seem to have good interpersonal skills and do not understand
corporate politics.” A few expressed a sense of misgiving (“IT people
are strange— personality is an issue”) and even a sense of hopeless-
ness: “People hate IT— particularly over the sensitivity of the data. IT
sometimes is viewed as misfits and incompetent.”
Question eight asked participants whether they felt there was too
much “hype” attributed to the importance of technology in business.
Over half responded in the negative, although not without reserva-
tion: “I do not think there is too much hype— but I am disappointed.
I had hoped that technology at this point would have reduced paper,
decreased cost— it just has not happened.” Others felt that there is
indeed some degree of sensationalism: “I definitely think there is too
much hype— everyone wants the latest and greatest.” Hype in many
cases can be related to a function of evaluation, as in this exclama-
tion: “The hype with IT relates more to when will we actually see
the value!” The last question in this section asks whether the uses of
technology within the firm had significantly changed over the last
five years. A majority agreed that it had: “The role of IT has changed
significantly in the last five years—we need to stay up-to-date because
we want to carry the image that we are ‘ on the ball’.” Many of these
stressed the importance of informational flows: “I find the ‘ I’ [infor-
mation] part to be more and more important and the ‘ T’ [technol-
ogy] to be diminishing in importance.” Some actively downplayed the
significance: “I believe in minimizing the amount of technology we
use—people get carried away.”
34 InForMAtIon teChnoloGY
Section 2: Management and Strategic Issues
This section focuses on questions pertaining to executive and man-
agement organizational concerns. The first and second questions
asked executives about the most senior title held by an IT officer
and about the reporting structure for IT. Two-thirds of the par-
ticipants ranked their top IT officer as a chief information officer
(CIO). In terms of organizational hierarchy, half of the IT leaders
were at the second tier, reporting directly to the CEO or presi-
dent, while the other half were at the third tier, reporting either
to the chief financial officer (CFO) or to the chief operating offi-
cer (COO). As one CEO stated, “Most of my activity with IT is
through the COO. We have a monthly meeting, and IT is always
on the agenda.”
The third question asked executives to consider their level of
involvement with IT matters. Over half claimed a highly active rela-
tionship, engaging on a weekly basis: “I like to have IT people close
and in one-on-one interactions. It is not good to have artificial barri-
ers.” For some, levels of involvement may be limited: “I am active with
IT issues in the sense of setting goals.” A third of participants claimed
less activity, usually becoming active when difficulties arose. Question
four asked whether executives spoke to their peers at other firms about
technology issues. A high majority managed to skip this potential for
communication with their peers. Only one in 10 actively pursued this
matter of engagement.
Question 5 asked about the extent to which IT issues were
discussed at board meetings, marketing meetings, and business
strategy sessions. Here, a great majority confirmed that there was
regular discussion regarding IT concerns, especially at board meet-
ings. A smaller majority attested to IT discussions during market-
ing meetings. Over a third reported that IT issues maintained a
presence at strategic sessions. The higher incidence at board meet-
ings may still be attributable to the effects of Year 2000 (Y2K)
preparations. The final question in this section concerned the level
of criticality for IT in the day-to-day operations of the business. A
high majority of executives responded affirmatively in this regard:
“IT is critical to our survival, and its impact on economies of scale
is significant.”
35the It dIleMMA
Section 3: Measuring IT Performance and Activities
This section is concerned with how chief executives measured IT per-
formance and activities within their firms. The first question of this
section asked whether executives had a view about how IT performance
should be measured. Almost two-thirds affirmed having some formal
or informal way of measuring performance: “We have no formal pro-
cess of measuring IT other than predefined goals, cost constraints, and
deadlines.” Their responses demonstrated great variation, sometimes
leaning on cynicism: “I measure IT by the number of complaints I
get.” Many were still grappling with this challenge: “Measuring IT is
unqualified at this time. I have learned that hours worked is not the way
to measure IT— it needs to be more goal- oriented.” Most chief execu-
tives expressed some degree of quandary: “We do not feel we know
enough about how IT should be measured.” Question two asked execu-
tives to rate their satisfaction with IT performance. Here, also, there
was significant variation. A little more than half expressed some degree
of satisfaction: “Since 9/11 IT has gained a lot of credibility because of
the support that was needed during a difficult time.” Slightly fewer than
half revealed a degree of dissatisfaction: “We had to overhaul our IT
department to make it more customer-service oriented.”
Question three concerned budgeting; that is, whether or not chief
executives budgeted IT costs as a percentage of gross revenues. Over
half denied using gross revenues in their budgeting method: “When
handling IT projects we look at it on a request-by-request basis.”
The last two questions asked chief executives to assess the impact of
technology on marketing and productivity. Almost three quarters of
the participants felt that technology represented a significant means of
enhancing both marketing and productivity. Some maintained a cer-
tainty of objective: “We try to get IT closer to the customer— having
them understand the business better.” Still, many had a less-defined
sense of direction: “I have a fear of being left behind, so I do think IT
will become more important to the business.” And others remained
caught in uncertainty: “I do not fully understand how to use technol-
ogy in marketing— but I believe it’s there.” Chief executive certainty,
in this matter, also found expression in the opposite direction: “IT
will become less important— it will be assumed as a capability and a
service that companies provide to their customers.” Of the Internet/
36 InForMAtIon teChnoloGY
Web marketing initiatives, only one quarter of these reported directly
to the IT organization: “IT does not drive the Web activities because
they do not understand the business.” Often, these two were seen as
separate or competing entities of technology: “Having Web develop-
ment report to IT would hinder the Internet business’s growth poten-
tial.” Yet, some might be willing to explore a synergistic potential:
“We are still in the early stages of understanding how the Internet
relates to our business strategy and how it will affect our product line.”
General Results
Section 1 revealed that the matter of defining a mission for the IT
organization remains as unresolved as finding a way to reckon with the
potential impact of IT on business strategy. Executives still seemed to
be at a loss on the question of how to integrate IT into the workplace— a
human resource as well as a strategic issue. There was uncertainty regard-
ing the dependability of the technology information received. Most
agreed, however, in their need for software development departments to
support their internally developed software, in their need to outsource
certain parts of technology, and in their use of outside consultants to
help them formulate the future activities of their IT departments.
Section 2 showed that while the amount of time that executives spent
on IT issues varied, there was a positive correlation between a structure in
which IT managers reported directly to the chief executive and the degree
of activity that executives stated they had with IT matters. Section 3
showed that chief executives understood the potential value that technol-
ogy can bring to the marketing and productivity of their firms. They did
not believe, however, that technology can go unmeasured; there needs
to be some rationale for allotting a spending figure in the budget. For
most of the firms in this study, the use of the Internet as a technological
vehicle for future business was not determined by IT. This suggests that
IT does not manage the marketing aspects of technology, and that it has
not achieved significant integration in strategic planning.
Defining the IT Dilemma
The variations found in this study in terms of where IT reports, how
it is measured, and how its mission is defined were consistent with
37the It dIleMMA
existing research. But, the wide-ranging inconsistencies and uncer-
tainties among executives described here left many of them wonder-
ing whether they should be using IT as part of their business strategy
and operations. While this quandary does not in itself suggest an
inadequacy, it does point to an absence of a “best practices” guideline
for using technology strategically. Hence, most businesses lacked a
clear plan on how to evolve IT contributions toward business develop-
ment. Although a majority of respondents felt that IT was critical to
the survival of their businesses, the degree of IT assimilation within
the core culture of organizations still varied. This suggests that the
effects of cultural assimilation lag behind the actual involvement of
IT in the strategic direction of the company.
While Sampler (1996) attributes many operational inconsistencies to
the changing landscape of technology, the findings of this study suggest
that there is also a lack in professional procedures, rules, and established
governance, that could support the creation of best practices for the
profession. Bensaou and Earl (1998), on the one hand, have addressed
this concern by taking a pro-Japanese perspective in extrapolating from
five “Western” problems five “general” principles, presumably not cul-
ture bound, and thence a set of “best principles” for managing IT. But,
Earl et al. (1995), on the other hand, have sidestepped any attempt to
incorporate Earl’s own inductive approach discussed here; instead, they
favor a market management approach, based on a supply-and-demand
model to “balance” IT management. Of course, best practices already
embody the implicit notion of best principles; however, the problems
confronting executives— the need for practical guidelines— remain. For
instance, this study shows that IT performance is measured in many
different ways. It is this type of practical inconsistency that leaves chief
executives with the difficult challenge of understanding how technol-
ogy decisions can be managed.
On a follow-up call related to this study, for example, a CEO
informed me of a practical yet significant difference she had instituted
since our interview. She stated:
The change in reporting has allowed IT to become part of the main-
stream vision of the business. It now is a fundamental component of all
discussions with human resources, sales and marketing, and accounting.
The change in reporting has allowed for the creation of a critical system,
38 InForMAtIon teChnoloGY
which has generated significant direct revenues for the business. I attri-
bute this to my decision to move the reporting of technology directly
to me and to my active participation in the uses of technology in our
business.
This is an example of an executive whom Schein (1994) would
call a “change agent”— someone who employs “cognitive redefinition
through scanning,” in this case to elicit the strategic potential of IT.
We might also call this activity reflective thinking (Langer, 2001b).
Schein’s change agents, however, go on to “acknowledge that future
generations of CEOs will have been educated much more thoroughly
in the possibilities of the computer and IT, thus enabling them to take
a hands-on adopter stance” (p. 343). This insight implies a distanc-
ing (“future”) of present learning responsibilities among current chief
executives. The nearer future of this insight may instead be seen in
the development of organizational learning.* These are two areas of
contemporary research that begin to offer useful models in the pursuit
of a best practices approach to the understanding and managing of IT.
If the focus of this latter study was geared toward the evaluation of
IT based on the view of the chief executive, it was, indeed, because
their views necessarily shape the very direction for the organizations
that they manage. Subsequent chapters of this book examine how
the various dilemmas surrounding IT that I have discussed here are
affecting organizations and how organizational learning practices can
help answer many of the issues of today as raised by executives, man-
agers, and operations personnel.
Recent Developments in Operational Excellence
The decline in financial markets in 2009, and the continued increase
in mergers and acquisitions due to global competition have created an
interesting opportunity for IT that reinforces the need for integration
via organizational learning. During difficult economic periods, IT
has traditionally been viewed as a cost center and had its operations
* My case study “Fixing Bad Habits” (Langer, 2001b) has shown that integrating
the practices of reflective thinking, to support the development of organizational
learning, has greatly enhanced the adaptation of new technologies, their strategic
valuation to the firm, and their assimilation into the social norms of the business.
39the It dIleMMA
reduced (I discuss this further in Chapter 3, in which I introduce
the concept of drivers and supporters). However, with the growth in
the role of technology, IT management has now been asked to help
improve efficiency through the use of technology across departments.
That is, IT is emerging as an agent for business transformation in a
much stronger capacity than ever before. This phenomenon has placed
tremendous pressure on the technology executive to align with his or
her fellow executives in other departments and to get them to partici-
pate in cost reductions by implementing more technology. Naturally,
using technology to facilitate cuts to the workforce is often unpopular,
and there has been much bitter fallout from such cross-department
reductions. Technology executives thus face the challenge of position-
ing themselves as the agents of a necessary change. However, opera-
tional excellence is broader than just cutting costs and changing the
way things operate; it is about doing things efficiently and with qual-
ity measures across corporate operations. Now that technology affects
every aspect of operations, it makes sense to charge technology execu-
tives with a major responsibility to get it accomplished.
The assimilation of technology as a core part of the entire orga-
nization is now paramount for survival, and the technology execu-
tive of today and certainly tomorrow will be one who understands
that operational excellence through efficiency must be accomplished
by educating business units in self-managing the process. The IT
executive, then, supports the activity as a leader, not as a cost cut-
ter who invades the business. The two approaches are very different,
and adopting the former can result in significant long-term results in
strategic alignment.
My interviews with CEOs supported this notion: The CEO does
not want to be the negotiator; change must be evolutionary within the
business units themselves. While taking this kind of role in organiza-
tional change presents a new dilemma for IT, it can also be an oppor-
tunity for IT to position itself successfully within the organization.
41
3
TeChnology as a
vaRiable anD Responsive
oRganizaTional Dynamism
Introduction
This chapter focuses on defining the components of technology and
how they affect corporate organizations. In other words, if we step
back momentarily from the specific challenges that information tech-
nology (IT) poses, we might ask the following: What are the generic
aspects of technology that have made it an integral part of strategic and
competitive advantage for many organizations? How do organizations
respond to these generic aspects as catalysts of change? Furthermore,
how do we objectively view the role of technology in this context, and
how should organizations adjust to its short- and long-term impacts?
Technological Dynamism
To begin, technology can be regarded as a variable, independent
of others, that contributes to the life of a business operation. It is
capable of producing an overall, totalizing, yet distinctive, effect on
organizations— it has the unique capacity to create accelerations of
corporate events in an unpredictable way. Technology, in its aspect of
unpredictability, is necessarily a variable, and in its capacity as accel-
erator— its tendency to produce change or advance— it is dynamic.
My contention is that, as a dynamic kind of variable, technology, via
responsive handling or management, can be tapped to play a special
role in organizational development. It can be pressed into service as
the dynamic catalyst that helps bring organizations to maturity in
dealing not only with new technological quandaries, but also with
other agents of change. Change generates new knowledge, which in
turn requires a structure of learning that should, if managed properly,
42 INFORMATION TECHNOLOGY
result in transformative behavior, supporting the continued evolution
of organizational culture. Specifically, technology speeds up events,
such as the expectation of getting a response to an e-mail, and requires
organizations to respond to them in ever-quickening time frames.
Such events are not as predictable as those experienced by individuals
in organizations prior to the advent of new technologies— particu-
larly with the meteoric advance of the Internet. In viewing technology
then as a dynamic variable, and one that requires systemic and cul-
tural organizational change, we may regard it as an inherent, internal
driving force— a form of technological dynamism.
Dynamism is defined as a process or mechanism responsible for the
development or motion of a system. Technological dynamism charac-
terizes the unpredictable and accelerated ways in which technology,
specifically, can change strategic planning and organizational behav-
ior/culture. This change is based on the acceleration of events and
interactions within organizations, which in turn create the need to
better empower individuals and departments. Another way of under-
standing technological dynamism is to think of it as an internal drive
recognized by the symptoms it produces. The new events and interac-
tions brought about by technology are symptoms of the dynamism
that technology manifests. The next section discusses how organiza-
tions can begin to make this inherent dynamism work in their favor
on different levels.
Responsive Organizational Dynamism
The technological dynamism at work in organizations has the power
to disrupt any antecedent sense of comfortable equilibrium or an
unwelcome sense of stasis. It also upsets the balance among the vari-
ous factors and relationships that pertain to the question of how we
might integrate new technologies into the business— a question of
what we will call strategic integration— and how we assimilate the cul-
tural changes they bring about organizationally— a question of what
we call cultural assimilation. Managing the dynamism, therefore, is a
way of managing the effects of technology. I propose that these orga-
nizational ripples, these precipitous events and interactions, can be
addressed in specific ways at the organizational management level.
The set of integrative responses to the challenges raised by technology
43teChnoloGY As A vArIAble And responsIve
is what I am calling responsive organizational dynamism, which will
also receive further explication in the next few chapters. For now, we
need to elaborate the two distinct categories that present themselves
in response to technological dynamism: strategic integration and cul-
tural assimilation. Figure 3.1 diagrams the relationships.
Strategic Integration
Strategic integration is a process that addresses the business- strategic
impact of technology on organizational processes. That is, the
business-strategic impact of technology requires immediate orga-
nizational responses and in some instances zero latency. Strategic
integration recognizes the need to scale resources across traditional
business– geographic boundaries, to redefine the value chain in the
life cycle of a product or service line, and generally to foster more
agile business processes (Murphy, 2002). Strategic integration, then,
Technology as an
independent
variable
Creates
Organizational
dynamism
Acceleration of events that
require different
infrastructures and
organizational processes
Requires
Strategic
integration
Cultural
assimilation
Symptoms and
implications
Figure 3.1 Responsive organizational dynamism.
44 INFORMATION TECHNOLOGY
is a way to address the changing requirements of business processes
caused by the sharp increases in uses of technology. Evolving tech-
nologies have become catalysts for competitive initiatives that create
new and different ways to determine successful business investment.
Thus, there is a dynamic business variable that drives the need for
technology infrastructures capable of greater flexibility and of exhib-
iting greater integration with all business operations.
Historically, organizational experiences with IT investment have
resulted in two phases of measured returns. The first phase often
shows negative or declining productivity as a result of the investment;
in the second phase, we often see a lagging of, although eventual
return to, productivity. The lack of returns in the first phase has been
attributed to the nature of the early stages of technology exploration
and experimentation, which tend to slow the process of organizational
adaptation to technology. The production phase then lags behind
the ability of the organization to integrate new technologies with
its existing processes. Another complication posed by technological
dynamism via the process of strategic integration is a phenomenon we
can call factors of multiplicity — essentially, what happens when several
new technology opportunities overlap and create myriad projects that
are in various phases of their developmental life cycle. Furthermore,
the problem is compounded by lagging returns in productivity, which
are complicated to track and to represent to management. Thus, it is
important that organizations find ways to shorten the period between
investment and technology’ s effective deployment. Murphy (2002)
identifies several factors that are critical to bridging this delta:
1. Identifying the processes that can provide acceptable business
returns from new technological investments
2. Establishing methodologies that can determine these processes
3. Finding ways to actually perform and realize expected benefits
4. Integrating IT projects with other projects
5. Adjusting project objectives when changes in the business
require them
Technology complicates these actions, making them more difficult
to resolve; hence the need to manage the complications. To tackle
these compounded concerns, strategic integration can shorten life
cycle maturation by focusing on the following integrating factors:
45teChnoloGY As A vArIAble And responsIve
• Addressing the weaknesses in management organizations in
terms of how to deal with new technologies, and how to bet-
ter realize business benefits
• Providing a mechanism that both enables organizations to
deal with accelerated change caused by technological innova-
tions and integrates them into a new cycle of processing and
handling change
• Providing a strategic learning framework by which every new
technology variable adds to organizational knowledge, par-
ticularly using reflective practices (see Chapter 4)
• Establishing an integrated approach that ties technology
accountability to other measurable outcomes using organiza-
tional learning techniques and theories
To realize these objectives, organizations must be able to
• Create dynamic internal processes that can function on a
daily basis to deal with understanding the potential fit of new
technologies and their overall value to the business
• Provide the discourse to bridge the gaps between IT- and
non-IT-related investments and uses into an integrated system
• Monitor investments and determine modifications to the life
cycle
• Implement various organizational learning practices, includ-
ing learning organization, knowledge management, change
management, and communities of practice, all of which help
foster strategic thinking and learning that can be linked to
performance (Gephardt & Marsick, 2003)
Another important aspect of strategic integration is what Murphy
(2002) calls “ consequential interoperability,” in which “ the conse-
quences of a business process” are understood to “ dynamically trigger
integration” (p. 31). This integration occurs in what he calls the five
pillars of benefits realization:
1. Strategic alignment: The alignment of IT strategically with
business goals and objectives.
2. Business process impact: The impact on the need for the organi-
zation to redesign business processes and integrate them with
new technologies.
46 INFORMATION TECHNOLOGY
3. Architecture: The actual technological integration of appli-
cations, databases, and networks to facilitate and support
implementation.
4. Payback: The basis for computing return on investment (ROI)
from both direct and indirect perspectives.
5. Risk: Identifying the exposure for underachievement or fail-
ure in the technology investment.
Murphy’ s (2002) pillars are useful in helping us understand how
technology can engender the need for responsive organizational dyna-
mism (ROD), especially as it bears on issues of strategic integration.
They also help us understand what becomes the strategic integration
component of ROD. His theory on strategic alignment and business
process impact supports the notion that IT will increasingly serve as an
undergirding force, one that will drive enterprise growth by identify-
ing the initiators (such as e-business on the Internet) that best fit busi-
ness goals. Many of these initiators will be accelerated by the growing
use of e-business, which becomes the very driver of many new market
realignments. This e-business realignment will require the ongoing
involvement of executives, business managers, and IT managers. In
fact, the Gartner Group forecasted that 70% of new software applica-
tion investments and 5% of new infrastructure expenditures by 2005
would be driven by e-business. Indeed, this has occurred and contin-
ues to expand.
The combination of evolving business drivers with accelerated and
changing customer demands has created a business revolution that
best defines the imperative of the strategic integration component of
ROD. The changing and accelerated way businesses deal with their
customers and vendors requires a new strategic integration to become
a reality rather than remain a concept discussed but affecting little
action. Without action directed toward new strategic integration,
organizations would lose competitive advantage, which would affect
profits. Most experts see e-business as the mechanism that will ulti-
mately require the integrated business processes to be realigned, thus
providing value to customers and modifying the customer– vendor
relationship. The driving force behind this realignment emanates from
the Internet, which serves as the principle accelerator of the change
in transactions across all businesses. The general need to optimize
47teChnoloGY As A vArIAble And responsIve
resources forces organizations to rethink and to realign business pro-
cesses to gain access to new business markets.
Murphy’ s (2002) pillar of architecture brings out yet another aspect
of ROD. By architecture we mean the focus on the effects that technol-
ogy has on existing computer applications or legacy systems (old exist-
ing systems). Technology requires existing IT systems to be modified
or replacement systems to be created that will mirror the new busi-
ness realignments. These changes respond to the forces of strategic
integration and require business process reengineering (BPR) activi-
ties, which represent the reevaluation of existing systems based on
changing business requirements. It is important to keep in mind the
acceleration factors of technology and to recognize the amount of
organizational effort and time that such projects take to complete. We
must ask the following question: How might organizations respond to
these continual requirements to modify existing processes? I discuss
in other chapters how ROD represents the answer to this question.
Murphy’ s (2002) pillar of direct return is somewhat limited and nar-
row because not all IT value can be associated with direct returns, but
it is important to discuss. Technology acceleration is forcing organiza-
tions to deal with broader issues surrounding what represents a return
from an investment. The value of strategic integration relies heavily on
the ability of technology to encapsulate itself within other departments
where it ultimately provides the value. We show in Chapter 4 that
this issue also has significance in organizational formation. What this
means is simply that value can be best determined within individual
business units at the microlevel and that these appropriate-level busi-
ness units also need to make the case for why certain investments need
to be pursued. There are also paybacks that are indirect; for example,
Lucas (1999) demonstrates that many technology investments are non-
monetary. The IT department (among others) becomes susceptible to
great scrutiny and subject to budgetary cutbacks during economically
difficult times. This does not suggest that IT “ hide” itself but rather
that its investment be integrated within the unit where it provides the
most benefit. Notwithstanding the challenge to map IT expenditures
to their related unit, there are always expenses that are central to all
departments, such as e-mail and network infrastructure. These types
of expenses can rarely provide direct returns and are typically allocated
across departments as a cost of doing business.
48 INFORMATION TECHNOLOGY
Because of the increased number of technology opportuni-
ties, Murphy’ s (2002) risk pillar must be a key part of strategic
integration. The concept of risk assessment is not new to an organiza-
tion; however, it is somewhat misunderstood as it relates to technology
assessment. Technology assessment, because of the acceleration factor,
must be embedded within the strategic decision-making process. This
can only be accomplished by having an understanding of how to align
technology opportunities for business change and by understanding
the cost of forgoing the opportunity as well as the cost of delays in
delivery. Many organizations use risk assessment in an unstructured
way, which does not provide a consistent framework to dynamically
deal with emerging technologies. Furthermore, such assessment needs
to be managed at all levels in the organization as opposed to being an
event-driven activity controlled only by executives.
Summary
Strategic integration represents the objective of dealing with emerg-
ing technologies on a regular basis. It is an outcome of ROD, and it
requires organizations to deal with a variable, that forces acceleration
of decisions in an unpredictable fashion. Strategic integration would
require businesses to realign the ways in which they include technol-
ogy in strategic decision making.
Cultural Assimilation
Cultural assimilation is a process that focuses on the organizational
aspects of how technology is internally organized, including the role
of the IT department, and how it is assimilated within the organiza-
tion as a whole. The inherent, contemporary reality of technologi-
cal dynamism requires not only strategic but also cultural change.
This reality demands that IT organizations connect to all aspects of
the business. Such affiliation would foster a more interactive culture
rather than one that is regimented and linear, as is too often the case.
An interactive culture is one that can respond to emerging technology
decisions in an optimally informed way, and one that understands the
impact on business performance.
49teChnoloGY As A vArIAble And responsIve
The kind of cultural assimilation elicited by technological dyna-
mism and formalized in ROD is divided into two subcategories: the
study of how the IT organization relates and communicates with
“ others,” and the actual displacement or movement of traditional
IT staff from an isolated “ core” structure to a firm-wide, integrated
framework.
IT Organization Communications with “ Others”
The Ravell case study shows us the limitations and consequences of
an isolated IT department operating within an organization. The case
study shows that the isolation of a group can lead to marginalization,
which results in the kind of organization in which not all individuals
can participate in decision making and implementation, even though
such individuals have important knowledge and value. Technological
dynamism is forcing IT departments to rethink their strategic posi-
tion within the organizational structure of their firm. No longer can
IT be a stand-alone unit designed just to service outside departments
while maintaining its separate identity. The acceleration factors of
technology require more dynamic activity within and among depart-
ments, which cannot be accomplished through discrete communica-
tions between groups. Instead, the need for diverse groups to engage
in more integrated discourse, and to share varying levels of techno-
logical knowledge, as well as business-end perspectives, requires new
organizational structures that will of necessity give birth to a new
and evolving business— social culture. Indeed, the need to assimilate
technology creates a transformative effect on organizational cultures,
the way they are formed and re-formed, and what they will need from
IT personnel.
Movement of Traditional IT Staff
To facilitate cultural assimilation from an IT perspective, IT must
become better integrated with non-IT personnel. This form of inte-
gration can require the actual movement of IT staff into other depart-
ments, which begins the process of a true assimilation of resources
among business units. While this may seem like the elimination of
50 INFORMATION TECHNOLOGY
the integrity or identity of IT, such a loss is far from the case. The
elimination of the IT department is not at all what is called for here;
on the contrary, the IT department is critical to the function of cul-
tural assimilation. However, the IT department may need to be struc-
tured differently from the way it has been so that it can deal primarily
with generic infrastructure and support issues, such as e-mail, net-
work architecture, and security. IT personnel who focus on business-
specific issues need to become closely aligned with the appropriate
units so that ROD can be successfully implemented.
Furthermore, we must acknowledge that, given the wide range of
available knowledge about technology, not all technological knowl-
edge emanates from the IT department. The question becomes
one of finding the best structure to support a broad assimilation of
knowledge about any given technology; then, we should ask how that
knowledge can best be utilized by the organization. There is a pitfall
in attempting to find a “ standard” IT organizational structure that
will address the cultural assimilation of technology. Sampler’ s (1996)
research, and my recent research with chief executives, confirms that
no such standard structure exists. It is my position that organizations
must find their own unique blend, using organizational learning con-
structs. This simply means that the cultural assimilation of IT may
be unique to the organization. What is then more important for the
success of organizational development is the process of assimilation as
opposed to the transplanting of the structure itself.
Today, many departments still operate within “ silos” where they
are unable to meet the requirements of the dynamic and unpredictable
nature of technology in the business environment. Traditional orga-
nizations do not often support the necessary communications needed
to implement cultural assimilation across business units. However,
business managers can no longer make decisions without considering
technology; they will find themselves needing to include IT staff in
their decision-making processes. On the other hand, IT departments
can no longer make technology-based decisions without concerted
efforts toward assimilation (in contrast to occasional partnering or
project-driven participation) with other business units. This assimi-
lation becomes mature when new cultures evolve synergistically as
opposed to just having multiple cultures that attempt to work in con-
junction with each other. The important lesson from Ravell to keep
51teChnoloGY As A vArIAble And responsIve
in mind here is that the process of assimilating IT can create new
cultures that in turn evolve to better support the requirements estab-
lished by the dynamism of technology.
Eventually, these new cultural formations will not perceive them-
selves as functioning within an IT or non-IT decision framework
but rather as operating within a more central business operation that
understands how to incorporate varying degrees of IT involvement
as necessary. Thus, organizational cultures will need to fuse together
to respond to new business opportunities and requirements brought
about by the ongoing acceleration of technological innovation. This
was also best evidenced by subsequent events at Ravell. Three years
after the original case study, it became necessary at Ravell to inte-
grate one of its business operations with a particular group of IT staff
members. The IT personnel actually transferred to the business unit
to maximize the benefits of merging both business and technical cul-
tures. Interestingly, this business unit is currently undergoing cultural
assimilation and is developing its own behavioral norms influenced by
the new IT staff. However, technology decisions within such groups
are not limited to the IT transferred personnel. IT and non-IT staff
need to formulate decisions using various organizational learning
techniques. These techniques are discussed in the next chapter.
Summary
Without appropriate cultural assimilation, organizations tend to have
staff that “ take shortcuts, [then] the loudest voice will win the day, ad
hoc decisions will be made, accountabilities lost, and lessons from suc-
cesses and failures will not become part of … wisdom” (Murphy, 2002,
p. 152). As in the case of Ravell Corporation, it is essential, then, to
provide for consistent governance that fits the profile of the existing cul-
ture or can establish the need for a new culture. While many scholars
and managers suggest the need to have a specific entity responsible for
IT governance, one that is to be placed within the operating structure
of the organization, such an approach creates a fundamental problem.
It does not allow staff and managers the opportunity to assimilate tech-
nologically driven change and understand how to design a culture that
can operate under ROD. In other words, the issue of governance is
misinterpreted as a problem of structural positioning or hierarchy when
52 INFORMATION TECHNOLOGY
it is really one of cultural assimilation. As a result, many business solu-
tions to technology issues often lean toward the prescriptive, instead of
the analytical, in addressing the real problem.
Murphy’ s (2002) risk pillar theory offers us another important
component relevant to cultural assimilation. This approach addresses
the concerns that relate to the creation of risk cultures formed to deal
with the impact of new systems. New technologies can actually cause
changes in cultural assimilation by establishing the need to make cer-
tain changes in job descriptions, power structures, career prospects,
degree of job security, departmental influence, or ownership of data.
Each of these potential risks needs to be factored in as an important
part of considering how best to organize and assimilate technology
through ROD.
Technology Business Cycle
To better understand technology dynamism, or how technology acts as
a dynamic variable, it is necessary to define the specific steps that occur
during its evolution in an organization. The evolution or business cycle
depicts the sequential steps during the maturation of a new technology
from feasibility to implementation and through subsequent evolution.
Table 3.1 shows the five components that comprise the cycle: feasibil-
ity, measurement, planning, implementation, and evolution.
Table 3.1 Technology Business Cycle
CYCLE COMPONENT COMPONENT DESCRIPTION
Feasibility Understanding how to view and evaluate emerging technologies, from a
technical and business perspective.
Measurement Dealing with both the direct monetary returns and indirect nonmonetary
returns; establishing driver and support life cycles.
Planning Understanding how to set up projects, establishing participation across
multiple layers of management, including operations and departments.
Implementation Working with the realities of project management; operating with political
factions, constraints; meeting milestones; dealing with setbacks; having
the ability to go live with new systems.
Evolution Understanding how acceptance of new technologies affects cultural
change, and how uses of technology will change as individuals and
organizations become more knowledgeable about technology, and
generate new ideas about how it can be used; objective is established
through organizational dynamism, creating new knowledge and an
evolving organization.
53teChnoloGY As A vArIAble And responsIve
Feasibility
The stage of feasibility focuses on a number of issues surrounding
the practicality of implementing a specific technology. Feasibility
addresses the ability to deliver a product when it is needed in com-
parison to the time it takes to develop it. Risk also plays a role in
feasibility assessment; of specific concern is the question of whether
it is possible or probable that the product will become obsolete before
completion. Cost is certainly a huge factor, but viewed at a “ high
level” (i.e., at a general cost range), and it is usually geared toward
meeting the expected ROI of a firm. The feasibility process must be
one that incorporates individuals in a way that allows them to respond
to the accelerated and dynamic process brought forth by technological
innovations.
Measurement
Measurement is the process of understanding how an investment in
technology is calculated, particularly in relation to the ROI of an
organization. The complication with technology and measurement
is that it is simply not that easy to determine how to calculate such
a return. This problem comes up in many of the issues discussed by
Lucas (1999) in his book Information Technology and the Productivity
Paradox. His work addresses many comprehensive issues, surround-
ing both monetary and nonmonetary ROI, as well as direct ver-
sus indirect allocation of IT costs. Aside from these issues, there
is the fact that for many investments in technology the attempt to
compute ROI may be an inappropriate approach. As stated, Lucas
offered a “ garbage can” model that advocates trust in the operational
management of the business and the formation of IT representatives
into productive teams that can assess new technologies as a regu-
lar part of business operations. The garbage can is an abstract con-
cept for allowing individuals a place to suggest innovations brought
about by technology. The inventory of technology opportunities
needs regular evaluation. Lucas does not really offer an explana-
tion of exactly how this process should work internally. ROD, how-
ever, provides the strategic processes and organizational– cultural
needs that can provide the infrastructure to better understand and
54 INFORMATION TECHNOLOGY
evaluate the potential benefits from technological innovations using
the garbage can model. The graphic depiction of the model is shown
in Figure 3.2.
Planning
Planning requires a defined team of user and IT representatives. This
appears to be a simple task, but it is more challenging to understand
how such teams should operate, from whom they need support, and
what resources they require. Let me be specific. There are a number
of varying types of “ users” of technology. They typically exist in three
tiers: executives, business line managers, and operations users. Each
of these individuals offers valuable yet different views of the benefits
of technology (Langer, 2002). I define these user tiers as follows:
1. Executives: These individuals are often referred to as execu
tive sponsors. Their role is twofold. First, they provide input
into the system, specifically from the perspective of pro-
ductivity, ROI, and competitive edge. Second, and per-
haps more important, their responsibility is to ensure that
users are participating in the requisite manner (i.e., made
Garbage can
model of IT value
Failed systems
Direct
benefits
Indirect
benefits
User
needs, etc.
C
on
ve
rs
io
n
eff
ec
tiv
en
es
s �e IT value pipeline
Figure 3.2 Garbage can model of IT value. (From Lucas, H.C., Information Technology and the
Productivity Paradox. Oxford University Press, New York, 1999.)
55teChnoloGY As A vArIAble And responsIve
to be available, in the right place, etc.). This area can be
problematic because internal users are typically busy doing
their jobs and sometimes neglect to provide input or to
attend project meetings. Furthermore, executive sponsors
can help control political agendas that can hurt the success
of the project.
2. Business line managers: This interface provides the most
information from a business unit perspective. These indi-
viduals are responsible for two aspects of management.
First, they are responsible for the day-to-day productivity
of their unit; therefore, they understand the importance
of productive teams, and how software can assist in this
endeavor. Second, they are responsible for their staff. Thus,
line managers need to know how software will affect their
operational staff.
3. Functional users: These are the individuals in the trenches who
understand exactly how processing needs to get done. While
their purview of the benefits of the system is relatively nar-
rower than that of the executives and managers, they provide
the concrete information that is required to create the feature/
functions that make the system usable.
The planning process becomes challenging when attempting to
get the three user communities to integrate their needs and “ agree to
agree” on how a technology project needs to be designed and managed.
Implementation
Implementation is the process of actually using a technology.
Implementation of technology systems requires wider integration
within the various departments than other systems in an organization
because usually multiple business units are affected. Implementation
must combine traditional methods of IT processes of development
yet integrate them within the constraints, assumptions, and cultural
(perhaps political) environments of different departments. Cultural
assimilation is therefore required at this stage because it delves into
the structure of the internal organization and requires individual
participation in every phase of the development and implementation
56 INFORMATION TECHNOLOGY
cycle. The following are some of the unique challenges facing the
implementation of technological projects:
1. Project managers as complex managers: Technology projects
require multiple interfaces that often lie outside the traditional
user community. They can include interfacing with writers,
editors, marketing personnel, customers, and consumers, all
of whom are stakeholders in the success of the system.
2. Shorter and dynamic development schedules: Due to the dynamic
nature of technology, its process of development is less lin-
ear than that of others. Because there is less experience in
the general user community, and there are more stakeholders,
there is a tendency by those in IT, and executives, to underes-
timate the time and cost to complete the project.
3. New untested technologies: There is so much new technol-
ogy offered to organizations that there is a tendency by IT
organizations to implement technologies that have not yet
matured— that are not yet the best products they will eventu-
ally be.
4. Degree of scope changes: Technology, because of its dynamic
nature, tends to be prone to scope creed — the scope of the orig-
inal project expanding during development.
5. Project management: Project managers need to work closely
with internal users, customers, and consumers to advise
them on the impact of changes to the project schedule.
Unfortunately, scope changes that are influenced by changes
in market trends may not be avoidable. Thus, part of a good
strategy is to manage scope changes rather than attempt to
stop them, which might not be realistic.
6. Estimating completion time: IT has always had difficulties in
knowing how long it will take to implement a technology.
Application systems are even more difficult because of the
number of variables and unknowns.
7. Lack of standards: The technology industry continues to be a
profession that does not have a governing body. Thus, it is
impossible to have real enforced standards that other pro-
fessions enjoy. While there are suggestions for best prac-
tices, many of them are unproven and not kept current with
57teChnoloGY As A vArIAble And responsIve
changing developments. Because of the lack of successful
application projects, there are few success stories to create new
and better sets of best practices.
8. Lessspecialized roles and responsibilities: The IT team tends to
have staff members who have varying responsibilities. Unlike
traditional new technology-driven projects, separation of roles
and responsibilities is more difficult when operating in more
dynamic environments. The reality is that many roles have not
been formalized and integrated using something like ROD.
9. Broad project management responsibilities: Project management
responsibilities need to go beyond those of the traditional IT
manager. Project managers are required to provide manage-
ment services outside the traditional software staff. They need
to interact more with internal and external individuals, as well
as with non-traditional members of the development team,
such as Web text and content staff. Therefore, there are many
more obstacles that can cause implementation problems.
Evolution
The many ways to form a technological organization with a natural
capacity to evolve have been discussed from an IT perspective in this
chapter. However, another important factor is the changing nature
of application systems, particularly those that involve e-businesses.
E-business systems are those that utilize the Internet and engage
in e-commerce activities among vendors, clients, and internal users
in the organization. The ways in which e-business systems are built
and deployed suggest that they are evolving systems. This means
that they have a long life cycle involving ongoing maintenance and
enhancement. They are, if you will, “ living systems” that evolve
in a manner similar to organizational cultures. So, the traditional
beginning-to-end life cycle does not apply to an e-business proj-
ect that must be implemented in inherently ongoing and evolving
phases. The important focus is that technology and organizational
development have parallel evolutionary processes that need to be in
balance with each other. This philosophy is developed further in the
next chapter.
58 INFORMATION TECHNOLOGY
Drivers and Supporters
There are essentially two types of generic functions performed by
departments in organizations: driver functions and supporter func-
tions. These functions relate to the essential behavior and nature of
what a department contributes to the goals of the organization. I
first encountered the concept of drivers and supporters at Coopers
& Lybrand, which was at that time a Big 8* accounting firm. I stud-
ied the formulation of driver versus supporter as it related to the role
of our electronic data processing (EDP) department. The firm was
attempting to categorize the EDP department as either a driver or a
supporter.
Drivers were defined in this instance as those units that engaged
in frontline or direct revenue-generating activities. Supporters were
units that did not generate obvious direct revenues but rather were
designed to support frontline activities. For example, operations such
as internal accounting, purchasing, or office management were all
classified as supporter departments. Supporter departments, due to
their nature, were evaluated on their effectiveness and efficiency or
economies of scale. In contrast, driver organizations were expected to
generate direct revenues and other ROI value for the firm. What was
also interesting to me at the time was that drivers were expected to
be more daring— since they must inevitably generate returns for the
business. As such, drivers engaged in what Bradley and Nolan (1998)
coined “ sense and respond” behaviors and activities. Let me explain.
Marketing departments often generate new business by investing
or “ sensing” an opportunity quickly because of competitive forces
in the marketplace. Thus, they must sense an opportunity and be
allowed to respond to it in a timely fashion. The process of sensing
opportunity, and responding with competitive products or services,
is a stage in the cycle that organizations need to support. Failures in
the cycles of sense and respond are expected. Take, for example, the
* The original “ Big 8” consisted of the eight large accounting and management con-
sulting firms— Coopers & Lybrand, Arthur Anderson, Touche Ross, Deloitte
Haskins & Sells, Arthur Young, Price Waterhouse, Pete Marwick Mitchell, and
Ernst and Whinney— until the late 1980s, when these firms began to merge. Today,
there are four: Price Waterhouse Coopers, Deloitte & Touche, Ernst & Young, and
KPMG (Pete Marwick and others).
59teChnoloGY As A vArIAble And responsIve
launching of new fall television shows. Each of the major stations
goes through a process of sensing which shows might be interesting to
the viewing audience. They respond, after research and review, with a
number of new shows. Inevitably, only a few of these selected shows
are actually successful; some fail almost immediately. While relatively
few shows succeed, the process is acceptable and is seen by manage-
ment as the consequence of an appropriate set of steps for competing
effectively— even though the percentage of successful new shows is
low. Therefore, it is safe to say that driver organizations are expected
to engage in high-risk operations, of which many will fail, for the sake
of creating ultimately successful products or services.
The preceding example raises two questions: (1) How does sense
and respond relate to the world of IT? and (2) Why is it important?
IT is unique in that it is both a driver and a supporter. The latter is the
generally accepted norm in most firms. Indeed, most IT functions are
established to support myriad internal functions, such as
• Accounting and finance
• Data center infrastructure (e-mail, desktop, etc.)
• Enterprise-level application (enterprise resource planning, ERP)
• Customer support (customer relationship management, CRM)
• Web and e-commerce activities
As one would expect, these IT functions are viewed as overhead
related, as somewhat of a commodity, and thus are constantly man-
aged on an economy-of-scale basis— that is, how can we make this
operation more efficient, with a particular focus on cost containment?
So, what then are IT driver functions? By definition, they are those
that engage in direct revenues and identifiable ROI. How do we define
such functions in IT because most activities are sheltered under the
umbrella of marketing organization domains? (Excluding, of course,
software application development firms that engage in marketing for
their actual application products.) I define IT driver functions as those
projects that, if delivered, would change the relationship between the
organization and its customers; that is, those activities that directly
affect the classic definition of a market: forces of supply and demand,
which are governed by the customer (demand) and the vendor (sup-
plier) relationship. This concept can be shown in the case example that
follows.
60 INFORMATION TECHNOLOGY
Santander versus Citibank
Santander Bank, the major bank of Spain, had enjoyed a dominant
market share in its home country. Citibank had attempted for years to
penetrate Santander’ s dominance using traditional approaches (open-
ing more branch offices, marketing, etc.) without success, until, that
is, they tried online banking. Using technology as a driver, Citibank
made significant penetration into the market share of Santander
because it changed the customer– vendor relationship. Online bank-
ing, in general, has had a significant impact on how the banking
industry has established new markets, by changing this relationship.
What is also interesting about this case is the way in which Citibank
accounted for its investment in online banking; it knows little about
its total investment and essentially does not care about its direct pay-
back. Rather, Citibank sees its ROI in a similar way that depicts
driver/marketing behavior; the payback is seen in broader terms to
affect not only revenue generation, but also customer support and
quality recognition.
Information Technology Roles and Responsibilities
The preceding section focuses on how IT can be divided into two dis-
tinct kinds of business operations. As such, the roles and responsibili-
ties within IT need to change accordingly and be designed under the
auspices of driver and supporter theory. Most traditional IT depart-
ments are designed to be supporters, so that they have a close-knit
organization that is secure from outside intervention and geared to
respond to user needs based on requests. While in many instances
this type of formation is acceptable, it is limited in providing the IT
department with the proper understanding of the kind of business
objectives that require driver-type activities. This was certainly the
experience in the Ravell case study. In that instance, I found that
making the effort to get IT support personnel “ out from their com-
fortable shells” made a huge difference in providing better service
to the organization at large. Because more and more technology is
becoming driver essential, this development will require of IT per-
sonnel an increasing ability to communicate to managers and execu-
tives and to assimilate within other departments.
61teChnoloGY As A vArIAble And responsIve
The Ravell case, however, also brought to light the huge vacuum of
IT presence in driver activities. The subsequent chief executive inter-
view study also confirmed that most marketing IT-oriented activities,
such as e-business, do not fall under the purview of IT in most orga-
nizations. The reasons for this separation are correlated with the lack
of IT executive presence within the management team.
Another aspect of driver and supporter functions is the concept of
a life cycle. A life cycle, in this respect, refers to the stages that occur
before a product or service becomes obsolete. Technology products
have a life cycle of value just as any other product or service. It is
important not to confuse this life cycle with processes during devel-
opment as discussed elsewhere in this chapter.
Many technical products are adopted because they are able to deliver
value that is typically determined based on ROI calculations. However,
as products mature within an organization, they tend to become more of
a commodity, and as they are normalized, they tend to become support-
oriented. Once they reach the stage of support, the rules of economies
of scale become more important and relevant to evaluation. As a prod-
uct enters the support stage, replacement based on economies of scale
can be maximized by outsourcing to an outside vendor who can provide
the service cheaper. New technologies then can be expected to follow
this kind of life cycle, by which their initial investment requires some
level of risk to provide returns to the business. This initial investment
is accomplished in ROD using strategic integration. Once the evalua-
tions are completed, driver activities will prevail during the maturation
process of the technology, which will also require cultural assimilation.
Inevitably, technology will change organizational behavior and struc-
ture. However, once the technology is assimilated and organizational
behavior and structures are normalized, individuals will use it as a per-
manent part of their day-to-day operations. Thus, driver activities give
way to those of supporters. Senior managers become less involved, and
line managers then become the more important group that completes
the transition from driver to supporter.
Replacement or Outsource
After the technology is absorbed into operations, executives will seek
to maximize the benefit by increased efficiency and effectiveness.
62 INFORMATION TECHNOLOGY
Certain product enhancements may be pursued during this phase; they
can create “ mini-loops” of driver-to-supporter activities. Ultimately, a
technology, viewed in terms of its economies of scale and longevity,
is considered for replacement or outsourcing. Figure 3.3 graphically
shows the cycle.
The final stage of maturity of an evolving driver therefore includes
becoming a supporter, at which time it becomes a commodity and,
finally, an entity with potential for replacement or outsourcing. The
next chapter explores how organizational learning theories can be
used to address many of the issues and challenges brought forth in
this chapter.
Mini loop technology enhancementsTechnology
driver
Evaluation
cycle
Driver
maturation
Support
status
Replacement or
outsource
Economies
of scale
Figure 3.3 Driver-to-supporter life cycle.
63
4
oRganizaTional leaRning
TheoRies anD TeChnology
Introduction
The purpose of this chapter is to provide readers with an under-
standing of organizational theory. The chapter covers some aspects
of the history and context of organizational learning. It also defines
and explains various learning protocols, and how they can be used to
promote organizational learning. The overall objective of organiza-
tional learning is to support a process that guides individuals, groups,
and entire communities through transformation. Indeed, evidence of
organizational transformation provides the very proof that learning
has occurred, and that changes in behavior are occurring. What is
important in this regard is that transformation remains internal to
the organization so that it can evolve in a progressive manner while
maintaining the valuable knowledge base that is contained within
the personnel of an organization. Thus, the purpose of organiza-
tional learning is to foster evolutionary transformation that will lead
to change in behaviors and that is geared toward improving strategic
performance.
Approaches to organizational learning typically address how indi-
viduals, groups, and organizations “notice and interpret information
and use it to alter their fit with their environments” (Aldrich, 2001,
p. 57). As such, however, organizational learning does not direct itself
toward, and therefore has not been able to show, an inherent link to
success—which is a critical concern for executive management. There
are two perspectives on organizational learning theory. On the one
hand, the adoptive approach, pioneered by Cyert and March (1963),
treats organizations as goal-oriented activity systems. These systems
generate learning when repeating experiences that have either suc-
ceeded or failed, discarding, of course, processes that have failed.
64 INFORMATION TECHNOLOGY
Knowledge development, on the other hand, treats organizations as
sets of interdependent members with shared patterns of cognition and
belief (Argyris & Schö n, 1996). Knowledge development empha-
sizes that learning is not limited to simple trial and error, or direct
experience. Instead, learning is understood also to be inferential and
vicarious; organizations can generate new knowledge through experi-
mentation and creativity. It is the knowledge development perspec-
tive that fits conceptually and empirically with work on technological
evolution and organizational knowledge creation and deployment
(Tushman & Anderson, 1986).
There is a complication in the field of organizational learning over
whether it is a technical or social process. Scholars disagree on this
point. From the technical perspective, organizational learning is
about the effective processing of, interpretation of, and response to
information both inside and outside the organization. “An organiza-
tion is assumed to learn if any of its units acquires knowledge that it
recognizes as potentially useful to the organization” (Huber, 1991,
p. 89). From the social perspective, on the other hand, comes the con-
cept that learning is “something that takes place not with the heads of
individuals, but in the interaction between people” (Easterby-Smith
et al., 1999, p. 6). The social approach draws from the notion that
patterns of behavior are developed, via patterns of socialization, by
evolving tacit knowledge and skills. There is, regrettably, a lack of
ongoing empirical investigation in the area of organizational learning
pertaining, for example, to in-depth case studies, to micropractices
within organizational settings, and to processes that lead to outcomes.
Indeed, measuring learning is a difficult process, which is why there
is a lack of research that focuses on outputs. As Prange (1999, p. 24)
notes: “The multitude of ways in which organizational learning has
been classified and used purports an ‘organizational learning jungle,’
which is becoming progressively dense and impenetrable.” Mackenzie
(1994, p. 251) laments that what the “scientific community devoted
to organizational learning has not produced discernable intellectual
progress.”
Ultimately, organizational learning must provide transformation
that links to performance. Most organizations seeking improved per-
formance expect changes that will support new outcomes. The study of
organizational learning needs an overarching framework under which
65orGAnIzAtIonAl leArnInG theorIes
an inquiry into the pivotal issues surrounding organizational change
can be organized. Frameworks that support organizational learning,
whether their orientation is on individuals, groups, or infrastructure,
need to allow for natural evolution within acceptable time frames for
the organization. This is the problem of organizational learning the-
ory. It lacks a method of producing measurable results that executives
can link to performance. While scholars seek outcomes through stra-
tegic learning, there must be tangible evidence of individual and orga-
nizational performance to ensure future investments in the concepts
of learning. Technology, we should remember, represents the oppor-
tunity to provide outcomes through strategic learning that addresses
transitions and transformations over a specific life cycle.
We saw this opportunity occur in the Ravell case study; the
information technology (IT) department used organizational learn-
ing. Specifically, individual reflective practices were used to provide
measurable outcomes for the organization. In this case, the out-
comes related to a specific event, the physical move of the business
to a different location. Another lesson we can derive (with hindsight)
from the Ravell experience is that learning was converted to strategic
benefit for the organization. The concept of converting learning to
strategic benefit was pioneered by Pietersen (2002). He established a
strategic learning cycle composed of four component processes that he
identified with the action verbs learn, focus, align, and execute. These
are stages in the learning cycle, as follows:
1. Learn: Conduct a situation analysis to generate insights into
the competitive environment and into the realities of the
company.
2. Focus: Translate insights into a winning proposition that out-
lines key priorities for success.
3. Align: Align the organization and energize the people behind
the new strategic focus.
4. Execute: Implement strategy and experiment with new con-
cepts. Interpret results and continue the cycle.
At Ravell, technology assisted in driving the learning cycle because,
by its dynamic nature, it mandated the acceleration of the cycle that
Pietersen (2002) describes in his stage strategy of implementation.
Thus, Ravell required the process Pietersen outlined to occur within
66 INFORMATION TECHNOLOGY
6 months, and therein established the opportunity to provide outcomes.
It also altered the culture of the organization (i.e., the evolution in cul-
ture was tangible because the transformation was concrete).
We see from the Ravell case that technology represents the best
opportunity to apply organizational learning techniques because the
use of it requires forms of evolutionary-related change. Organizations
are continually seeking to improve their operations and competi-
tive advantage through efficiency and effective processes. As I have
discussed in previous chapters, today’s businesses are experiencing
technological dynamism (defined as causing accelerated and dynamic
transformations), and this is due to the advent of technologically driven
processes. That is, organizations are experiencing more pressure to
change and compete as a result of the accelerations that technology
has brought about. Things happen quicker, and more unpredictably,
than before. This situation requires organizations to sense the need for
change and execute that change. The solution I propose is to tie orga-
nizational theory to technological implementation. Another way of
defining this issue is to provide an overarching framework that orga-
nizes an inquiry into the issues surrounding organizational change.
Another dimension of organizational learning is political. Argyris
(1993) and Senge (1990) argue that politics gets “in the way of good
learning.” In my view, however, the political dimension is very much
part of learning. It seems naï ve to assume that politics can be elimi-
nated from the daily commerce of organizational communication.
Instead, it needs to be incorporated as a factor in organizational learn-
ing theory rather than attempting to disavow or eliminate it, which is
not realistic. Ravell also revealed that political factors are simply part
of the learning process. Recall that during my initial efforts to create
a learning organization there were IT staff members who deliberately
refused to cooperate, assuming that they could “outlast” me in my
interim tenure as IT director. But politics, of course, is not limited to
internal department negotiations; it was also a factor at Ravell with,
and among, departments outside IT. These interdepartmental rela-
tionships applied especially to line managers, who became essential
advocates for establishing and sustaining necessary forms of learning
at the organizational level. But, not all line managers responded with
the same enthusiasm, and a number of them did not display a sense of
authentically caring about facilitating synergies across departments.
67orGAnIzAtIonAl leArnInG theorIes
The irrepressible existence of politics in social organizations, however,
must not in itself deter us from implementing organizational learn-
ing practices; it simply means that that we must factor it in as part
of the equation. At Ravell, I had to work within the constraints of
both internal and external politics. Nevertheless, in the end I was able
to accomplish the creation of a learning organization. Another way
one might look at the road bumps of politics is to assume that they
will temporarily delay or slow the implementation of organizational
learning initiatives. But, let us make no mistake about the potentially
disruptive nature of politics because, as we know, in its extreme cases
of inflexibility, it can be damaging.
I have always equated politics with the dilemma of blood cholesterol.
We know that there are two types of cholesterol: “good” cholesterol
and “bad” cholesterol. We all know that bad cholesterol in your blood
can cause heart disease, among other life-threatening conditions.
However, good cholesterol is essential to the body. My point is simple;
the general word politics can have damaging perceptions. When most
people discuss the topic of cholesterol, they focus on the bad type, not
the good. Such is the same with politics—that is, most individuals dis-
cuss the bad type, which often corresponds with their personal expe-
riences. My colleague Professor Lyle Yorks, at Columbia University,
often lectures on the importance of politics and its positive aspects for
establishing strategic advocacy, defined as the ability to establish per-
sonal and functional influence through cultivating alliances through
defining opportunities for the adding value to either the top or bottom
line (Langer & Yorks, 2013). Thus, politics can add value for indi-
viduals by allowing them to initiate and influence relationships and
conversations with other leaders. This, then, is “good” politics!
North American cultural norms account for much of what goes
into organizational learning theory, such as individualism, an empha-
sis on rationality, and the importance of explicit, empirical informa-
tion. IT, on the other hand, has a broadening, globalizing effect on
organizational learning because of the sheer increase in the number of
multicultural organizations created through the expansion of global
firms. Thus, technology also affects the social aspects of organizational
learning, particularly as it relates to the cultural evolution of commu-
nities. Furthermore, technology has shown us that what works in one
culture may not work in another. Dana Deasy, the former CIO of the
68 INFORMATION TECHNOLOGY
Americas region/sector for Siemens AG, experienced the difficulties
and challenges of introducing technology standards on a global scale.
He quickly learned that what worked in North America did not oper-
ate with the same expectations in Asia or South America. I discuss
Siemens AG as a case study in Chapter 8.
It is my contention, however, that technology can be used as an
intervention that can actually increase organizational learning. In
effect, the implementation of organizational learning has lacked and
has needed concrete systemic processes that show results. A solution
to this need can be found, as I have found it, in the incorporation of
IT itself into the process of true organizational learning. The prob-
lem with IT is that we keep trying to simplify it—trying to reduce
its complexity. However, dealing with the what, when, and how of
working with technology is complex. Organizations need a kind of
mechanism that can provide a way to absorb and learn all of the com-
plex pieces of technology.
It is my position that organizational change often follows learn-
ing, which to some extent should be expected. What controls whether
change is radical or evolutionary depends on the basis on which
new processes are created (Argyris & Schö n, 1996; Senge, 1990;
Swieringa & Wierdsma, 1992). Indeed, at Ravell the learning fol-
lowed the Argyris and Schö n approach: that radical change occurs
when there are major events that support the need for accelerated
change. In other words, critical events become catalysts that promote
change, through reflection. On the other hand, there can be non-
event-related learning, that is not so much radical in nature, as it is
evolutionary. Thus, evolutionary learning is characterized as an ongo-
ing process that slowly establishes the need for change over time. This
evolutionary learning process compares to what Senge (1990, p. 15)
describes as “learning in wholes as opposed to pieces.”
This concept of learning is different from an event-driven perspec-
tive, and it supports the natural tendency that groups and organiza-
tions have to protect themselves from open confrontation and critique.
However, technology provides an interesting variable in this regard.
It is generally accepted as an agent of change that must be addressed
by the organization. I believe that this agency can be seized as an
opportunity to promote such change because it establishes a reason
why organizations need to deal with the inevitable transitions brought
69orGAnIzAtIonAl leArnInG theorIes
about by technology. Furthermore, as Huysman (1999) points out, the
history of organizational learning has not often created measurable
improvement, particularly because implementing the theories has not
always been efficient or effective. Much of the impetus for implement-
ing a new technology, however, is based on the premise that its use
will result in such benefits. Therefore, technology provides compelling
reasons for why organizational learning is important: to understand
how to deal with agents of change, and to provide ongoing changes in
the processes that improve competitive advantage.
There is another intrinsic issue here. Uses of technology have not
always resulted in efficient and effective outcomes, particularly as
they relate to a firm’s expected ROI. In fact, IT projects often cost
more than expected and tend to be delivered late. Indeed, research
performed by the Gartner Group and CIO Magazine (Koch, 1999)
reports that 54% of IT projects are late and that 22% are never com-
pleted. In May 2009, McGraw reported similar trends, so industry
performance has not materially improved. This is certainly a disturb-
ing statistic for a dynamic variable of change that promises outcomes
of improved efficiency and effectiveness. The question then is why is
this occurring? Many scholars might consider the answer to this ques-
tion as complex. It is my claim, however, based on my own research,
that the lack of organizational learning, both within IT and within
other departments, poses, perhaps, the most significant barrier to the
success of these projects in terms of timeliness and completion. Langer
(2001b) suggests that the inability of IT organizations to understand
how to deal with larger communities within the organization and to
establish realistic and measurable outcomes are relevant both to many
of the core values of organizational learning and to its importance in
attaining results. What better opportunity is there to combine the
strengths and weaknesses of each of IT and organizational learning?
Perhaps what is most interesting—and, in many ways, lacking
within the literature on organizational learning—is the actual way
individuals learn. To address organizational learning, I believe it is
imperative to address the learning styles of individuals within the
organization. One fundamental consideration to take into account
is that of individual turnover within departments. Thus, methods
to measure or understand organizational learning must incorporate
the individual; how the individual learns, and what occurs when
70 INFORMATION TECHNOLOGY
individuals change positions or leave, as opposed to solely focusing
on the event-driven aspect of evolutionary learning. There are two
sociological positions about how individual learning occurs. The first
suggests that individual action derives from determining influences
in the social system, and the other suggests that it emanates from
individual action. The former proposition supports the concept that
learning occurs at the organizational, or group level, and the lat-
ter supports it at the individual level of action and experience. The
“system” argument focuses on learning within the organization as a
whole and claims that individual action functions within its boundar-
ies. The “individual” argument claims that learning emanates from
the individual first and affects the system as a result of outcomes from
individual actions. Determining a balance between individual and
organizational learning is an issue debated by scholars and an impor-
tant one that this book must address.
Why is this issue relevant to the topic of IT and organizational
learning? Simply put, understanding the nature of evolving technolo-
gies requires that learning—and subsequent learning outcomes—will
be heavily affected by the processes in which it is delivered. Therefore,
without understanding the dynamics of how individuals and organi-
zations learn, new technologies may be difficult to assimilate because
of a lack of process that can determine how they can be best used in
the business. What is most important to recognize is the way in which
responsive organizational dynamism (ROD) needs both the system
and individual approaches. Huysman (1999) suggests (and I agree)
that organizational versus individual belief systems are not mutually
exclusive pairs but dualities. In this way, organizational processes are
not seen as just top-down or bottom-up affairs, but as accumulations
of history, assimilated in organizational memory, which structures
and positions the agency or capacity for learning. In a similar way,
organizational learning can be seen as occurring through the actions
of individuals, even when they are constrained by institutional forces.
The strategic integration component of ROD lends itself to the system
model of learning to the extent that it almost mandates change—
change that, if not addressed, will inevitably affect the competitive
advantage of the organization. On the other hand, the cultural assim-
ilation component of ROD is also involved because of its effect on
individual behavior. Thus, the ROD model needs to be expanded to
71orGAnIzAtIonAl leArnInG theorIes
show the relationship between individual and organizational learning
as shown in Figure 4.1.
An essential challenge to technology comes from the fact that
organizations are not sure about how to handle its overall potential.
Thus, in a paradoxical way, this quandary provides a springboard to
learning by utilizing organizational learning theories and concepts to
create new knowledge, by learning from experience, and ultimately by
linking technology to learning and performance. This perspective can
be promoted from within the organization because chief executives
are generally open to investing in learning as long as core business
principles are not violated. This position is supported by my research
with chief executives that I discussed in Chapter 2.
Organizational
dynamism
Acceleration of events that
require different
infrastructures and
organizational processes
Requires
Strategic
integration
Cultural
assimilation
Organization
structures
(system)
Individual
actions
Renegotiation of
relationship
Organizational learning techniques
Symptoms and
implications
Technology
Figure 4.1 ROD and organizational learning.
72 INFORMATION TECHNOLOGY
Organizational learning can also assist in the adoption of
technologies by providing a mechanism to help individuals manage
change. This notion is consistent with Aldrich (2001), who observes
that many organizations reject technology-driven changes or “pio-
neering ventures,” which he called competence-destroying ventures
because they threaten existing norms and processes. Organizations
would do well to understand the value of technology, particularly for
those who adopt it early (early adopters), and how it can lead to com-
petitive advantages. Thus, organizations that position themselves to
evolve, to learn, and to create new knowledge are better prepared to
foster the handling, absorption, and acceptance of technology-driven
change than those that are not. Another way to view this ethic is to
recognize that organizations need to be “ready” to deal with change—
change that is accelerated by technology innovations. Although
Aldrich (2001) notes that organizational learning has not been tied
to performance and success, I believe it will be the technology revolu-
tion that establishes the catalyst that can tie organizational learning
to performance.
The following sections of this chapter expand on the core concept
that the success of ROD is dependent on the uses of organizational
learning techniques. In each section, I correlate this concept to many
of the organizational learning theories and show how they can be
tailored and used to provide important outcomes that assist the pro-
motion of both technological innovation and organizational learning.
Learning Organizations
Business strategists have realized that the ability of an organization
to learn faster, or “better,” than its competitors may indeed be the key
to long-term business success (Collis, 1994; Dodgson, 1993; Grant,
1996; Jones, 1975). A learning organization is defined as a form of
organization that enables, in an active sense, the learning of its mem-
bers in such a way that it creates positive outcomes, such as innovation,
efficiency, improved alignment with the environment, and competi-
tive advantage. As such, a learning organization is one that acquires
knowledge from within. Its evolution, then, is primarily driven by
itself without the need for interference from outside forces. In this
sense, it is a self-perpetuating and self-evolving system of individual
73orGAnIzAtIonAl leArnInG theorIes
and organizational transformations integrated into the daily processes
of the organization. It should be, in effect, a part of normal organiza-
tional behavior. The focus of organizational learning is not so much
on the process of learning but more on the conditions that allow suc-
cessful outcomes to flourish. Learning organization literature draws
from organizational learning theory, particularly as it relates to inter-
ventions based on outcomes. This provides an alternative to social
approaches.
In reviewing these descriptions of what a learning organization
does, and why it is important, we can begin to see that technology may
be one of the few agents that can actually show what learning organi-
zations purport to do. Indeed, Ravell created an evolving population
that became capable of dealing with environmental changes brought
on by technological innovation. The adaptation of these changes
created those positive outcomes and improved efficiencies. Without
organizational learning, specifically the creation of a learning organi-
zation, many innovations brought about by technology could produce
chaos and instability. Organizations generally tend to suffer from, and
spend too much time reflecting on, their past dilemmas. However,
given the recent phenomenon of rapid changes in technology, orga-
nizations can no longer afford the luxury of claiming that there is
simply too much else to do to be constantly worrying about technol-
ogy. Indeed, Lounamaa and March (1987) state that organizations
can no longer support the claim that too-frequent changes will inhibit
learning. The fact is that such changes must be taken as evolutionary,
and as a part of the daily challenges facing any organization. Because
a learning organization is one that creates structure and strategies, it
is positioned to facilitate the learning of all its members, during the
ongoing infiltration of technology-driven agents of change. Boland
et al. (1994) show that information systems based on multimedia
technologies may enhance the appreciation of diverse interpretations
within organizations and, as such, support learning organizations.
Since learning organizations are deliberately created to facilitate the
learning of their members, understanding the urgency of technologi-
cal changes can provide the stimulus to support planned learning.
Many of the techniques used in the Ravell case study were based
on the use of learning organizational techniques, many of which were
pioneered by Argyris and Schö n (1996). Their work focuses on using
74 INFORMATION TECHNOLOGY
“action science” methods to create and maintain learning organiza-
tions. A key component of action science is the use of reflective prac-
tices—including what is commonly known among researchers and
practitioners as reflection in action and reflection on action. Reflection
with action is the term I use as a rubric for these various methods,
involving reflection in relation to activity. Reflection has received
a number of definitions, from different sources in the literature.
Depending on the emphasis, whether on theory or practice, defini-
tions vary from philosophical articulation (Dewey, 1933; Habermas,
1998), to practice-based formulations, such as Kolb’s (1984b) use of
reflection in the experiential learning cycle. Specifically, reflection
with action carries the resonance of Schö n’s (1983) twin constructs:
reflection on action and reflection in action, which emphasize reflec-
tion in retrospect, and reflection to determine which actions to take
in the present or immediate future, respectively. Dewey (1933) and
Hullfish and Smith (1978) also suggest that the use of reflection sup-
ports an implied purpose: individuals reflect for a purpose that leads
to the processing of a useful outcome. This formulation suggests the
possibility of reflection that is future oriented—what we might call
“reflection to action.” These are methodological orientations covered
by the rubric.
Reflective practices are integral to ROD because so many
technology-based projects are event driven and require individu-
als to reflect before, during, and after actions. Most important to
this process is that these reflections are individually driven and that
technology projects tend to accelerate the need for rapid decisions.
In other words, there are more dynamic decisions to be made in less
time. Without operating in the kind of formation that is a learning
organization, IT departments cannot maintain the requisite infra-
structure to develop products timely on time and support business
units—something that clearly is not happening if we look at the
existing lateness of IT projects. With respect to the role of reflec-
tion in general, the process can be individual or organizational.
While groups can reflect, it is in being reflective that individuals
bring about “an orientation to their everyday lives,” according to
Moon (1999). “For others reflection comes about when conditions
in the learning environment are appropriate” (p. 186). However,
IT departments have long suffered from not having the conditions
75orGAnIzAtIonAl leArnInG theorIes
to support such an individual learning environment. This is why
implementing a learning organization is so appealing as a remedy
for a chronic problem.
Communities of Practice
Communities of practice are based on the assumption that learning
starts with engagement in social practice and that this practice is the
fundamental construct by which individuals learn (Wenger, 1998).
Thus, communities of practice are formed to get things done by using
a shared way of pursuing interest. For individuals, this means that
learning is a way of engaging in, and contributing to, the practices
of their communities. For specific communities, on the other hand,
it means that learning is a way of refining their distinctive practices
and ensuring new generations of members. For entire organizations,
it means that learning is an issue of sustaining interconnected com-
munities of practice, which define what an organization knows and
contributes to the business. The notion of communities of practice
supports the idea that learning is an “inevitable part of participat-
ing in social life and practice” (Elkjaer, 1999, p. 75). Communities of
practice also include assisting members of the community, with the
particular focus on improving their skills. This is also known as situ
ated learning. Thus, communities of practice are very much a social
learning theory, as opposed to one that is based solely on the indi-
vidual. Communities of practice have been called learning in working,
in which learning is an inevitable part of working together in a social
setting. Much of this concept implies that learning, in some form or
other will occur, and that it is accomplished within a framework of
social participation, not solely or simply in the individual mind. In a
world that is changing significantly due to technological innovations,
we should recognize the need for organizations, communities, and
individuals to embrace the complexities of being interconnected at an
accelerated pace.
There is much that is useful in the theory of communities of practice
and that justifies its use in ROD. While so much of learning technol-
ogy is event driven and individually learned, it would be shortsighted
to believe that it is the only way learning can occur in an organization.
Furthermore, the enormity and complexity of technology requires a
76 INFORMATION TECHNOLOGY
community focus. This would be especially useful within the confines of
specific departments that are in need of understanding how to deal with
technological dynamism. That is, preparation for using new technolo-
gies cannot be accomplished by waiting for an event to occur. Instead,
preparation can be accomplished by creating a community that can
assess technologies as a part of the normal activities of an organization.
Specifically, this means that, through the infrastructure of a commu-
nity, individuals can determine how they will organize themselves to
operate with emerging technologies, what education they will need, and
what potential strategic integration they will need to prepare for changes
brought on by technology. Action in this context can be viewed as a
continuous process, much in the same way that I have presented technol-
ogy as an ongoing accelerating variable. However, Elkjaer (1999) argues
that the continuous process cannot exist without individual interaction.
As he states: “Both individual and collective activities are grounded in
the past, the present, and the future. Actions and interactions take place
between and among group members and should not be viewed merely as
the actions and interactions of individuals” (p. 82).
Based on this perspective, technology can be handled by the
actions (community) and interactions (individuals) of the organiza-
tion as shown in Figure 4.2.
Communities of practice:
Social actions of how to
deal with technology
Allows groups to engage in
discourse and examine the
ongoing effects on the
department/unit, including
short/long-term education
requirements, skills transfer
and development,
organizational issues,
relationships with other
departments and customers
�e individual interacts with
others and determines new
methods of utilizing
technology within his/her
specific business objectives.
Individuals use reflection as
the basis of transformative
learning.
Event-driven individual-
based learning
Figure 4.2 Technology relationship between communities and individuals.
77orGAnIzAtIonAl leArnInG theorIes
It seems logical that communities of practice provide the mecha-
nism to assist, particularly, with the cultural assimilation component
of ROD. Indeed, cultural assimilation targets the behavior of the
community, and its need to consider what new organizational struc-
tures can better support emerging technologies. I have, in many ways,
already established and presented the challenge of what should be
called the “community of IT practice” and its need to understand how
to restructure to meet the needs of the organization. This is the kind
of issue that does not lend itself to event-driven, individual learning,
but rather to a more community-based process that can deal with the
realignment of departmental relationships.
Essentially, communities of IT practice must allow for the con-
tinuous evolution of learning based on emergent strategies. Emergent
strategies acknowledge unplanned action. Such strategies are defined
as patterns that develop in the absence of intentions (Mintzberg &
Waters, 1985). Emergent strategies can be used to gather groups that
can focus on issues not based on previous plans. These strategies can
be thought of as creative approaches to proactive actions. Indeed, a
frustrating aspect of technology is its uncertainty. Ideas and concepts
borrowed from communities of practice can help departments deal
with the evolutionary aspects of technological dynamism.
The relationship, then, between communities of practice and tech-
nology is significant. Many of the projects involving IT have been tra-
ditionally based on informal processes of learning. While there have
been a number of attempts to computerize knowledge using various
information databases, they have had mixed results. A “structured”
approach to creating knowledge reporting is typically difficult to estab-
lish and maintain. Many IT departments have utilized International
Organization for Standardization (ISO) 9000 concepts. The ISO is
a worldwide organization that defines quality processes through for-
mal structures. It attempts to take knowledge-based information and
transfer it into specific and documented steps that can be evaluated as
they occur. Unfortunately, the ISO 9000 approach, even if realized,
is challenging when such knowledge and procedures are undergoing
constant and unpredictable change. Technological dynamism cre-
ates too many uncertainties to be handled by the extant discourses on
how organizations have dealt with change variables. Communities of
practice provide an umbrella of discourses that are necessary to deal
78 INFORMATION TECHNOLOGY
with ongoing and unpredictable interactions established by emerging
technologies.
Support for this position is found in the fact that technology requires
accumulative collective learning that needs to be tied to social prac-
tices; this way, project plans can be based on learning as a participatory
act. One of the major advantages of communities of practice is that
they can integrate key competencies into the very fabric of the organi-
zation (Lesser et al., 2000). The typical disadvantage of IT is that its
staff needs to serve multiple organizational structures simultaneously.
This requires that priorities be set by the organization. Unfortunately,
it is difficult, if not impossible, for IT departments to establish such
priorities without engaging in concepts of communities of practice that
allow for a more integrated process of negotiation and determination.
Much of the process of communities of practice would be initiated by
strategic integration and result in many cultural assimilation changes;
that is, the process of implementing communities of practice will
necessitate changes in cultural behavior and organization processes.
As stated, communities-of-practice activities can be initiated via
the strategic integration component of ROD. According to Lesser et
al. (2000), a knowledge strategy based on communities of practice
consists of seven basic steps (Table 4.1).
Lesser and Wenger (2000) suggest that communities of practice
are heavily reliant on innovation: “Some strategies rely more on inno-
vation than others for their success. … Once dependence on innova-
tion needs have been clarified, you can work to create new knowledge
where innovation matters” (p. 8). Indeed, electronic communities of
practice are different from physical communities. IT provides another
dimension to how technology affects organizational learning. It does
so by creating new ways in which communities of practice operate. In
the complexity of ways that it affects us, technology has a dichoto-
mous relationship with communities of practice. That is, there is a
two-sided issue: (1) the need for communities of practice to imple-
ment IT projects and integrate them better into learning organiza-
tions, and (2) the expansion of electronic communities of practice
invoked by technology, which can, in turn, assist in organizational
learning, globally and culturally.
The latter issue establishes the fact that a person can now readily
be a member of many electronic communities, and in many different
79orGAnIzAtIonAl leArnInG theorIes
capacities. Electronic communities are different, in that they can
have memberships that are short-lived and transient, forming and
re-forming according to interest, particular tasks, or commonality of
issue. Communities of practice themselves are utilizing technologies
to form multiple and simultaneous relationships. Furthermore, the
growth of international communities resulting from ever-expanding
global economies has created further complexities and dilemmas.
Thus far, I have presented communities of practice as an infra-
structure that can foster the development of organizational learn-
ing to support the existence of technological dynamism. Most of
what I presented has an impact on the cultural assimilation com-
ponent of ROD—that is, affecting organizational structure and the
Table 4.1 Extended Seven Steps of Community of Practice Strategy
STEP COMMUNITIES-OF-PRACTICE STEP TECHNOLOGY EXTENSION
1 Understanding strategic knowledge
needs: What knowledge is critical
to success.
Understanding how technology affects strategic
knowledge, and what specific technological
knowledge is critical to success.
2 Engaging practice domains: People
form communities of practice to
engage in and identify with.
Technology identifies groups, based on
business-related benefits; requires domains to
work together toward measurable results.
3 Developing communities: How to
help key communities reach their
full potential.
Technologies have life cycles that require
communities to continue; treats the life cycle
as a supporter for attaining maturation and
full potential.
4 Working the boundaries: How to link
communities to form broader
learning systems.
Technology life cycles require new boundaries to
be formed. This will link other communities
that were previously outside discussions and
thus, expand input into technology
innovations.
5 Fostering a sense of belonging: How
to engage people’s identities and
sense of belonging.
The process of integrating communities: IT and
other organizational units will create new
evolving cultures that foster belonging as well
as new social identities.
6 Running the business: How to
integrate communities of practice
into running the business of the
organization.
Cultural assimilation provides new
organizational structures that are necessary to
operate communities of practice and to
support new technological innovations.
7 Applying, assessing, reflecting,
renewing: How to deploy knowledge
strategy through waves of
organizational transformation.
The active process of dealing with multiple new
technologies that accelerates the deployment
of knowledge strategy. Emerging technologies
increase the need for organizational
transformation.
80 INFORMATION TECHNOLOGY
way things need to be done. However, technology, particularly the
strategic integration component of ROD, fosters a more expanded
vision of what can represent a community of practice. What does
this mean? Communities of practice, through the advent of strate-
gic integration, have expanded to include electronic communities.
While technology can provide organizations with vast electronic
libraries that end up as storehouses of information, they are only
valuable if they are allowed to be shared within the community.
Although IT has led many companies to imagine a new world of
leveraged knowledge, communities have discovered that just storing
information does not provide for effective and efficient use of knowl-
edge. As a result, many companies have created these “electronic”
communities so that knowledge can be leveraged, especially across
cultures and geographic boundaries. These electronic communities
are predictably more dynamic as a result of what technology pro-
vides to them. The following are examples of what these communi-
ties provide to organizations:
• Transcending boundaries and exchanging knowledge with
internal and external communities. In this circumstance,
communities are extending not only across business units,
but also into communities among various clients—as we
see developing in advanced e-business strategies. Using the
Internet and intranets, communities can foster dynamic inte-
gration of the client, an important participant in competitive
advantage. However, the expansion of an external commu-
nity, due to emergent electronics, creates yet another need for
the implementation of ROD.
• Creating “Internet” or electronic communities as sources
of knowledge (Teigland, 2000), particularly for technical-
oriented employees. These employees are said to form “com-
munities of techies”: technical participants, composed largely
of the IT staff, who have accelerated means to come into con-
tact with business-related issues. In the case of Ravell, I cre-
ated small communities by moving IT staff to allow them to
experience the user’s need; this move is directly related to the
larger, and expanded, ability of using electronic communities
of practice.
81orGAnIzAtIonAl leArnInG theorIes
• Connecting social and workplace communities through
sophisticated networks. This issue links well to the entire
expansion of issues surrounding organizational learning, in
particular, learning organization formation. It enfolds both
the process and the social dialectic issues so important to cre-
ating well-balanced communities of practice that deal with
organizational-level and individual development.
• Integrating teleworkers and non-teleworkers, including the
study of gender and cultural differences. The growth of dis-
tance workers will most likely increase with the maturation of
technological connectivity. Videoconferencing and improved
media interaction through expanded broadband will support
further developments in virtual workplaces. Gender and cul-
ture will continue to become important issues in the expan-
sion of existing models that are currently limited to specific
types of workplace issues. Thus, technology allows for the
“globalization” of organizational learning needs, especially
due to the effects of technological dynamism.
• Assisting in computer-mediated communities. Such media-
tion allows for the management of interaction among com-
munities, of who mediates their communications criteria, and
of who is ultimately responsible for the mediation of issues.
Mature communities of practice will pursue self-mediation.
• Creating “flame” communities. A flame is defined as a lengthy,
often personally insulting, debate in an electronic commu-
nity that provides both positive and negative consequences.
Difference can be linked to strengthening the identification
of common values within a community but requires organiza-
tional maturation that relies more on computerized commu-
nication to improve interpersonal and social factors to avoid
miscommunications (Franco et al., 2000).
• Storing collective knowledge in large-scale libraries and
databases. As Einstein stated: “Knowledge is experience.
Everything else is just information.” Repositories of informa-
tion are not knowledge, and they often inhibit organizations
from sharing important knowledge building blocks that affect
technical, social, managerial, and personal developments that
are critical for learning organizations (McDermott, 2000).
82 INFORMATION TECHNOLOGY
Ultimately, these communities of practice are forming new social
networks, which have established the cornerstone of “global connectiv-
ity, virtual communities, and computer-supported cooperative work”
(Wellman et al., 2000, p. 179). These social networks are creating
new cultural assimilation issues, changing the very nature of the way
organizations deal with and use technology to change how knowledge
develops and is used via communities of practice. It is not, therefore,
that communities of practice are new infrastructure or social forces;
rather, the difference is in the way they communicate. Strategic inte-
gration forces new networks of communication to occur (the IT effect
on communities of practice), and the cultural assimilation component
requires communities of practice to focus on how emerging technolo-
gies are to be adopted and used within the organization.
In sum, what we are finding is that technology creates the need
for new organizations that establish communities of practice. New
members enter the community and help shape its cognitive schemata.
Aldrich (2001) defines cognitive schemata as the “structure that repre-
sents organized knowledge about persons, roles, and events” (p. 148).
This is a significant construct in that it promotes the importance of a
balanced evolutionary behavior among these three areas. Rapid learn-
ing, or organizational knowledge, brought on by technological inno-
vations can actually lessen progress because it can produce premature
closure (March, 1991). Thus, members emerge out of communities of
practice that develop around organizational tasks. They are driven by
technological innovation and need constructs to avoid premature clo-
sure, as well as ongoing evaluation of perceived versus actual realities.
As Brown and Duguid (1991, p. 40) state:
The complex of contradictory forces that put an organization’s assump-
tions and core beliefs in direct conflict with members’ working, learn-
ing, and innovating arises from a thorough misunderstanding of what
working, learning, and innovating are. As a result of such misunder-
standings, many modern processes and technologies, particularly those
designed to downskill, threaten the robust working, learning, and inno-
vating communities and practice of the workplace.
This perspective can be historically justified. We have seen time
and time again how a technology’s original intention is not realized
83orGAnIzAtIonAl leArnInG theorIes
yet still productive. For instance, many uses of e-mail by individuals
were hard to predict. It may be indeed difficult, if not impossible,
to predict the eventual impact of a technology on an organization
and provide competitive advantages. However, based on evolutionary
theories, it may be beneficial to allow technologies to progress from
driver-to-supporter activity. Specifically, this means that communi-
ties of practice can provide the infrastructure to support growth from
individual-centered learning; that is, to a less event-driven process
that can foster systems thinking, especially at the management levels
of the organization. As organizations evolve into what Aldrich (2001)
call “bounded entities,” interaction behind boundaries heightens the
salience of cultural difference. Aldrich’s analysis of knowledge cre-
ation is consistent with what he called an “adaptive organization”—one
that is goal oriented and learns from trial and error (individual-based
learning)—and a “knowledge development” organization (system-
level learning). The latter consists of a set of interdependent members
who share patterns of belief. Such an organization uses inferential and
vicarious learning and generates new knowledge from both experi-
mentation and creativity. Specifically, learning involves sense mak-
ing and builds on the knowledge development of its members. This
becomes critical to ROD, especially in dealing with change driven
by technological innovations. The advantages and challenges of vir-
tual teams and communities of practice are expanded in Chapter 7, in
which I integrate the discussion with the complexities of outsourcing
teams.
Learning Preferences and Experiential Learning
The previous sections of this chapter focused on organizational learn-
ing, particularly two component theories and methods: learning
organizations and communities of practice. Within these two meth-
ods, I also addressed the approaches to learning; that is, learning that
occurs on the individual and the organizational levels. I advocated
the position that both system and individual learning need to be part
of the equation that allows a firm to attain ROD. Notwithstanding
how and when system and individual learning occurs, the investi-
gation of how individuals learn must be a fundamental part of any
theory-to-practice effort, such as the present one. Indeed, whether
84 INFORMATION TECHNOLOGY
one favors a view of learning as occurring on the organizational or
on the individual level (and it occurs on both), we have to recog-
nize that individuals are, ultimately, those who must continue to
learn. Dewey (1933) first explored the concepts and values of what
he called “experiential learning.” This type of learning comes from
the experiences that adults have accrued over the course of their
individual lives. These experiences provide rich and valuable forms
of “literacy,” which must be recognized as important components
to overall learning development. Kolb (1984a) furthered Dewey’s
research and developed an instrument that measures individual
preferences or styles in which adults learn, and how they respond
to day-to-day scenarios and concepts. Kolb’s (1999) Learning Style
Inventory (LSI) instrument allows adults to better understand how
they learn. It helps them understand how to solve problems, work in
teams, manage conflicts, make better career choices, and negotiate
personal and professional relationships. Kolb’s research provided a
basis for comprehending the different ways in which adults prefer to
learn, and it elaborated the distinct advantages of becoming a bal-
anced learner.
The instrument schematizes learning preferences and styles into
four quadrants: concrete experience , reflective observation , abstract con
ceptualization , and active experimentation . Adults who prefer to learn
through concrete experience are those who need to learn through
actual experience, or compare a situation with reality. In reflective
observation, adults prefer to learn by observing others, the world
around them, and what they read. These individuals excel in group
discussions and can effectively reflect on what they see and read.
Abstract conceptualization refers to learning, based on the assimila-
tion of facts and information presented, and read. Those who prefer
to learn by active experimentation do so through a process of evaluat-
ing consequences; they learn by examining the impact of experimen-
tal situations. For any individual, these learning styles often work in
combinations. After classifying an individual’s responses to questions,
Kolb’s instrument determines the nature of these combinations. For
example, an individual can have a learning style in which he or she
prefers to learn from concrete experiences using reflective observation
as opposed to actually “doing” the activity. Figure 4.3 shows Kolb’s
model in the form of a “learning wheel.” The wheel graphically shows
85orGAnIzAtIonAl leArnInG theorIes
an individual’s learning style inventory, reflecting a person’s strengths
and weaknesses with respect to each learning style.
Kolb’s research suggests that learners who are less constrained by
learning preferences within a distinct style are more balanced and are
better learners because they have available to them more dimensions
in which to learn. This is a significant concept; it suggests that adults
who have strong preferences may not be able to learn when faced with
learning environments that do not fit their specific preference. For
example, an adult who prefers group discussion and enjoys reflective
conversation with others may feel uncomfortable in a less interper-
sonal, traditional teaching environment. The importance of Kolb’s
LSI is that it helps adults become aware that such preferences exist.
McCarthy’s (1999) research furthers Kolb’s work by investigating
the relationship between learning preferences and curriculum devel-
opment. Her Learning Type Measure (4Mat) instrument mirrors
and extends the Kolb style quadrants by expressing preferences from
an individual’s perspective on how to best achieve learning. Another
important contribution in McCarthy’s extension of Kolb’s work is the
inclusion of brain function considerations, particularly in terms of
hemisphericity. McCarthy focuses on the cognitive functions asso-
ciated with the right hemisphere (perception) and left hemisphere
(process) of the brain. Her 4Mat system shows how adults, in each
Concrete experience
Abstract
conceptualization
Learns from
hands-on
experience
Observes
concrete
situation and
reflects on its
meaning
Seeks to find
practical uses
for ideas and
theories
Interested in
abstract ideas
and concepts
Active
experimentation
Reflective
observation
Figure 4.3 Kolb’s Learning Style Inventory.
86 INFORMATION TECHNOLOGY
style quadrant, perceive learning with the left hemisphere of the
brain and how it is related to processing in the right hemisphere.
For example, for Type 1 learners (concrete experience and reflective
observation), adults perceive in a concrete way and process in a reflec-
tive way. In other words, these adults prefer to learn by actually doing
a task and then processing the experience by reflecting on what they
experienced during the task. Type 2 learners (reflective observation
and abstract conceptualization), however, perceive a task by abstract
thinking and process it by developing concepts and theories from
their initial ideas. Figure 4.4 shows McCarthy’s rendition of the
Kolb learning wheel.
The practical claim to make here is that practitioners who acquire
an understanding of the concepts of the experiential learning mod-
els will be better able to assist individuals in understanding how
they learn, how to use their learning preferences during times of
Meaning
Con
ce
pts
W
hat?
Skills
How?
W
hy?
Adaptat
ions
If?
Integrate
QIV
QIII
QI
QII
Try Define
Refine Examine
ImageExtend
Counsel
Figure 4.4 McCarthy rendition of the Kolb Learning Wheel.
87orGAnIzAtIonAl leArnInG theorIes
transition, and the importance of developing other dimensions of
learning. The last is particularly useful in developing expertise in
learning from individual reflective practices, learning as a group
in communities of practice, and participating in both individual
transformative learning, and organizational transformations. How,
then, does experiential learning operate within the framework of
organizational learning and technology? This is shown Figure 4.5
in a combined wheel, called the applied individual learning for tech
nology model, which creates a conceptual framework for linking the
technology life cycle with organizational learning and experiential
learning constructs.
Figure 4.5 expands the wheel into two other dimensions. The
first quadrant (QI) represents the feasibility stage of technology. It
requires communities to work together, to ascertain why a particular
technology might be attractive to the organization. This quadrant is
Engaging in the
technology process
Con
ce
ptuali
ze
driv
er
an
d su
ppor
ter
life
cy
cle
sMeas
urem
en
t a
nd
an
aly
sis
–W
hat?
Exploring technology opportunities
Planning and
design–How?
Feasibility–W
hy?
Im
plem
en
tin
g te
ch
nology
Crea
tio
n–W
hat
If?
Action
learning
QIV
QIII
QI
QII
Knowledge
management
Transformative
learning
Communities
of practice
Figure 4.5 Combined applied learning wheel.
88 INFORMATION TECHNOLOGY
best represented by individuals who engage in group discussions to
make better connections from their own experiences. The process
of determining whether a technology is feasible requires integrated
discourse among affected communities, who then can make better
decisions, as opposed to centralized or individual and predetermined
decisions on whether to use a specific technology. During this phase,
individuals need to operate in communities of practice, as the infra-
structure with which to support a democratic process of consensus
building.
The second quadrant (QII) corresponds to measurement and analy-
sis. This operation requires individuals to engage in specific details
to determine and conceptualize driver and supporter life cycles ana-
lytically. Individuals need to examine the specific details to under-
stand “ what” the technology can do, and to reflect on what it means to
them, and their business unit. This analysis is measured with respect
to what the ROI will be, and which driver and supporter functions
will be used. This process requires transformation theory that allows
individuals to perceive and conceptualize which components of the
technology can transform the organization.
Quadrant 3 (QIII), design and planning, defines the “how”
component of the technology life cycle. This process involves explor-
ing technology opportunities after measurement and analysis have
been completed. The process of determining potential uses for
technology requires knowledge of the organization. Specifically, it
needs the abstract concepts developed in QII to be integrated with
tacit knowledge, to then determine possible applications where the
technology can succeed. Thus, knowledge management becomes the
predominant mechanism for translating what has been conceptual-
ized into something explicit (discussed further in Chapter 5).
Quadrant 4 (QIV) represents the implementation-and-creation
step in the technology life cycle. It addresses the hypothetical ques-
tion of “What if?” This process represents the actual implementation
of the technology. Individuals need to engage in action learning tech-
niques, particularly those of reflective practices. The implementation
step in the technology life cycle is heavily dependent on the indi-
vidual. Although there are levels of project management, the essential
aspects of what goes on inside the project very much relies on the
individual performances of the workers.
89orGAnIzAtIonAl leArnInG theorIes
Social Discourse and the Use of Language
The successful implementation of communities of practice fosters
heavy dependence on social structures. Indeed, without understand-
ing how social discourse and language behave, creating and sustaining
the internal interactions within and among communities of practice
are not possible. In taking individuals as the central component for
continued learning and change in organizations, it becomes impor-
tant to work with development theories that can measure and support
individual growth and can promote maturation with the promotion
of organizational/system thinking (Watkins & Marsick, 1993). Thus,
the basis for establishing a technology-driven world requires the inclu-
sion of linear and circular ways of promoting learning. While there
is much that we will use from reflective action concepts designed by
Argyris and Schö n (1996), it is also crucial to incorporate other theo-
ries, such as marginality, transitions, and individual development.
Senge (1990) also compares learning organizations with engineer-
ing innovation; he calls these engineering innovations “technologies.”
However, he also relates innovation to human behavior and distin-
guishes it as a “discipline.” He defines discipline as “a body of theory
and technique that must be studied and mastered to be put into prac-
tice, as opposed to an enforced order or means of punishment” (p. 10).
A discipline, according to Senge, is a developmental path for acquir-
ing certain skills or competencies. He maintains the concept that cer-
tain individuals have an innate “gift”; however, anyone can develop
proficiency through practice. To practice a discipline is a lifelong
learning process—in contrast to the work of a learning organization.
Practicing a discipline is different from emulating a model. This book
attempts to bring the arenas of discipline and technology into some
form of harmony. What technology offers is a way of addressing the
differences that Senge proclaims in his work. Perhaps this is what is
so interesting and challenging about attempting to apply and under-
stand the complexities of how technology, as an engineering innova-
tion, affects the learning organization discipline—and thereby creates
a new genre of practices. After all, I am not sure that one can master
technology as either an engineering component, or a discipline.
Technology dynamism and ROD expand the context of the glo-
balizing forces that have added to the complexity of analyzing “the
90 INFORMATION TECHNOLOGY
language and symbolic media we employ to describe, represent,
interpret, and theorize what we take to be the facticity of organi-
zational life” (Grant et al., 1998, p. 1). ROD needs to create what
I call the “language of technology.” How do we then incorporate
technology in the process of organizing discourse, or how has tech-
nology affected that process? We know that the concept of dis-
course includes language, talk, stories, and conversations, as well
as the very heart of social life, in general. Organizational discourse
goes beyond what is just spoken; it includes written text and other
informal ways of communication. Unfortunately, the study of dis-
course is seen as being less valuable than action. Indeed, discourse
is seen as a passive activity, while “doing” is seen as supporting
more tangible outcomes. However, technology has increased the
importance of sensemaking media as a means of constructing and
understanding organizational identities. In particular, technology,
specifically the use of e-mail, has added to the instability of lan-
guage, and the ambiguities associated with metaphorical analysis—
that is, meaning making from language as it affects organizational
behavior. Another way of looking at this issue is to study the meta-
phor, as well as the discourse, of technology. Technology is actually
less understood today, a situation that creates even greater reason
than before for understanding its metaphorical status in organiza-
tional discourse—particularly with respect to how technology uses
are interpreted by communities of practice. This is best shown using
the schema of Grant et al. of the relationship between content and
activity and how, through identity, skills, and emotion, it leads to
action (Figure 4.6).
To best understand Figure 4.4 and its application to technology,
it is necessary to understand the links between talk and action. It
is the activity and content of conversations that discursively produce
identities, skills, and emotions, which in turn lead to action. Talk,
in respect to conversation and content, implies both oral and writ-
ten forms of communications, discourse, and language. The written
aspect can obviously include technologically fostered communications
over the Internet. It is then important to examine the unique condi-
tions that technology brings to talk and its corresponding actions.
91orGAnIzAtIonAl leArnInG theorIes
Identity
Individual identities are established in collaborations on a team, or
in being a member of some business committee. Much of the theory
of identity development is related to how individuals see themselves,
particularly within the community in which they operate. Thus, how
active or inactive we are within our communities, shapes how we see
ourselves and how we deal with conversational activity and content.
Empowerment is also an important part of identity. Indeed, being
excluded or unsupported within a community establishes a different
identity from other members of the group and often leads to margin-
ality (Schlossberg, 1989).
Identities are not only individual but also collective, which to
a large extent contributes to cultures of practice within organiza-
tional factions. It is through common membership that a collec-
tive identity can emerge. Identity with the group is critical during
discussions regarding emerging technologies and determining how
they affect the organization. The empowerment of individuals, and
the creation of a collective identity, are therefore important in fos-
tering timely actions that have a consensus among the involved
community.
Skills
Identity
Emotions
Action
Conversational
activity
Conversational
content
Figure 4.6 Grant’s schema— relationship between content and activity.
92 INFORMATION TECHNOLOGY
Skills
According to Hardy et al. (1998, p. 71), conversations are “arenas in
which particular skills are invested with meaning.” Watson (1995)
suggests that conversations not only help individuals acquire “techni-
cal skills” but also help develop other skills, such as being persuasive.
Conversations that are about technology can often be skewed toward
the recognition of those individuals who are most “technologically
talented.” This can be a problem when discourse is limited to who
has the best “credentials” and can often lead to the undervaluing of
social production of valued skills, which can affect decisions that lead
to actions.
Emotion
Given that technology is viewed as a logical and rational field, the
application of emotion is not often considered a factor of action.
Fineman (1996) defines emotion as “personal displays of affected, or
‘moved’ and ‘agitated’ states—such as joy, love, fear, anger, sadness,
shame, embarrassment,”—and points out that these states are socially
constructed phenomena. There is a positive contribution from emo-
tional energy as well as a negative one. The consideration of positive
emotion in the organizational context is important because it drives
action (Hardy et al., 1998). Indeed, action is more emotion than ratio-
nal calculation. Unfortunately, the study of emotions often focuses on
its negative aspects. Emotion, however, is an important part of how
action is established and carried out, and therefore warrants attention
in ROD.
Identity, skills, and emotion are important factors in how talk actu-
ally leads to action. Theories that foster discourse, and its use in orga-
nizations, on the other hand, are built on linear paths of talk and
action. That is, talk can lead to action in a number of predefined paths.
Indeed, talk is typically viewed as “cheap” without action or, as is often
said, “action is valued,” or “action speaks louder than words.” Talk,
from this perspective, constitutes the dynamism of what must occur
with action science, communities of practice, transformative learn-
ing, and, eventually, knowledge creation and management. Action,
by contrast, can be viewed as the measurable outcomes that have been
93orGAnIzAtIonAl leArnInG theorIes
eluding organizational learning scholars. However, not all actions
lead to measurable outcomes. Marshak (1998) established three types
of talk that lead to action: tooltalk , frametalk , and mythopoetictalk :
1. Tooltalk includes “instrumental communities required to:
discuss, conclude, act, and evaluate outcomes” (p. 82). What
is most important in its application is that tool-talk be used to
deal with specific issues for an identified purpose.
2. Frametalk focuses on interpretation to evaluate the mean-
ings of talk. Using frame-talk results in enabling implicit and
explicit assessments, which include symbolic, conscious, pre-
conscious, and contextually subjective dimensions.
3. Mythopoetictalk communicates ideogenic ideas and images
(i.e., myths and cosmologies) that can be used to communicate
the nature of how to apply tool-talk and frame-talk within the
particular culture or society. This type of talk allows for con-
cepts of intuition and ideas for concrete application.
Furthermore, it has been shown that organizational members
experience a difficult and ambiguous relationship, between discourse
that makes sense, and non-sense—what is also known as “the struggle
with sense” (Grant et al., 1998). There are two parts that comprise
non-sense: The first is in the difficulties that individuals experience in
understanding why things occur in organizations, particularly when
their actions “make no sense.” Much of this difficulty can be cor-
related with political issues that create “nonlearning” organizations.
However, the second condition of non-sense is more applicable, and
more important, to the study of ROD than the first—that is, non-
sense associated with acceleration in the organizational change pro-
cess. This area comes from the taken-for-granted assumptions about
the realities of how the organization operates, as opposed to how it can
operate. Studies performed by Wallemacq and Sims (1998) provide
examples of how organizational interventions can decompose stories
about non-sense and replace them with new stories that better address
a new situation and can make sense of why change is needed. This
phenomenon is critical to changes established, or responded to, by the
advent of new technologies. Indeed, technology has many nonsensi-
cal or false generalizations regarding how long it takes to implement
a product, what might be the expected outcomes, and so on. Given
94 INFORMATION TECHNOLOGY
the need for ROD—due to the advent of technology—there is a con-
comitant need to reexamine “old stories” so that the necessary change
agents can be assessed and put into practice. Ultimately, the challenge
set forth by Wallemacq and Sims is especially relevant, and critical,
since the very definition of ROD suggests that communities need
to accelerate the creation of new stories—stories that will occur at
unpredictable intervals. Thus, the link between discourse, organiza-
tional learning, and technology is critical to providing ways in which
to deal with individuals and organizations facing the challenge of
changing and evolving.
Grant’s (1996) research shows that sense making using media and
stories provided effective ways of constructing and understanding
organizational identities. Technology affects discourse in a similar
way that it affects communities of practice; that is, it is a variable that
affects the way discourse is used for organizational evolution. It also
provides new vehicles on how such discourse can occur. However, it is
important not to limit discourse analysis to merely being about “texts,”
emotion, stories, or conversations in organizations. Discourse analysis
examines “the constructing, situating, facilitating, and communicat-
ing of diverse cultural, instrumental, political, and socio-economic
parameters of ‘organizational being’” (Grant, 1996, p. 12). Hence,
discourse is the essential component of every organizational learn-
ing effort. Technology accelerates the need for such discourse, and
language, in becoming a more important part of the learning matura-
tion process, especially in relation to “system” thinking and learning.
I propose then, as part of a move toward ROD, that discourse theories
must be integrated with technological innovation and be part of the
maturation in technology and in organizational learning.
The overarching question is how to apply these theories of dis-
course and language to learning within the ROD framework and par-
adigm. First, let us consider the containers of types of talk discussed
by Marshak (1998) as shown in Figure 4.7.
These types of talk can be mapped onto the technology wheel, so that
the most appropriate oral and written behaviors can be set forth within
each quadrant, and development life cycle, as shown in Figure 4.8.
Mythopoetic-talk is most appropriate in Quadrant 1 (QI), where
the fundamental ideas and issues can be discussed in communities of
practice. These technological ideas and concepts, deemed feasible, are
95orGAnIzAtIonAl leArnInG theorIes
then analyzed through frame-talk, by which the technology can be
evaluated in terms of how it meets the fundamental premises estab-
lished in QI. Frame-talk also reinforces the conceptual legitimacy
of how technology will transform the organization while provid-
ing appropriate ROI. Tool-talk represents the process of identifying
applications and actually implementing them. For this reason, tool-
talk exists in both QIII and QIV. The former quadrant represents
Mythopoetic-talk: Ideogenic
Frame-talk: Interpretive
Tool-talk: Instrumental
Figure 4.7 Marshak’s type of talk containers.
Planning and design–How?
Im
plem
en
tat
ion–W
hat
If?
Tool-talk: Doing
using reflective
practices
QIV
QIII
QI
QII
Tool-talk:
Discuss-decide:
Knowledge
management
Frame-talk:
Transformative
Mythopoetic-
talk: Ground
ideas using
communities of
practice
Feasibility–W
hy?
Meas
urem
en
t a
nd an
aly
sis
–W
hat?
Figure 4.8 Marshak’s model mapped to the technology learning wheel.
96 INFORMATION TECHNOLOGY
the discussion-to-decision portion, and the latter represents the actual
doing and completion of the project itself. In QIII, table-talk requires
knowledge management to transition technology concepts into real
options. QIV transforms these real options into actual projects, in
which, reflecting on actual practices during implementation, provides
an opportunity for individual- and organizational-level learning.
Marshak’s (1998) concept of containers and cycles of talk and
action are adapted and integrated with cyclical and linear matu-
rity models of learning. However, discourse and language must
be linked to performance, which is why it needs to be part of the
discourse and language-learning wheel. By integrating discourse
and language into the wheel, individual and group activities can
use discourse and language as part of ref lective practices to create
an environment that can foster action that leads to measurable
outcomes. This process, as explained throughout this book, is of
paramount importance in understanding how discourse operates
with ROD in the information age.
Linear Development in Learning Approaches
Focusing only on the role of the individual in the company is an incom-
plete approach to formulating an effective learning program. There is
another dimension to consider that is based on learning maturation.
That is, where in the life cycle of learning are the individuals and the
organization? The best explanation of this concept is the learning mat-
uration experience at Ravell. During my initial consultation at Ravell,
the organization was at a very early stage of organizational learning.
This was evidenced by the dependence of the organization on event-
driven and individual reflective practice learning. Technology acted
as an accelerator of learning—it required IT to design a new network
during the relocation of the company. Specifically, the acceleration,
operationalized by a physical move, required IT to establish new rela-
tionships with line management. The initial case study concluded that
there was a cultural change as a result of these new relationships—
cultural assimilation started to occur using organizational learning
techniques, specifically reflective practices.
After I left Ravell, another phase in the evolution of the company
took place. A new IT director was hired in my stead, who attempted
97orGAnIzAtIonAl leArnInG theorIes
to reinstate the old culture: centralized infrastructure, stated opera-
tional boundaries, and separations that mandated anti-learning orga-
nizational behaviors. After six months, the line managers, faced with
having to revert back to a former operating culture, revolted and
demanded the removal of the IT director. This outcome, regrettable
as it may be, is critical in proving the conclusion of the original study
that the culture at Ravell had indeed evolved from its state, at the time
of my arrival. The following are two concrete examples that support
this notion:
1. The attempt of the new IT director to “roll back” the process
to a former cultural state was unsuccessful, showing that a
new evolving culture had indeed occurred.
2. Line managers came together from the established learning
organization to deliver a concerted message to the execu-
tive team. Much of their learning had now shifted to a social
organization level that was based less on events and was
more holistic with respect to the goals and objectives of the
organization.
Thus, we see a shift from an individual-based learning process
to one that is based more on the social and organizational issues to
stimulate transformation. This transformation in learning method
occurred within the same management team, suggesting that changes
in learning do occur over time and from experience. Another way of
viewing the phenomenon is to see Ravell as reaching the next level of
organizational learning or maturation with learning. Consistent with
the conclusion of the original study, technology served to accelerate
the process of change or accelerate the maturation process of organi-
zational learning.
Another phase (Phase II) of Ravell transpired after I returned
to the company. I determined at that time that the IT department
needed to be integrated with another technology-based part of the
business—the unit responsible for media and engineering services
(as opposed to IT). While I had suggested this combination eight
months earlier, the organization had not reached the learning matu-
ration to understand why such a combination was beneficial. Much
of the reason it did not occur earlier, can also be attributed to the
organization’s inability to manage ROD, which, if implemented,
98 INFORMATION TECHNOLOGY
would have made the integration more obvious. The initial Ravell
study served to bring forth the challenges of cultural assimilation,
to the extent that the organization needed to reorganize itself and
change its behavior. In phase II, the learning process matured by
accelerating the need for structural change in the actual reporting
processes of IT.
A year later, yet another learning maturation phase (phase III)
occurred. In Ravell, Phase III, the next stage of learning matura-
tion, allowed the firm to better manage ROD. After completing
the merger of the two technically related business units discussed
(phase II), it became necessary to move a core database depart-
ment completely out of the combined technology department, and
to integrate it with a business unit. The reason for this change was
compelling and brought to light a shortfall in my conclusions from
the initial study. It appears that as organizational learning matures
within ROD, there is an increasing need to educate the executive
management team of the organization. This was not the case during
the early stages of the case study. The limitation of my work, then,
was that I predominantly interfaced with line management and
neglected to include executives in the learning. During that time,
results were encouraging, so there was little reason for me to include
executives in event-driven issues, as discussed. Unfortunately, lack-
ing their participation fostered a disconnection with the strategic
integration component of ROD. Not participating in ROD created
executive ignorance of the importance that IT had on the strategy of
the business. Their lack of knowledge resulted in chronic problems
with understanding the relationship and value of IT on the business
units of the organization. This shortcoming resulted in continued
conflicts over investments in the IT organization. It ultimately left
IT with the inability to defend many of its cost requirements. As
stated, during times of economic downturns, firms tend to reduce
support organizations. In other words, executive management did
not understand the driver component of IT.
After the move of the cohort of database developers to a formal
business line unit, the driver components of the group provided
the dialogue and support necessary to educate executives. However,
this education did not occur based on events, but rather, on using
the social and group dynamics of organizational learning. We see
99orGAnIzAtIonAl leArnInG theorIes
here another aspect of how organizational and individual learning
methods work together, but evolve in a specific way, as summarized
in Table 4.2.
Another way of representing the relationship between individual
and organizational learning over time is to chart a “maturity” arc
to illustrate the evolutionary life cycle of technology and organiza-
tional learning. I call this arc the ROD arc. The arc is designed to
assess individual development in four distinct sectors of ROD, each
in relation to five developmental stages of organizational learning.
Thus, each sector of ROD can be measured in a linear and inte-
grated way. Each stage in the course of the learning development
Table 4.2 Analysis of Ravell’s Maturation with Technology
LEARNING PHASE I PHASE II PHASE III
Type of learning Individual reflective
practices used to
establish
operations and
line management.
Line managers
defend new culture
and participate in
less event-driven
learning.
Movement away from holistic
formation of IT, into
separate driver and
supporter attributes.
Learning approaches are
integrated using both
individual and
organizational methods, and
are based on functionality
as opposed to being
organizationally specific.
Learning
outcomes
Early stage of
learning
organization
development.
Combination of
event-driven and
early-stage social
organizational
learning
formation.
Movement toward social-
based organizational
decision making, relative to
the different uses of
technology.
Responsive
organizational
dynamism:
cultural
assimilation.
Established new
culture; no change
in organizational
structure.
Cultural
assimilation
stability with
existing structures;
early phase of IT
organizational
integration with
similar groups.
Mature use of cultural
assimilation, based on IT
behaviors (drivers and
supporters).
Responsive
organizational
dynamism:
Strategic
integration.
Limited integration
due to lack of
executive
involvement.
Early stages of
value/needs based
on similar
strategic
alignment.
Social structures emphasize
strategic integration based
on business needs.
100 INFORMATION TECHNOLOGY
of an organization reflects an underlying principle that guides the
process of ROD norms and behaviors; specifically, it guides orga-
nizations in how they view and use the ROD components available
to them.
The arc is a classificatory scheme that identifies progressive
stages in the assimilated uses of ROD. It reflects the perspective—
paralleling Knefelkamp’s (1999) research—that individuals in an
organization are able to move through complex levels of thinking,
and to develop independence of thought and judgment, as their
careers progress within the management structures available to
them. Indeed, assimilation to learning at specific levels of opera-
tions and management are not necessarily an achievable end but
one that fits into the psychological perspective of what productive
employees can be taught about ROD adaptability. Figure 4.9 illus-
trates the two axes of the arc.
The profile of an individual who assimilates the norms of ROD
can be characterized in five developmental stages (vertical axis)
along four sectors of literacy (horizontal axis). The arc character-
izes an individual at a specific level in the organization. At each
level, the arc identifies individual maturity with ROD, specifically
strategic integration, cultural assimilation, and the type of learning
process (i.e., individual vs. organizational). The arc shows how each
tier integrates with another, what types of organizational learning
theory best apply, and who needs to be the primary driver within
the organization. Thus, the arc provides an organizational schema
for how each conceptual component of organizational learning
applies to each sector of ROD. It also identifies and constructs a
path for those individuals who want to advance in organizational
rank; that is, it can be used to ascertain an individual’s ability to
cope with ROD requirements as a precursor for advancement in
management. Each position within a sector, or cell, represents a
specific stage of development within ROD. Each cell contains spe-
cific definitions that can be used to identify developmental stages
of ROD and organizational learning maturation. Figure 4.10 rep-
resents the ROD arc with its cell definitions. The five stages of the
arc are outlined as follows:
101orGAnIzAtIonAl leArnInG theorIes
St
ra
te
gi
c
in
te
gr
at
io
n
Se
ct
or
s o
f r
es
po
ns
iv
e
or
ga
ni
za
tio
na
l d
yn
am
ism
O
pe
ra
tio
na
l
kn
ow
le
dg
e
D
ep
ar
tm
en
t/
un
it
vi
ew
a
s o
th
er
In
te
gr
at
ed
di
sp
os
iti
on
St
ab
le
o
pe
ra
tio
ns
O
rg
an
iz
at
io
na
l
le
ad
er
sh
ip
O
rg
an
iz
at
io
na
l l
ea
rn
in
g
co
ns
tr
uc
ts
M
an
ag
em
en
t l
ev
el
C
ul
tu
ra
l a
ss
im
ila
tio
n
Fi
gu
re
4
.9
Re
fle
ct
iv
e
or
ga
ni
za
tio
na
l d
yn
am
is
m
a
rc
m
od
el
.
102 INFORMATION TECHNOLOGY
Se
ct
or
v
ar
ia
bl
e
O
pe
ra
tio
na
l
kn
ow
le
dg
e
D
ep
ar
tm
en
t/
un
it
vi
ew
as
o
th
er
In
te
gr
at
ed
d
isp
os
iti
on
St
ab
le
o
pe
ra
tio
ns
O
rg
an
iz
at
io
na
l
le
ad
er
sh
ip
St
ra
te
gi
c
in
te
gr
at
io
n
O
pe
ra
tio
ns
p
er
so
nn
el
un
de
rs
ta
nd
th
at
te
ch
no
lo
gy
h
as
a
n
im
pa
ct
o
n
st
ra
te
gi
c
de
ve
lo
pm
en
t,
pa
rt
ic
ul
ar
ly
o
n
ex
ist
in
g
pr
oc
es
se
s
Vi
ew
th
at
te
ch
no
lo
gy
ca
n
an
d
w
ill
a
ffe
ct
th
e
w
ay
th
e
or
ga
ni
za
tio
n
op
er
at
es
a
nd
th
at
it
ca
n
aff
ec
t r
ol
es
a
nd
re
sp
on
sib
ili
tie
s
C
ha
ng
es
b
ro
ug
ht
fo
rt
h
by
te
ch
no
lo
gy
n
ee
d
to
b
e
as
sim
ila
te
d
in
to
de
pa
rt
m
en
ts
a
nd
a
re
de
pe
nd
en
t o
n
ho
w
o
th
er
s
pa
rt
ic
ip
at
e
U
nd
er
st
an
ds
n
ee
d
fo
r
or
ga
ni
za
tio
na
l c
ha
ng
es
;
di
ffe
re
nt
c
ul
tu
ra
l
be
ha
vi
or
n
ew
st
ru
ct
ur
es
ar
e
se
en
a
s v
ia
bl
e
so
lu
tio
ns
O
rg
an
iz
at
io
na
l c
ha
ng
es
ar
e
co
m
pl
et
ed
a
nd
in
op
er
at
io
n;
e
xi
st
en
ce
o
f
ne
w
o
r m
od
ifi
ed
em
pl
oy
ee
p
os
iti
on
s
D
ep
ar
tm
en
t-
le
ve
l
or
ga
ni
za
tio
na
l c
ha
ng
es
an
d
cu
ltu
ra
l e
vo
lu
tio
n
ar
e
in
te
gr
at
ed
w
ith
or
ga
ni
za
tio
n-
w
id
e
fu
nc
tio
ns
a
nd
c
ul
tu
re
s
In
di
vi
du
al
b
el
ie
fs
o
f
st
ra
te
gi
c
im
pa
ct
a
re
in
co
m
pl
et
e;
in
di
vi
du
al
ne
ed
s t
o
in
co
rp
or
at
e
ot
he
r v
ie
w
s w
ith
in
th
e
de
pa
rt
m
en
t o
r b
us
in
es
s
un
it
Re
co
gn
iti
on
th
at
in
di
vi
du
al
an
d
de
pa
rt
m
en
t v
ie
w
s
m
us
t b
e
in
te
gr
at
ed
to
b
e
co
m
pl
et
e
an
d
st
ra
te
gi
ca
lly
p
ro
du
ct
iv
e
fo
r t
he
d
ep
ar
tm
en
t/
un
it
C
ha
ng
es
m
ad
e
to
pr
oc
es
se
s a
t t
he
de
pa
rt
m
en
t/
un
it
le
ve
l
fo
rm
al
ly
in
co
rp
or
at
e
em
er
gi
ng
te
ch
no
lo
gi
es
D
ep
ar
tm
en
ta
l s
tr
at
eg
ie
s
ar
e
pr
op
ag
at
ed
a
nd
in
te
gr
at
ed
a
t
or
ga
ni
za
tio
n
le
ve
l
C
ul
tu
ra
l
as
sim
ila
tio
n
O
rg
an
iz
at
io
na
l
le
ar
ni
ng
c
on
st
ru
ct
s
In
di
vi
du
al
-b
as
ed
re
fle
ct
iv
e
pr
ac
tic
e
Sm
al
l g
ro
up
-b
as
ed
re
fle
ct
iv
e
pr
ac
tic
es
In
te
ra
ct
iv
e
w
ith
b
ot
h
in
di
vi
du
al
a
nd
m
id
dl
e
m
an
ag
em
en
t u
sin
g
co
m
m
un
iti
es
o
f p
ra
ct
ic
e
In
te
ra
ct
iv
e
be
tw
ee
n
m
id
dl
e
m
an
ag
em
en
t a
nd
ex
ec
ut
iv
es
u
sin
g
so
ci
al
di
sc
ou
rs
e
m
et
ho
ds
to
pr
om
ot
e
tr
an
sf
or
m
at
io
n
O
rg
an
iz
at
io
na
l l
ea
rn
in
g
at
ex
ec
ut
iv
e
le
ve
l u
sin
g
kn
ow
le
dg
e
m
an
ag
em
en
t
M
an
ag
em
en
t l
ev
el
O
pe
ra
tio
ns
O
pe
ra
tio
n
an
d
m
id
dl
e
m
an
ag
em
en
t
M
id
dl
e
m
an
ag
em
en
t
M
id
dl
e
m
an
ag
em
en
t a
nd
ex
ec
ut
iv
e
Ex
ec
ut
iv
e
Fi
gu
re
4
.1
0
Re
sp
on
si
ve
o
rg
an
iza
tio
na
l d
yn
am
is
m
a
rc
.
103orGAnIzAtIonAl leArnInG theorIes
1. Operational knowledge: Represents the capacity to learn, con-
ceptualize, and articulate key issues relating to how technology
can have an impact on existing processes and organizational
structure. Organizational learning is accomplished through
individual learning actions, particularly reflective practices.
This stage typically is the focus for operations personnel, who
are usually focused on their personal perspectives of how
technology affects their daily activities.
2. Department/unit view as other : Indicates the ability to inte-
grate points of view about using technology from diverse indi-
viduals within the department or business unit. Using these
new perspectives, the individual is in position to augment
his or her understanding of technology and relate it to others
within the unit. Operations personnel participate in small-
group learning activities, using reflective practices. Lower
levels of middle managers participate in organizational learn-
ing that is in transition, from purely individual to group-level
thinking.
3. Integrated disposition : Recognizes that individual and depart-
mental views on using technology need to be integrated to
form effective business unit objectives. Understanding that
organizational and cultural shifts need to include all mem-
ber perspectives, before formulating departmental decisions,
organizational learning is integrated with middle managers,
using communities of practice at the department level.
4. Stable operations : Develops in relation to competence in sec-
tors of ROD appropriate for performing job duties for emerg-
ing technologies, not merely adequately, but competitively,
with peers and higher-ranking employees in the organization.
Organizational learning occurs at the organizational level
and uses forms of social discourse to support organizational
transformation.
5. Organizational leadership : Ability to apply sectors of ROD to
multiple aspects of the organization. Department concepts
can be propagated to organizational levels, including strate-
gic and cultural shifts, relating to technology opportunities.
Organizational learning occurs using methods of knowledge
management with executive support. Individuals use their
104 INFORMATION TECHNOLOGY
technology knowledge for creative purposes. They are will-
ing to take risks using critical discernment and what Heath
(1968) calls “freed” decision making.
The ROD arc addresses both individual and organizational
learning. There are aspects of Senge’s (1990) “organizational”
approach that are important and applicable to this model. I
have mentioned its appropriateness in regard to the level of the
manager— suggesting that the more senior manager is better posi-
tioned to deal with nonevent learning practices. However, there is
yet another dimension within each stage of matured learning. This
dimension pertains to timing. The timing dimension focuses on
a multiple-phase approach to maturing individual and organiza-
tional learning approaches. The multiple phasing of this approach
suggests a maturing or evolutionary learning cycle that occurs
over time, in which individual learning fosters the need and the
acceptance of organizational learning methods. This process can
be applied within multiple tiers of management and across differ-
ent business units.
The ROD arc can also be integrated with the applied individual
learning wheel. The combined models show the individual’s cycle of
learning along a path of maturation. This can be graphically shown
to reflect how the wheel turns and moves along the continuum of the
arc (Figure 4.11).
Figure 4.11 shows that an experienced technology learner can
maximize learning by utilizing all four quadrants in each of the
maturity stages. It should be clear that certain quadrants of indi-
vidual learning are more important to specific stages on the arc.
However, movement through the arc is usually not symmetrical;
that is, individuals do not move equally from stage to stage, within
the dimensions of learning (Langer, 2003). This integrated and
multiphase method uses the applied individual learning wheel
with the arc. At each stage of the arc, an individual will need
to draw on the different types of learning that are available in
the learning wheel. Figure 4.12 provides an example of this con-
cept, which Knefelkamp calls “multiple and simultaneous” (1999),
meaning that learning can take on multiple meanings across dif-
ferent sectors simultaneously.
105orGAnIzAtIonAl leArnInG theorIes
Figure 4.12 shows that the dimension variables are not necessarily
parallel in their linear maturation. This phenomenon is not unusual
with linear models, and in fact, is quite normal. However, it also reflects
the complexity of how variables mature, and the importance of having
the capability and infrastructure to determine how to measure such
levels of maturation within dimensions. There are both qualitative
and quantitative approaches to this analysis. Qualitative approaches
typically include interviewing, ethnographic-type experiences over
Engaging in the
technology process
Con
ce
ptuali
ze
driv
er
an
d su
ppor
ter
life
cy
cle
sMeas
urem
en
t a
nd
an
aly
sis
–W
hat?
Exploring technology opportunities
Planning and
design–How?
Feasibility–W
hy?
Im
plem
en
tin
g te
ch
nology
Crea
tio
n–W
hat
If?
Action
learning
QIV
QIII
QI
QII
Operational
knowledge
Department/unit
view as other
Integrated
disposition
Increased levels of maturity with
organizational dynamism
Stable
operations
Organizational
leadership
Knowledge
management
Transformative
learning
Communities
of practice
Figure 4.11 ROD arc with applied individual learning wheel.
106 INFORMATION TECHNOLOGY
D
im
en
si
on
v
ar
ia
bl
e
O
pe
ra
tio
na
l
kn
ow
le
dg
e
D
ep
ar
tm
en
t/
un
it
vi
ew
a
s o
th
er
In
te
gr
at
ed
di
sp
os
iti
on
St
ab
le
o
pe
ra
tio
ns
O
rg
an
iz
at
io
na
l
le
ad
er
sh
ip
St
ra
te
gi
c
in
te
gr
at
io
n
C
ul
tu
ra
l a
ss
im
ila
tio
n
O
rg
an
iz
at
io
na
l l
ea
rn
in
g
co
ns
tr
uc
ts
M
an
ag
em
en
t l
ev
el
Fi
gu
re
4
.1
2
Sa
m
pl
e
RO
D
ar
c.
107orGAnIzAtIonAl leArnInG theorIes
some predetermined time period, individual journals or diaries, group
meetings, and focus groups. Quantitative measures involve the cre-
ation of survey-type measures; they are based on statistical results
from answering questions that identify the level of maturation of the
individual.
The learning models that I elaborate in this chapter are suggestive
of the rich complexities surrounding the learning process for indi-
viduals, groups, and entire organizations. This chapter establishes a
procedure for applying these learning models to technology-specific
situations. It demonstrates how to use different phases of the learning
process to further mature the ability of an organization to integrate
technology strategically and culturally.
109
5
managing oRganizaTional
leaRning anD TeChnology
The Role of Line Management
In Chapter 1, the results of the Ravell case study demonstrated the
importance of the role that line managers have, for the success of imple-
menting organizational learning, particularly in the objective of inte-
grating the information technology (IT) department. There has been
much debate related to the use of event-driven learning. In particular,
there is Senge’s (1990) work from his book, The Fifth Discipline. While
overall, I agree with his theories, I believe that there is a need to critique
some of his core concepts and beliefs. That is, Senge tends to make
broad generalizations about the limits of event-driven education and
learning in organizations. He believes that there is a limitation of learn-
ing from experience because it can create limitations to learning based
on actions—as he asks: “What happens when we can no longer observe
the consequences of our actions?” (Senge, 1990, p. 23).
My research has found that event-driven learning is essential to
most workers who have yet to learn through other means. I agree with
Senge that not all learning can be obtained through event-oriented
thinking, but I feel that much of what occurs at this horizon pertains
more to the senior levels than to what many line managers have to deal
with as part of their functions in business. Senge’s concern with learn-
ing methods that focus too much on the individual, perhaps, is more
powerful, if we see the learning organization as starting at the top and
then working its way down. The position, however, particularly with
respect to the integration of technology, is that too much dependence
on executive-driven programs to establish and sustain organizational
learning, is dangerous. Rather, the line management—or middle
managers who fundamentally run the business—is best positioned
to make the difference. My hypothesis here is that both top-down
and bottom-up approaches to organizational learning are riddled with
110 INFORMATION TECHNOLOGY
problems, especially in their ability to sustain outcomes. We cannot
be naï ve—even our senior executives must drive results to maintain
their positions. As such, middle managers, as the key business drivers,
must operate in an event- and results-driven world—let us not under-
estimate the value of producing measurable outcomes, as part of the
ongoing growth of the organizational learning practicum.
To explore the role of middle managers further, I draw on the inter-
esting research done by Nonaka and Takeuchi (1995). These research-
ers examined how Japanese companies manage knowledge creation,
by using an approach that they call “middle-up-down.” Nonaka and
Takeuchi found that middle managers “best communicate the contin-
uous iterative process by which knowledge is created” (p. 127). These
middle managers are often seen as leaders of a team, or task, in which
a “spiral conversion process” operates and that requires both executive
and operations management personnel. Peters and Waterman (1982),
among others, often have attacked middle managers as representing a
layer of management that creates communication problems and inef-
ficiencies in business processes that resulted in leaving U.S. workers
trailing behind their international competitors during the automobile
crisis in the 1970s. They advocate a “flattening” of the never-ending
levels of bureaucracy responsible for inefficient operations. However,
executives often are not aware of details within their operating depart-
ments and may not have the ability or time to acquire those details.
Operating personnel, on the other hand, do not possess the vision
and business aptitudes necessary to establish the kind of knowledge
creation that fosters strategic learning.
Middle managers, or what I prefer to identify as line managers
(Langer, 2001b), possess an effective combination of skills that can pro-
vide positive strategic learning infrastructures. Line managers under-
stand the core issues of productivity in relation to competitive operations
and return on investment, and they are much closer to the day-to-day
activities that bring forth the realities of how, and when, new strategic
processes can be effectively implemented. While many researchers, such
as Peters and Waterman, find them to be synonymous with backward-
ness, stagnation, and resistance to change, middle managers are the
core group that can provide the basis for continuous innovation through
strategic learning. It is my perspective that the difference of opinion
regarding the positive or negative significance middle managers have
111MAnAGInG orGAnIzAtIonAl leArnInG
in relation to organizational learning has to do with the wide-ranging
variety of employees who fall into the category of “middle.” It strikes
me that Peters and Waterman were somewhat on target with respect to
a certain population of middle managers, although I would not char-
acterize them as line managers. To justify this position, it is important
to clearly establish the differences. Line managers should be defined as
pre-executive employees who have reached a position of managing a
business unit that contains some degree of return on investment for the
business. In effect, I am suggesting that focusing on “middle” manag-
ers, as an identifiable group, is too broad. Thus, there is a need to further
delineate the different levels of what comprises middle managers, and
their roles in the organization.
Line Managers
These individuals usually manage an entire business unit and have
“return-on-investment” responsibilities. Line managers should be
categorized as those who have middle managers reporting to them;
they are, in effect, managers of managers, or, as in some organiza-
tions, they serve a “directorial” function. Such individuals are, in
many ways, considered future executives and perform many low-end
executive tasks. They are, if you will, executives in training. What
is significant about this managerial level is the knowledge it carries
about operations. However, line managers are still involved in daily
operations and maintain their own technical capabilities.
FirstLine Managers
First-line individuals manage nonmanagers but can have supervisory
employees who report to them. They do not carry the responsibility
for a budget line unit but for a department within the unit. These
managers have specific goals that can be tied to their performance and
to the department’s productivity.
Supervisor
A supervisor is the lowest-level middle manager. These individu-
als manage operational personnel within the department. Their
112 INFORMATION TECHNOLOGY
management activities are typically seen as “functions,” as opposed
to managing an entire operation. These middle managers do not have
other supervisors or management-level personnel reporting to them.
We should remember that definitions typically used to character-
ize the middle sectors of management, as described by researchers
like Peters, Nonaka, and others, do not come from exact science. The
point must be made that middle managers cannot be categorized by a
single definition. The category requires distinctive definitions within
each level of stratification presented. Therefore, being more specific
about the level of the middle manager can help us determine the man-
ager’s role in the strategic learning process. Given that Nonaka and
Takeuchi (1995) provide the concept of middle-up-down as it related
to knowledge management, I wish to broaden it into a larger sub-
ject of strategic learning, as a method of evolving changes in culture
and organizational thinking. Furthermore, responsive organizational
dynamism (ROD), unlike other organizational studies, represents
both situational learning and ongoing evolutionary learning require-
ments. Evolutionary learning provides a difficult challenge to organi-
zational learning concepts. Evolutionary learning requires significant
contribution from middle managers. To understand the complexity of
the middle manager, all levels of the organization must be taken into
consideration. I call this process management vectors.
Management Vectors
Senge’s (1990) work addresses some aspects of how technology might
affect organizational behavior: “The central message of the Fifth
Discipline is more radical than ‘ radical organization redesign’—
namely that our organizations work the way they work, ultimately
because of how we think and how we interact” (p. xiv). Technology
aspires to be a new variable or catalyst that can change everyday
approaches to things—to be the radical change element that forces
us to reexamine norms no longer applicable to business operations.
On the other hand, technology can be dangerous if perceived unre-
alistically as a power that possesses new answers to organizational
performance and efficiency. In the late 1990s, we experienced the
“bust” of the dot-com explosion, an explosion that challenged conven-
tional norms of how businesses operate. Dot-coms sold the concepts
113MAnAGInG orGAnIzAtIonAl leArnInG
that brick-and-mortar operations could no longer compete with new
technology-driven businesses and that “older” workers could not be
transformed in time to make dot-com organizations competitive.
Dot-coms allowed us to depart from our commitment to knowledge
workers and learning organizations, which is still true today.
For example, in 2003, IBM at its corporate office in Armonk, New
York, laid off 1,000 workers who possessed skills that were no lon-
ger perceived as needed or competitive. Rather than retrain work-
ers, IBM determined that hiring new employees to replace them
was simply more economically feasible and easier in terms of trans-
forming their organization behaviors. However, in my interview
with Stephen McDermott, chief executive officer (CEO) of ICAP
Electronic Trading Community (ETC), it became apparent that
many of the mystiques of managing technology were incorrect. As he
stated, “Managing a technology company is no different from manag-
ing other types of businesses.” While the technical skills of the IBM
workers may no longer be necessary, why did the organization not
provide enough opportunities to migrate important knowledge work-
ers to another paradigm of technical and business needs? Widespread
worker replacements tell us that few organizational learning infra-
structures actually exist. The question is whether technology can pro-
vide the stimulus to prompt more organizations to commit to creating
infrastructures that support growth and sustained operation. Most
important is the question of how we establish infrastructures that can
provide the impetus for initial and ongoing learning organizations.
This question suggests that the road to working successfully with tech-
nology will require the kind of organizational learning that is driven
by both individual and organization-wide initiatives. This approach
can be best explained by referring to the concept of driver and sup-
porter functions and life cycles of technology presented in Chapter 3.
Figure 5.1 graphically shows the relationship between organizational
structure and organizational learning needs. We also see that this
relationship maps onto driver and supporter functionality.
Figure 5.1 provides an operational overview of the relations between
the three general tiers of management in most organizations. These
levels or tiers are mapped onto organizational learning approaches;
that is, organizational/system or individual. This mapping follows a
general view based on what individuals at each of these tiers view or
114 INFORMATION TECHNOLOGY
seek as their job responsibilities and what learning method best sup-
ports their activities within their environment. For example, execu-
tive learning focuses on system-level thinking and learning because
executives need to view their organizations in a longer-term way (e.g.,
return on investment), as opposed to viewing learning on an indi-
vidual, transactional event way. Yet, executives play an integral part in
long-term support for technology, as an accelerator. Their role within
ROD is to provide the stimulus to support the process of cultural
assimilation, and they are also very much a component of strategic
integration. Executives do not require as much event-driven reflective
change, but they need to be part of the overall “social” structure that
paves the way for marrying the benefits of technology with organi-
zational learning. What executives do need to see, are the planned
measurable outcomes linked to performance from the investment of
coupling organizational learning with technology. The lack of execu-
tive involvement and knowledge will be detrimental to the likelihood
of making this relationship successful.
Operations, on the other hand, are based more on individual prac-
tices of learning. Attempting to incorporate organizational vision
and social discourse at this level is problematic until event-driven
learning is experienced individually to prove the benefits that can be
derived from reflective practices. In addition, there is the problem of
the credibility of a learning program. Workers are often wary of new
Management/
operational
layers
Executive tier
Middle
management
tiers
Operations tier Support Event-driven
individual
Reflective practices
Organization/
system on
driver individual
on support
Communities of
practice (driver)
reflective practices
(supporter)
Driver/support
life cycle
Organizational
learning
method
Knowledge
management
Learning
approach
Organization
system
Driver/support
life cycle
involvement
Driver
Figure 5.1 Three-tier organizational structure.
115MAnAGInG orGAnIzAtIonAl leArnInG
programs designed to enhance their development and productivity.
Many question the intentions of the organization and why it is mak-
ing the investment, especially given what has occurred in corporations
over the last 20 years: Layoffs and scandals have riddled organizations
and hurt employee confidence in the credibility of employer programs.
Ravell showed us that using reflective practices during events pro-
duces accelerated change, driven by technological innovation, which
in turn, supports the development of the learning organization. It is
important at this level of operations to understand the narrow and
pragmatic nature of the way workers think and learn. The way opera-
tions personnel are evaluated is also a factor. Indeed, operations per-
sonnel are evaluated based on specific performance criteria.
The most complex, yet combined, learning methods relate to the
middle management layers. Line managers, within these layers, are
engrossed in a double-sided learning infrastructure. On one side, they
need to communicate and share with executives what they perceive to
be the “overall” issues of the organization. Thus, they need to learn
using an organizational learning approach, which is less dependent
on event-driven learning and uses reflective practice. Line managers
must, along with their senior colleagues, be able to see the business
from a more proactive perspective and use social-oriented methods
if they hope to influence executives. Details of events are more of an
assumed responsibility to them than a preferred way of interacting. In
other words, most executives would rather interface with line manag-
ers on how they can improve overall operations efficiently and effec-
tively, as opposed to dealing with them on a micro, event-by-event
basis. The assumption, then, is that line managers are expected to deal
with the details of their operations, unless there are serious problems
that require the attention of executives; such problems are usually cor-
related to failures in the line manager’s operations.
On the other side are the daily relationships and responsibilities
managers face for their business units. They need to incorporate more
individual-based learning techniques that support reflective practices
within their operations to assist in the personal development of their
staff. The middle management tier described in Figure 5.1 is shown
at a summary level and needs to be further described. Figure 5.2 pro-
vides a more detailed analysis based on the three types of middle man-
agers described. The figure shows the ratio of organizational learning
116 INFORMATION TECHNOLOGY
to individual learning based on manager type. The more senior the
manager, the more learning is based on systems and social processes.
Knowledge Management
There is an increasing recognition that the competitive advantage of
organizations depends on their “ability to create, transfer, utilize, and
protect difficult-to-intimate knowledge assets” (Teece, 2001, p. 125).
Indeed, according to Bertels and Savage (1998), the dominant logic
of the industrial era requires an understanding of how to break the
learning barrier to comprehending the information era. While we
have developed powerful solutions to change internal processes and
organizational structures, most organizations have failed to address
the cultural dimensions of the information era. Organizational
knowledge creation is a result of organizational learning through stra-
tegic processes. Nonaka and Takeuchi (1995) define organizational
knowledge as “the capability of a company as a whole to create new
knowledge, disseminate it throughout the organization, and embody
it in products, services, and systems” (p. 3). Nonaka and Takeuchi use
the steps shown in Figure 5.3 to assess the value and chain of events
surrounding the valuation of organization knowledge.
Supervisor
High individual-
based learning
High org/system-
based learning
Individual
System
Manager Director
Figure 5.2 Organizational/system versus individual learning by middle manager level.
Knowledge creation
Continuous innovation
Competitive advantage
Figure 5.3 Nonaka and Takeuchi steps to organizational knowledge.
117MAnAGInG orGAnIzAtIonAl leArnInG
If we view the Figure 5.3 processes as leading to competitive advan-
tage, we may ask how technology affects the chain of actions that
Nonaka and Takeuchi (1995) identify. Without violating the model,
we may insert technology and observe the effects it has on each step,
as shown in Figure 5.4.
According to Nonaka and Takeuchi (1995), to create new knowl-
edge means to re-create the company, and everyone in it, in an ongo-
ing process that requires personal and organizational self-renewal.
That is, knowledge creation is the responsibility of everyone in the
organization. The viability of this definition, however, must be ques-
tioned. Can organizations create personnel that will adhere to such
parameters, and under what conditions will senior management sup-
port such an endeavor?
Again, technology has a remarkable role to play in substantiat-
ing the need for knowledge management. First, executives are still
challenged to understand how they need to deal with emerging tech-
nologies as this relates to whether their organizations are capable
of using them effectively and efficiently. Knowledge management
provides a way for the organization to learn how technology will be
used to support innovation and competitive advantage. Second, IT
departments need to understand how they can best operate within
the larger scope of the organization—they are often searching for a
true mission that contains measurable outcomes, as defined by the
entire organization, including senior management. Third, both execu-
tives and IT staff agree that understanding the uses of technology is a
continuous process that should not be utilized solely in a reactionary
Knowledge creation: Technology provides more dynamic shifts in knowledge,
thus accelerating the number of knowledge-creation events that can occur.
Continuous innovation: Innovations are accelerated because of the dynamic
nature of events and the time required to respond—therefore, continuous
innovation procedures are more significant to have in each department in order
to respond to technological opportunities on an ongoing basis.
Competitive advantage: Technology has generated more global competition.
Competitive advantages that depend on technological innovation
are more common.
Figure 5.4 Nonaka and Takeuchi organizational knowledge with technology extension.
118 INFORMATION TECHNOLOGY
and event-driven way. Finally, most employees accept the fact that
technology is a major component of their lives at work and at home,
that technology signifies change, and that participating in knowledge
creation is an important role for them.
Again, we can see that technology provides the initiator for
understanding how organizational learning is important for com-
petitive advantage. The combination of IT and other organizational
departments, when operating within the processes outlined in ROD,
can significantly enhance learning and competitive advantage. To
expand on this point, I now focus on the literature specifically relat-
ing to tacit knowledge and its important role in knowledge man-
agement. Scholars theorize knowledge management is an ability to
transfer individual tacit knowledge into explicit knowledge. Kulkki
and Kosonen (2001) define tacit knowledge as an experience-based
type of knowledge and skill and as the individual capacity to give
intuitive forms to new things; that is, to anticipate and preconcep-
tualize the future. Technology, by its very definition and form of
being, requires this anticipation and preconceptualization. Indeed,
it provides the perfect educational opportunity in which to practice
the transformation of tacit into explicit knowledge. Tacit knowledge
is an asset, and having individual dynamic abilities to work with
such knowledge commands a “higher premium when rapid organic
growth is enabled by technology” (Teece, 2001, p. 140). Thus,
knowledge management is likely to be greater when technological
opportunity is richer.
Because evaluating emerging technologies requires the ability to
look into the future, it also requires that individuals translate valu-
able tacit knowledge, and creatively see how these opportunities are
to be judged if implemented. Examples of applicable tacit knowledge
in this process are here extracted from Kulkki and Kosonen (2001):
• Cultural and social history
• Problem-solving modes
• Orientation to risks and uncertainties
• Worldview organizing principles
• Horizons of expectations
I approach each of these forms of tacit knowledge from the per-
spective of the components of ROD as shown in Table 5.1.
119MAnAGInG orGAnIzAtIonAl leArnInG
It is not my intention to suggest that all technologies should be, or
can be, used to generate competitive advantage. To this extent, some
technologies may indeed get rejected because they cannot assist the
organization in terms of strategic value and competitive advantage. As
Teece (2001) states, “Information transfer is not knowledge transfer and
information management is not knowledge management, although the
former can assist the latter. Individuals and organizations can suffer
from information overload” (p. 129). While this is a significant issue for
many firms, the ability to have an organization that can select, interpret,
Table 5.1 Mapping Tacit Knowledge to Responsive Organizational Dynamism
TACIT KNOWLEDGE STRATEGIC INTEGRATION CULTURAL ASSIMILATION
Cultural and social
history
How the IT department and other
departments translate emerging
technologies into their existing processes
and organization.
Problem-solving
modes
Individual reflective practices that assist
in determining how specific technologies
can be useful and how they can be
applied.
Technology opportunities
may require organizational
and structural changes to
transfer tacit knowledge to
explicit knowledge.
Utilization of tacit knowledge
to evaluate probabilities for
success.
Orientation to risks
and uncertainties
Technology offers many risks and
uncertainties. All new technologies may
not be valid for the organization.
Tacit knowledge is a
valuable component to fully
understand realities, risks,
and uncertainties.
Worldviews Technology has global effects and changes
market boundaries that cross business
cultures. It requires tacit knowledge to
understand existing dispositions on how
others work together.
Review how technology
affects the dynamics of
operations.
Organizing
principles
How will new technologies actually be
integrated? What are the organizational
challenges to “rolling out” products and
to implementation timelines? What
positions are needed, and who in the
organization might be best qualified to
fill new responsibilities?
Identify limitations of the
organization; that is, tacit
knowledge versus explicit
knowledge realities.
Horizons of
expectations
Individual limitations in the tacit domain
that may hinder or support whether a
technology can be strategically
integrated into the organization.
120 INFORMATION TECHNOLOGY
and integrate information is a valuable part of knowledge management.
Furthermore, advances in IT have propelled much of the excitement
surrounding knowledge management. It is important to recognize that
learning organizations, reflective practices, and communities of prac-
tice all participate in creating new organizational knowledge. This is
why knowledge management is so important. Knowledge must be built
on its own terms, which requires intensive and laborious interactions
among members of the organization.
Change Management
Because technology requires that organizations accelerate their
actions, it is necessary to examine how ROD corresponds to theories
in organizational change. Burke (2002) states that most organiza-
tional change is evolutionary; however, he defines two distinct types
of change: planned versus unplanned and revolutionary versus evolu-
tionary. Burke also suggests that the external environmental changes
are more rapid today and that most organizations “are playing catch
up.” Many rapid changes to the external environment can be attrib-
uted to emerging technologies, which have accelerated the divide
between what an organization does and what it needs to do to remain
competitive. This is the situation that creates the need for ROD.
The catching-up process becomes more difficult because the amount
of change required is only increasing given ever-newer technologies.
Burke (2002) suggests that this catching up will likely require planned
and revolutionary change. Such change can be mapped onto much of
my work at Ravell. Certainly, change was required; I planned it, and
change had to occur. However, the creation of a learning organiza-
tion, using many of the organizational learning theories addressed
in Chapter 4, supports the eventual establishment of an operating
organization that can deal with unplanned and evolutionary change.
When using technology as the reason for change, it is then important
that the components of ROD be integrated with theories of organi-
zational change.
History has shown that most organizational change is not success-
ful in providing its intended outcomes, because of cultural lock-in.
Cultural lockin is defined by Foster and Kaplan (2001) as the inability
of an organization to change its corporate culture even when there
121MAnAGInG orGAnIzAtIonAl leArnInG
are clear market threats. Based on their definition, then, technology
may not be able to change the way an organization behaves, even
when there are obvious competitive advantages to doing so. My con-
cern with Foster and Kaplan’s conclusion is whether individuals truly
understand exactly how their organizations are being affected—or are
we to assume that they do understand? In other words, is there a pro-
cess to ensure that employees understand the impact of not changing?
I believe that ROD provides the infrastructure required to resolve
this dilemma by establishing the processes that can support ongoing
unplanned and evolutionary change.
To best show the relationship of ROD to organizational change
theory, I use Burke’s (2002) six major points in assisting change in
organizations:
1. Understanding the external environment: What are competitors
and customers’ expectations? This is certainly an issue, specif-
ically when tracking whether expected technologies are made
available in the client– vendor relationship. But, more critical
is the process of how emerging technologies, brought about
through external channels, are evaluated and put into produc-
tion; that is, having a process in place. Strategic integration of
ROD is the infrastructure that needs to facilitate the moni-
toring and management of the external environment.
2. Evaluation of the inside of the organization: This directly relates
to technology and how it can be best utilized to improve
internal operations. While evaluation may also relate to a
restructuring of an organization’s mission, technology is often
an important driver for why a mission needs to be changed
(e.g., expanding a market due to e-commerce capabilities).
3. Readiness of the organization: The question here is not whether
to change but how fast the organization can change to address
technological innovations. The ROD arc provides the steps
necessary to create organizations that can sustain change as a
way of operation, blending strategic integration with cultural
assimilation. The maturation of learning: moving toward sys-
tem-based learning also supports the creation of infrastruc-
tures that are vitally prepared for changes from emerging
technologies.
122 INFORMATION TECHNOLOGY
4. Cultural change as inevitable: Cultural assimilation essentially
demands that organizations must dynamically assimilate new
technologies and be prepared to evolve their cultures. Such
evolution must be accelerated and be systemic within business
units, to be able to respond effectively to the rate of change
created by technological innovations.
5. Making the case for change: It is often difficult to explain why
change is inevitable. Much of the need for change can be sup-
ported using the reflective practices implemented at Ravell.
However, such acceptance is directly related to the process of
time. Major events can assist in establishing the many needs
for change, as discussed by Burke (2002).
6. Sustaining change: Perhaps the strongest part of ROD is its
ability to create a process that is evolutionary and systemic. It
focuses on driving change to every aspect of the organization
and provides organizational learning constructs to address
each level of operation. It addresses what Burke (2002) calls
the “prelaunch, launch, postlaunch, and sustaining,” in the
important sequences of organizational change (p. 286).
Another important aspect of change management is leadership.
Leadership takes many forms and has multiple definitions. Technology
plays an interesting role in how leadership can be presented to orga-
nizations, especially in terms of the management style of leadership,
or what Eisenhardt and Bourgeois (1988) have coined as “power cen-
tralization.” Their study examines high-velocity environments in the
microcomputer industry during the late 1980s. By high velocity, they
refer to “those environments in which there is a rapid and discon-
tinuous change in demand, competitors, technology, or regulation, so
that information is often inaccurate, unavailable, or obsolete” (p. 738).
During the period of their study, the microcomputer industry was
undergoing substantial technological change, including the introduc-
tion of many new competitors. As it turns out, the concept of high
velocity is becoming more the norm today given the way organizations
find themselves needing to operate in constant fluxes of velocity. The
term power centralization is defined as the amount of decision-making
control wielded by the CEO. Eisenhardt and Bourgeois’s study finds
that the more the CEO engages in power-centralized leadership,
123MAnAGInG orGAnIzAtIonAl leArnInG
the greater the degree of politics, which has a negative impact on the
strategic performance of the firms examined. This finding suggests
that the less democratic the leadership is in high-velocity environ-
ments, the less productive the organization will be. Indeed, the study
found that when individuals engaged in team learning, political ten-
sion was reduced, and the performance of the firms improved.
The structure of ROD provides the means of avoiding the high-
velocity problems discovered by the Eisenhardt and Bourgeois (1988)
study. This is because ROD allows for the development of more indi-
vidual learning, as well as system thinking, across the executive ranks
of the business. If technology is to continue to establish such high
velocities, firms need to examine the Eisenhardt and Bourgeois study
for its relevance to everyday operations. They also need to use orga-
nizational learning theories as a basis for establishing leadership that
can empower employees to operate in an accelerated and unpredict-
able environment.
Change Management for IT Organizations
While change management theories address a broad population in
organizations, there is a need to create a more IT-specific approach to
address the unique needs of this group. Lientz and Rea (2004) estab-
lish five specific goals for IT change managers:
1. Gain support for change from employees and non-IT
managers.
2. Implement change along measurements for the work so that
the results of the change are clearly determined.
3. Implement a new culture of collaboration in which employees
share more information and work more in teams.
4. Raise the level of awareness of the technology process and
work so that there is less of a tendency for reversion.
5. Implement an ongoing measurement process for the work to
detect any problems.
Lientz and Rea’s (2004) position is that when a new culture is
instilled in IT departments, it is particularly important that it should
not require massive management intervention. IT people need to be
self-motivated to keep up with the myriad accelerated changes in the
124 INFORMATION TECHNOLOGY
world of technology. These changes occur inside IT in two critical
areas. The first relates to the technology itself. For example, how do
IT personnel keep up with new versions of hardware and software?
Many times, these changes come in the form of hardware (often
called system) and software upgrades from vendors who require
them to maintain support contracts. The ongoing self-management
of how such upgrades and changes will ultimately affect the rest
of the organization is a major challenge and one that is difficult to
manage top-down. The second area is the impact of new or emerg-
ing technologies on business strategy. The challenge is to develop IT
personnel who can transform their technical knowledge into busi-
ness knowledge and, as discussed, take their tacit knowledge and
convert it into explicit, strategic knowledge. Further understanding
of the key risks to the components of these accelerated changes is
provided as follows:
System and software version control: IT personnel must continue
to track and upgrade new releases and understand the impact
of product enhancements. Some product-related enhance-
ments have no bearing on strategic use; they essentially fix
problems in the system or software. On the other hand, some
new releases offer new features and functions that need to be
communicated to both IT and business managers.
Existing legacy systems: Many of these systems cannot support
the current needs of the business. This often forces IT staff to
figure out how to create what is called “workarounds” (quick
fixes) to these systems. This can be problematic given that
workarounds might require system changes or modifications
to existing software. The risk of these changes, both short and
long term, needs to be discussed between user and IT staff
communities of practice.
Software packages (offtheshelf software): Since the 1990s, the use
of preprogrammed third-party software packages has become
a preferred mode of software use among users. However,
many of these packages can be inflexible and do not support
the exact processes required by business users. IT personnel
need to address users’ false expectations about what software
packages can and cannot do.
125MAnAGInG orGAnIzAtIonAl leArnInG
System or software changes: Replacement of systems or software
applications is rarely 100% complete. Most often, remnants of
old systems will remain. IT personnel can at times be insensi-
tive to the lack of a complete replacement.
Project completion: IT personnel often misevaluate when their
involvement is finished. Projects are rarely finished when the
software is installed and training completed. IT staff tend to
move on to other projects and tasks and lose focus on the like-
lihood that there will be problems discovered or last-minute
requests made by business users.
Technical knowledge: IT staff members need to keep their techni-
cal skills up to date. If this is not done, emerging technolo-
gies may not be evaluated properly as there may be a lack of
technical ability inside the organization to map new technical
developments onto strategic advantage.
Pleasing users : While pleasing business users appears to be a
good thing, it can also present serious problems with respect
to IT projects. What users want, and what they need, may
not be the same. IT staff members need to judge when they
might need assistance from business and IT management
because users may be unfairly requesting things that are not
feasible within the constraints of a project. Thus, IT staff must
have the ability to articulate what the system can do and what
might be advisable. These issues tend to occur when certain
business users want new systems to behave like old ones.
Documentation: This, traditionally, is prepared by IT staff and
contains jargon that can confuse business users. Furthermore,
written procedures prepared by IT staff members do not con-
sider the entire user experience and process.
Training: This is often carried out by IT staff and is restricted
to covering system issues, as opposed to the business realities
surrounding when, how, and why things are done.
These issues essentially define key risks to the success of imple-
menting technology projects. Much of this book, thus far, has focused
on the process of organizational learning from an infrastructure per-
spective. However, the implementation component of technology
possesses new risks to successfully creating an organization that can
126 INFORMATION TECHNOLOGY
learn within the needs of ROD. These risks, from the issues enumer-
ated, along with those discussed by Lientz and Rea (2004) are sum-
marized as follows:
Business user involvement: Continuous involvement from busi-
ness users is necessary. Unfortunately, during the life of a proj-
ect there are so many human interfaces between IT staff and
business users that it is unrealistic to attempt to control these
communications through tight management procedures.
Requirements, definition, and scope: These relate to the process
by which IT personnel work with business users to deter-
mine exactly what software and systems need to accomplish.
Determining requirements is a process, not a predetermined
list that business users will necessarily have available to
them. The discourse that occurs in conversations is critical to
whether such communities are capable of developing require-
ments that are unambiguous in terms of expected outcomes.
Business rules: These rules have a great effect on how the organi-
zation handles data and transactions. The difference between
requirements and business rules is subtle. Specifically, busi-
ness rules, unlike requirements, are not necessarily related to
processes or events of the business. As such, the determina-
tion of business rules cannot be made by reviewing proce-
dures; for example, all account numbers must be numeric.
Documentation and training materials: IT staff members need to
interact with business users and establish joint processes that
foster the development of documentation and training that
best fit user needs and business processes.
Data conversion: New systems and applications require that data
from legacy systems be converted into the new formats. This
process is called data mapping; IT staff and key business users
review each data field to ensure that the proper data are rep-
resented correctly in the new system. IT staff members should
not be doing this process without user involvement.
Process measurement: Organizations typically perform a post-
completion review after the system or software application
is installed. Unfortunately, this process measurement should
occur during and after project completion.
127MAnAGInG orGAnIzAtIonAl leArnInG
IT change management poses some unique challenges to imple-
menting organizational learning, mostly because managers cannot
conceivably be available for all of the risks identified. Furthermore,
the very nature of new technologies requires that IT staff mem-
bers develop the ability to self-manage more of their daily functions
and interactions, particularly with other staff members outside the
IT department. The need for self-development is even more critical
because of the existence of technological dynamism, which focuses
on dynamic and unpredictable transactions that often must be han-
dled directly by IT staff members and not their managers. Finally,
because so many risks during technology projects require business
user interfaces, non-IT staff members also need to develop better and
more efficient self-management than they are accustomed to doing.
Technological dynamism, then, has established another need for
change management theory. This need relates to the implementation
of self-development methods. Indeed, part of the reason for the lack
of success of IT projects can be attributed to the inability of the core
IT and business staff to perform in a more dynamic way. Historically,
more management cannot provide the necessary learning and reduc-
tion of risk.
The idea of self-development became popular in the early 1980s as
an approach to the training and education of managers, and managers
to be. Thus, the focus of management self-development is to increase
the ability and willingness of managers to take responsibility for
themselves, particularly for their own learning (Pedler et al., 1988).
I believe that management self-development theory can be applied to
nonmanagers, or to staff members, who need to practice self-manage-
ment skills that can assist them in transitioning to operating under
the conditions of technological dynamism.
Management self-development draws on the idea that many peo-
ple emphasize the need for learner centeredness. This is an impor-
tant concept in that it ties self-development theory to organizational
learning, particularly to the work of Chris Argyris and Malcolm
Knowles. The concept of learner centeredness holds that individuals
must take prime responsibility for their own learning: when and how
to learn. The teacher (or manager) is assigned the task of facilitator—a
role that fosters guidance as opposed to direct initiation of learning.
In many ways, a facilitator can be seen as a mentor whose role it is to
128 INFORMATION TECHNOLOGY
guide an individual through various levels of learning and individual
development.
What makes self-development techniques so attractive is that
learners work on actual tasks and then reflect on their own efforts.
The methods of reflective practice theory, therefore, are applicable
and can be integrated with self-development practices. Although self-
development places the focus on the individual’s own efforts, manag-
ers still have responsibilities to mentor, coach, and counsel their staff.
This support network allows staff to receive appropriate feedback and
guidance. In many ways, self-development relates to the professional
process of apprenticeship but differs from it in that the worker may not
aspire to become the manager but may wish simply to develop better
management skills. Workers are expected to make mistakes and to be
guided through a process that helps them reflect and improve. This is
why self-development can be seen as a management issue as opposed
to just a learning theory.
A mentor or coach can be a supervisor, line manager, director, or
an outside consultant. The bottom line is that technological dyna-
mism requires staff members who can provide self- management
to cope with constant project changes and risks. These individu-
als must be able to learn, be self-aware of what they do not know,
and possess enough confidence to initiate the required learning
and assistance that they need to be successful (Pedler et al., 1988).
Self-development methods, like other techniques, have risks.
Most notable, is the initial decrement in performance followed by
a slow increment as workers become more comfortable with the
process and learn from their mistakes. However, staff members
must be given support and time to allow this process to occur;
self-development is a trial-and-error method founded on the basis
of mastery learning (i.e., learning from one’s mistakes). Thus, the
notion of self-development is both continuous and discontinuous
and must be implemented in a series of phases, each having unique
outcomes and maturity. The concept of self-development is also
consistent with the ROD arc, in which early phases of maturation
require more individual learning, particularly reflective practices.
Self-development, in effect, becomes a method of indirect man-
agement to assist in personal transformation. This personal trans-
formation will inevitably better prepare individuals to participate
129MAnAGInG orGAnIzAtIonAl leArnInG
in group- and organizational-level learning at later stages of
maturation.
The first phase of establishing a self-development program is to
create a “learning-to-learn” process. Teaching individuals to learn is a
fundamental need before implementing self-development techniques.
Mumford (1988) defines learning to learn as
1. Helping staff to understand the stages of the learning process
and the pitfalls to not learning
2. Helping staff to find their own preferences to learning
3. Assisting staff in understanding their present learning prefer-
ences and how to deal with, and overcome, learning weaknesses
4. Helping staff to build on their learning experience and apply
it to their current challenges in their job
The first phase of self-development clearly embraces the Kolb
(1999) Learning Style Inventory and the applied individual learn-
ing wheel that were introduced in Chapter 4. Thus, all staff members
should be provided with both of these learning wheels, made aware
of their natural learning strengths and weaknesses, and provided with
exercises to help them overcome their limitations. Most important is
that the Kolb system will make staff aware of their shortfalls with
learning. The applied individual learning wheel will provide a per-
spective on how individuals can link generic learning preferences into
organizational learning needs to support ROD.
The second phase of self-development is to establish a formal learn-
ing program in which staff members
1. Are responsible for their own learning, coordinated with a
mentor or coach
2. Have the right to determine how they will meet their own
learning needs, within available resources, time frames, and
set outcomes
3. Are responsible for evaluating and assessing their progress
with their learning
In parallel, staff coaches or mentors
1. Have the responsibility to frame the learning objectives so
that they are consistent with agreed-on individual weaknesses
130 INFORMATION TECHNOLOGY
2. Are responsible for providing access and support for staff
3. Must determine the extent of their involvement with mentor-
ing and their commitment to assisting staff members achieve
stated outcomes
4. Are ultimately responsible for the evaluation of individual’s
progress and success
This program must also have a formal process and structure.
According to Mossman and Stewart (1988), formal programs, called
self-managed learning (SML), need the following organization and
materials:
1. Staff members should work in groups as opposed to on their
own. This is a good opportunity to intermix IT and non-
IT staff with similar issues and objectives. The size of these
groups is (typically) from four to six members. Groups should
meet every two– three weeks, and should develop what are
known as learning contracts . Learning contracts specifically
state what the individual and management have agreed on.
Essentially, the structure of self-development allows staff
members to experience communities of practice, which by
their very nature, will also introduce them to group learning
and system-level thinking.
2. Mentors or coaches should preside over a group as opposed to
presiding over just one individual. There are two benefits to
doing this: (1) There are simply economies of scale for which
managers cannot cover staff on an individual basis, and (2)
facilitating a group with similar objectives benefits interac-
tion among the members. Coaches obviously need to play an
important role in defining the structure of the sessions, in
offering ideas about how to begin the self-development pro-
cess, and in providing general support.
3. Staff members need to have workbooks, films, courses,
study guides, books, and specialists in the organization,
all of which learners can use to help them accomplish their
goals.
4. Typically, learning contracts will state the assessment meth-
ods. However, assessment should not be limited only to indi-
viduals but also should include group accomplishments.
131MAnAGInG orGAnIzAtIonAl leArnInG
An SML should be designed to ensure that the learning program
for staff members represents a commitment by management to a for-
mal process, that can assist in the improvement of the project teams.
The third phase of self-development is evaluation. This process is a
mixture of individual and group assessments from phase II, coupled
with assessments from actual practice results. These are results from
proven outcomes during normal workday operations. To garner the
appropriate practice evaluation, mentors and coaches must be involved
in monitoring results and noting the progress on specific events that
occur. For example, if a new version of software is implemented, we
will want to know if IT staff and business users worked together to
determine how and when it should be implemented. These results
need to be formally communicated back to the learning groups. This
process needs to be continued on an ongoing basis to sustain the
effects of change management. Figure 5.5 represents the flow of the
three phases of the process.
The process for self-development provides an important approach
in assisting staff to perform better under the conditions of technologi-
cal dynamism. It is one thing to teach reflective practice; it is another
Individual learning contracts
Learning styles inventory
Self-managed learning program
communities of
practice IT and non-IT staff
Phase 1:
Establish
learning to
learn
objectives
Phase 2:
Create formal
learning
program
Make necessary
changes to self-
development
learning
Individual and group assessment
monitor operations for
measurable outcomes
Phase 3:
Implement
evaluation
Figure 5.5 Phases of self-development.
132 INFORMATION TECHNOLOGY
to get staff members to learn how to think in a manner that takes into
consideration the many risks that have plagued systems and software
projects for decades. While the role of management continues to play
a major part in getting things done within strategic objectives, self-
development can provide a strong learning method, that can foster
sustained bottom-up management, which is missing in most learning
organizations.
The Ravell case study provides some concrete evidence on how
self-development techniques can indeed get results. Because of the
time pressures at Ravell, I was not able to invest in the learning-to-
learn component at the start of the process. However, I used informal
methods to determine the learning preferences of the staff. This can
be accomplished through interviews in which staff responses can pro-
vide a qualitative basis for evaluating how specific personnel prefer to
learn. This helped me to formulate a specific training program that
involved group meetings with IT and non-IT-oriented groups.
In effect, phase II at Ravell had two communities. The first com-
munity was the IT staff. We met each week to review progress and
to set short-term objectives of what the community of IT wanted to
accomplish. I acted as a facilitator, and although I was in a power
position as their manager, I did not use my position unless there were
clear signs of resistance in the team (which there were in specific situ-
ations). The second community was formed with various line manager
departments. This is where I formed “dotted-line” reporting struc-
tures, which required IT staff members also to join other commu-
nities of practice. This proved to be an invaluable strategy because
it brought IT and business users together and formed the links that
eventually allowed IT staff members to begin to learn and to form
relationships with the user community, which fostered reflective
thinking and transformation.
As stated, there are setbacks at the start of any self-development
program, and the experience at Ravell was no exception. Initially,
IT staff members had difficulty understanding what was expected
of them; they did not immediately perceive the learning program as
an opportunity for their professional growth. It was through ongo-
ing, motivated discourse in and outside of the IT community that
helped achieve measurable increments of self-developmental growth.
Furthermore, I found it necessary to integrate individual coaching
133MAnAGInG orGAnIzAtIonAl leArnInG
sessions with IT staff. While group sessions were useful, they were
not a substitute for individual discussions, which at times allowed
IT staff members to personally discuss their concerns and learning
requirements. I found the process to be ultimately valuable, and I
maintained the role of coach, as opposed to that of a manager who
tells IT staff members what to do in every instance. I knew that direct
management only would never allow for the development of learning.
Eventually, self-development through discourse will foster identity
development. Such was the case at Ravell, where both user and IT
groups eventually came together to form specific and interactive com-
munities of practice. This helped form a clearer identity for IT staff
members, and they began to develop the ability to address the many
project risk issues that I defined in this chapter. Most important for
the organization was that Ravell phase I built the foundation for later
phases that required more group and system thinking among the IT
ranks.
Evaluation of the performance at Ravell (phase III of the self-
development process) was actually easier than expected, which means
that if the first two phases are successful, evaluation will naturally be
easy to determine. As reflective thinking became more evident in the
group, it was easier to see the growth in transformative behavior; the
IT groups became more proactive and critical by themselves, without
necessarily needing my input. In fact, my participation fell into more
of a supporter role; I was asked to participate more when I felt needed
to provide a specific task for the group. Evaluation based on perfor-
mance was also easier to determine, mainly because we had formed
interdepartmental communities and because of the relationships I
established with line managers.
Another important decision we made and one that nurtured our
evaluation capabilities was the fact that line managers often joined
our IT staff meetings. So, getting feedback on actual results was
always open for discussion.
Viewing self-development in the scope of organizational learning
and management techniques provides an important support method
for later development in system thinking. The Ravel experience did
just that, as the self-development process inevitably laid the foun-
dation for more sophisticated organizational learning, required as a
business matures under ROD.
134 INFORMATION TECHNOLOGY
Social Networks and Information Technology
The expansion of social networks, through the use of technological
innovations, has substantially changed the way information flows in
and out of a business community. Some companies, particularly in the
financial services communities, have attempted to “lock out” social
network capabilities. These attempts are ways for organizations to
control, as opposed to change, behavior. Historically, such controls
to enforce compliance have not worked. This is particularly relevant
because of the emergence of a younger generation of workers who use
social networking tools as a regular way to communicate and carry out
discourse. Indeed, social networking has become the main vehicle for
social discourse both inside and outside organizations. There are those
who feel that the end of confidentiality may be on the horizon. This
is not to suggest that technology executives give up on security—we
all know this would be ludicrous. On the other hand, the increasing
pressure to “open” the Web will inevitably become too significant to
ignore. Thus, the technology executive of the future must be prepared
to provide desired social and professional networks to their employees
while figuring out how to minimize risk—certainly not an easy objec-
tive. Organizations will need to provide the necessary learning tech-
niques to help employees understand the limits of what can be done.
We must remember that organizations, governments, and busi-
nesses have never been successful at controlling the flow of information
to any population to or from any specific interest group—inevitably,
information flows through. As stated by Cross and Thomas (2009),
“The network perspective could trigger new approaches to organiza-
tion design at a time when environmental and competitive conditions
seem to be exhausting conventional wisdom” (p. 186). Most important
is the understanding that multinational organizations need to think
globally and nationally at the same time. To do this, employees must
transform their behavior and how they interact. Controlling access
does not address this concern; it only makes communication more
difficult and therefore does not provide a solution. Controls typically
manifest themselves in the form of new processes and procedures. I
often see technology executives proclaiming the need to change pro-
cesses in the name of security without really understanding that they
are not providing a solution, but rather, fostering new procedures that
135MAnAGInG orGAnIzAtIonAl leArnInG
will allow individuals to evade the new security measures. As Cross and
Thomas (2009) point out, “Formal structures often overlook the fact
that every formal organization has in its shadow an informal or ‘ invis-
ible’ organization” (p. 1). Instead, technology executives concerned
with security, need to focus on new organizational design to assist
businesses to be “social network ready.” ROD must then be extended
to allow for the expansion of social network integration, including,
but not limited to, such products as Linkedln, Facebook, and Twitter.
It may also be necessary to create new internal network infrastruc-
tures that specifically cater to social network communication.
Many software application companies have learned that compat-
ibility in an open systems environment is a key factor for success-
ful deployment of an enterprise-wide application solution. Thus, all
applications developed within or for an organization need to have
compatibility with the common and popular social network products.
This popularity is not static, but rather, a constant process of deter-
mining which products will become important social networks that
the company may want to leverage. We see social networks having
such an impact within the consumer environment—or what we can
consider to be the “market.” I explained in my definition of ROD that
it is the acceleration of market changes—or the changing relationship
between a buyer and seller—that dictates the successes and failures of
businesses. That said, technology executives must focus their attention
on how such networks will require their organizations to embrace
them. Obviously, this change carries risks. Adapting too early could
be overreacting to market hype, while lagging could mean late entry.
The challenge, then, for today’s technology leaders is to create
dynamic, yet functional, social networks that allow businesses to
compete while maintaining the controls they must have to protect
themselves. The IT organization must concentrate on how to provide
the infrastructure that allows these dynamic connections to be made
without overcontrol. The first mission for the technology executive is
to negotiate this challenge by working with the senior management
of the organization to reach consensus on the risk factors. The issues
typically involve the processes, behavior patterns, and risks shown in
Figure 5.6.
Ultimately, the technology executive must provide a new road map
that promotes interagency and cross-customer collaboration in a way
136 INFORMATION TECHNOLOGY
that will assist the organization to attain a ROD culture. Social net-
works are here to stay and will continue to necessitate 24/7 access for
everyone. This inevitably raises salient issues relating to the manage-
ment structure within businesses and how best to manage them.
In Chapter 2, I defined the IT dilemma in a number of contexts.
During an interview, a chief executive raised an interesting issue that
relates to the subject: “My direct reports have been complaining that
because of all this technology that they cannot get away from—that
their days never seem to end.” I responded to this CEO by asking,
Business process
Design a social network
that allows
participants to
respond dynamically
to customer and
business needs
Aspired behavior patterns Risks
Users understand the inherent limits
to what can be communicated
outside the organization, limit
personal transactions, and use
judgment when foreign e-mails are
forwarded.
Users cannot properly
determine the ethics of
behavior and will not take
the necessary precautions
to avoid exposing the
organization to outside
security breaches.
Discern which critical
functions are required
for the social network
to work effectively and
maintain the firm’s
competitive
positioning
Users are active and form strategic
groups (communities of practice)
that define needs on a regular basis
and work closely with IT and senior
management.
Users cannot keep up with
changes in social
networks, and it is
impossible to track
individual needs and
behaviors.
Provide a network
design that can be
scaled as needs
change within the
budget limitations of
the organization
�e organization must understand
that hard budgets for social
networking may not be feasible.
Rather, the network needs are
dynamic, and costs must be
assessed dynamically within the
appropriate operating teams in the
organizations.
Reality tells us that all
organizations operate
within budget limitations.
Large organizations find
it difficult to govern
dynamically, and smaller
organizations cannot
afford the personnel
necessary to manage
dynamically.
Create a social network
that “flattens” the
organization so that
all levels are
accessible
Particularly large organizations need
to have a network that allows its
people better access to its
departments, talent, and
management. In the 1980s, the
book In Search of Excellence (Peters
& Waterman, 1982) was the first
effort to present the value of a
“flatter” organizational structure.
Social networks provide the
infrastructure to make this a reality.
With access come the
challenges of responding
to all that connect to the
system. �e organization
needs to provide the
correct etiquette of how
individuals respond
dynamically without
creating anarchy.
Figure 5.6 Social network management issues.
137MAnAGInG orGAnIzAtIonAl leArnInG
“Why are they e-mailing and calling you? Is it possible that tech-
nology has exposed a problem that has always existed?” The CEO
seemed surprised at my response and said, “What do you mean?”
Again, I responded by suggesting that technology allowed access,
but perhaps, that was not really the problem. In my opinion, the real
problem was a weakness in management or organizational structure.
I argued that good managers build organizations that should handle
the questions that were the subject of these executives’ complaints.
Perhaps the real problem was that the organization or management
was not handling day-to-day issues. This case supports my thesis that
technology dynamism requires reevaluation of how the organization
operates and stresses the need to understand the cultural assimilation
abilities of dealing with change.
Another interesting aspect of social networks is the emergence of
otherwise invisible participants. Technology-driven networks have
allowed individuals to emerge not only because of the access determi-
nant but also because of statistics. Let me be specific. Network traffic
can easily be tracked, as can individual access. Even with limited his-
tory, organizations are discovering the valued members of their com-
panies simply by seeing who is active and why. This should not suggest
that social networks are spy networks. Indeed, organizations need to
provide learning techniques to guide how access is tracked and to
highlight the value that it brings to a business. As with other issues,
the technology executive must align with other units and individuals;
the following are some examples:
• Human resources (HR): This department has specific needs
that can align effectively with the entire social network.
Obviously, there are compliance issues that limit what can
be done over a network. Unfortunately, this is an area that
requires reassessment: In general, governance and controls do
not drive an organization to adopt ROD. There are other fac-
tors related to the HR function. First, is the assimilation of
new employees and the new talents that they might bring to
the network. Second, is the challenge of adapting to ongoing
change within the network. Third, is the knowledge lost of
those who leave the organization yet may still want to partici-
pate socially within the organization (friends of the company).
138 INFORMATION TECHNOLOGY
• Gender: Face-to-face meetings have always shown differences
in participation by gender. Men tend to dominate meetings
and the positions they hold in an organization. However, the
advent of social virtual networks has begun to show a shift
in the ways women participate and hold leadership positions
among their peers. In an article in Business Week (May 19,
2008), Auren Hoffman reports that women dominate social
network traffic. This may result in seeing more women-centric
communication. The question, then, is whether the expan-
sion of social networks will give rise to more women in senior
management positions.
• Marketing: The phenomenon of social networking has allowed
for the creation of more targeted connectivity; that is, the abil-
ity to connect with specific clients in special ways. Marketing
departments are undergoing an extraordinary transformation
in the way they target and connect with prospective custom-
ers. The technology executive is essentially at the center of
designing networks that provide customizable responses and
facilitate complex matrix structures. Having such abilities
could be the differentiator between success and failure for
many organizations.
One can see that the expansion of social networks is likely to have
both good and bad effects. Thus far, in this section I have discussed the
good. The bad relates to the expansion of what seems to be an unlim-
ited network. How does one manage such expansion? The answer lies
within the concept of alignment. Alignment has always been critical
to attain organizational effectiveness. The heart of alignment is deal-
ing with cultural values, goals, and processes that are key to meet
strategic objectives (Cross & Thomas, 2009). While the social net-
work acts to expose these issues, it does not necessarily offer solutions
to these differences. Thus, the challenge for the technology executive
of today is to balance the power of social networks while providing
direction on how to deal with alignment and control—not an easy
task but clearly an opportunity for leadership. The following chapters
offer some methods to address the challenges discussed in this chap-
ter, and the opportunities they provide for technology executives.
139
6
oRganizaTional
TRansfoRmaTion anD The
balanCeD sCoReCaRD
Introduction
The purpose of this chapter is to examine the nature of organiza-
tional transformation, how it occurs, and how it can be measured.
Aldrich (2001) defines organizational transformation along three
possible dimensions: changes in goals, boundaries, and activities.
According to Aldrich, transformations “must involve a qualita-
tive break with routines and a shift to new kinds of competencies
that challenge existing organizational knowledge” (p. 163). He
warns us that many changes in organizations disguise themselves
as transformative but are not. Thus, focusing on the qualifications
of authentic or substantial transformation is key to understanding
whether it has truly occurred in an organization. Technology, as
with any independent variable, may or may not have the capacity to
instigate organizational transformation. Therefore, it is important
to integrate transformation theory with responsive organizational
dynamism (ROD). In this way, the measurable outcomes of orga-
nizational learning and technology can be assessed in organizations
that implement ROD. Most important in this regard, is that organi-
zational transformation, along with knowledge creation, be directly
correlated to the results of implementing organizational learning.
That is, the results of using organizational learning techniques must
result in organizational transformation.
Organizational transformation is significant for three key reasons:
1. Organizations that cannot change will fundamentally be at
risk against competitors, especially in a quickly changing
market.
140 INFORMATION TECHNOLOGY
2. If the organization cannot evolve, it will persist in its norms
and be unwilling to change unless forced to do so.
3. If the community population is forced to change and is con-
strained in its evolutionary path, it is likely that it will not be
able to transform and thus, will need to be replaced.
Aldrich (2001) establishes three dimensions of organizational
transformation. By examining them, we can apply technology-
specific changes and determine within each dimension what consti-
tutes authentic organizational transformation.
1. Goals: There are two types of goal-related transformations: (a)
change in the market or target population of the organiza-
tion; (b) the overall goal of the organization itself changes. I
have already observed that technology can affect the mission
of an organization, often because it establishes new market
niches (or changes them). Changed mission statements also
inevitably modify goals and objectives.
2. Boundaries: Organizational boundaries transform when there
is expansion or contraction. Technology has historically
expanded domains by opening up new markets that could
not otherwise be reached without technological innovation.
E-business is an example of a transformation brought about
by an emerging technology. Of course, business can contract
as a result of not assimilating a technology; technology also
can create organizational transformation.
3. Activity systems: Activity systems define the way things are
done. They include the processing culture, such as behav-
ioral roles. Changes in roles and responsibilities alone do
not necessarily represent organizational transformation
unless it is accompanied by cultural shifts in behavior. The
cultural assimilation component of ROD provides a method
with which to facilitate transformations that are unpredict-
able yet evolutionary. Sometimes, transformations in activ-
ity systems deriving from technological innovations can
be categorized by the depth and breadth of its impact on
other units. For example, a decision could be made to use
technology as part of a total quality management (TQM)
141orGAnIzAtIonAl trAnsForMAtIon
effort. Thus, activity transformations can be indirect and
need to be evaluated based on multiple and simultaneous
events.
Aldrich’ s (2001) concept of organizational transformation bears
on the issue of frequency of change. In general, he concludes that
the changes that follow a regular cycle are part of normal evolution
and “flow of organizational life” (p. 169) and should not be treated as
transformations. Technology, on the other hand, presents an inter-
esting case in that it can be perceived as normal in its persistence
and regularity of change while being unpredictable in its dynamism.
However, Aldrich’ s definition of transformation poses an interesting
issue for determining transformations resulting from technological
innovations. Specifically, under what conditions is a technological
innovation considered to have a transformative effect on the organi-
zation? And, when is it to be considered as part of regular change? I
refer to Figure 6.1, first presented in Chapter 3 on driver and sup-
porter life cycles to respond to this question.
The flows in this cycle can be used as the method to determine
technological events that are normal change agents versus transforma-
tive ones. To understand this point, one should view all driver-related
technologies as transformational agents because they, by definition,
affect strategic innovation and are approved based on return on
investment (ROI). Aldrich’ s (2001) “normal ebb and flows” repre-
sent the “mini-loops” that are new enhancements or subtechnologies,
which are part of normal everyday changes necessary to mature a
Mini loop technology enhancementsTechnology
driver
Evaluation
cycle
Driver
maturation
Support
status
Replacement or
outsource
Economies
of scale
Figure 6.1 Driver-to-supporter life cycle.
142 INFORMATION TECHNOLOGY
technological innovation. Thus, driver variables that result from mini-
loops, would not be considered transformational agents of change.
It is important to recognize that Aldrich’ s (2001) definition of
organizational transformation should not be confused with theories
of transformative learning. As West (1996) proclaims, “The goal of
organizational learning is to transform the organization” (p. 54). The
study of transformative learning has been relevant to adult education,
and has focused on individual, as opposed to organizational, devel-
opment and learning. Thus, transformative learning has been better
integrated in individual learning and reflective practice theories than
in organizational ones. While these modes of learning are related to
the overall learning in organizations, they should not be confused
with organizations that are attempting to realize their performance
objectives.
Yorks and Marsick (2000) offer two strategies that can produce
transformative learning for individuals, groups, or organizations:
action learning and collaborative inquiry. I covered action science in
Chapter 4, particularly reflective practices, as key interventions to fos-
ter both individual and group evolution of learning, specifically in
reference to how to manage ROD. Aspects of collaborative inquiry
are applied to later stages of maturation and to more senior levels of
management based on systems-level learning. As Yorks and Marsick
(2000) state, “For the most part the political dimensions of how the
organization functions is off limits, as are discussions of larger social
consequences” (p. 274).
Technological innovations provide acceleration factors and foster
the need for ROD. Technology also furnishes the potential tangible
and measurable outcomes necessary to normalize York and Marsick’ s
(2000) framework for transformative learning theory into organiza-
tional contexts as follows:
1. Technology, specifically e-business, has created a critical need
for organizations to engage with clients and individuals in a
new interactive context. This kind of discourse has established
accelerated needs, such as understanding the magnitude of
alternative courses of action between customer and vendor.
The building of sophisticated intranets (internal Internets) and
their evolution to assimilate with other Internet operations
143orGAnIzAtIonAl trAnsForMAtIon
has also fueled the need for learning to occur more often than
before and at organizational level.
Because technology can produce measurable outcomes,
individuals are faced with accelerated reflections about the
cultural impact of their own behaviors. This is directly related
to the implementation of the cultural assimilation component
of ROD, by which individuals determine how their behaviors
are affected by emerging technologies.
2. Early in the process of implementing strategic integration,
reflective practices are critical for event-driven technology
projects. These practices force individuals to continually reex-
amine their existing meaning perspectives (specifically, their
views and habits of mind). Individual reflection in, on, and to
practice will evolve to system-level group and organizational
learning contexts, as shown in the ROD arc.
3. The process of moving from individual to system-level learn-
ing during technology maturation is strengthened by the
learners’ abilities to comprehend why historical events have
influenced their existing habits of mind.
4. The combination of strategic integration and cultural assimi-
lation lays the foundation for organizational transformation
to occur. Technology provides an appropriate blend of being
both strategic and organizational in nature, thus allow-
ing learners to confront their prior actions and develop new
practices.
Aldrich (2001) also provides an interesting set of explanations for
why it is necessary to recognize the evolutionary aspect of organiza-
tional transformations. I have extended them to operate within the
context of ROD, as follows:
Variation : Defined as “change from current routines and compe-
tencies and change in organizational forms” (Aldrich, 2001,
p. 22). Technology provides perhaps the greatest amount of
variation in routines and thereby establishes the need for
something to manage it: ROD. The higher the frequency of
variation, the greater the chance that organizational transfor-
mation can occur. Variation is directly correlated to cultural
assimilation.
144 INFORMATION TECHNOLOGY
Selection : This is the process of determining whether to use a
technology variation. Selections can be affected by external
(outside the organization) and internal (inside the organi-
zation) factors, such as changes in market segments or new
business missions, respectively. The process of selection can be
related to the strategic integration component of ROD.
Retention : Selected variations are retained or preserved by the
organization. Retention is a key way of validating whether
organizational transformation has occurred. As Aldrich
states: “Transformations are completed when knowledge
required for reproducing the new form is embodied in a com-
munity of practice” (p. 171).
Because of the importance of knowledge creation as the basis of
transformation, communities of practice are the fundamental struc-
tures of organizational learning to support organizational transforma-
tion. Aldrich (2001) also goes beyond learning; he includes policies,
programs, and networks as parts of the organizational transformative
process. Figure 6.2 shows Aldrich’ s evolutionary process and its rela-
tionship to ROD components.
Thus, we see from Figure 6.2 the relationships between the pro-
cesses of creating organizational transformation, the stages required
to reach it, the ROD components in each stage, and the correspond-
ing organizational learning method that is needed. Notice that the
mapping of organizational learning methods onto Aldrich’ s (2001)
scheme for organizational transformation can be related to the ROD
arc. It shows us that as we get closer to retention, organizational learn-
ing evolves from an individual technique to a system/organizational
learning perspective. Aldrich’ s model is consistent with my driver-
versus-supporter concept. He notes, “When the new form becomes
a taken-for-granted aspect of every day life in the organization, its
legitimacy is assumed” (p. 175).
Hence, the assimilation of new technologies cannot be consid-
ered transformative until it behaves as a supporter. Only then can we
determine that the technology has changed organizational biases and
norms. Representing the driver and supporter life cycle to include this
important relationship is shown in Figure 6.3.
145orGAnIzAtIonAl trAnsForMAtIon
Technology
Variation
Strategic integration–
assess value of
technology
Cultural assimilation–
assess extent of what
to implement and
determine effects on
structure
Strategic integration–
determine which
technologies best fit
corporate needs and
provide highest ROI
Corresponding organizational learning methods
Individual
reflective
practices
Group-based
reflective
practices
Social discourse
using
communities
of practice
Communities of
practice and
knowledge
management
Validation of
organizational
transformation–
technology has provided
strategic outcomes and
modified structures
and processes
Selection
Retention
Figure 6.2 Stages of organizational transformation and ROD.
Individual
reflective practice
Group-based
reflective
practice
Communities
of practice
Knowledge
management
Organizational
transformation
Mini loop technology enhancementsTechnology
driver
Evaluation
cycle
Driver
maturation
Support
status
Replacement or
outsource
Economies
of scale
Figure 6.3 Organizational transformation in the driver-to-supporter life cycle.
146 INFORMATION TECHNOLOGY
Methods of Ongoing Evaluation
If we define organizational transformation as the retention of knowl-
edge within the body of communities of practice, the question to be
answered is how this retention actually is determined in practice.
The possibility often occurs that transformations are partial or in
some phase of completion. This would mean that the transformation
is incomplete or needs to continue along some phase of approach.
Indeed, cultural assimilation does not occur immediately, but rather,
over periods of transition. Much of the literature on organizational
transformation does not address the practical aspects of evaluation
from this perspective. This lack of information is particularly prob-
lematic with respect to technology, since so much of how technology
is implemented relates to phased steps that rarely happen in one major
event. Thus, it is important to have some method of ongoing evalua-
tion to determine the extent of transformation that has occurred and
which organizational learning methods need to be applied to help
continue the process toward complete transformation.
Aldrich’ s (2001) retention can also be misleading. We know that
organizational transformation is an ongoing process, especially as
advocated in ROD. It is probable that transformations continue and
move from one aspect of importance to another, so a completed trans-
formation may never exist. Another way of viewing this concept is to
treat transformations as event milestones. Individuals and communi-
ties of practice are able to track where they are in the learning process.
It also fits into the phased approach of technology implementation.
Furthermore, the notion of phases allows for integration of organiza-
tional transformation concepts with stage and development theories.
With the acceptance of this concept, there needs to be a method or
model that can help organizations define and track such phases of
transformation. Such a model would also allow for mapping outcomes
onto targeted business strategies. Another way of understanding the
importance of validating organizational transformation is to recognize
its uniqueness, since most companies fail to execute their strategies.
The method that can be applied to the validation of organizational
transformation is a management tool called the balanced scorecard.
The balanced scorecard was introduced by Kaplan and Norton (2001)
in the early 1990s as a tool to solve measurement problems. The ability
147orGAnIzAtIonAl trAnsForMAtIon
of an organization to develop and operationalize its intangible assets
has become more and more a critical component for success. As I
have already expressed regarding the work of Lucas (1999), financial
measurement may not be capable of capturing all IT value. This is
particularly true in knowledge-based theories. The balanced score-
card can be used as a solution for measuring outcomes that are not
always financial and tangible. Furthermore, the balanced scorecard
is a “living” document that can be modified as certain objectives or
measurements require change. This is a critical advantage because, as
I have demonstrated, technology projects often change in scope and in
objectives as a result of internal and external factions.
The ultimate value, then, of the balanced scorecard, in this con-
text, is to provide a means for evaluating transformation not only for
measuring completion against set targets but also for defining how
expected transformations map onto the strategic objectives of the
organization. In effect, it is the ability of the organization to execute
its strategy. Before explaining the details of how a balanced scorecard
can be applied specifically to ROD, I offer Figure 6.4, which shows
Mobilize change
through executive
leadership
Balanced
Translate the
strategy to
operational terms
Scorecard
Make strategy a
continual processStrategy
Align
organizational to
the strategy
Make strategy
everyone’s job
Figure 6.4 Balanced scorecard. (From Kaplan, R.S., & Norton, D.P., The Strategy-Focused
Organization , Harvard University Press, Cambridge, MA, 2001.)
148 INFORMATION TECHNOLOGY
exactly where the scorecard fits into the overall picture of transition-
ing emerging technologies into concrete strategic benefit.
The generic objectives of a balanced scorecard are designed to cre-
ate a strategy-focused organization. Thus, all of the objectives and
measurements should be derived from the vision and strategy of the
organization (Kaplan & Norton, 2001). These measurements are
based on the fundamental principles of any strategically focused orga-
nization and on alignment and focus. Kaplan and Norton define these
principles as the core of the balanced scorecard:
1. Translate the strategy to operational terms : This principle
includes two major components that allow an organization to
define its strategy from a cause-and-effect perspective using
a strategy map and scorecard. Thus, the strategy map and its
corresponding balanced scorecard provide the basic measure-
ment system.
2. Align the organization to the strategy: Kaplan and Norton
define this principle as favoring synergies among organiza-
tional departments that allow communities of practice to have
a shared view, and common understanding of their roles.
3. Make strategy everyone’ s everyday job: This principle supports
the notion of a learning organization that requires everyone’ s
participation, from the chief executive officer (CEO) to cleri-
cal levels. To accomplish this mission, the members of the
organization must be aware of business strategy; individuals
may need “personal” scorecards and a matching reward sys-
tem for accomplishing the strategy.
4. Make strategy a continual process: This process requires the
linking of important, yet fundamental, components, includ-
ing organizational learning, budgeting, management reviews,
and a process of adaptation. Much of this principle falls into
the areas of learning organization theories that link learning
and strategy in ongoing perpetual cycles.
5. Mobilize change through executive leadership: This principle
stresses the need for a strategy-focused organization that
incorporates the involvement of senior management and can
mobilize the organization and provide sponsorship to the
overall process.
149orGAnIzAtIonAl trAnsForMAtIon
Using the core balanced scorecard schematic, I have modified it to
operate with technology and ROD, as shown in Figure 6.5.
1. Evaluation of technology: The first step is to have an infrastruc-
ture that can determine how technology fits into a specific
strategy. Once this is targeted, the evaluation team needs to
define it in operational terms. This principle requires the stra-
tegic integration component of ROD.
2. Align technology with business strategy : Once technology is
evaluated, it must be integrated into the business strategy.
This involves ascertaining whether the addition of technology
will change the current business strategy. This principle is also
connected to the strategic integration component of ROD.
3. Make technology projects part of communities of practice : Affected
communities need to be strategically aware of the project.
Organizational structures must determine how they distrib-
ute rewards and objectives across departments. This principle
requires the cultural assimilation component of ROD.
4. Phasedin technology implementation : Short- and long-term
project objectives are based on driver and supporter life cycles.
Executive
interfaces
Balanced
Evaluation of
technology
Scorecard
Phase technology
implementation
Responsive
org
dynamism
strategy
Align
technology
with business
strategy
Make technology
project part of
communities of
practice
Figure 6.5 Balanced scorecard ROD.
150 INFORMATION TECHNOLOGY
This will allow organizational transformation phases to be
linked to implementation milestones. This principle maps
onto the cultural assimilation component of ROD.
5. Executive interface : CEO and senior managers act as executive
sponsors and project champions. Communities of practice
and their common “threads” need to be defined, including
middle management and operations personnel, so that top-
down, middle-up-down, and bottom-up information flows
can occur.
The balanced scorecard ultimately provides a framework to view
strategy from four different measures:
1. Financial : ROI and risk continue to be important components
of strategic evaluation.
2. Customer : This involves the strategic part of how to create
value for the customers of the organization.
3. Internal business processes : This relates to the business pro-
cesses that provide both customer satisfaction and operational
efficiency.
4. Learning and growth : This encompasses the priorities and
infrastructure to support organizational transformation
through ROD.
The generic balanced scorecard framework needs to be extended to
address technology and ROD. I propose the following adjustments:
1. Financial : Requires the inclusion of indirect benefits from
technology, particularly as Lucas (1999) specifies, in nonmon-
etary methods of evaluating ROI. Risk must also be factored
in, based on specific issues for each technology project.
2. Customer : Technology-based products are integrated with
customer needs and provide direct customer package inter-
faces. Further, web systems that use the Internet are depen-
dent on consumer use. As such, technology can modify
organizational strategy because of its direct effect on the cus-
tomer interface.
3. Internal business processes : Technology requires business pro-
cess reengineering (BPR), which is the process of reevaluat-
ing existing internal norms and behaviors before designing a
151orGAnIzAtIonAl trAnsForMAtIon
new system. This new evaluation process addresses customers,
operational efficiencies, and cost.
4. Learning and growth : Organizational learning techniques,
under the umbrella of ROD, need to be applied on an ongo-
ing and evolutionary basis. Progress needs to be linked to the
ROD arc.
The major portion of the balanced scorecard strategy is in its initial
design; that is, in translating the strategy or, as in the ROD scorecard,
the evaluation of technology. During this phase, a strategy map and
actual balanced scorecards are created. This process should begin by
designing a balanced scorecard that articulates the business strategy.
Remember, every organization needs to build a strategy that is unique
and based on its evaluation of the external and internal situation (Olve
et al., 2003). To clarify the definition of this strategy, it is easier to
consider drawing the scorecard initially in the form of a strategy map.
A generic strategy map essentially defines the components of each
perspective, showing specific strategies within each one, as shown in
Figure 6.6.
Perspective:
Financial
Customer
Process
Learning and
growth
Improve
technology
Improve staff
skills
Establish new
markets
Increase
customer
service
Increase
efficiency
More satisfied
customers
Improve
profitability
Stronger
finances
Increase
customer
base
Figure 6.6 Strategy map. (From Olve, N., et al., Making Scorecards Actionable: Balancing
Strategy and Control , Wiley, New York, 2003.)
152 INFORMATION TECHNOLOGY
We can apply the generic strategy map to an actual case study,
Ravell phase I, as shown in Figure 6.7.
Recall that Ravell phase I created a learning organization using
reflective practices and action science. Much of the organization
transformation at Ravell was accelerated by a major event— the relo-
cation of the company. The move was part of a strategic decision for
the organization, specifically the economies of scale for rental expense
and an opportunity to retire old computers and replace them with
a much needed state-of-the-art network. Furthermore, there was a
grave need to replace old legacy applications that were incapable of
operating on the new equipment and were also not providing the
competitive advantage that the company sought. In using the strategy
map, a balanced scorecard can be developed containing the specific
outcomes to achieve the overall mission. The balanced scorecard is
shown in Figure 6.8.
The Ravell balanced scorecard has an additional column that defines
the expected organizational transformation from ROD. This model
addresses the issue of whether a change is truly a transformation. This
method also provides a systematic process to forecast, understand, and
Perspective:
Financial
Users
Process
Learning
and growth
New technology
products
New ways of
staff interaction
Establish
new organization
structure
Improved
systems
Provide accurate
and timely
information
More satisfied
users
Improve return
on project
investments
Reduce
technology
overhead
Increase user
IT support
Figure 6.7 Technology strategy map.
153orGAnIzAtIonAl trAnsForMAtIon
present what technology initiatives will ultimately change in the stra-
tegic integration and cultural assimilation components of ROD.
There are two other important factors embedded in this modified
balanced scorecard technique. First, scorecards can be designed at
varying levels of detail. Thus, two more balanced scorecards could
Strategy map
perspective
Financial
Measureable
outcomes Strategic objectives Organizational
transformation
Combine IT expenses with
relocation and capitalize
entire expense
Combination of expenses
requires formation of new
communities of practice,
which includes finance,
engineering, and IT
Improve
returns on
project
investments
Reduce
technology
overhead
costs
More
satisfied
users
Increase
user IT
support
Users
Process
Learning and
growth
Provide
accurate
and timely
information
Improved
systems
New
technology
products
New ways
of staff
interaction
structure
Establish
new
organization
Integrate new telephone
system with computer
network expenses
Leverage engineering
and communications
expenses with technology
Retire old equipment
from financial statements
Increase access to
central applications
Integrate IT within other
departments to improve
dynamic customer
support requirements
Provide new products to
replace old e-mail
system and make
standard applications
available to all users
Establish help desk
personnel
Process of supporting users
requires IT staff to embrace
reflective practices. User
relationship formed through
new communities of practice
and cultural assimilation
with user community
New culture at Ravell
established
Startegic integration occurs
through increased discourse
and language among
communities of practice
engaged in making
relocation successful. New
knowledge created and
needs knowledge
management
Improve decision support
for improved reporting
and strategic marketing
Upgrade new internal
systems, including
customer relationship
management (CRM),
general ledger, and
rights and royalties
Investigate new
voice-messaging
technology to improve
integration of e-mail and
telephone systems
Physically relocate IT
staff across departments
Modify IT reporting
structure with “dotted
line” to business units
IT becomes more critically
reflective, understands value
of their participation with
learning organization. IT
staff seeks to know less and
understands view of the
“other”
·
·
·
·
·
·
·
·
· ·
·
·
·
·
·
·
·
·
·
·
·
·
Figure 6.8 Ravell phase I balanced scorecard.
154 INFORMATION TECHNOLOGY
be developed that reflect the organizational transformations that
occurred in Ravell phases II and III, or the three phases could
be summarized as one large balanced scorecard or some combina-
tion of summary and detail together. Second, the scorecard can
be modified to reflect unexpected changes during implementa-
tion of a technology. These changes could be related to a shift-
ing mission statement or to external changes in the market that
require a change in business strategy. Most important, though,
are the expected outcomes and transformations that occur during
the course of a project. Essentially, it is difficult to predict how
organizations will actually react to changes during an IT project
and transform.
The balanced scorecard provides a checklist and tracking system
that is structured and sustainable— but not perfect. Indeed, many
of the outcomes from the three phases of Ravell were unexpected or
certainly not exactly what I expected. The salient issue here is that it
allows an organization to understand when such unexpected changes
have occurred. When this does happen, organizations need to have
an infrastructure and a structured system to examine what a change
in their mission, strategy, or expectations means to all of the com-
ponents of the project. This can be described as a “rippling effect,” in
which one change can instigate others, affecting many other parts of
the whole. Thus, the balanced scorecard, particularly using a strat-
egy map, allows practitioners to reconcile how changes will affect the
entire plan.
Another important component of the balanced scorecard, and the
reason why I use it as the measurement model for outcomes, is its
applicability to organizational learning. In particular, the learning
and growth perspective shows how the balanced scorecard ensures
that learning and strategy are linked in organizational development
efforts.
Implementing balanced scorecards is another critical part of the
project— who does the work, what the roles are, and who has the
responsibility for operating the scorecards? While many companies
use consultants to guide them, it is important to recognize that bal-
anced scorecards reflect the unique features and functions of the com-
pany. As such, the rank and file need to be involved with the design
and support of balanced scorecards.
155orGAnIzAtIonAl trAnsForMAtIon
Every business unit that has a scorecard needs to have someone
assigned to it, someone accountable for it. A special task force may
often be required to launch the training for staff and to agree on how
the scorecard should be designed and supported. It is advisable that the
scorecard be implemented using some application software and made
available on an Internet network. This provides a number of benefits:
It reduces paper or local files that might get lost or not be secured, allows
for easy “roll-up” of multiple scorecards, to a summary level, and access
via the Internet (using an external secured hookup) allows the scorecard
to be maintained from multiple locations. This is particularly attractive
for staff members and management individuals who travel.
According to Olve et al. (2003), there are four primary responsi-
bilities that can support balanced scorecards:
1. Business stakeholders : These are typically senior managers
who are responsible for the group that is using the score-
card. These individuals are advocates of using scorecards and
require compliance if deemed necessary. Stakeholders use
scorecards to help them manage the life cycle of a technology
implementation.
2. Scorecard designers : These individuals are responsible for the
“look and feel” of the scorecard as well as its content. To some
extent, the designers set standards for appearance, text, and
terminology. In certain situations, the scorecard designers
have dual roles as project managers. Their use of scorecards
helps them understand how the technology will operate.
3. Information providers : These people collect, measure, and
report on the data in the balanced scorecard. This function
can be implemented with personnel on the business unit level
or from a central services department. Reporting informa-
tion often requires support from IT staff, so it makes sense to
have someone from IT handle this responsibility. Information
providers use the scorecard to perform the measurement of
project performance and the handling of data.
4. Learning pilots : These individuals link the scorecard to organi-
zational learning. This is particularly important when measur-
ing organizational transformation and individual development.
156 INFORMATION TECHNOLOGY
The size and complexity of an organization will ultimately deter-
mine the exact configuration of roles and responsibilities that are
needed to implement balanced scorecards. Perhaps the most appli-
cable variables are:
Competence : Having individuals who are knowledgeable about
the business and its processes, as well as knowledgeable
about IT.
Availability : Individuals must be made available and appropri-
ately accommodated in the budget. Balanced scorecards that
do not have sufficient staffing will fail.
Executive management support: As with most technology proj-
ects, there needs to be a project advocate at the executive level.
Enthusiasm : Implementation of balanced scorecards requires a
certain energy and excitement level from the staff and their
management. This is one of those intangible, yet invaluable,
variables.
Balanced Scorecards and Discourse
In Chapter 4, I discussed the importance of language and discourse
in organizational learning. Balanced scorecards require ongoing dia-
logues that need to occur at various levels and between different com-
munities of practice. Therefore, it is important to integrate language
and discourse and communities of practice theory with balanced
scorecard strategy. The target areas are as follows:
• Developing of strategy maps
• Validating links across balanced scorecard perspectives
• Setting milestones
• Analyzing results
• Evaluating organizational transformation
Figure 6.9 indicates a community of practice relationship that
exists at a company. Each of these three levels was connected by a
concept I called “common threads of communication.” This model can
be extended to include the balanced scorecard.
The first level of discourse occurs at the executive community
of practice. The executive management team needs to agree on the
157orGAnIzAtIonAl trAnsForMAtIon
specific business strategy that will be used as the basis of the mis-
sion statement for the balanced scorecard. This requires conversations
and meetings that engage the CEO, executive board members (when
deemed applicable), and executive managers, like the chief operat-
ing officer (COO), chief financial officer (CFO), chief information
officer (CIO), and so on. Each of these individuals needs to represent
his or her specific area of responsibility and influence from an execu-
tive perspective. The important concept is that the balanced scorecard
mission and strategy should be a shared vision and responsibility for
the executive management team as a whole. To accomplish this task,
the executive team needs to be instructed on how the balanced score-
card operates and on its potential for accomplishing organizational
transformation that leads to strategic performance. Ultimately, the
discourse must lead to a discussion of the four balanced scorecard
perspectives: financial, customer, process, and learning and growth.
From a middle management level, the balanced scorecard allows
for a measurable model to be used as the basis of discourse with
Executive community of practice
CEO
Americas
Executive
board Consultants
CEO
AmericasNew ideas
and
adjustments
Senior
management
Operations
Middle
management
Middle
management
Adjustments as a
result of discourse
with operations
community
Operations management community of practice
Implementation community of practice
Figure 6.9 Community of practice “threads.”
158 INFORMATION TECHNOLOGY
executives. For example, the strategy map can be the vehicle for
conducting meaningful conversations on how to transform execu-
tive-level thinking and meaning into a more operationally focused
strategy. Furthermore, the scorecard outlines the intended outcomes
for strategy and organizational learning and transformation.
The concept of using the balanced scorecard as a method with
which to balance thinking and meaning across communities of prac-
tice extends to the operational level as well. Indeed, the challenge of
making the transition from thinking and meaning at the executive
level of operations is complicated, especially since these communi-
ties rarely speak the same language. The measurable outcomes section
of the scorecard provides the concrete layer of outcomes that opera-
tions staff tend to embrace. At the same time, this section provides
corresponding strategic impact and organizational changes needed to
satisfy business strategies set by management.
An alternative method of fostering the need forms of discourse is to
create multiple-tiered balanced scorecards designed to fit the language
of each community of practice, as shown in Figure 6.10. The diagram
in Figure 6.10 shows that each community can maintain its own lan-
guage and methods while establishing “common threads” to foster a
transition of thinking and meaning between it and other communi-
ties. The common threads from this perspective look at communica-
tion at the organizational/group level, as opposed to the individual
level. This relates to my discussion in Chapter 4, which identified
individual methods of improving personal learning, and development
within the organization. This suggests that each balanced scorecard
must embrace language that is common to any two communities to
establish a working and learning relationship— in fact, this common
language is the relationship.
Knowledge Creation, Culture, and Strategy
Balanced scorecards have been used as a measurement of knowledge
creation. Knowledge creation, especially in technology, has signifi-
cant meaning, specifically in the relationship between data and infor-
mation. Understanding the sequence between these two is interesting.
We know that organizations, through their utilization of software
applications, inevitably store data in file systems called databases.
159orGAnIzAtIonAl trAnsForMAtIon
The information stored in these databases can be accessed by many
different software applications across the organization. Accessing
multiple databases and integrating them across business units creates
further valuable information. Indeed, the definition of information
is “organized data.” These organized data are usually stored in data
infrastructures called data warehouses or data marts, where the infor-
mation can be queried and reported on to assist managers in their
decision-making processes. We see, in the Ravell balanced scorecard,
that decision-support systems were actually one of the strategic objec-
tives for the process perspective.
Organization-level balanced scorecard
Common
discourse threads
Executive-level
balanced scorecard
Management-level
balanced scorecard
Operational-level
balanced scorecard
Common
discourse threads
Common
discourse threads
Figure 6.10 Community of practice “common threads.”
160 INFORMATION TECHNOLOGY
Unfortunately, information does not ensure new knowledge cre-
ation. New knowledge can only be created by individuals who evolve
in their roles and responsibilities. Individuals, by participating in
groups and communities of practice, can foster the creation of new
organizational knowledge. However, to change or evolve one’ s behav-
ior, there must be individual or organizational transformation. This
means that knowledge is linked to organizational transformation. The
process to institutionalize organizational transformation is dependent
on management interventions at various levels. Management needs to
concentrate on knowledge management and change management and
to act as a catalyst and advocate for the successful implementation of
organizational learning techniques. These techniques are necessary to
address the unique needs of ROD.
Ultimately, the process must be linked to business strategy. ROD
changes the culture of an organization, through the process of cul-
tural assimilation. Thus, there is an ongoing need to reestablish align-
ment between culture and strategy, with culture altered to fit new
strategy, or strategy first, then culture (Pietersen, 2002). We see this
as a recurring theme, particularly from the case studies, that busi-
ness strategy must drive organizational behavior, even when technol-
ogy acts as a dynamic variable. Pietersen identifies what he called six
myths of corporate culture:
1. Corporate culture is vague and mysterious.
2. Corporate culture and strategy are separate and distinct
things.
3. The first step in reducing our company should be defining our
values.
4. Culture cannot be measured or rewarded.
5. Our leaders must communicate what our culture is.
6. Our culture is the one constant that never changes.
Resulting from these myths, Pietersen (2002) establishes four basic
rules of success for creating a starting point for the balance between
culture and strategy:
1. Company values should directly support strategic priorities.
2. They should be described as behaviors.
3. They should be simple and specific.
161orGAnIzAtIonAl trAnsForMAtIon
4. They should be arrived at through a process of enrollment
(motivation).
Once business synergy is created, sustaining the relationship
becomes an ongoing challenge. According to Pietersen (2002), this
must be accomplished by continual alignment, measurement, set-
ting examples, and a reward system for desired behaviors. To lead
change, organizations must create compelling statements of the case
for change, communicate constantly and honestly with their employ-
ees, maximize participation, remove ongoing resistance in the ranks,
and generate some wins. The balanced scorecard system provides the
mechanism to address the culture– strategy relationship while main-
taining an important link to organizational learning and ROD. These
linkages are critical because of the behavior of technology. Sustaining
the relationship between culture and strategy is simply more critical
with technology as the variable of change.
Ultimately, the importance of the balanced scorecard is that it
forces an understanding that everything in an organization is con-
nected to some form of business strategy. Strategy calls for change,
which requires organizational transformation.
Mission : To accelerate investment in technology during the reloca-
tion of the company for reasons of economies of scale and competitive
advantage.
163
7
viRTual Teams anD
ouTsouRCing
Introduction
Much has been written and published about virtual teams. Most
define virtual teams as those that are geographically dispersed,
although others state that virtual teams are those that primarily inter-
act electronically. Technology has been the main driver of the growth
of virtual teams. In fact, technology organizations, due mostly to the
advent of competitive outsourcing abroad, have pushed information
technology (IT) teams to learn how to manage across geographical
locations, in such countries as India, China, Brazil, Ireland, and many
others. These countries are not only physically remote but also present
barriers of culture and language. These barriers often impede commu-
nications about project status, and affect the likelihood of delivering a
project on time, and within forecasted budgets.
Despite these major challenges, outsourcing remains attractive due
to the associated cost savings and talent supply. These two advantages
are closely associated. Consider the migration of IT talent that began
with the growth of India in providing cheap and educated talent. The
promise of cost savings caused many IT development departments to
begin using more India-based firms. The ensuing decline in IT jobs
in the United States resulted in fewer students entering IT curricu-
lums at U.S. universities for fear that they would not be able to find
work. Thus, began a cycle of lost jobs in the United States and further
demand for talent abroad. Now, technology organizations are faced
with the fact that they must learn to manage virtually because the tal-
ent they need is far away.
From an IT perspective, successful outsourcing depends on effec-
tive use of virtual teams. However, the converse is not true; that is,
virtual teams do not necessarily imply outsourcing. Virtual teams can
164 INFORMATION TECHNOLOGY
be made up of workers anywhere, even those in the United States
who are working from a distance rather than reporting to an office
for work. A growing number of employees in the United States want
more personal flexibility; in response, many companies are allow-
ing employees to work from home more often— and have found the
experience most productive. This type of virtual team management
generally follows a hybrid model, with employees working at home
most of the time but reporting to the office for critical meetings; an
arrangement that dramatically helps with communication and allows
management to have quality checkpoints.
This chapter addresses virtual teams working both within the
United States and on an outsource basis and provides readers with
an understanding of when and how to consider outsource partners.
Chapter topics include management considerations, dealing with
multiple locations, contract administration, and in-house alternatives.
Most important, this chapter examines organizational learning as a
critical component of success in using virtual teams. Although the
advent of virtual teams creates another level of complexity for design-
ing and maintaining learning organizations, organizational learning
approaches represent a formidable solution to the growing dilemma of
how teams work, especially those that are 100% virtual.
Most failures in virtual management are caused by poor communi-
cation. From an organizational learning perspective, we would define
this as differences in meaning making— stemming mostly from cul-
tural differences in the meaning of words and differing behavioral
norms. There is also no question that time zone differences play a role
in certain malfunctions of teams, but the core issues remain commu-
nication related.
As stated, concerning the Ravell case study, cultural transformation
is slow to occur and often happens in small intervals. In many virtual
team settings, team members may never do more than communicate
via e-mail. As an example, I had a client who was outsourcing produc-
tion in China. One day, they received an e-mail stating, “ We cannot
do business with you.” Of course, the management team was confused
and worried, seeking to understand why the business arrangement
was ending without any formal discussions of the problem. A trans-
lator in China was hired to help clarify the dilemma. As it turned
out, the statement was meant to suggest that the company needed
165vIrtuAl teAMs And outsourCInG
to provide more business— more work, that is. The way the Chinese
communicated that need was different from the Western interpre-
tation. This is just a small example of what can happen without a
well-thought-out organizational learning scheme. That is, individuals
need to develop more reflective abilities to comprehend the meaning
of words before they take action, especially in virtual environments
across multiple cultures. The development of such abilities— the
continual need for organizations to respond effectively to dynamic
changes, brought about by technology, in this case, e-mail— is consis-
tent with my theory of responsive organizational dynamism (ROD).
The e-mail established a new dynamic of communication. Think how
often specifications and product requirements are changing and need
virtual teams to somehow come together and agree on how to get the
work done— or think they agree.
Prior research and case studies provide tools and procedures as ways
to improve productivity and quality of virtual team operations. While
such processes and methodologies are helpful, they will not necessar-
ily ensure the successful outcomes that IT operations seek unless they
also change. Specifically, new processes alone are not sufficient or a
substitute for learning how to better communicate and make mean-
ing in a virtual context. Individuals must learn how to develop new
behaviors when working virtually. We must also remember that vir-
tual team operations are not limited to IT staffs. Business users often
need to be involved as they would in any project, particularly when
users are needed to validate requirements and test the product.
Status of Virtual Teams
The consensus tells us that virtual teams render results. According to
Bazarova and Walther (2009), “ Virtual groups whose members com-
municate primarily or entirely via email, computer conferencing, chat,
or voice— have become a common feature of twenty-first century
organizations” (p. 252). Lipnack and Stamps (2000) state that virtual
teams will become the accepted way to work and will likely reshape
the work world. While this prediction seems accurate, there has also
been evidence of negative attribution or judgment about problems that
arise in virtual team performance. Thus, it is important to understand
how virtual teams need to be managed and how realistic expectations
166 INFORMATION TECHNOLOGY
of such teams might be formed. So, while organizations understand
the need for virtual teams, they are not necessarily happy with proj-
ect results. Most of the disappointment relates to a lack of individual
development that helps change the mindset of how people need to
communicate, coupled with updated processes.
Management Considerations
Attribution theory “ describes how people typically generate explana-
tions for outcomes and actions— their own and others” (Bazarova &
Walther, 2009, p. 153). This theory explains certain behavior patterns
that have manifested during dysfunctional problems occurring in man-
aging virtual teams. Virtual teams are especially vulnerable to such
problems because their limited interactions can lead to members not
having accurate information about one another. Members of virtual
teams can easily develop perceptions of each other’ s motives that are
inaccurate or distorted by differing cultural norms. Research also shows
us that virtual team members typically attribute failure to the external
factors and successes to internal factors. Problems are blamed on the
virtual or outside members for not being available or accountable to the
physical community. The successes then tend to reinforce that virtual
teams are problematic because of their very nature. This then estab-
lishes the dilemma of the use of virtual teams and organizations— its
use will continue to increase and dominate workplace structures and
yet will present challenges to organizations that do not want to change.
The lack of support to change will be substantiated during failures in
expected outcomes. Some of the failures, however, can and should be
attributable to distance. As Olson and Olson (2000) state: “ Distance
will persist as an important element of human experience” (p. 172). So,
despite the advent of technology, it is important not to ignore the social
needs that teams need to have to be effective.
Dealing with Multiple Locations
Perhaps the greatest difficulty in implementing virtual teams is the
reality that they span multiple locations. More often, these locations
can be in different time zones and within multiple cultures. To prop-
erly understand the complexity of interactions, it makes sense to revisit
167vIrtuAl teAMs And outsourCInG
the organizational learning tools discussed in prior chapters. Perhaps
another way of viewing virtual teams and their effects on organiza-
tion learning is to perceive it as another dimension— a dimension that
is similar to multiple layers in a spreadsheet. This notion means that
virtual teams do not upset the prior relations between technology as
a variable from a two-dimensional perspective, rather in the depth
of how it affects this relationship in a third dimension. Figure 7.1
reflects how this dimension should be perceived.
In other words, the study of virtual teams should be viewed as
a subset of the study of organizations. When we talk about work-
place activities, we need to address issues at the component level. In
this example, the components are the physical organization and the
Technology as an
independent
variable
Creates
Virtual organizational
dynamism dimension
Physical organizational
dynamism dimension
Virtual acceleration
dimension
Strategic
integration
Cultural
assimilation
Total
organizational
dynamism
Acceleration of events that
require different
infrastructures and
organizational processes
Virtual cultural
assimilation dimension
Virtual strategic
integration dimension
Figure 7.1 The three-dimensional ROD.
168 INFORMATION TECHNOLOGY
virtual organization. The two together make up the superset or the
entire organization. To be fruitful, any discussion of virtual organiza-
tions must be grounded in the context of the entire organization and
address the complete topic of workplace learning and transformation.
In Chapter 4, I discussed organizational learning in communities of
practice (COP). In this section, I expand that discussion to include
virtual organizational structures.
The growing use of virtual teams may facilitate the complete inte-
gration of IT and non-IT workers. The ability to connect from various
locations using technology itself has the potential to expand COP.
But, as discussed in Chapter 4, it also presents new challenges, most
of which relate to the transient nature of members, who tend to par-
ticipate on more of a subject or transactional basis, rather than being
permanent members of a group. Table 7.1 reflects some of the key
differences between physical and virtual teams.
There has been much discussion about whether every employee is
suited to perform effectively in a virtual community. The consensus is
that effective virtual team members need to be self-motivated, able to
work independently, and able to communicate clearly and in a posi-
tive way. However, given that many workers lack some or all of these
skills, it seems impractical to declare that workers who do not meet
these criteria should be denied the opportunity to work in virtual
Table 7.1 Operating Differences between Traditional and Virtual Teams
TRADITIONAL OR PHYSICAL TEAMS VIRTUAL TEAMS
Teams tend to have fixed participation and
members.
Membership shifts based on topics and needs.
Members tend to be from the same
organization.
Team members can include people from outside
the organization (clients and collaborators).
Team members are 100% dedicated. Members are assigned to multiple teams.
Team members are collocated geographically
and by organization.
Team members are distributed geographically and
by organization.
Teams tend to have a fixed term of
membership; that is, start and stop dates.
Teams are reconfigured dynamically and may
never terminate.
Teams tend to have one overall manager. Teams have multiple reporting relationships with
different parts of the organization at different
times.
Teamwork is physical and practiced in
face-to-face interactions.
Teamwork is basically social.
Engagement is often during group events
and can often be hierarchical in nature.
Individual engagement is inseparable from
empowerment.
169vIrtuAl teAMs And outsourCInG
teams. A more productive approach might be to encourage workers to
recognize that they must adapt to changing work environments at the
risk of becoming marginal in their organizations.
To better understand this issue, I extended the COP matrix,
presented in Chapter 4, to include virtual team considerations in
Table 7.2.
Item 7 in Table 7.2 links the study of knowledge management with
COP. Managing knowledge in virtual communities within an orga-
nization has become associated directly with the ability of a firm to
sustain competitive advantage. Indeed, Peddibhotla and Subramani
(2008) state that “ virtual communities are not only recognized as
important contributors to both the development of social networks
among individuals but also towards individual performance and firm
performance” (p. 229). However, technology-enabled facilities and
support, while providing a repository for better documentation, also
create challenges in maintaining such knowledge. The process of how
information might become explicit has also dramatically changed
with the advent of virtual team communications. For example, much
technology-related documentation evolves from bottom-up sources,
rather than the traditional top-down process. In effect, virtual com-
munities share knowledge more on a peer-to-peer basis or through
mutual consensus of the members. As a result, virtual communities
have historically failed to meet expectations, particularly those of
management, because managers tend to be uninvolved in communi-
cation. While physical teams can meet with management more often
before making decisions, virtual teams have no such contact available.
To better understand the complexities of knowledge management and
virtual teams, Sabherwal and Becerra-Fernandez (2005) expand on
Nonaka’ s (1994) work on knowledge management, which outlined
four modes of knowledge creation: externalization, internalization,
combination, and socialization. Each of these modes is defined and
discussed next.
Externalization
Externalization is the process of converting or translating tacit knowl-
edge (undocumented knowledge) into explicit forms. The problem with
this concept is whether individuals really understand what they know
170 INFORMATION TECHNOLOGY
Table 7.2 Communities of Practice: Virtual Team Extensions
STEP
COMMUNITIES-OF-
PRACTICE STEP TECHNOLOGY EXTENSION VIRTUAL EXTENSION
1 Understanding strategic
knowledge needs: What
knowledge is critical to
success.
Understanding how
technology affects
strategic knowledge and
what specific
technological knowledge
is critical to success.
Understanding how to
integrate multiple visions
of strategic knowledge and
where it can be found
across the organization.
2 Engaging practice
domains: Where people
form communities