Select Page

Simple. Obvious. Intuitive. Impossible.

Simple. Obvious. Intuitive. Impossible.

Fred Davis has heard his theories characterized as all of these —by the same person, sometimes in the same sentence — by everyone from information technology professionals to CEOs. And within those competing descriptions may lie the reason that so many major technology implementations fail.

Failure rates for new information systems hover near 25 percent, with another 17 percent of projects reporting cost overruns, according to a 1999 report from Software Productivity Research. Their sample, which tracked 6,700 projects across 500 enterprises, found that as project complexity increased, failure increased. And for highly complex systems, the failure rate was 65 percent, with another 35 percent of projects experiencing cost overruns.

According to Fred Davis, who chairs the information systems department and holds the David D. Glass Chair in the Sam M. Walton College of Business, most software implementations fail because of a discrepancy between specified user requirements and actual user requirements. He points to a 2002 article in Technology Review reporting that in many instances, the purpose of new software is not spelled out before programmers begin writing it.

This lack of clarity about user requirements leads to expensive and difficult changes later in the development cycle, causing project delays, cost overruns and failure to gain user acceptance after implementation. Since this is the area that generates the costs, Davis reasoned that it would be valuable to focus on the identification, correction and prevention of these errors.

Davis is best known for development of the Technology Acceptance Model (TAM) and its extension, TAM2, which can predict if a company will be successful in implementing a new computer technology. However, his research interests also include software development practices, computer training and skills acquisition, computer-assisted decision-making and the management of emerging technologies.

The Art of Technology

Davis traces his unique perspective on technology to muralist Diego Rivera. One of Mexico’s most famous artists, Rivera came to the United States in 1930 and painted a series of murals in San Francisco and New York, including the infamous Man at the Crossroads, which was commissioned and destroyed by John D. Rockefeller.

However, what many have called Rivera’s greatest mural, Detroit Industry, was painted on the walls of the courtyard at the Detroit Institute of Arts (DIA). Worked on 27 panels, it depicts workers at the Ford and Chrysler assembly plants along with allusions to science and technology and the chemical and pharmaceutical industries.

As a young engineering student at Wayne State University, Davis worked on a research project in the health care industry for a professor. He often took books and papers to read in the courtyard of the DIA, which adjoined the university campus.

“I studied Rivera’s murals for hours and drew inspiration from them,” said Davis. “I decided that I wanted to learn how to harness the power of technology to make good things happen.”

In 1979 that research project also gave Davis his first experience in the failure of technology implementation. His part in the project was to write a complex program to schedule employees in a large hospital. Although his software worked exactly as specified, the client requested a number of rewrites and modifications. However, the client’s employees never accepted the new procedures and the implementation was unsuccessful.

After earning an engineering degree from Wayne State, Davis entered the Sloan School of Management at the Massachusetts Institute of Technology. While working on his Ph.D., which would evolve into TAM, he first encountered the resistance he learned to associate with challenging “sacred assumptions.”

“A major technology company was going to let me conduct my research there. They sent a helicopter for me and flew me to their headquarters for a meeting with their executives,” Davis said. “They explained their project, and I pointed out why it could not succeed. By the looks on their faces, I knew that some of them realized the problem. But the software was written, and the client was already struggling with implementation.”

The software, an expert system to improve specification of the myriad parts of a major computer system, was designed to reduce costs, improve delivery and installation, enhance plant and warehouse efficiency and increase customer satisfaction. However, it provided no benefit to the sales force, which was expected to use the system.

“There was no reward at all for the sales people — no incentive to spend time learning the system or using it,” said Davis. “In fact, they looked at it as a negative. It did not improve their jobs, but it took away time they could be using to make money for the company. It was not useful to them.”

Because the company discontinued his project, Davis switched his research to technology implementation in the hospitality industry. The fact that he could easily switch to another failed implementation in a different industry showed Davis how pervasive the problem of technology acceptance is across all industries.

But this response no longer surprises Davis. When he was developing the TAM in the early 1980s, he often encountered skepticism. Now TAM is considered a foundation for hundreds of research projects.

When Good Systems Go Bad

Technology development includes two types of design features — interface features and functional features. Interface features are directly associated with ease of use, and are relatively easy to modify. Functional features, on the other hand, relate to usefulness and are drawn from what the company says it requires, which makes them difficult to change.

In most software development cycles, users first see a system after it has been developed and a working prototype created. Drawing on his research into usefulness and ease of use, Davis developed a model for pre-prototype user acceptance testing that would allow developers to determine if a product would be useful before the code was written.

Dubbed the “crystal ball,” his expanded model can predict if a company will be successful in implementing a new computer technology, such as introduction of proprietary software, before a single line of code is written. In other words, it allows managers to avoid the costly development and implementation of a product that will ultimately fail.

“This approach challenges the preceived wisdom that software systems can only be evaluated from a ‘hands-on’ perspective. While this is true for ease of use, it is not true for usefulness,” said Davis. “Most people think that if it is true for one, it must be true for the other. But our research has shown that this is not the case. And no amount of ease of use will compensate for a lack of usefulness.”

The Crystal Ball

In the process of extending his TAM, Fred Davis encountered a conundrum. “Corporate America is littered with failed information technology (IT) projects,” said Davis. “Most IT managers and business leaders have horror stories of software implementations that failed after huge investments of time and money because they were not adopted by the critical users.”

The ability to prevent this enormous waste of time and resources should be beneficial to decision-makers. And when Davis presents his model to information technology managers, business leaders, faculty members and consultants, they say it is so simple that it is obvious. But within minutes, they will also insist that it is impossible.

Davis, it seems, has discovered that the core element essential to technology adoption — usability — is a multistable perception. For most people, a multistable perception is an interesting mental game; in information technology (IT), it may cost businesses billions of dollars.

“It is like the drawing you see in textbooks. One way you look at it, it seems to be a vase, but if you shift your perception, it seems to be two faces,” explained Davis. “You can see both versions, but not at the same time. And when you try to see both, they both disappear and you are looking at a meaningless drawing.”

Introduced by Danish psychologist Edgar Rubin in 1915, the vase/face drawing is probably the most famous example of multistable perception. It represents a puzzle that has challenged psychologists for more than a century, that something can take multiple, distinct, but mutually inconsistent forms. Most observers shift back and forth between the forms, but cannot hold awareness of both distinct forms at the same time.

“Usability seems to be like that,” Davis added. “In IT we have subsumed two constructs — usefulness and ease of use — into the concept of usability, which seems to be multistable. Usability is the most critical factor in user acceptance of technology, but major IT implementations still fail because only one component of usability — ease of use — is addressed.”

What Do Users Want?

Davis has spent his career studying why users adopt — or fail to adopt — technologies. Introduced in 1986, his TAM has become central to most discussions of user acceptance. TAM showed that perceived usefulness and ease of use were the only two factors that consistently influenced user acceptance.

During the past decade, many studies have shown that, while both are necessary, perceived usefulness is the strongest determinant of user acceptance. If a technology is useful, the user will tolerate some difficulty to use it. But if the user doesn’t find the technology useful for his or her particular job, the implementation is likely to fail.

Because of its central role in employees’ intention to use a computer technology, Davis worked with Viswanath Venkatesh of the University of Maryland to study perceived usefulness in detail. Their research looked at social influences — voluntariness, image and the opinion of persons important to the employee, and cognitive influences — the employees’ own opinions about usefulness of the new technology in their job.

The researchers tracked the implementations of new computer technologies in four different manufacturing and financial services companies over a period of time. In two companies, the use of the technology was voluntary and in two others it was mandatory.

In studying employee opinions about the usefulness of the new technology, Davis and Venkatesh focused on job relevance, output quality, ease of use and tangible results. They found that employees are more positive toward a new technology if they think that it is directly useful to their job and performs the tasks well.

Although perceived relevance and quality are closely related, the way employees use them is different. Job relevance is used to eliminate systems from consideration that are judged to be less relevant. Output quality is used to select one system from among several options.

“Even effective systems can fail to get user acceptance because of lack of result demonstrability,” explained Davis. “If users can’t tie gains in their job performance directly to the use of the new system, they are unlikely to think of it as useful.”

Prior to the introduction of the technology, the opinion of other people significant to the employee had a great deal of influence, Davis found, but only if the implementation was mandatory. Even then, the importance of this element diminished over time as the employee used the new technology.

“Organizational mandates don’t always have a positive effect on technology use,” notes Davis. “Even when usage is mandatory, usage intentions vary because some users are unwilling to comply with such mandates.”

Managerial Mandates

It seems intuitive that once a software product is developed, managers can merely order employees to use it. However, extensive research and business experiences demonstrate that this is not the case. No matter what the incentives or disincentives, managerial mandates are insufficient to ensure adoption. In fact, Davis’ research showed that managerial mandates are the least effective means to achieve compliance in a technical environment.

To determine what factors influenced employee intentions to adopt a new methodology for developing software, Davis and colleagues Bill Hardgrave, director of the Information Technology Research Center, and Cynthia Riemenschneider, assistant professor of information systems, conducted a study with 128 developers in a Fortune 1000 firm implementing a new methodology that had been custom-created for the company’s internal use. Implementing development methodologies has been called one of the most serious areas of concern in IT

“We expected that making the usage mandatory would increase employees’ intention to comply, but it didn’t,” said Hardgrave. “In fact, of all of the parameters we evaluated, managerial mandate was the least likely to induce compliance.”

Methodologies are comprehensive systems that standardize the steps in the development process. Although they can provide increased productivity and profitability, development methodologies are only used in about half of all companies that develop software because of the difficulties in deploying the systems.

“Many organizations are trying to improve their software development by implementing methodologies,” explained Davis. “But this usually represents a substantial change from their previous practices. Developer resistance can prevent the company from fully deploying or realizing the benefits of the methodology.”

Overcoming this resistance requires knowledge of the factors that make an employee intend to use the methodology. According to Hardgrave, the new methodology represented a radical change for the developers, who moved from an environment with no prescribed processes in place to an environment guided by an organization-wide methodology.

The developers were introduced to the methodology in a presentation, trained with it for six weeks and given written and online instructions. At the end of the training period, developers were instructed in writing to begin using the new policy. After 12 weeks they were given a questionnaire to assess their intention to use the new methodology on the basis of five common determinants — usefulness, compatibility with their existing practices, social pressure, complexity of the new methodology and organizational mandate.

“As we expected, complexity had little impact on intentions,” Hardgrave added. “But neither did organizational mandate. The greatest influence was usefulness, followed by compatibility and social pressure.”

According to Davis, these findings point to training strategies that could improve acceptance of new methodologies. Since perceived usefulness is crucial, demonstrating the individual productivity benefits of the methodology could help to address this issue. Managers also might improve adoption by explicitly demonstrating how the new methodology is compatible with existing work practices and design a migration path that will introduce parts of the methodology incrementally rather than in a single step.

Training

The importance of training in technology acceptance has led Davis to explore the ways in which training strategies impact skill acquisition. His recent work focuses on the role of symbolic mental rehearsal (SMR) as an integral component of a training program.

“Insufficient computer skills are a key reason why organizational investments in information technology so often fail to deliver the desired productivity gains,” Davis said. “Improvements in computer skill training represent a key driver of ongoing productivity improvements.”

Behavior modeling is a common approach to computer skill training. In this approach, trainees watch a model demonstrate computer skills and then repeat the demonstrated behaviors. This approach has proven to be more effective than approaches like computer-aided instruction, lectures or self-study. Davis speculated that the addition of SMR to traditional behavior modeling could significantly enhance learning.

“SMR is a specific form of mental rehearsal that establishes a cognitive link between visual images and symbolic memory codes,” explained Davis. “In essence, trainees imagine themselves performing behaviors that they saw performed by the trainer.”

Davis conducted experiments to determine if the addition of SMR would facilitate the development of knowledge structures. Knowledge structures are the rules and strategies, the “scaffolding,” that guide the construction of complex behaviors like using a new software system. His research showed that SMR does more than knowledge and task performance.

“The research also showed why SMR has this positive effect on training outcomes,” Davis said. “It showed that changes to relevant knowledge structures are a key mediational process by which SMR produces training improvements.”

New Worlds to Conquer

Davis constructed his unique approach to understanding user acceptance of technology by drawing on theories from psychology, sociology, education and many other fields. He has demonstrated that focusing on the key elements of TAM can give employers a “crystal ball” that can be used to determine if a technology implementation will succeed before a single line of code is written. And he has demonstrated that the addition of SMR to traditional behavioral modeling can significantly enhance computer skills training and, consequently, productivity.

“Although hundreds of projects and articles have focused on TAM, it certainly has not been exhausted as a research field. For example, none of the recommended practices, have incorporated a user acceptance model,” Davis said. “But I have become interested in exploring self-regulated learning strategies. I want to know if you can train people to be self-directed learners.”

About The Author

Looking for an expert?

The University of Arkansas Campus Experts website is a searchable database of experts who can talk to the media on current events.

Trending Topics:
Mars
State and local economy
Environmental economics
Immigration politics

The University Relations Science and Research Team

Matt McGowan
science and research writer
479-575-4246, dmcgowa@uark.edu

Robert Whitby
science and research writer
479-387-0720, whitby@uark.edu

DeLani Bartlette
feature writer
479-575-5709, drbartl@uark.edu

Where Technology, History and Culture Collide

More on University of Arkansas Research

Visit The Office of Research and Innovation for more information on research policies, support and analytics.

Connect with Us