“This isn’t what I need!” objected the admissions officer at Northwest Regional Hospital. Judy sighed, “But this is the software you asked us to create for you.” “I don’t care what I said at the time, this system won’t work for us the way you have it set up. You’ll have to fix it.” “But any fixes are going to set this project back at least four months,” Judy warned. “Why don’t you work with it for a while and get used to the features? I’m sure you’ll find that it works fine.” Judy’s attempt at reassurance just set off an even more negative response from the admissions officer: “Look, we needed the registration screens in a different format. I can’t read this one. And on top of that, it’s missing the insurance check function.” “But you didn’t ask for any of those features last April when we developed the specifications for the system.” “At the time, I didn’t know they were available. Since then, we got new information and some new federal regulations. You’ll have to make big changes before I can authorize our staff to switch over to this system.” As Judy reflected on this conversation later, she realized that this had become a recurring problem at the hospital. As head of the IT department, Judy was responsible for upgrading and adding multiple new reporting and information system functions to the hospital’s software on an ongoing basis. It seemed as if the plan for every new effort was met by clients with
initial enthusiasm and high expectations. After the preliminary scope meetings, the members of the IT group would head back to the department and work over several months to create a prototype so their clients could see the system in use, play around with it, and realize its value. Unfortunately, more often than not, that sequence just didn’t happen. By the time the programmers and system developers had finished the project and presented it to the customer, these hospital staff members had forgotten what they asked for, didn’t like what they received, or had a new list of “critical features” the IT representative had to immediately include. Later, at the lunch table, Judy related the latest demonstration and rejection meeting to some of her colleagues from the IT group. To a person, they were not surprised. Tom, her second-in-command, shrugged, “It happens all the time. When was the last time you had a department act happy with what we created for them? Look on the bright side—it’s steady work!” Judy shook her head, “No, there’s got to be something wrong with our processes. This shouldn’t keep happening like this. Think about it. What’s the average length of one of our software upgrade projects? Five or six months?” Tom thought a moment, “Yes, something like that.” “OK,” Judy continued, “during your typical development cycle, how often do we interact with the client?” “As little as possible! You know that the more we talk to them, the more changes they ask for. It’s better
to just lock the specs in upfront and get working. Anything else leads to delays.” Judy objected, “Does it really delay things that much, especially when the alternative is to keep developing systems that no one wants to use because it’s ‘not what they asked for’?” Tom thought about this and then looked at Judy, “Maybe this is a no-win situation. If we ask them for input, we’ll never hear the end of it. If we create a system for them, they don’t like it. What’s the alternative?”
Questions
1. Why does the classic waterfall project planning model fail in this situation? What is it about the IT
department’s processes that lead to their finished systems being rejected constantly?
2. How would an Agile methodology correct some of these problems? What new development cycle
would you propose?
3. Why are “user stories” and system “features” critical components of an effective IT software development process?
4. Using the terms “Scrum,” “Sprint,” and “User stories,” create an alternative development cycle for
a hypothetical software development process at Northwest Regional Hospital.