Job98456


 
HomeCalendarFAQSearchRegisterMemberlistUsergroupsLog in

Share | 
 

 Buggy Software: Up From a Low-Quality Quagmire

Go down 
AuthorMessage
raj_mmm9



Number of posts : 1850
Age : 55
Registration date : 2008-03-08

PostSubject: Buggy Software: Up From a Low-Quality Quagmire   Fri 11 Apr - 15:33

By using application life-cycle management, some companies are trying to exterminate software bugs and reduce the costs they incur.

The horror stories have become all too familiar:
* In April, a software glitch resulted in the loss of thousands of dollars for US Airways Group Inc. when some tickets were mistakenly priced at $1.86.
* In the latest U.S. presidential election, reports of incorrect tallies surfaced in several districts that were using new computerized voting machines.
* A software bug apparently caused the largest power outage in North America, the Northeast blackout of August 2003, which threw millions of people into darkness.


The list could go on and on. And the problem, it seems, is only getting worse. According to one oft-quoted number from the National Institute of Standards and Technology, flawed software cost the U.S. economy $60 billion in 2002. No one doubts that the number is even higher today.

Bad software plagues nearly every organization that uses computers, causing lost work hours during computer downtime, lost or corrupted data, missed sales opportunities, high IT support and maintenance costs, and low customer satisfaction. In frustration, CIOs are taking a hard look at how bugs get into the application development process and why they seem to be so hard to prevent. The consensus: It's not one specific failure but a series of disconnects and miscommunications among the IT specialists involved in the planning, development, testing and maintenance of each application.

The problem, say those who study bad software, is a failure to manage the life cycle of software and recognize that any effort to improve software quality must span all of the stages of the application's life, from initial planning to postdeployment and maintenance. Berkshire Life Insurance Company of America, a subsidiary of The Guardian Life Insurance Company of America in New York, has been examining ways to improve quality throughout the application life cycle.

"In the past year, we have looked at our development process, at our requirements-gathering methodology and at the way we monitor systems," explains Sorin Fiscu, project manager and IT rapid application development team leader at Berkshire Life.

Fiscu's team has implemented changes such as involving the quality assurance (QA) staff in the early planning stages, soliciting input from business analysts and automating more of the testing phase. These changes have enabled the company to meet or exceed two of its goals for postdeployment: application availability and overall user satisfaction with the application.

One of the first steps in the development of an application at Berkshire Life is bringing business users and IT together to agree upon the functional specifications of the application, listing every feature and function that the business users need, from the flow of screens to the names of data fields.

"It's a very detailed picture of the application and how it will be used," says Fiscu. "The key is to get everybody talking upfront. Testers, analysts and developers need to communicate as much as possible."

The basic goals of application life-cycle management (ALM) are fairly straightforward. They include ensuring adequate communication between the teams responsible for each stage and preventing errors from progressing through the cycle, since it costs more to fix errors later in the development process than at the beginning.

"The life cycle may appear obvious, but most organizations -- close to about 90% - do not know how to effectively manage the life cycle," asserts Theresa Lanowitz, an analyst at Gartner Inc. "If the life cycle was truly embraced with the right people, process and technologies, we would see better-quality software and more efficient and effective IT organizations. As it is, most IT organizations waste quite a bit of their budget because they have bad business practices, fail to deliver on requirements and fail to manage projects to meet schedule, cost and quality goals."

Quality From the Start
Establishing clear communication channels among developers, testers and the business users is critical to successful life-cycle management. This needs to be made part of the process during the planning stage.

At Staples Inc., the emphasis is on collaboration among everyone involved in the application's development, testing and use, according to Kathy Murray, senior manager of quality management at the Framingham, Mass.-based office products retailer.

"We meet with our business partners to discuss the business requirements, with QA there as well so they understand the requirements," she says. "The more time we spend in the definition phase, the better later phases go. There are studies that say 60% to 70% of bugs are introduced during the definition stage, and we find that to be true."

Poor requirements are the root of most QA problems, says Arthur Povlot, an Atlanta-based business development manager at Tescom Software Systems Testing Ltd., a provider of QA services. "Very seldom do companies implement quality 'gates' at the requirements stage. For instance, you should have the requirements audited and signed off on by the people involved -- business analysts, marketing managers, subject matter experts, etc.," he says.

Programmers tend to like to do things their own way. And though it's probably counterproductive to bog developers down with red tape, it's nevertheless a good idea to implement some processes and procedures for consistency and quality control.

Fiscu highly recommends requiring developers to perform specific QA tests on their code before handing it off -- bugs and all -- to the QA staff to fix. "Our development team receives a set of unit test scripts, like a high-level checklist. Development is done only when the checklist is done," he says. "This way, we make sure we don't push high-level defects from development into the test environment."

Another common difficulty in development that breeds software errors is keeping track of changes and versions. Configuration management and change management policies and tools help enforce a standard process for creating and testing code.

American Greetings Corp. in Cleveland, for instance, relies on AllFusion Change Manager from Computer Associates International Inc. to track changes to its code throughout the development process and enforce company standards for development.

"Someone can't decide to use a different compiler, for instance, or skip a test, because it's all built into the process" in AllFusion, says Tom Brown, software manager at American Greetings. "To manage the life cycle means to keep the source code as current and consistent as far as the type of processes and compilers that we used."

Testing and More Testing
While developers should do some early testing as they go, a full-blown testing process/department is crucial to finding and fixing bugs. After developers pass off the code, it should be subjected to a variety of thorough checks, including functional testing to evaluate the flow and functional correctness of the program, integration testing, performance testing, security testing, and regression testing of updates and changes to a program.

The Chicago Board of Trade performs a number of manual and automated tests on applications, including unit testing by developers, performance testing using QACenter from Compuware Corp. and user-acceptance testing or functional testing by the traders and brokers who will use the software. CBOT also tests with an eye toward growth and heavier traffic in the future.

"We are proactive, not reactive, so we test for future loads the systems may experience," says David Burkhart, director of quality assurance at CBOT.

Because of limits on time, technology and human capabilities, even the most sensitive, mission-critical systems can't be tested to 100% assurance. The question becomes one of how many tests to make and how much time to take. Povlot advises creating test cases for 100% of the application's most critical requirements. (Test cases are lists of the input and expected responses needed to test a particular feature.) Overall, he says, you should be testing 90% of all requirements.

Automated tools can help speed test planning and execution, especially for regression testing. "We've decreased our test cycle man-hours by 50%, enabling us to increase test coverage by 300%," says Murray, who credits the improvement to Staples' use of SilkTest by Segue Software Inc. and StarTest from Star Quality in Hopkinton, Mass.

Berkshire Life Insurance uses Empirix Inc.'s e-Test Suite to manage the testing process and speed regression testing. "The more enhancements we added, the more time the regression phase of testing took. Now automation frees up resources and also ensures consistency.
Back to top Go down
View user profile
 
Buggy Software: Up From a Low-Quality Quagmire
Back to top 
Page 1 of 1
 Similar topics
-
» Sri Lanka tea quality season hit by bad weather
» Need help with IMX Software
» CDAX software dilemma
» Scan Products gets ISO certification for Food Safety and Quality
» JStock 1.0.6 Is Ready - Free Stock Market Software

Permissions in this forum:You cannot reply to topics in this forum
Job98456 :: General Software-
Jump to: