.NET Zone is brought to you in partnership with:

Sasha Goldshtein is a Senior Consultant for Sela Group, an Israeli company specializing in training, consulting and outsourcing to local and international customers.Sasha's work is divided across these three primary disciplines. He consults for clients on architecture, development, debugging and performance issues; he actively develops code using the latest bits of technology from Microsoft; and he conducts training classes on a variety of topics, from Windows Internals to .NET Performance. You can read more about Sasha's work and his latest ventures at his blog: http://blogs.microsoft.co.il/blogs/sasha. Sasha writes from Herzliya, Israel. Sasha is a DZone MVB and is not an employee of DZone and has posted 204 posts at DZone. You can read more from them at their website. View Full User Profile

“Fitting” Performance into the Software Development Lifecycle

05.13.2012
| 2953 views |
  • submit to reddit

This is a short excerpt from Chapter 1 of Pro .NET Performance, scheduled to appear in August 2012. I might be publishing a few more of these before and after the book is out. We have an Amazon page and a cover image now!

Where do you fit performance in the software development lifecycle? This innocent question carries the mind baggage of having to retrofit performance into an existing process. Although it is possible, a healthier approach is to consider every step of the development lifecycle an opportunity to understand the application’s performance better—first, the performance goals and important metrics; next, whether the application meets or exceeds its goals; and finally, whether maintenance, user loads, and requirement changes introduce any regressions.

During the requirements gathering phase, you should start thinking about the performance goals you would like to set.

During the architecture phase, you should refine the performance metrics important for your application and define concrete performance goals.

During the development phase, you should frequently perform exploratory performance testing on prototype code or partially complete features to verify that you are well within the system’s performance goals.

During the testing phase, you should perform significant load testing and performance testing to validate completely your system’s performance goals.

During subsequent development and maintenance, you should perform additional load testing and performance testing with every release (and preferably on a daily or weekly basis) to quickly identify any performance regressions introduced into the system.

Taking the time to develop a suite of automatic load tests and performance tests, to set up an isolated lab environment in which to run them, and to analyze their results carefully to make sure no regressions are introduced is a very time-consuming process. Nevertheless, the performance benefits gained from systematically measuring and improving performance and making sure regressions don’t creep slowly into the system is worth the initial investment in having a robust performance development process.

 

Published at DZone with permission of Sasha Goldshtein, author and DZone MVB. (source)

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)