Most developers have a sense that 99%+ is too full for memory and CPU cycles. But it is much less clear where to draw the line. Is 95% too full? How about 90%?
As it turns out, there is very little guidance on this area. But performing some what-if analysis with some classic software cost data leads to a rather startling conclusion:
If your memory or CPU is more than 80% full
and you are making fewer than 1 million units
then you should get more memory and a faster CPU.
and you are making fewer than 1 million units
then you should get more memory and a faster CPU.
This may seem too conservative. But what is happening between about 60% full and 80% full is that software is gradually becoming more difficult to develop. In part that is because a lot of optimizations have to be added. And in part that is because there is limited room for run-time monitoring and debug support. You have to crunch the data to see the curves (chapter 18 of my book has the details). But this is where you end up.
This is a rule of thumb rather than an absolute cutoff. 81% isn't much different than 80%. Neither is 79%. But by the time you are getting up to 90% full, you are spending dramatically more on the overall product than you really should be. Chapter 18 of my book discusses the true cost of nearly full resources and explains where these numbers come from.
---
---
You cannot afford to save on HW.
ReplyDelete