I read
this piece about "code rot" this morning. Curious.
Every Windows PC I have ever owned has seemed to get slower as time goes by - and that includes servers. Sometimes defragging the disk and/or a reboot speeds it up a bit - but mostly not. People presumably think this is how computers are supposed to be.
Well it isn't.
Apple kit doesn't seem to have this problem, neither apparently does Linux. I recall back in the 90s our trusty Netware server went for months on end without a reboot or loss of performance.
So is it a real effect or are we all imagining it? What's your experience?