Community
I read this piece about "code rot" this morning. Curious.
Every Windows PC I have ever owned has seemed to get slower as time goes by - and that includes servers. Sometimes defragging the disk and/or a reboot speeds it up a bit - but mostly not. People presumably think this is how computers are supposed to be.
Well it isn't.
Apple kit doesn't seem to have this problem, neither apparently does Linux. I recall back in the 90s our trusty Netware server went for months on end without a reboot or loss of performance.
So is it a real effect or are we all imagining it? What's your experience?
This content is provided by an external author without editing by Finextra. It expresses the views and opinions of the author.
Sergiy Fitsak Managing Director, Fintech Expert at Softjourn
06 January
Elena Vysotskaia Founder & CEO at Astra Global
03 January
Dieter Halfar Partner at Elixirr
Prakash Bhudia HOD – Product & Growth at Deriv
Welcome to Finextra. We use cookies to help us to deliver our services. You may change your preferences at our Cookie Centre.
Please read our Privacy Policy.