X
Tech

Is your PC slowing down?

Take anybody that uses a computer, and one of the most common topics of discussion is the fact that the PC keeps slowing down over time. But take a look at the root cause of the slowdown and you may discover some interesting finds.
Written by Chris Clay Clay, Contributor

Take anybody that uses a computer, and one of the most common topics of discussion is the fact that the PC keeps slowing down over time. But take a look at the root cause of the slowdown and you may discover some interesting finds. I've heard everything from "my computer is getting tired" to "the hardware is wearing out". Well, there is really little truth to those statements. First, PC hardware does not tire out. It breaks, but replacing components can have it up and running just like normal. Sure, fans, hard drives, and moving pieces can wear out. But again they can be replaced. With all that said, there is truth to the fact that software is constantly evolving, and running newer software on older hardware will almost no doubt run slower than running software that was released at the time that the hardware was. Newer software by nature will require more system resources.

The cause of general PC slowdown is the software running on the PC itself. And now that we have different operating systems available, you will find different results with each. Typically, Windows is one of the worst operating systems known to slow down over time. And there are many reasons for this. Take a look at the solutions on the Internet and you will soon find out why it is prone to this. The most common steps to fixing Windows slowdown issues are: cleaning out temp files/folders, removing old log files not needed, defragmenting the hard drive, and uninstalling/stopping applications that are no longer needed. These steps are just the beginning to a long checklist of things that systems administrators usually take.

What you may not have heard is that other operating systems like Linux are not prone to these slowdown issues. I've seen installations of Linux that are over 7-8 years old, and they run as good as the day they were installed. How can this be? First, Linux is extremely good at keeping itself cleaned up. There is a processed called "logrotate" that is simply a cron job that periodically cleans out all log files on the system, and automatically splits the files into 4 files per month (increasing speed of the services that write to the files). Logs older than 30 days are usually purged. There is another cron job called "tmpwatch" that periodically cleans out temporary files in /tmp (this folder is equivalent to the "temp" folders in Windows) and purges them based on age. This is simply brilliant, and best of all it's set up by default with most mainstream Linux distributions and all happens behind the scenes without the user's intervention to set anything up.

Defragmenting the hard drive. It's hard to believe that even Windows 7, the latest operating system from Microsoft, is still prone to this problem. The NTFS filesystem (used by Windows NT and up) has other quirks, but it seems to slowly get fragmented and requires defragmenting from time to time. This process can take a long time depending on your hardware, and no doubtedly has to happen when you are not using your computer. It's more like a band-aid to the problem, whereas Linux solves the problem up front by not even allowing fragmenting to happen at all. This has been the case since the ext3 filesystem was first used for Linux, and is still the case today with the ext4 filesystem. To quote the Linux System Administrator Guide: "Modern Linux filesystem(s) keep fragmentation at a minimum by keeping all blocks in a file close together, even if they can't be stored in consecutive sectors. Some filesystems, like ext3, effectively allocate the free block that is nearest to other blocks in a file. Therefore it is not necessary to worry about fragmentation in a Linux system.". Again, this is brilliant.

Uninstalling software no longer needed mainly applies to Windows as well. Usually, this step is needed because of extra bloatware installed on the PC either from the PC manufacturer or from trialware that is no longer needed, etc. The base installation of Linux does have some services that can be disabled, which I would recommend as some just don't need to be running. But Linux does not have the issue of extra bloatware as it does not have the marketing fiasco of proprietary software that is the root cause of the bloatware being installed by PC manufacturers.

So in the end, we can see that some operating systems (Windows) do not attempt at being efficient and therefore tend to bog down over time. While others (Linux) automatically keep themselves tidy and clean along the way, and avoid the issues altogether. These problems can be compounded on server systems, where performance can affect more than just one user.

Editorial standards