Turn Your Computer Off When You’re Not Using It – No Matter What They Tell You

I recently came across this article in which the question was posed - "is it better to turn your computer off or leave it on?"

http://www.digitaltrends.com/computing/should-you-turn-off-your-computer-at-night/

I get this question quite frequently myself from people who have heard the old, familiar, and totally incorrect, "Don't turn your computer off, because the wear-and-tear of start-up is more damaging than just leaving it on and running."

As I read through the article, I was so discouraged at seeing the same old misinformation, and no discussion at all of the real heart of the matter, and - once again - the wrong advice, that I was finally taken to task to sit down and write this out.

For those of you that just want to get to the end of it and get on with your life, the answer is in these two titled excerpts:

THE WRONG

"Don't turn your computer off, because the wear-and-tear of start-up is more damaging than just leaving it on and running."

THE RIGHT

"Always turn your computer off when you are not using it, unless it needs to be accessed remotely."

For anyone that would like to know the reasons why, the history of the mis-information that still persist, and why this is still such a badly understood subject, read on.

THE HISTORY OF THE MISINFORMATION

There is actually an origin to this thinking that is based in truth.

Back in the early days of Home PCs, the technology was new, and two crucial things led to the formation of this now commonly known and circulated bad advice.

OLD HARD DRIVE TECHNOLOGY

The old hard drives had heads that would literally float above the discs that they were reading. They would float on the air resistance caused by the discs spinning at thousands of revolutions a minute.

When the hard drive was turned off the discs would slow down and eventually stop, and as the air resistance depreciated, the heads would float down and come in contact with the discs and be touching them during the final slow down revolutions.

The act of these heads coming in contact with discs during slow down to stoppage was a considerable moment of "wear and tear", certainly much more so than just regular usage.

As the technology evolved to handle such problems, one of the first solutions implemented was a process that was dependent upon the user. Before the user shutdown the computer, they needed to go to DOS command prompt, and enter the Hard Drive commands for parking the heads.

This command locked the heads in place so that they would not descend down onto the discs as the discs spun down.

Like all things back then, this was not just a simple matter of point and click. The user had to terminate whatever program he was running, get to the correct DOS based Directory for accessing the command, type in the commands with correct syntax or write a batch file to execute the commands for him, sit and wait for the heads to park, and remember, old technology was painfully slow. And, should the user forget to park the heads altogether and just reach over and shut off their computer, they would cringe with eyes squeezed shut as they imagined the Hard Drive heads coming to rest with a clunk and a dragging along the spinning discs holding the precious data.

There and then, it became a whole lot easier (and depending on how forgetful you were, safer) to just leave the damn thing turned on.

In fact, I can still remember my father, in 1993, sitting at his desk in front of his 286 computer and saying with a bit of resignation at the end of explaining it all, "it's so much easier to just leave it running."

This is the true origin of the whole "leaving your computer on is better than enduring the damage of turning it on and off" myth.

Supporting this myth, and carrying it past the point that manufacturers were building self-parking hard-drives with teflon coated heads, was the matter of the power-on surge.

THE EVOLUTION OF SURGE PROTECTION

Just like self-parking hard drive heads, PC surge protection technology had to evolve.

Aside from the hard-core home-entertainment enthusiast of the day - more commonly referred to back then as audiophiles, as home entertainment was more about a plethora of high-end audio components than today's big screen TVs - most people were largely ignorant of the idea of power-strip surge protection. TV's were simply unplugged during thunderstorms and plugged back in after the thunderstorm passed and that was as far as it went in the average household.

So accordingly, many of us lost our very first computers to surges in power. And not just lighting strikes, which we already knew to unplug everything during thunderstorms, but even power surges coming down the line from power companies were enough to blow out our first PCs.

And so we became obsessed with power-strip surge-protection.

But, as we've already here established, these were the bad old days of PC computing, the technology was new, and we were all still trying to figure it out.

And so it was that many of us would still lose our computers to power-surge, not from lightning strikes or power-plant irregularity, but because early-generation PC power-supplies, upon firing up, would produce an initial surge that would wear and tear and take its toll upon the motherboard circuitry, sometimes eventually frying it.

For those not familiar with the term, a PC's "power supply" is a small metal box inside the computer that changes your home's AC 120 volt electricity used by heavy appliances into DC 5 volt electricity that can be used by your computer's small electronic components. The small port that you plug your computer's power cable into - that's the computer's power supply. The fan that you see in the back of your computer is actually housed inside the power supply.

Like everything else, the technology evolved and improved. Power supplies became better, surged on power-up less and less as they improved, and more expensive ones have a bit of their own output power-surge regulation built in. Motherboards were given large capacitors and voltage gates that started at the beginning of the power chain to absorb the overage surges produced at moment of being turned on.

Better materials along with improvements in design from lessons learned reduced internal power-surging upon power-up to the point that today's modern devices suffer negligible to no power-surge upon starting.

So here we have the two principal elements that have fueled this 30-year paradigm of "It's better to leave your computer turned on all the time then endure the wear-and-tear that happens when you turn it on." With today's modern hard-drives and built-in power-up-surge protection, the foundation of this myth is removed.

So what is the evidence for always turning off your computer when you are not using it? And why does this myth of leaving it always on to avoid wear-and-tear still exist?

HERE'S WHY YOU HAVE TO TURN THEM OFF

Computers are created with hundreds to thousands of integrated circuits. Integrated Circuits, or "ICs", are most commonly thought of as those little tiny black blocks with spider-like metal legs on either sides which are most commonly called "Computer Chips". ICs also include resistors, capacitors, transistors, etc, and can be found surfaced soldered to all the circuit boards of your motherboard, your sound card, your video card, your hard drive. All of those little components that create your computer and all of its devices are integrated circuits.

Integrated Circuits are manufactured out of materials like ceramic and silicon and copper and zinc - metals and minerals that are superheated and shaped and etched into computer chips, transistors, resistors, capacitors and gates, just to name the most commonly known.

Depending on the quality of the elements, the quality of the metals, and the quality of the manufacturing, these ICs, these computer chips, will have a finite amount of time that they will perform before dying. This amount of time that they will function properly before dying is called a “lifespan”. And this lifespan is rated in hours.

And the lifespan of a component is not an arbitrary, random number. It is based not only upon all of the knowledge that has been acquired along the way, but with thorough testing. Every batch of ICs that is created is expected to have a certain lifespan according to the quality of materials and care of engineering that goes into creating it. But this lifespan is not assumed to be achieved. Every batch of ICs that is created has a sample of units taken from it and tested. All ICs are tested in small batches to insure and correctly quantify the expected life-span of those ICs.

So, let us say, for purposes of easy math, we have built a computer out of ICs that are rated for life-spans of 10,000 hours. And, let us say, that for purposes of our example, we give one of these computers to Marge, and one of these computers to Jeff.

Marge is very clean and organized and frugal, turns off the lights when she's not using them, and tends not to leave things like televisions or radios running in the background. Accordingly, she turns off her computer when she is not using it.

Marge uses her computer a couple of times during the day to check email, and streams an hour's worth of online TV during the evening. So let's say, on average, that Marge uses her computer 2.75 hours a day. With her computer built of ICs that are rated for a lifespan of 10,000 hours, we can assume that Marge's computer will last about 3,636 days, or 9 years.

Jeff follows the "it's worse to turn it on and off than to just leave it on" crowd, and leaves his computer on 24/7. At 24 hours a day, with a computer built of ICs with rated life spans of 10,000 hours, we can assume that Jeff's computer will last about 416 days, or 1 year and 2 months.

Even if Jeff uses his computer twice as often as Marge, 6 hours a day, he will only ever utilize just under 25% of his computer's entire lifetime, which will be over within the second year, while Marge will have hers for the next decade.

By now it's pretty obvious. When you're not using your computer it should be turned off.

SO WHY DOES THE MYTH STILL EXIST?

Why does any myth exist? Why do wive's tales persist? In some cases, like this one, it is because those old wive's tales were born from a kernel of truth. A kernel of truth that has been washed away from the ignorance of not knowing the original reasons why, and the new technology that has made those previous truths irrelevant.

But in this case, more than the old truths persisting or having any sort of relevance with today's technology, I feel the reason why people don't want to turn off and on their computer is simple laziness. In this "instant-on" world, people really hate the minutes ticking by that they have to wait for their computer to boot up. "It takes FOREVER," they tell me. And just like my father, echoing words from so long ago, they inevitably say, "it's so much easier to just leave it running."

And anything they've heard that supports that position, they're gonna believe.

===

Matt Gillmore bought his first computer in 1981 at the tender age of 13, had completely Frankensteined it beyond recognition by 15 and built his first clone from components by 17. His 20-year career includes PC, Mainframe and Network technician, IC Purchaser, IT Director, and Interactive Designer, as well as several Director positions relating to IT and IM.

Posted in Blog and tagged .