Cleaning Your Laptop’s Heatsink to Solve Overheating Issues

Share

In the past few months I’ve run into a number of people who have experienced problems with their laptops running hot. I had an HP/Compaq nx6110 laptop sitting in a drawer that I recalled had been exhibiting the same issue. I wanted to loan it to my nephew to use during a visit and I thought I’d take the opportunity to put in a new battery and check to see the if I could find the root cause of the overheating issue.

I had checked a number of sites on the web that talked about overheating issues with laptops, but none of them talked about the potential for the fan/heatsink assembly to collect dust and block the airflow through the heatsink. If you think about it, most laptops operate like a vacuum cleaner in that they suck air up through a hole in the bottom of the case and blow it out the side. The air must flow through a radiator with fins that can collect lint, dust, pet hair and anything else that it might find in ample quantities when the laptop is placed on a carpet, blanket, or your lap. So the odds are pretty good that if you’ve had your laptop for any length of time, dust has accumulated inside and has negatively affected the heatsink’s ability to remove heat from the CPU.

In preparation for removing the fan from the laptop, I downloaded a PDF of HP nx6110 service manual from the HP website. You can do a Google search for your laptop’s model and the words, ‘service manual’ to see if the manufacturer makes the service manual available online. You may notice that removing the fan seems to require all kinds of parts to be taken off the laptop since they seem to cover everything else first, but in my case, it only involved removing the keyboard, which turned out to be quite easy.

The steps were:

1. Remove battery and the cover for the memory (1 Phillips screw).
2. Remove 2 T8 Torx screws exposed after the memory cover is removed. These screws hold in the keyboard.
3. Slide four latches on the keyboard downward to release the keyboard.
4. Remove the fan (2 Phillips screws).

Although I removed the fan and keyboard cables, I found out later that I really didn’t need to do this to get at the heatsink, you can simply lay them over as long as you’re careful not to move them and put strain on the cables.

I’ve attached some images of the fan and heatsink below. Click on any image for a higher resolution version of it.

HP nx6110 fan

Removal of the keyboard exposes the fan/heatsink

As you can see from the picture above, the fan is exposed once the keyboard is removed. It doesn’t appear to be too dirty, but the dust is hidden between the fan and the heatsink.

HP nx61110 heatsink

Heasink with dust

Now that the fan has been removed, you can see that there is a lot of dust that has accumulated on the heatsink. You can use a brush to remove the dust and then blow it out with compressed air.

HP nx6110 heatsink

Heatsink after cleaning

The dust that had accumulated on the heatsink caused the air to be blocked and so the air that does get through is very hot since a smaller portion of the heatsink is being used to remove the heat. It can also increase the velocity of the air because it’s forced through a smaller restriction. Eventually, however, if it’s not cleaned, the heatsink will no longer be able to do its job at all and the computer could shut down due to overheating.

Once the heatsink was cleaned, the air that exited the vent was much cooler, and the fan didn’t need to work as hard to keep the CPU cool (around 45 °C). In doing some research, I found that the BIOS was using various CPU temperature thresholds to determine when to turn the fan on and when to increase its speed. In my case, the BIOS was original and it was turning the fan on at 40 °C, and so I decided to see if any improvements were possible by updating the BIOS. Sure enough, a newer BIOS available from HP’s website had raised this limit to 45 °C which turned out to be much better, since that is the temperature where the CPU tends to stabilize when it’s at idle and so the fan stays off unless you’re doing something that causes the CPU to become busy.

I have a friend with an HP laptop dv6 model and it had an overheating problem that was so severe that it would reach 90 °C and shut itself down whenever she used it for more than 15 minutes. After searching through forums for many weeks she finally came across a thread that suggested updating the video driver and the BIOS to fix an overheating issue. That turned out to be the fix in her case. Now her computer runs very cool. So make sure you’re running all the latest updates from the manufacturer.

If you clean your heatsink and your overheating problems persist, you may want to check and see if there are any processes that are keeping the CPU busy all the time. Any time the CPU utilization goes up, the fan will come on at a higher speed. You can use Windows built-in Task Manager to do monitor CPU utilization by pressing CTL-ALT-DEL keys simultaneously. A better solution than Task Manager for examining CPU utilization is the Process Explorer which is free for downloading from Microsoft. I also downloaded a free utility to monitor the CPU temperature called Core Temp. I found that I had multiple virus scanners running (you only need one of these) and some other processes I didn’t need, so I removed the software responsible for running these processes.

I found that although Core Temp was helpful, it sometimes interfered with the BIOS in reading the temperature of the CPU. A better program for measuring CPU temperature on this model of laptop was Speedfan.

can of compressed air
If taking your computer apart sounds frightening to you, or if you have a laptop where absolutely everything must be removed to get to the heatsink, then another option is to use a can of compressed air that you can buy at any office store and blow air backward through the fan’s vent. Feel for which direction the fan blows and determine where it’s exhausting the warm air. Then shut down the computer and aim the straw into the exhaust vents and if you see dust coming out through the intake vents, then you’re making progress. When dust collects on the heatsink, it continues to attract more dust like a log jam in a river. Blowing the air backward through the vent can clear this log jam.

How to schedule a recurring backup of a Windows XP folder to a Network share drive

Share

I have worked on computer backup products for more than 20 years and I still find most of them to be complicated to set up and use. They sometimes include so many features and options that people just give up in despair while trying to configure them.

If you have a network share drive and want to make periodic backups to it, you can do it without purchasing any new software and without having to resort to using Microsoft’s built-in backup utility which stores backups in a proprietary file format. This approach requires a special restore program to examine them or to copy them back to your PC. The fact that Microsoft’s backup utility is so well hidden speaks volumes for their confidence in customers being able to successfully find and use it.

If all you want to do is schedule a simple automated backup of a folder, I will explain an easy 3-step process to set up a recurring backup on Windows XP. I haven’t investigated an equivalent procedure for Windows Vista or Win7, but I assume this technique would work there as well since those operating systems also have similar built-in capabilities.

Step 1: Map the network share to a drive letter.

The first step is to set up the network drive as a drive letter on your PC. To do that, just open up Windows Explorer, basically any folder, and under the Tools menu, select “Map Network Drive”. Your network drive will likely have a name such as “server” and a share that you wish to use such as “backup”. In the dialog box that pops up, you can just type in \\server\backup. XP will let you use the ‘Browse’ feature to find your network drive and share if you don’t happen to know them by name. Once located, you can assign it an available drive letter. Make sure to check the box for your PC to reconnect to it at logon.

Step 2: Create a simple xcopy command in a batch file.

Let’s say the first step gave your network drive the letter X:.

We will now create a simple batch file called backup.bat in Notepad with the following single line that looks like this example:

xcopy "c:\Documents and Settings\Lee\My Documents\SWfiles\*.*" x:\Backup\SWfiles /d /e /y

(that should all be on a single line, but it got wrapped here)

This is the DOS xcopy command which is built into XP. It works like this:

xcopy "source files" "destination folder" /options.

I’ve selected the source files as “c:\Documents and Settings\Lee\My Documents\SWfiles\*.*”. That will back up every file and folder under the folder called SWfiles. I had to put quotes around it because some of the folder names have spaces in them. If you have spaces in any of the folder names, you will need these quotes around the source and/or destination name.

The purpose of the options is as follows:

/d – This option only backs up the file if the source file is newer than the destination file that may already exist. This allows the backup to avoid unnecessary writing if the file hasn’t changed since the last backup.

/e – This option makes the xcopy command search in sub-folders so you back those up files too.

/y – This is to avoid having the job ask for your permission to overwrite existing files.

Let’s save this file as backup.bat in any convenient folder. I put mine in My Documents.

Test it by double clicking on it to confirm it backs up the folder to your network share.

Step 3: Set up a scheduled task to run the batch file.

To open Scheduled Tasks, click Start, click All Programs, point to Accessories, point to System Tools, and then click Scheduled Tasks.

Double-click Add Scheduled Task to start the Scheduled Task Wizard, and then click Next in the first dialog box. Just follow the wizard to select the backup.bat file and don’t worry if you can’t find the proper options to select, the task is very easy to edit once it’s finished. Unless you select, ‘run only when logged on’, you will have to select a user/password to run this task so if you don’t already have a password set for yourself, you will need to set one up. I set up a My Documents backup to run once a day, but for my SWfiles folder, I wanted it to back it up once an hour. If at first you don’t get it right, you can edit the task by double-clicking on it. Then you can select ‘Advanced’ options to set it up to run it every hour. Set the duration to 24 hours so it will run every hour of the day.

Here’s an example of what the Scheduled Tasks screen looks like:

The Task Scheduler for Windows XP

The Task Scheduler for Windows XP, Click for larger image.

Please keep in mind that if you do this on a laptop, the backups will only happen when you’re connected to your network, so if you’re off traveling with your laptop, make sure you use another backup method (perhaps a Picture Keeper customized to backup all of your favorite file types).

Preserving Your Old Photos

Share

As you may have noticed by reading this blog and checking out my other web pages, I am the VP of Engineering at Simplified IT Products, LLC the company that makes the Picture Keeper. It’s the easiest way I know of to protect your digital pictures. The people at AM Northwest recently ran a segment with organizing expert Krista Colvin that talks about the Picture Keeper along with several other tips for managing your digital photo collection. If you are concerned with the security of your digital photo collection, the video linked above is well worth watching.

To summarize it, Krista recommends getting all of your photos stored in one place. The Picture Keeper is excellent for pulling all your photos from multiple computers and putting them in one location. Then make a backup of that collection. But even befor you do that that she recommends sorting your old printed photos into 3 categories, A-List, B-List, and C-List which would work like this:

A-List: I love this photo and really want to keep it.

B-List: This is a nice photo that I want to keep, or it may be an A-List photo for someone else. If so, give it to them.

C-List: The photo is flawed, for example has a finger blocking the shutter or is otherwise unimportant, you should throw those away.

She then talks about scanning all of your photos with a product called Flip-Pal, which I have reviewed here. The flip pal is an excellent way to digitize old photos, particularly if you don’t already have a scanner or if you need the convenience of a scanner that you can take to the photos, instead of bringing the photos to the scanner. And there’s also the option of taking the photos to a scanning service.

Getting all of your old photos scanned an stored on your computer and backed up gives you a lot more options on what you can do with them. The most popular reason for having a digital copy, other than preservation, is to share with others who might be interested in them. People love looking at old pictures and if you really want to brighten someone’s day, send them a copy of a photo that is bound to bring back some fond memories.

The LightSquared Debacle in Layman’s Terms

Share

I’ve been reading about the LightSquared debacle for months. My aviation-related sources have been covering the GPS industry’s objection to LightSquared and how it would be disastrous for the GPS receivers, essentially causing a loss of satellite lock or at minimum causing accuracy issues that could lead to disaster. The story received a boost in interest when it was found that there was some political chicanery associated with the White House administration pressuring a reversal in the testimony of an Air Force general. It’s hard to ferret out the underlying reason how this problem got this far, with so much money being invested in a technology that LightSquared should have known would hit a wall during its deployment, namely when it produced interference with a critical service like GPS that is adjacent to its bandwidth allocations. So I decided to do some research and summarize it here.

The problem of radio interference isn’t new. The FCC and similar organizations in other countries were created primarily to help prevent interference problems by licensing radio spectrum and settling disputes among the radio spectrum’s users. All radio transmitters generate some amount of radio frequency (RF) energy on adjacent bands. All receivers are influenced by signals that are in adjacent bands because there is no such thing as a perfect filter to ignore nearby signals. So one must ask the question, is this an interference problem on the part of LightSquared, or a susceptibility problem for the GPS manufacturers? And since there is no such thing as a perfect filtering technique, how much can it help to apply filters to GPS receivers? Can the problem be solved with a 5-cent change to GPS receivers as suggested by LightSquared, a solution promptly dismissed as absurd by the GPS industry? I can tell you one thing that does not work, and that is to expect an industry to accept a problem introduced by some third party AFTER its products have already shipped and are in the hands of its customers. Sure, you can ask for a change to future products, assuming the change actually produces the desired result and isn’t too costly, but if an industry is entrenched, and I think that after 2 decades of shipping millions of products, GPS can be categorized as such, you can’t expect them to accept a problem that wasn’t a problem until your service came along.

Nor can you expect all existing customers to ‘upgrade’ their equipment just to solve some newly introduced interference issue. Yet this is apparently what LightSquared was expecting. And I find that attitude arrogant and ridiculous. Anyone whose money is invested in a technology that expected such a system to work should expect to see his investment lost by those who dismissed or talked around these issues when they were first brought up.

One might tend to lay the blame on LightSquared and its naivete, but I think the FCC is just as culpable. The FCC needed to realize that any service that occupied frequencies adjacent to GPS must necessarily be compatible with it. GPS satellites transmit signals from a distance of about 12,500 miles above the earth. Because of this vast distance, the signals are at a very low level once they arrive on earth, about -130 dBm (which is about 180 x 10^^-18 Watts). The land-based LightSquared 4G transmitters can use as much as 70 dBm (10,000 Watts). So you can see that there is a vast difference of roughly 2 x 10^^20 in signal strength between the two services. The low signal strength is one of the reasons why most Space-to-Earth signals require dishes or other types of high gain antennas pointed at the satellites to amplify only those signals and simultaneously ignore any signals originating from other directions. But GPS receivers cannot do that. First of all, the constellation of 24 satellites is in constant orbiting motion, and secondly, a GPS receiver needs an antenna that can receive from several satellites at once in order for it to do its job so it cannot use a directional antenna. A GPS receiver has none of the amplification and signal isolation benefits provided by a directional antenna. This means that the signals that a GPS receiver has to deal with are extremely weak, and are actually below the noise floor, and must be dug out of this noise floor using sophisticated signal processing techniques.

GPS and LightSquared satellite allocations

As shown in the graphic above, (source) the bands adjacent to the GPS spectrum were intended to be used for similar purposes, that is to send signals from space to earth or earth to space, and based on what I’ve been reading about LightSquared, this was how they intended to use the spectrum initially. But most broadband solutions that depend on satellites are not very compelling due to the 44,000 mile round trip the signals they need to make to the geosynchronous satellites. This trip adds about a half second delay which is too high a latency to provide a satisfactory experience compared with terrestrial broadband solutions, especially with modern Internet applications some of which cannot tolerate that kind of latency. People tend to use satellite broadband only when there are no terrestrial broadband offerings in their area.

In 2004, presumably to make its service more financially attractive, LightSquared’s predecessor lobbied for and received authorization by the FCC to deploy thousands of land-based transmitters in the same frequency range as their satellite-to-earth band. I think that this authorization from the FCC is where things went awry. LightSquared, when it was a space-based wireless service that could hypothetically offer 100% coverage over the U.S. had a formidable calling card, namely that it could provide mobile wireless service to previously under-served rural areas. Telling a government bureaucrat that you’re going to provide ‘service to rural and the under-served’ is tantamount to telling them you’re going to cure world hunger or help the blind to see. Everyone knows there is little or no profit in serving the under-served, it just makes for a good story to soften up government bureaucrats so they’ll grant you favors. Indeed, earlier this year, the FCC allowed LightSquared to offer devices with just the terrestrial capability, making them nothing more than just another mobile wireless provider, which might be viewed as a clever bait-and-switch maneuver since those devices would no longer have the large size and expense of a hybrid phone. This would allow them to rake in some real profits by taking business away from the incumbents of lucrative mobile wireless services rather than being just some quirky satellite phone and data service.

So more than any other factor, it was the decision to take its space-based frequency allocation and have the FCC re-authorize it for terrestrial transmitters that made it incompatible with GPS receivers. Even a very low-power transmitter that is in close proximity to a receiver will have signal strength that is many orders of magnitude stronger than one that is located 22,000 miles away. But if you can influence politicians by explaining away the problem, and hoping that the GPS industry looks upon it as an opportunity to force their customers to purchase new receivers that deal with the interference, then it would be a win-win for all parties, except those who have to buy new GPS receivers, namely consumers, who have no lobbyists to protect them. But it appears that all the hand waving about potential technical solutions may not make the GPS interference problem go away. There may be no filtering technique available at any cost that would fix it and still allow a GPS receiver to maintain the accuracy customers rely on. And so, in order for a company and its investors to enrich themselves, they appear to have no qualms about completely destroying another much larger industry that provides an invaluable service to many sectors of the economy. Some might think of this as free market capitalism. I think of it as sociopathic behavior so extreme that it makes me ashamed for the company and the politicians who did the company’s bidding.

I have to wonder whether it’s even possible to provide an economical hybrid mobile wireless device that can be used with geosynchronous satellites and land-based cells. Iridium provides mobile phone service based on satellites, although that service nearly went broke and was only revived when its multi-billion dollar investment in satellites was picked up for pennies on the dollar. But Iridium is a completely different technology since its satellites are in low earth orbit, just 485 miles above the earth, and so the distance is about 2% as far from the earth as a geosynchronous satellite thereby requiring much less power from the mobile device to establish a connection. But these phones and service are very expensive compared with standard mobile phones. The phones tend to be large and bulky and cost upward of $1200. The service is metered at $1.30/min or more in addition to a $50 monthly fee. Compared with standard mobile phones this would not be a competitive offering, so getting the go-ahead from the FCC to have terrestrial transmitters was a key win for LightSquared because a phone that communicated with geostationary satellites would be very large, power hungry, and costly.

The amount of power and antenna you’d need to communicate with a geosynchronous satellite would be difficult to implement in a handheld device that fits in one’s pocket, if it could be done at all, unless they intended for it to go through some form of a roof-mounted gateway. But then it wouldn’t really a true hybrid mobile device as this service had been promoted. And you couldn’t use a satellite handset from inside a car or house without a roof-mounted antenna and transceiver due to blockage of the satellite signals, making the service appear like something that may feel like a throwback to 1980’s technology.

Hughes has offered a satellite/terrestrial mobile phone solution called GMR1-3G for some time. The hardware looks like something you’d need if you were deployed to some remote corner of the earth. In fact, LightSquared initially had planned to use that service before switching to something called EGAL which stands for Earth Geostationary Air Link from Qualcomm. EGAL appears to be some new hypothetical hardware/service that has yet to be deployed. Interestingly, Qualcomm is the company that came up with the estimate of 5 cents for the filter that would fix the GPS issue.

It is usually not a good sign when a company gathering large sums from investors is basing its future success on a yet-to-be proven technology while simultaneously ramrodding its agenda by forcing a government agency to grant approval and thumbing its nose at its spectrum neighbors. These folks need a wake-up call. Maybe the sound of a few billion dollars of their investment swirling around a drain will provide that wake up call for Lightsquared and its investors and anyone foolish enough to embark on a similar venture in the future.

UPDATE (2011-11-11) If you would like to know more about the testing that was done that showed the significant interference on GPS receivers, the Coalition to Save our GPS has a complete list of test reports on their website. The summary is that during these tests, nearly all GPS devices tested couldn’t receive a signal when they were within a few miles from the tower, even though the LightSquared transmitter was operating at 10% of the power they would be permitted to use. In addition, LightSquared claimed that if they simply moved their signals to the first 10Mhz of their allocated bandwidth, then 99% of the GPS receivers would not have been affected, even though there is not a single shred of evidence from this test that would support that claim.