This post covers what has turned out to be a remarkable odyssey of technical misfortune. One thing going wrong after another. From a hard drive with a seemingly innocent “imminent failure” warning through Ubuntu deleting the stored image and the inability to clone to a disk that is smaller, finally ending up with rebuilding from recovery discs and a user backup. We’ll start at the beginning…
My aunt had a hard drive failure on her HP dv7 laptop. When she booted, it said there was an “imminent failure” with the hard drive. She probably waited too long to do something about it so it ended up worse than it could have been. The issue with the drive turned out to be a “SMART” failure. Unfortunately, the SMART errors aren’t really that smart – they know that the drive is not functioning well, but they only tell you after there’s already been trouble and they don’t tell you what the problem is or how to fix it.
The laptop’s internal BluRay drive failed shortly before the hard drive had. I believe the failure of the optical drive was due to cat hair in the mechanics and possibly second-hand smoke on the optics. My aunt did admit to performing a bit of “percussive maintenance” on the optical drive to get it to work. And that may well have caused a problem with the hard dive that triggered the bad sector issue that begot the SMART failure.
The hard drive is a Western Digital Scorpio Blue 750 Gbyte hard drive. I did some research and it seems that to fit 750 Gbyte in the two platters that fit in the thin internal hard drive, WD played some games with the platters which may have yielded a less robust hard drive. That’s ironic considering that my aunt’s stated purpose in going for the 750 Gbyte drive was to get more longevity out of the laptop. And yet, it is the future leaning 750 Gbyte hard drive and BluRay drive that were the failure points in her laptop. In other words, if she had gone for less leading edge stuff when she bought the computer two and a half years ago, she likely wouldn’t have had these issues.
Regardless, the hard drive is failing, she’s got stuff on it she wants to keep, so the smart (sorry about the pun) thing to do is clone the hard drive to a new working one. And before we start cloning, a user backup is the obvious place to start. Unfortunately, the hard drive was in such bad shape already that every individual read was failing. It was taking 15 minutes just to boot up the computer. Clearly bad news for the prospect of recovering the OS install, but I figured we’d see how things looked after the clone. She did manage to complete a user backup to a 16 Gbyte USB drive.
The original plan was to do a disk to disk backup using Clonezilla. I would create an image of the whole drive and all 4 of the partitions originally defined by HP when the computer was manufactured. I have nifty eSATA cable that connects a laptop hard drive to the eSATA port for just this sort of temporary use. (Interestingly, her laptop has a cavity for holding a second hard drive but unfortunately, it doesn’t include the wiring and connector for the second hard drive so external would be the way we need to go.) The first problem with my plan was that since her optical drive is broken, we couldn’t boot from a Clonezilla CD and if we have one hard drive connected through the normal internal connection and the other connected through the eSATA port, that doesn’t leave any space for a CD. Also, the BIOS doesn’t allow for booting from an external optical drive on this HP. I could have gone through some pains to get Clonezilla on a USB drive, but that seemed like too much trouble at the time.
Instead, I figured the first step would be to get an optical drive working in her computer. I discovered that even though my Dell optical drives were 9mm instead of 12mm, they fit enough into the slot and could connect (after I removed the Dell drive’ push/pull mechanism – hope I can reattach that!). Now with a CD drive working, I could boot from the Clonezilla CD on her computer, right?
No, actually, I couldn’t. It turns out that her computer has some hardware component that the Clonezilla CD doesn’t recognize or like and it forces a reboot. Which, if you have the BIOS set to boot from CD as the first choice causes an interesting infinite reboot loop. Entertaining, but not getting us anywhere.
Okay, let’s put the HP aside and switch to my Dell. Fortunately, no such Clonezilla problems with my laptop. So I pulled my main Windows 7 64-bit SSD out of the main slot and put in my triple boot hard drive with partitions for MacOS, Ubuntu, and Windows 7 32-bit. (I haven’t blogged about my triple boot in my E6430 yet because it isn’t working ideally yet, but it works enough for this sort of thing.) I figured I’d be better off storing the image created from the clone process on Ubuntu and I also liked the idea about keeping my main drive far away from any of these shenanigans – clear on the other side of my desk.
Why not do a disk to disk clone? That was my original plan, as stated above. But I suggested getting a 320 Gbyte hard drive for the new one and had never considered that she might have a drive larger than that in the laptop. Clonezilla can’t do a clone to a smaller drive, which I knew before starting. It was silly of me to not check on the size of the failing hard drive first. Though, even if we did know it was that big initially, I think we still would have tried to replace it with a smaller disk – both because a smaller one is cheaper and because a smaller one would be less likely to fail the same way – at least that’s my guess. And with the space used two years later only at 100 Gbyte, having a spare 650 Gbyte is entirely unnecessary.
I also considered shrinking the size of the C: volume before creating the disk image. Only 100 Gbyte of the 750 Gbyte drive was used so shrinking to 200 Gbyte (which would fit comfortably on the 320 Gbyte drive) would be easy. But since the drive was failing so badly with all of the SMART errors, I figured it would take forever to edit the partition. (In retrospect, that may have been the case from within Windows, but I don’t think that’s correct if I was using Gparted because the bad parts of the disk appear to be in the part of the partition that does have data on it and the blank parts probably wouldn’t have had read errors. But shrinking at this point would have introduced other issues that I’ll get to later.)
Finally, it’s time to create the disk image. Clonezilla wasn’t too hard to figure out. It found the source disk and it found the disk where I wanted to save the image created and for the most part, even though I used “expert” mode, I left the options mostly unchanged. I only selected the two options for continuing to create the image when disk read errors are encountered. It took a while to create the image but the image creation reported successful and I even had selected the option to check the image when it was complete to ensure it could be restored. I chose to save the image to the Ubuntu partition figuring a Linux partition was a smart place to put an image file created by a Linux program and I chose the “tmp” directory because it wants a top level directory and none of the other top level directories made sense for a file that would only hang out there long enough to do the other half of the cloning.
Great, now with the image created, the original plan was to write the image to the new hard drive. Except I knew I’d have a problem with the size of the image being too large for the new drive and I was wondering if I could clear up any of the read errors by writing the image back to the original drive. I figured it was a long shot, but before completely abandoning the disk, I’d give it a go. The new hard drive package hadn’t been opened yet so that could be returned if the old drive could be fixed. I started by booting up in the Ubuntu partition where I stored my image figuring that was a good place to run gparted and reformat the partition on the failing disk. I got into gparted and deleted the partition and I may have created a new one – I’m not sure at this point. It shouldn’t have mattered because when I wrote the image back to the original location, it would create whatever it needed to.
I rebooted into Clonezilla now and went through the prompts to choose the restore option. It didn’t come up. That’s weird. I tried a few more times, still couldn’t find the restore option. I Googled Clonezilla instructions and found where the selection for restore was supposed to be but restore was not an option. A little more Googling said that it wouldn’t show the restore when it couldn’t find the image. That’s weird – I know the image is right…. Hey!!! Where the hell did the image go!!!? A little more Googling and I found the answer: Ubuntu has an automatic clean out of the /tmp directory on every reboot! (I think it is on startup, not on shutdown.) So just by me starting up Ubuntu to do the partition work, the image file was already wiped out. Aiigghh!!! Who’s dumb idea is this? For servers that stay running for weeks or months at a time, every reboot is too long to wait. And when you are trying to fix something and rebooting every few minutes, every reboot is way too frequent. At least there’s an easy fix once you know about it. But that doesn’t help me now!!!
At this point, I had no image and no partition. All the data was deleted, except for whatever made it into the Windows Backup (which definitely would not include the OS). But the question is whether I could undo either the new partition or undo the Ubuntu delete. First up, I figured I’d try to recover the image file that Ubuntu deleted on me.
Lots of Googling led me to PhotoRec. The name belies its current function: to recover deleted files of many types. It uses pattern matching to discover file data and then comes up with a temp name for them and an extension based on what type it thinks the data is. It’s pretty cool software and amazing that it is freeware. I worked at this much longer than I should have. In the end, there are two problems: 1.) By the time Ubuntu had removed the file from the tmp folder on startup, it had already written new files to disk, including in the tmp folder. That means it’s very likely that some of the disk space used by the very large image data would have been reused by the new file data. 2.) When PhotoRec recovers things, it recovers individual files with no logic to the names or directory structure. Therefore, since I don’t know how to identify the right files amongst those recovered, I’d have little luck getting anything useful out of a pile of recovered files. I did find a number of “gz” files but I couldn’t figure out how to use them in an image. (More on this below.)
PhotoRec also comes with something called TestDisk. Again, the name belies the current function: to find and fix partition issues. One of the possible things it can supposedly do is to undelete a partition. Again, I worked at this much longer than I should have. Theoretically this is exactly what I would have needed if I had simply deleted the partition. But there were a few problems: 1.) TestDisk couldn’t correctly identify the disk geometry. I think this is largely because of the SMART error it was having. And since I didn’t know what the disk geometry should have been, I trusted TestDisk to have guessed correctly initially with 1/1/512 which turned out to not be correct. I had tried to fix the boot sector (Rebuild BS) in each of the partitions while I had the wrong geometry and I still got disk errors. 2.) I did eventually figure out that the right geometry was 255/63/512 (although for a while there, it looked like 255/63/4096 was correct since it is an “advanced format” disk). But by this point, TestDisk was reporting the correct 4 partitions and claiming that the 2nd partition, the C: drive, had only two things on it: directories named “$RECYCLE.BIN” and “System Volume Information”. The real 100 Gbyte of used data was not associated with this partition and as I eventually came to discover, never would be again. 3.) I’m not sure that I had “deleted” the partition. I may have formatted it. (I honestly don’t remember what happened to the original C partition since, as I wrote above, it didn’t really matter what I did since the clone would replace what was there.) TestDisk isn’t good it seems at unformatting a drive. There are other programs for that. I did try Disk Genius for an unformat, but I think by this point TestDisk’s rebuild of the boot sector may have made it so the unformat wouldn’t work. 4.) The 750 Gbyte is so large that any operation in TestDisk would end up taking at least a day or overnight. And TestDisk is a funny program – after the “Analyse” is complete and I would then want to check on something else in the program, the result of the “Analyse” was cleared and lost such that if I wanted to go back to it, I’d need to wait the day again. I suppose that’s the nature of what TestDisk is doing but I spent the better part of a week waiting for these checks to run. (Fortunately, they could run on their own and I could just return later to find out what had happened but it still took some time to queue things up and figure out what to try next.)
At this point, I was getting desperate and started looking around for options other than PhotoRec and TestDisk. I stumbled upon a post that talked about “Hiren’s BootCD“. (Download is from separate site.) Apparently somebody named Hiren put together a CD that is bootable and contains all kinds of useful utilities. It saves you from trying a bunch of things individually. It includes Clonezilla and TestDisk/PhotoRec as well as a Linux environment for running Gparted called Parition Magic. It also includes Disk Genius which I mentioned in the previous paragraph. I have now thrown away the CD I created for Clonezilla since I can boot from this CD in the HP laptop with no trouble. From now on, whenever I need to do any partitioning or partition recovery or file recovery or disk cloning, I am definitely going to start with Hiren’s BootCD. While Hiren’s BootCD would have made the testing and debugging described so far easier, I don’t think I would have ended up anywhere different and I didn’t find any tools on it that would get the data back.
With things looking bleak for the original hard drive, I switched gears and started setting up the new hard drive. I wanted to clone the data from the old drive to the new one, even though the main C: drive data would end up missing. My plan was to get the new hard drive set up the same as the old one was, with the same “SYSTEM”, “RECOVERY”, and “HP_TOOLS” partitions that were on the original drive and worry about filling in the empty “C” partition later. If my plan went well, I’d even be able to do a Recovery using the Recovery partition. However trying to proceed with cloning the old disk to the new would still have the same problem as I had originally: Clonezilla can’t clone to a smaller target than the source. So it was now finally time to resize the partition on the original drive. (I’d been holding off doing this because I wanted to see if I could extract the original partition from the disk but now that I’ve ruled out the ability to do that, I’m able to proceed with things that are more destructive to that partition.) Using my new best friend Hiren’s BootCD, I booted into Partition Magic and ran Gparted. Before I even did anything, I had a problem, though. The SMART failure was preventing modifying the C: partition because it knew the disk was damaged there. Well, since the partition is already known to be a lost cause, I just deleted the partition. Then, I moved the 3rd and 4th partitions up to be well within a 320 Gbyte drive’s limits. And then I recreated the 2nd partition.
Now with the 750 Gbyte disk set up the way I want the 320 Gbyte disk to be, I created a clone image of the old disk and tried restoring to the new disk. I got a failure from Clonezilla right away that the disks didn’t match. That’s especially disappointing since it seems to indicate that even if I had a disk larger than the original 750, the clone wouldn’t have worked unless it was exactly the same brand and model disk. Fortunately, I found a workaround for this at linuxquestions.org. You need to edit the contents of the image meta data files so they appear to have been created from the new disk instead of the old. I didn’t do things exactly the way listed in the workaround, but my technique was similar. Here’s what I did:
- boot up in Partition Magic from Hiren’s BootCD (I chose “live” because it starts faster)
- start Applications -> System Tools -> Mount
- wait for Mount window to open and initialize
- in Mount, click on the appropriate partition so that the partition containing the image is now mounted
- open a terminal window and cd to the directory that contains the disk image you want to restore (e.g.
- backup these two files (or ones named similarly):
cp -p sdb-chs.sf sdb-chs.sf.orig
cp -p sdb-pt.parted sdb-pt.parted.orig
sudo fdisk -l /dev/sdband note the geometry output
- edit the file sdb-chs.sf and modify the geometry in the file to match the output of the previous command; save and exit
sudo parted /dev/sda
- Now in (parted) prompt execute the following commands:
- use your mouse to select and copy the first four lines of output from the previous step
- edit the file sdb-pt.parted and replace the first four lines in the file with the copied 4 lines from the previous step; save and exit
- either close the terminal window or cd to the root (to move out of the partition that was mounted)
- switch back to the Mount app and click unmount on the partition you mounted earlier
- proceed with Clonezilla as previously planned (Clonezilla is also on Partition Magic – the “Disk Cloning” icon on the desktop)
My clone completed successfully. Finally, something finished the way I wanted!! Except when I booted from the new hard drive, I discovered that nothing was usable. I got the dreaded “0xc000000e Boot Error”. I know from past experience that this means the MBR is screwed up. I suppose that makes sense since the MBR on the new disk is different than it was on the old one. So I used Hiren’s BootCD to boot into Mini XP. First, I ran TestDisk, and did a quick Analyse. It found all the correct partitions so on the next screen, I switched the status to what I remembered from the original disk (*, P, P, P) and then wrote that to the disk. Then I opened a command prompt and ran diskpart. In diskpart, I did the following:
select disk 0
select partition 3
Then I rebooted. The Recovery manager came right up from the Recovery partition. Yay!
Alright, so time to try the recovery to rebuild the C: drive. I followed the instructions in the HP recovery software and confirmed I wanted to rebuild the disk. And then I got an error saying that I couldn’t use the Recovery software on the same disk that contained Recovery. But… What… Why would HP include it there if you could never run it from there? More Googling gave me the answer: you can do a Recovery from the same disk if the disk hasn’t been altered. In other words, that’s a fast way to put the disk back the way it was when you bought the computer but it won’t do for building a new disk since that would be a new install and that recovery partition would need to be built special for that disk. So there are a few different levels of recovery and the recovery you need for a different disk is not the recovery that is installed on the original disk. You can only get that recovery by using the original HP recovery CDs. Which of course HP (and Dell and every one else) no longer ships with consumer laptops. So they had to be ordered. Fortunately, I had anticipated needing the discs so my aunt had ordered them at the start of the process. But I didn’t have them yet.
HP sent the disc kit via standard mail, with no tracking information and when checking the HP web site, it says that they were shipped and delivered on the same day, December 2. And it also shows expected delivery date is December 10th. In other words, HP marked the shipment as arrived at exactly the moment it was shipped since there was no tracking information to use. Additionally, the link for tracking was still present with the word “NONE” so I figured I’d click on it and see why there was no tracking information. Instead, I got a new web page warning me that I was leaving the HP site when I clicked on the link below – where there was none. That means that the link on the previous page was merely a link to the warning page with a parameter of the page I was going to then be linked to but that parameter was empty which meant the warning page still came up but without a link. Nice web site programming, HP! So not only do I not have any tracking information, but I also found bugs in HP’s site.
In the mean time, I wondered if the recovery would work at all. So, I spent a little more time testing it out. I went back to the original drive, such as it is. I used Partition Magic to put the drive partitions back the way they were originally. Interestingly, I couldn’t move the 4th partition, the FAT32 “HP_TOOLS”, all the way to the end. It would only let me get close to the end. If I tried to move it all the way, it would throw an error saying “we are working on it”. So I discovered that I could move it close the end graphically and save it. Then do it again and get it closer and save it. The third time, the scale was now so small, I could move it to 1 Mbyte from the end. Then I moved the 3rd partition adjacent to it and grew the second one to fill up the blank space. That put the drive almost back to where it was originally. And I hoped that the “almost” would be good enough for the Recovery tool to work. I rebooted and ran Recovery (F11 on boot up) and it did work. Recovery was able to rebuild the original C: partition and then I saw what my aunt probably saw when she first bought the computer. And most importantly, I was able to test restoring files from her backup. (I had used Clonezilla to clone the USB drive that held the backup.) The files came back just fine so it seems like we will be in good shape when we get the new drive built cleanly from the recovery discs. The SMART error was still causing “imminent failure” warnings during boot and in the recovered Windows, it would periodically throw an error about the disk too. So it’s still clear that the drive is toast, but testing out the process was helpful.
While waiting for the recovery discs, I had one more thing to try. I still wanted to try forcing the partition restore from the recovered original image data. One of the recovered gz files was 2.2 Gbyte which I thought was unlikely to contain the full 100 Gbyte of data that was lost, regardless of how good the compression algorithm Clonezilla used is, but it was worth a shot to see what we got. So I went back to that image I had created of the 320 Gbyte drive and replaced the partition image file with the restored file. (The restored file had a recovery-assigned name but I renamed it to match the file that was representing the lost partition.) I tried restoring and got an error saying that the image file was corrupt. Not too big a surprise since I had always assumed that the restored file might not be complete due to the disk operations that took place after Ubuntu deleted the file. But I tried doing a gunzip on the recovered file anyway, to see what I got. The result should have been “data” according to the “file” command but instead, it said it was “ASCII text”. So I opened it to look at it and found a small Linux script. Then I looked at the file size and found that it was only a few bytes. I found a post about PhotoRec that says when it recovers gz files, it doesn’t know where the file ends, so it just keeps going until it finds other data it recognizes, meaning that there will frequently be extra stuff added to the end, but the original file should be extractable. And in my case, that’s exactly what happened. Except the file was .1% of the size of the data incorporated in the gz and the remaining 99.9% was the junk. I took another look at the recovered files from Ubuntu and the next largest file was only a few 100 Mbytes – definitely not large enough to be of any use in a backup. That means that the large files recovered were not actually that large and were files from my own Ubuntu disk, not the temp stuff from /tmp. So that’s it: the end of the line for checking for recovering the image file. My original conclusion about not being able to recover anything because the drive was in use was proven correct.
Within hours of me realizing that the recovered image file I had was junk, the recovery disc kit arrived with the 3 discs for recovery plus an additional one for drivers. I put the first disc in and booted. It came up with the same screen about doing a recovery, but this time, I was able to proceed. Hooray! Finally, we’re getting somewhere. I went away and checked back in later and it was waiting for disc 2 so I swapped discs putting in disc 2 and walked away. I came back later and found the computer off. Completely off. Weird. I turned it on and when it booted up, it came up with a note saying that it had now finished the recovery and would now reboot to complete installation. So I clicked okay to reboot and it came up with an error saying “Bootmgr is missing”. C’mon. Really? This process is supposed to be solid. What happened? Well, I have no idea. (Could it be that it gave up waiting for disc 3 while I was away so it shutdown? It was plugged in the whole time.) I just redid the whole process again starting with disc 1 and this time, it did ask for disc 3 and it even asked about whether I had an additional software disc. The drivers disc didn’t match by name, but I figured that might be what it was asking about so I put that in. It appeared to ignore using the disc because it wasn’t spinning that 4th disc, but it did do something for a while with that 4th disc in the drive. And at the end, it said the process was complete and there would be a few reboots. There were, and it took maybe a couple of hours to run through the whole process of automated reboots and setup routines. In the end, I got the same welcome routine that I had when the Recovery worked on the old disk (described above). I went through the prompts to complete the setup.
Now, all I needed to do was restore the user backup data. When I had tested this previously, I noticed that I ended up with two folders in Users – one named “user” and the other named “user1”. (Actually, “user” and “user1” are anonymized for this post and are actually the name of the owner of the computer.) In checking with my aunt, it seems that her original account “user” was abandoned a while ago when something went wrong with the computer and the only thing Microsoft could figure out to do was abandon the old account and copy it to a new user account. When I had done the test for the restoring of files, I didn’t know that so her files came through no problem. This time, when HP setup asked me for a username, I knew to enter “user1” so that the restore would put the stuff she needed in the right directory. Except the restore didn’t want to restore her AppData directory. All of the files that should have been written to AppData were “skipped”. Ugh. So, the stuff that makes a computer feel like your own computer and not something sterile and new is the stuff that won’t come back automatically through the restore. How stupid, Microsoft. The workaround I figured out on my own was to create a new user with administrative rights, log in as that user, run restore, choose only the AppData directory that failed to restore before, and instead of selecting the original folder, choose a folder in the temp user’s account. When the restore finished (with no skipped files), I copied the files from the temp user account into the right user’s AppData directory. I rebooted and logged in as “user1” and saw things appear as they last did before the backup. Finally.
But that doesn’t fully complete the process. The original OS of Windows 7 Home Premium had been upgraded to Windows 7 Professional. During the file restore, I discovered that the Win 7 Pro upgrade was actually stored in the user1’s Downloads directory. That’s easy, I figured, so I burned the directory to a DL DVD for posterity and then to prove that I could use the DVD, I tried to run the DVD in the drive that had burned it. (Note that this is still my Dell’s DVD drive with the retainer clip removed so it would fit in the HP.) Surprisingly, the disc couldn’t be read. In the drive that just finished burning it. So I hooked up an external DVD drive (again a Dell internal drive in an external case enclosure) and that one read the disc. Okay, so I clicked through the prompts and got to the point where it asked me if I wanted to upgrade the existing install or install new. I chose “upgrade existing” and then it told me that I shouldn’t be using this – I should instead use the Windows Anytime Upgrade program. Ha, ha, Microsoft, you got me again. Good one.
My aunt found the product key for the upgraded Windows 7 and I punched that in to the Windows Anytime Upgrade and off it went. It seems like there wasn’t much, if any, downloading, but everything was there ready to be flipped on based on the key. While that’s probably a smart way to do it, it would have been nice if the existing download didn’t waste my time loading only to give me a “thpp” at the end.
With Win 7 Pro installed, I ran through the Windows Update (over and over) until I got through everything including SP1. I also redid the restore from backup, just in case the update touched any of the files from the backup. (Which means I needed to create that tempuser again, do the restore there, and then copy to user1’s AppData again.) And finally, I installed Office, then did all the updates associated with Office. And finally, I wrapped up the system build by installing Avast (have to make sure to do custom install to not get the junk stuff you didn’t ask for).
To help make sure she has a good backup going forward, I set up Windows Backup to run weekly and back up to a USB drive. The only USB drives she has now are “Cruzer Obrit” drives which do not seem durable. And they are only 16 Gbyte. I’d like to see her upgrade her USB drive for backup to something substantial and with greater capacity. I also set up Sync Toy to sync the My Documents folder with another of the Cruzer Orbit drives. That will make it so she can keep the laptop My Documents as the master, have the backups drive from there, and have the portable access to My Documents. And I added a task to Task Scheduler to automate syncing every night.
Here’s a summary of the problems I encountered during this process in the order in which they occurred or I encountered them:
- CD/DVD/BD drive failure
- SMART disk failure
- can’t boot from CD because of optical drive failure
- can’t boot HP dv7 from external CD because of BIOS
- can’t boot HP dv7 from Clonezilla CD because of hardware driver problem that causes reboot
- can’t clone to a smaller drive in Clonezilla
- Ubuntu clears tmp directory on every boot wiping out image file
- PhotoRec couldn’t recover image file because Ubuntu drive already had overwritten file data
- TestDisk couldn’t get the disk info correct on the failing drive
- TestDisk couldn’t undo whatever it is that I had done
- Disk Genius couldn’t use what was left of the drive after TestDisk attempts
- TestDisk and PhotoRec are extremely slow on a failing 750 Gbyte drive
- PhotoRec doesn’t recover gz files well
- Gparted can’t modify partition that is damaged due to SMART failure
- Gparted can’t move FAT32 partition to the end of a disk
- Clonezilla won’t restore to disk that is different (not just smaller) without modifying metadata files
- Recovery partition on HP original install only works with original disk
- After recovery, get Boot error because MBR broken
- HP doesn’t include recovery disc kit with PCs when purchased; have to be ordered for a fee
- HP’s shipping doesn’t include a tracking number
- HP marks the product delivered as soon as it is shipped
- HP’s web site has a bug with showing tracking as “NONE” with a link to a broken page
- HP’s Recovery disc process can break while not attended
- Microsoft Backup’s Restore process cannot restore to AppData
- Upgrading Win7 Home Premium to Win7 Pro cannot be done with a file that is already downloaded
One other note that’s worth holding on to but that I didn’t end up needing. If you need to create recovery discs but you have already done it once before, there’s a trick to it. See the page at thinkdigit for the procedure.
Also, it’s worth mentioning that the WD “mainstream” hard drive that my aunt got as a replacement at her local Best Buy turned out to have a “Blue” inside the package. I had previously thought that “mainstream” was below the green/blue/black/red series of drives. But it turns out that “mainstream” is just a “blue” drive in a box ready for retail sail instead of “OEM” type raw packaging.