Disabling Data Deduplication on Windows Server 2012R2

server2012r2

Sometimes you have to give up a technology because it doesn’t suit all scenarios….

Today, I had to remove Windows Server 2012R2 Data De-duplication, which is a very clever technology that reduces disk space use.   The downside in my scenario was that it also seemed to cause massive slowdown in Windows Server Backup, and whilst it gave me great disk space savings (60%) it meant having to close down Hyper V instances.   I sure hope that Microsoft in Windows 10 Server give us better usage of De-duplication on live HyperV guest systems, and not just VDI instances.

Anyway…   the point of this post is more for my own future reference, as well as maybe helping someone else out there who is removing Data De-duplication from Server 2012 R2.

 

Disabling Data De-duplication doesn’t “undupe” the data already “duped”.

If you disable Data DeDuplication via the GUI or Powershell, it does not actually undo the work it has done.   Worse, if you have disabled, you also cannot run a garbage cleanup command (which cleans up the data created by the deduplication technology).

So, its important that you leave Data Deduplication enabled, but EXCLUDE the entire drive first.   Then, run the following two commands (which will take ages to run dependant on the amount of data you have).

 

The unoptimise command:

start-dedupjob -Volume <VolumeLetter> -Type Unoptimization

 

Check the status:

get-dedupjob

 

Clean up the Garbage:

start-dedupjob -Volume <VolumeLetter> -Type GarbageCollection

 

Check the status:

get-dedupjob

 

Once you have both of the above done, and it will take a while, you can remove the deduplication role from your server.

  23 Replies to “Disabling Data Deduplication on Windows Server 2012R2”

  1. Robert Affleck
    March 2, 2015 at 5:45 pm

    Thanks for this article. Microsoft really should have put in a prompt if you turn off deduplication that this isn’t doing what you might think.

  2. Mike
    March 10, 2015 at 4:49 pm

    This is exactly what I was looking for. Thank you so much. It also turns out that enabling deduplication causes windows search indexing to skip all deduped files.

    Found that out the hard way.

    • Siegfried
      March 13, 2015 at 1:00 pm

      Thanks a lot!!

  3. Joachim
    April 17, 2015 at 12:45 pm

    Do I have to wait until one job is finished or run both jobs at the same time? I started “start-dedupjob -Volume E: -Type Unoptimization” 3 hours ago and still have 0% progress 🙁

    • Joachim
      April 18, 2015 at 12:36 pm

      start-dedupjob -Volume -Type GarbageCollection does not work after Unoptimization: ” HRESULT 0x80565323, The specified volume is not enabled for deduplication”. the first command actually disables dedup.

      • August 5, 2015 at 7:26 am

        You’ve disabled dedupe on the volume, you need to re-enable it but exclude the root folder from dedupe, then run the commands, as stated.

        • John
          July 16, 2020 at 7:45 pm

          I got the same error. I hadn’t disabled it. It showed enabled for virtual server. So based on what you said I disabled it applied that. Then re-enabled it for virtual server and applied that. Then it let me run the garbage collection command. Then disable again. Not sure if this was advisable or not. Unlike the optimization command which took a long time this second command was fairly quick. All the data from sysvol appears to be gone and files are reporting true size on disk.

  4. Ajay
    May 26, 2015 at 2:00 am

    Hi Nick,

    would you have any idea how much data (in GBs) can be un-deduped per hour, just rough idea?

    thanks
    Ajay

  5. Dave
    March 9, 2016 at 8:42 pm

    What method did you use to exclude the “entire drive” from active deduplication before running the unoptimized job? I’ve seen excluding all the folders, but not the whole drive.

  6. MarcK4096
    March 15, 2016 at 3:47 pm

    Joachim is right. “start-dedupjob -Volume -Type Unoptimization” will automatically turn off dedup on the volume. In fact, I’ve found that it will also clear out the ChunkStore folder. (If it completes successfully. More on that in a bit.) So, the method to turn off dedup is just to run “start-dedupjob -Volume -Type Unoptimization”. Perhaps Microsoft changed something since this article was written.

    Now, I did run into the problem that sometimes the unoptimize job would encounter a problem and exit early leaving the ChunkStore in tack. In that case, I was able to rerun the Unoptimization job again.

    So, my recommendation is to run it multiple times. If you try to run it on a volume that has dedup completely reversed, it will display an error. If the job had previously ran partially, it will resume. So, my recommendation is to run “start-dedupjob -Volume -Type Unoptimization” multiple times on the volume until the error is encountered.

    • Nick
      March 15, 2016 at 4:51 pm

      Thanks for the comments all. Being honest I cannot remember why I added the second command but I do know there was a reason because I was left with junk. 🙂

  7. DedupQuestion
    October 7, 2016 at 6:52 pm

    The only previously deduplicated volume on my server has been completely unoptimized and deduplication has been disabled. However, when I try to remove the role, I get a warning. Is that standard, or is there something I’ve missed?

  8. Dave Piehl
    January 18, 2017 at 1:03 am

    Also, be very careful on systems that you may multi-boot to Windows 2012r2/ 2016. The dedup mechanism is different. Files deduped in 2016 cannot be read by 2012 unless you boot back into 2016 and unoptimize them first. Learned that the hard way.

  9. Grant
    April 27, 2018 at 12:44 pm

    Worth mentioning, if you leave it to its own devices and come back and find no result from the get-dedupejob, check eventlog “Data Deduplication/Operational”
    EventID: 6146
    Category: Data Deduplication Unoptimization Task

  10. John W
    August 17, 2018 at 12:37 pm

    Great explanation! I found this out the hard way after turning dedupe off and having old chunks cluttering up system volume information until the drive was almost full from new data.

  11. Anthony
    October 16, 2018 at 12:44 am

    Great article and Comments. Helped me out a lot as I go to “Undupe” a 4TB volume on my 2012 R2 server where, due to the tpe of data, I was only getting about 5% dedupe.
    A few other observations:
    1. The ‘Unoptimization’ process has been running for over 2 weeks on my server and is still not complete. I did have to stop/restart the process once to apply security updates and reboot, but this looks like it is going to take a VERY long time and has a VERY high I/O impact on the server.
    2. Running “get-dedupjob” to get the status (i.e. Progress value) I found it was sitting at 0% for many days, then suddenly jumped to 51% one day. Then over the next 2-3 days I saw 1% increments and then it just seemed to stay stuck at around 57%. Given I had to restart the job, I’m back at 0% again..

  12. Corey Bryant
    November 8, 2018 at 6:02 pm

    Great article, but I have a question. I enable dedupe on our 2012 server and it caused severe issues with one of our applications. I disabled it, without going through the proper steps in this article. Should I re-enable it and then exclude and disable after that, or am I screwed?

  13. June 20, 2019 at 11:05 am

    I do not understand this part, delete the entire drive, partition, will delete all the data?

    They could better explain what needs to be excluded.

  14. mike mcsharry
    July 30, 2019 at 10:11 am

    Awesome advice from Nick – once again.. That’s another pint or three I owe him!

    • Nick
      July 31, 2019 at 12:50 am

      I still accept payment in beer or gin. 😉

  15. Steve Irvine
    February 15, 2023 at 9:10 am

    Thanks, very useful…

Leave a Reply to Joachim Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.