Storage
-
Make Windows Search service work with Data Deduplication service
Windows Search service does not Work together with Data deduplication service. Besides saving a lot of your customers a lot of time it would be a strong feature, and actually as expected by customers.
https://technet.microsoft.com/en-us/windows-server-docs/storage/data-deduplication/interop
Thanks, and kind regards
Sven Andersen301 votes -
DFS Replication and Deduplication Integration
The deduplication feature makes Windows Server an attractive OS to use for a backup repository. There is just one issue...replication. One could use DFS Replication to replicate the backup files off-site. But, DFS-R rehydrates the files, and then uses it's own compression, to transmit them. This requires allocating space to DFS-R for staging. Plus, the DFS-R compression is likely not as efficient as deduplication.
It would be great if DFS-R could be configured to replicate the already deduplicated blocks. This would provide the most efficient trasmission while eliminating the need for separate staging for DFS-R (in this use case).
3 votes -
Have a file extension whitelist for deduplication
I'd like to specify only certain file types to be deduplicated, as our users tend to copy around big movie files, but I still want Windows search to work for regular files (which are also small enough to not bother with deduplicating).
See also the folder whitelist https://windowsserver.uservoice.com/forums/295056-storage/suggestions/7961025-de-duplication-feature-should-be-enabled-on-direct and https://windowsserver.uservoice.com/forums/295056-storage/suggestions/17888647-make-windows-search-service-work-with-data-dedupli
2 votes -
Add deduplication support to client OS
Today there is an enthusiast community* that runs deduplication bits on client for various applications, e.g., keeping multiple virtual disk images on an SSD. Enabling it on a larger scale would allow both clients and enterprises to save costs by having to buy smaller SSDs and by being able to store more aggregated data. While the overall usage may remain small, there would be a dedicated following of fans and users that would find this feature highly useful.
*(These enthusiasts typically capture the dedup package from a Windows Server installation and then use the dism tool to add them to…
858 votes -
de-duplication feature should be enabled on directory based (not volume or disk based)
de-duplication should be able to be enabled on directory based (unlike the current volume based approach). Currently we can enable it on a volume basis and we can exclude some directories. Especially some applications make big log files on OS drive which contains lots of duplicated data inside. We can compress those drives by enabling compress feature checkbox but as the compression works on file basis we loose so much space for duplicated log content. For example our veeam backup software logs which are located in C:\Programdata\Veeam folder is nearly 40GB. This can ve reduced to 20Gb with compression enabled…
32 votes -
Inline deduplication
We use deduplication on volumes used as VTL in System Center DPM and it would beneficial to have at least at deduplication pass done inline to minimize the storage requirement of the volume as the backup is first done undeduped first and then a lot of free space is recuperated once the deduplication job has run.
27 votes -
Dedup dedicated or reserved file space
Dedup should reserve the space needed to successfully run its jobs on the disk so that the disk is never 'too full' to dedup. It would be great if this functioned kind of like page files where an admin could even direct the 'reserved' space to a different dedicated volume.
Dedup requires a certain amount of 'free' disk space to run. When using dedup on user-facing file shares it never fails that some users will suddenly drop a huge number of files on to the share and then dedup will fail all its jobs due to a lack of free…
1 vote
- Don't see your idea?