We ar e working with a 3par san combined with a hyper-v cluster. I have some virtual machines with direct attached storage. when i issue the command optimize-volume -driveletter x -retrim I Always get an error Not enough storage is avalable to complet this operation. The command defrag /L gives me an out of memory error. All volumes are over 2 TB
Optimize-volume -retrim gives error Optimize-Volume : Not enough storage is available to complete this operation
Re-creating DFS Replication between two folders
I have two DFS servers: SERVER1 and SERVER2. I have a namespace called "COMPANY" and one of the replicated folders is called "USERDATA" and it contains around 880gb worth of data. The DFS path is \\Company\UserData
SERVER1 had some problems and I had to remove \\SERVER1\UserData as a DFS folder target, and I turned off replication with SERVER2. When it came time to bring it back online, I followed some advice and used Robocopy to make sure the data in \\SERVER1\UserData matched the UserData folder on Server2 exactly. This is the command I used:
robocopy \\Server2\UserData \\Server1\UserData /e /b /copyall /r:6 /w:5 /MT:64 /tee /log+:d:\robocopy_dep.txt /v
To be sure, I then ran the command in reverse:
robocopy \\Server1\UserData \\Server2\UserData /e /b /copyall /r:6 /w:5 /MT:64 /tee /log+:d:\robocopy_dep.txt /v
The results show the two folders are now identical; there were no extra files, and no files were copied when I ran the second command because everything was identical already.
I then made sure to delete the "DFSRPrivate" folder from each server's share. In DFS manager, I deleted the replicated folder "UserData" in the replication group (but not the whole replication group). I made
sure the changes had replicated to SERVER1 and SERVER2.
I created the replicated folders again and set SERVER2 as the primary. After about 15 minutes I checked to see if there was a backlog. There was NO Backlog going from SERVER2 to SERVER1, but there was a backlog of 10k files, which had grown to 50k files within 10 minutes going from SERVER1 to SERVER2. The backlog list of the first 100 files was listingevery subfolder in the UserData share, so it seems like it is queuing up the entire contents of the folder, even though they are identical.
It does not appear to be moving anything as far as I can tell since the size of the UserData share on SERVER2 remains the same, as does the UserData share on SERVER1. But having a backlog of hundreds of thousands of files is just not good.
Am I doing something wrong? Can this be avoided or is it just part of the process and I have to wait for it to work itself through?
Thanks
DFSR performance question - high latency and high bandwidth - windows 2012r2
Hello,
first question here...
We have a pair of backup servers, connected with a 1G WAN link (290 ms latency).
I am replicating backup files (mainly SQL files) between them (one-way), for DR/BCP purposes.
a Get-DfsrState regularly shows hundreds of files in waiting state, and only 16 or so being downloaded at the same time. The system is using very little bandwidth, much less than what I have set in the replication properties (256mbps)
Is there any parameter I can tweak to increase the number of files being transferred in parallel copies?
Before you ask, the WAN circuits are fine and currently ~80% free. I can easily saturate them using other technologies (rsync, gluster, HTTP). The servers are doing very little as well (2x6C, 32G ram, lots of disks, CPU usually under 10%).
Any idea on how to speed this up? I saw some registry settings in old articles for windows 2008r2, but nothing for 2012r2.
Thanks
File Corruption During Sync Windows 8.1
Hi
I have recently introduced some Windows 8.1 systems into our environment, a Sony laptop and a non branded custom build, both systems are running Windows 8.1 Pro.
Our environment is as follows:
Windows Server 2008 domain
PPTP VPN for laptop users using a Watchguard X515.
Netgear ReadyNAS 2100 for Network Folders.
Folder redirection in place for Documents and Desktop.
Offline files in use. Users have the ability to make shared network files available offline if required.
The first device we had with 8.1 Pro was a Sony laptop for our Marketing Director. He has manually made a few shared network folders available offline for when he is out of the office and unable to connect to the VPN.
We first noticed an issue a few weeks ago when he came to me telling me that one of his documents were corrupt. The file was one of the ones he has made available offline. When he got into the office he ran a manual sync, to put the updates onto our shared drive. When he next came to open it, the file was corrupt with the following message:
'Excel cannot open the file 'XXXX XXXX.xlsx' because the file extension is not valid. Verify that the file has not been corrupted and that the file extension matches the format of the file'.
I ran the file through Excel's file repair, which couldn't fix the problem and I tried a few software packages from the internet which said the file was unrecoverable, to no avail. We tried to recreate the issue but couldn't so we put it down to a one off. A few days later, it happened again under the same circumstances.
I put it down to something happening while the laptop was offline, however whilst researching this, the same problem happened on a Windows 8.1 Pro Desktop. This system never leaves the office, which led to some confusion.
Everywhere I look points to ransomware, however I have run full scans with Sophos and Malwarebytes, and both come back clean, and the issue is only occurring on 8.1 Pro systems, 7 Pro and 8 Pro systems are fine.
The user with the 8.1 Pro desktop created a test spreadsheet for me, and this morning we managed to recreate the problem. When running a manual sync, the Offline Files comes back with an error, stating that the network copy has been deleted, would you like to copy the local copy or delete it? When we copy the local copy to the network, and then try to open it, we get the previous error about file corruption.
Any help would be greatly appreciated as I will have to roll the systems back to Windows 7 if I cant get it resolved, which would be a shame as the laptop is has a flip screen to turn it into a tablet.
Cheers
Jim
Undoing Spanned volume?
I am not sure if I am asking the correct question. Please forgive me if I am asking this in the wrong place.
I have Windows Server 2012. For purposes of this, I have 3 disks. The primary partition (Disk 0) and two others (Disk 1 & Disk 2). My question does not include the primary partition.
I shrunk Disk 1 so that I had unallocated space. I then created a spanned volume to disk2 (extended volume on disk2). It did not do what I wanted so I wan to 'undo' it. Can I just delete the spanned volume (space that added to disk 2)? Or more accurately how can I get back the unallocated space? Do I just shrink again? Or do I need to back everything up, reformat and re-partition it all?
Windows user cannot see files copied into a folder
File share is on a Windows 2008 (R1) server.
We use shared folders as mailboxes for users. When the file scanning person (Heidi) scans the incoming paper mail, it is placed in each user's individual mail folder on a shared drive. Each user has full control permissions on their mail folder. All of our users are running Windows 7 Pro x64, with the exception of two people who run Windows XP SP3. Heidi is one of those users.
Angela has a mail folder with full control permissions. When Heidi scans something from her desktop and places it in Angie's mail folder, Angela cannot see it from her computer. Anyone else can see it fine.
I set up a Windows 7 VM and logged in as Angela and I can see the file with no problems. So the only time Angie can't see her files is when she's logged in to her computer in her own office. Anyone know why?
Combine Hardware RAID1 with SIMPLE TIERED Storage Spaces Windows Server 2012 R2
Configure the existing HDD drives as pairs in RAID1, let's say 6x HDD disk as 3x RAID1 pairs. Then add 2x SSD disks and also configure them as a RAID1 pair.
Now install Windows Server 2012 R2 on the first RAID1 HDD pair and then create a SIMPLE, TIERED Storage Space out of the remaining two RAID1 HDD and the RAID1 SSD pairs. Why do I want to do that?
a) Through this configuration I get a "big" tiered Virtual Disk with all advantages of tiering / performance
b) I do not have to replace the HP SmartArray Controllers
c) I have full Redundancy since all disks are in RAID1 pairs
d) I should be able to hot plug / replace failed disks without any problems since the SmartArray controller handles the disks at this layer / Windows Server should not even realize the disk failure (except of the performance degradation)
Does this work? I think it should because configuring a Storage Spaces with the "SIMPLE" Storage Layout means no Redundancy and no "Software"-RAID functionality on the OS layer - the Hardware RAID logic of the SmartArray controller should not be interferring with the Windows Server Storage Spaces layer?!?
BTW: NO Clustering of the Virtual Disk is planned - the Virtual Disk is used only as a simple File- and VHDX-Container.
Any feedback on this idea?
Mark
Namespaces in DFS are not in redundancy system
Hello,
here is my scenario. I have 3 servers all connected to a namespace in a DFS environment.
Folder1 is replicated on each of these servers.
My shortcut is pointing to the namespace share.
If I shutdown one of the namespace servers, and try to pull up a shared folder within the namespace, nothing happens. Once i power up the server, i am able to get to the folder and see my files.
I have checked to make sure the folder is in replication and that is correctly setup in the namespace and all seems well.
Am I missing a target setting or something along those lines? Any suggestions where to check?
NTBackup & Task Scheduler - Error 01xf
I'm trying to create an NTBackup solution which would back up my hard disks weekly, combined with Scheduler to make it automated for my Windows Server 2003 system.
The first time I tried it, everything seemed to work. However, after a physical office move, things somehow don't work anymore. Deleting and recreating the whole thing doesn't help either.
Using the backup manually via Backup & Restore works, but somehow when it is run in scheduler, it returns error code 0x1f:
"Monday Backup.job" (ntbackup.exe)
Started 10/6/2008 10:08:34 AM
"Monday Backup.job" (ntbackup.exe)
Finished 10/6/2008 10:08:35 AM
Result: The task completed with an exit code of (1f).
Trying to run the scheduled task by right clicking and selecting "RUN" yields the same result.
The full command line in scheduler is:
C:\WINDOWS\system32\ntbackup.exe backup "@C:\Documents and Settings\Administrator\Local Settings\Application Data\Microsoft\Windows NT\NTBackup\data\Saturday Full Backup.bks" /n "Saturday Full Backup.bkf created 10/4/2008 at 12:44 PM" /d "Set created 10/4/2008 at 12:44 PM" /v:yes /r:no /rs:no /hc:off /m normal /j "Saturday Full Backup" /l:s /f "Z:\6. Saturday\Saturday Full Backup.bkf"
There should not be any permission problems in my Z drive as I can backup manually to that location.
Any help would be appreciated.
How to make files/folders get updated ACLs by inheritance when you move in same volume?
Hi everyone!
I'm having some issues with my W2008R2 file server (using DFS). The problem is: when an user connected to a file server share (using W7 box), moves (drag and drop) files or folders from one folder to another in the same volume of the server, the ACLs are not automatically updated by the ACLs of the new parent folder, I mean, it is not inheriting the new ACLs from the new parent folder.
I've read this article: http://support.microsoft.com/kb/310316
but it says:
- Make sure that the user account that is used to move the object has the Change Permissions permission set. If the permission is not set, grant the Change Permissions permission to the user account.
Users have the "Modify" permission set in both folders (the permissions are granted to Domain Local Groups of AD, which users are members of). As you know, "Modify" permission does not grant users the ability to "change permissions", which is correct, because you don't want users to have the ability to modify permissions in file server.
The question is: is there a way to fix this issue and to get ACLs automatically updated?
Thanks!
Profile Loggin issue.
*** NEWBE ***
Hi All,
We have been getting errors on our Win2K8 server which are causing problems and are getting worse. It is taking an age to login and we are getting Alerts and Event log errors - see below.
appw: wlclntfy (warning) (0) (6005) : the winlogon notification subscriber <gpclient> is taking long time to handle the notification event (logon)..
Event ID’s
“Windows detected your registry file is still in use by other applications or services. The file will be unloaded now. The applications or services that hold your registry file may not function properly afterwards.
DETAIL -
2 user registry handles leaked from \Registry\User\S-1-5-21-3423745762-702078250-474996060-296380:
Process 884 (\Device\HarddiskVolume1\Windows\System32\winlogon.exe) has opened key \REGISTRY\USER\S-1-5-21-3423745762-702078250-474996060-296380
Process 908 (\Device\HarddiskVolume1\Windows\System32\svchost.exe) has opened key \REGISTRY\USER\S-1-5-21-3423745762-702078250-474996060-296380\Printers\DevModePerUser
Is there anyone having similar issues to this. Can someone offer any direction? The client machines are Win7 and the servers are VM and W2k8 R2. if you require any further details please let me know.
P.S. I believe that there is a profile Hive Cleanup application, can this be used on W2k8 or is it for W2k and W2k3 exclusively?
Thanks.
Files and/or folders not showing when drive accessed via UNC path (or mapped drive) and not logged into the domain
We have some users who are mobile and domain authentication across site-to-site VPN is not always successful. For this reason, I give UNC paths to for a shortcut or "Network Location" and otherwise map drives via a script where the mapping includes, e.g.:
net use S: \\10.10.22.11\Sharename /persistent:no password /USER:domain\username
This consistently gives the user access to desired shares when outside the domain.
However, there are times when a folder or file saved from within the domain does not appear accessible to the remote user via the aforementioned connection.
I have generally assessed this was a time or propagation issue and if I browse from an effected computer, and use the entire path with filename, the file will open. If, however, no such complete path with filename is used, the files and/or folders remain invisible.
Since this is intermittent, I am afraid I have little further information.
My hope is this is common or at least known and there is something I can do to alleviate the issue.
Thank you in advance for any assistance.
Stuart
Stuart TechnoFile
UAC causing access denied for Domain Admin user w/ Full Control
NOTE: I've determined that this problem is being caused by UAC. Read my second reply after this post.
Hi all, I'm experiencing very strange behavior with share/NTFS permissions. My account (let's call it "Corbin Dallas") is a member of the Domain Admins group and is getting access denied when trying to open a file on a network share locally. The NTFS permissions on the share are configured for Full Control for Domain Admins. This is how I (Corbin Dallas account) created and configured the share:
\\server\TestShare$
Share Permissions:
- Authenticated Users - Full Control
NTFS Permissions:
- Configure folder to not inherit permissions and remove all existing permissions.
- Add Domain Admins with Full Control - This Folder, Subfolders, and Files.
- Add System account with Full Control - This Folder, Subfolders, and Files.
- Add Local Administrators group on server with Full Control - This Folder, Subfolders, and Files.
- Owner Rights with Modify - This Folder, Subfolders, and Files.
- Test User with Modify - This Folder, Subfolders, and Files.
Note that I have not explicitly assigned my Corbin Dallas account any permissions, but should still receive access as it is a member of the Domain Admins group which has Full Control.
So the problem itself comes about when I do the following: I log into a Windows 7 computer as "Test User" (which only has Domain Users membership) and access \\server\TestShare$ and create a .txt document in the root of the folder and log off the computer.
Then on the server hosting the TestShare$, I log in locally as my Corbin Dallas account and attempt to go into the TestShare folder and try to open the .txt document that "Test User" created but I'm given an access denied which doesn't make sense since the Corbin Dallas account is a member of the Domain Admins group which has NTFS Full Control permissions to it. What's even stranger, is that if I log into another computer as my Corbin Dallas account and access the share through \\server\TestShare$ and try to open the .txt document, it opens fine without any access denied prompts.
So basically, I'm only denied access when I attempt to open the file locally on the server. What have I configured wrong? Any suggestions are greatly appreciated.
Event ID 4004 when re-enabling a Replicated Folder
I had DFS working for a while between a 2K8R2 server and a 2K12 Server. This is only being used for backup purposes, and the backup replica server is set to read only. When I started to get massive amounts of backlogged files I decided to see if I could re-do the replication. Since the replicas are huge I figured I'd go with the pre-existing approach, as opposed to deleting the folders altogether. I made sure that the hashes were good between the two servers, disabled the replicated folder (in the GUI under memberships | replicated folder | disable. I did not delete any members, nor did I touch the data files on either end. I waited for event ID 4114 on both servers before I did anything. When I first tried to re-enable the replicated folder I got an error message about security could not be set on the replicated folder - access denied. This went away after I restarted the DFS service. However now I get an event ID 4004 in the DFS log.:
Additional information: Error 9075 (the content set is read-only and can't be a primary). I don't know what's going on. All I did was disable and re-enable the replicated folder. I didn't even delete the replicated folder!
Any help would be greatly appreciated.
Local share ask password
Hi all,
I have two servers:
SERVER1 - W2K8R2 Running solution of record security cams.
SERVER2 - W2KR2 Windows shared, storage recorded files.
SERVER2 will died, and I'll transfer all recorded files to SERVER1... The solution use \\SERVER2\FILES with reference to find old records.
Due to this I need access path \\SERVER2\FILES on SERVER1. I created one share FILES and two entries in host files on SERVER1
# localhost name resolution is handled within DNS itself.
#127.0.0.1 localhost
#::1 localhost
127.0.0.1SERVER2.mydom.local
127.0.0.1SERVER2
So when I run \\SERVER2\FILES in SERVER1 password are asked.
I need access this path without password, like a normal share.
Douglas Filipe http://douglasfilipe.wordpress.com
File/Folder permissions for a scanned drive
I am trying to figure out how to set this up. I have a root folder I created called "scans" with the permissions below:
http://tinypic.com/r/iw7zsx/8
I have a login script which creates the directory for the user one time in the root folder above based off of the user variable %username% The permissions come out like this:
http://tinypic.com/r/scx5xc/8
I have a GPO which maps this %username% subfolder for each user as the Z drive.
My scanner uses a generic account "Filezilla_SVC" to scan to each user's subfolder. The issue I am having is that the permissions on the scanned file are not allowing the user to even see the file. The permissions of the file are like this:
http://tinypic.com/r/zvtzpl/8
How can I get the permissions to propagate down to the file so the user has access as well? I have a lot of users so I don't want to edit permissions manually.
Thanks in advance!
2012 R2 Data Deduplication - Unoptimize single directory - part 2
This is a continuation of a similar post from Shaun TPG.
I am attempting to un-dedup files that I recently added to my "excluded" parameters in my Dedup job settings (.pst's). I excluded both "pst" extension types, and a specific folder. Now I am trying to unoptimize these files that are still deduped. I can see that my "InPolicyFiles" no longer matches my "OptimizedFiles"\
I have two questions:
- How do I determine which files are deduplicated and not? I cannot run the cmdlet "get-dedupestatus" on a specific folder
- How do I expand a file that should no longer be deduped? The cmdlet "expand-dedupfile" was mentioned in a previous thread. However, when running on a specific file, I get no output. I also included the "-asjob" function, but am unable to get any results from the "get-dedupjob". The Optimized file total also does not change.
- How do I expand an entire directory? Even if I was able to get the cmdlet "expand-dedupfile" to work, when piping in a directory, the command returns error "0x80070005 - Access is Denied". I have verified ownership and full permissions, and running the command manually on an individual file works correctly. The previous post about unoptimizing a single directory also has a user experiencing the same issue. I should note that I do not get any "access denied" message when running the command on an individual file, although I still do not have any output/results.
Dedupe is really cool, but I've lost many files due to corruption, and beginning to rethink it's value in a production environment.
Thanks in advance.
PS D:\file archives\Exchange Mail Archives> dir "d:\file archives\exchange mail archives" | expand-dedupfile
expand-dedupfile : MSFT_DedupVolume.Path='D:\file archives\exchange mail archives\folder1' - HRESULT 0x80070005,
Access is denied.
At line:1 char:49
+ dir "d:\file archives\exchange mail archives" | expand-dedupfile
+
~~~~~~~~~~~~~~~~
+ CategoryInfo : PermissionDenied: (MSFT_DedupVolume:ROOT/Microsoft/...SFT_DedupVolume) [Expand-DedupFile
], CimException
+ FullyQualifiedErrorId : HRESULT 0x80070005,Expand-DedupFile
expand-dedupfile : MSFT_DedupVolume.Path='D:\file archives\exchange mail archives\folder2' - HRESULT 0x80070005,
Access is denied.
At line:1 char:49
+ dir "d:\file archives\exchange mail archives" | expand-dedupfile
+
~~~~~~~~~~~~~~~~
+ CategoryInfo : PermissionDenied: (MSFT_DedupVolume:ROOT/Microsoft/...SFT_DedupVolume) [Expand-DedupFile
], CimException
+ FullyQualifiedErrorId : HRESULT 0x80070005,Expand-DedupFile
not able to extend the disk partition in our server 2008
Hi all,
extend the partition but t it not able to do that.We are using the windows server 2008 in our setup. one of my exchange server I want to extend the partition. I had already extend the partition but now its not able to do that and shows the below error.
in the diskmangement view I can see the partition in splitted format when ever I previously extended it shows the same( I extended 3, 4 times). is it ok to delete the Splitted partition?
awaiting for the solutions eagerly.
Jags
Should I format U disk at first or recover data at first?
Extend C partition
Hi,
I have a disk that I had previously partitioned so that I had a separate system drive and data drive.
My system drive is filling up though so I shrunk my data drive and intended to extend my system drive with this space but I can't see how I can extend and use it. Can anyone give me some tips or let me know why.
Thanks Andy