A lot as been said and written about the infamous MAX_PATH limitation imposed by the Windows API. Still in 2013 this keeps being a major pain for Windows File Server administrators and as such has been over-discussed but never properly solved. My opinion is that, unless a major architectural rework is done at Redmond, it is up to system administrators to keep an eye on their filesystems and try to evite as much as possible to have paths going over that hardcoded maximum.
That's why I want to share the Powershell script I use to find those paths that exceed 260 characters in lenght, being 260 the value of MAX_PATH ever since.
- function Get-Longpaths {
- <#
- .SYNOPSIS
- Retrieves a list of the paths that are too long for being managed with get-childitem.
- .DESCRIPTION
- Retrieves a list of the paths that are too long for being managed with get-childitem.
- An object is returned containing the list of the paths longer then 260 characters.
- The aim of this function is to help sysadmins to handle paths which could cause the
- “System.IO.PathTooLongException“ due to their excessive length comared to what is
- supported by .NET and Windows API.
- .PARAMETER Path
- The parent path that you need to recursively check.
- .PARAMETER Csvpath
- The name of the csv log file you optionally want to export your list of long paths to.
- .EXAMPLE
- Get-Longpaths c:\
- Retrieves long path on the system partition c: and show on the current host.
- .EXAMPLE
- Get-Longpaths f:\documents c:\mylogs\longpathnames.csv -verbose
- Retrieves long path inside the folder f:\documents and saves the
- output to a csv file named longpathnames.csv under c:\mylogs\.
- It also shows additional information on the task being performed.
- #>
- [CmdletBinding()]
- param(
- [Parameter(Mandatory=$true)]
- [string] $Path,
- [Parameter(Mandatory=$false)]
- [string] $csvlog
- )
- $options = [system.IO.SearchOption]::AllDirectories
- $allfiles = [system.IO.Directory]::GetFiles($path, "*",$options)
- $allfolders = [system.IO.Directory]::GetDirectories($path, "*",$options)
- $toolongcontainer = @()
- foreach($file in $allfiles)
- {
- if($file.Length -gt 259)
- {
- write-verbose "Adding $file to the list of too long paths"
- $toolongcontainer += $file
- }
- }
- foreach($folder in $allfolders)
- {
- if(($folder|out-string).Length -gt 259)
- {
- write-verbose "Adding $folder to the list of too long paths"
- $toolongcontainer += $folder
- }
- }
- if(!($toolongcontainer))
- {
- write-verbose "No too long paths found. Good."
- $toolongcontainer = "No invalid path under $path"
- }
- if($csvlog)
- {
- write-verbose "Exporting output to $csvpath"
- $toolongcontainer | Export-Csv -Path $csvpath -NoTypeInformation -USECULTURE -ErrorAction Stop
- }
- return $toolongcontainer
- }
- Get-Longpaths C:\longpath -verbose
As you can see I used the methods GetFiles and GetDirectories of the system.IO.Directory class. In particular GetFiles returns the names of files in a specified directory, while GetDirectories gets the names of the subdirectories. The cool thing with this methods is that, for some mystical reasons explained here and here, they return the full path as a string or arrays and that's all we need.
Then all the objects returned by these two methods are added as strings to a string array named $toolongcontainer for later analysis. The content of this object can be sent to a CSV file as a record by passing the csv file path as a parameter.
Once you have this information, you can take appropriate actions such as shortening the path, deleting files and folders that exceed MAX_PATH or moving them to a shorter path. This is up to you.
If you need any clarification about this script, feel free to leave a comment and I'll be happy to answer. But remember that the easier way to get help for a function is to dot source the script then use the built-in help function this way: man Get-Longpaths -full where 'man' is an interesting alias for get-help.
Hi I am new to scripting and would like to know first if can just use the script as it is without making changes, second copy the script into a text file and run it, or just copy and past it on the power shell screen please let me know
ReplyDeleteThanks
Hi, yes, you can save it as .ps1 file and then dot-source it and then call the function Get-Longpaths as it is passing a parameter as described in the comment-based help. The parameter is the path to analyze.
DeleteCarlo
Hello,
DeleteI am very interested in this function. I am not Powershell proficient, but I do have Powershell 4.0 on a Windows 8.1 laptop.
Can you explain in detail how to get this function to run on my system or explain what you mean by "dot-source it"?
Thanks,
Tony
Hello,
DeleteI am very interested in this function. I am not Powershell proficient, but I do have Powershell 4.0 on a Windows 8.1 laptop.
Can you explain in detail how to get this function to run on my system or explain what you mean by "dot-source it"?
Thanks,
Tony
I find when I run the script against files I have made with names longer than 260 characters the script runs and reports the filename. When I do the same for a folder I am presented with an error.
ReplyDeleteI created a folder structure which has a couple of files over 260 and a couple of folders over 260. When it scans the folders I get
Exception calling "GetFiles" with "3" argument(s): "Could not find a part of the path
Exception calling "GetDirectories" with "3" argument(s): "Could not find a part of the path
Regards
Richard
This is a great concept for this script and I hope that you can help me get it to work. I've set line 73 as shown below:
ReplyDeleteGet-Longpaths "s:\Archived Files" s:\longpathnames.csv -verbose
Yet I get the following error:
PS C:\> .\LongFileNames.ps1
VERBOSE: No too long paths found. Good.
VERBOSE: Exporting output to
Export-Csv : Cannot validate argument on parameter 'Path'. The argument is null or empty. Supply an argument that is
not null or empty and then try the command again.
At C:\LongFileNames.ps1:68 char:40
+ $toolongcontainer | Export-Csv -Path $csvpath -NoTypeInformation -USECULTURE - ...
+ ~~~~~~~~
+ CategoryInfo : InvalidData: (:) [Export-Csv], ParameterBindingValidationException
+ FullyQualifiedErrorId : ParameterArgumentValidationError,Microsoft.PowerShell.Commands.ExportCsvCommand
No invalid path under s:\Archived Files
PS C:\>
You can also use Long Path Tool to sort out this problem
DeleteIn the parameters, you defined the option $csvLOG but in the script, you use the variable $csvPATH which is never defined.
ReplyDeleteThanks for pointing this out! I shall update the code. Anyhow, did the script was useful to you?
ReplyDeleteCarlo
the best alternative solution i think u can use long path tool program. its recomended tool for max_path limitation. :)
ReplyDeleteWhen I run this script, I get the following error. How do I change the script to give it full access to the path 'C:\Documents and Setting' ??
ReplyDeleteException calling "GetFiles" with "3" argument(s): "Access to the path 'C:\Documents and Settings' is denied."
At C:\Users\admin\Documents\LongPathFinder_Admin.ps1:39 char:1
+ $allfiles = [system.IO.Directory]::GetFiles($path, "*",$options)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : UnauthorizedAccessException
Exception calling "GetDirectories" with "3" argument(s): "Access to the path 'C:\Documents and Settings' is denied."
At C:\Users\adminDocuments\LongPathFinder_Admin.ps1:40 char:1
+ $allfolders = [system.IO.Directory]::GetDirectories($path, "*",$optio ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : UnauthorizedAccessException