-
Notifications
You must be signed in to change notification settings - Fork 13
Downloading URL artifact can cause Errno::ENAMETOOLONG: File name too long #5
Comments
Sad trombone dot gif, didn't think of this. I should probably check what the limit on Windows is too. |
Because you filed the bug you get to have an opinion, switch to a hash for anything over 255 chars (or whatever the max on Windows is)? It won't be as nicely readable but should be better than nothing. |
I'm not totally sure I understand what we get out of adding anything like that to the file name. |
We need some kind of unique name for the cache file, the goal with Base64 was to make something that could be reversed to the original URL during manual debugging if needed, while a hash could not be, but is fixed-length. |
I would have thought converting URL to the filename would be sufficient for most if not all. Generating a hash seems simplest. I don't think you lose a lot from doing that |
I did think that, but as you noticed this can make invalid filenames when very long ;-) |
I meant converting |
Unfortunately that is not unique enough, I ran into problems early on with downloads of the same name but in different folders. |
Anyways, I'll fix this up shortly. |
What about hashing the URL with SHA1? We are also running into this problem. I can submit a PR if this solution is acceptable. |
I changed this a while ago to use a base64 of the resource name and only the last path component of the URL, you're still hitting something too long though? |
Oh yes. Especially with consul-template. Why not just use |
An example:
|
Ahh okay, so the part of the workaround I didn't mention is you can use the |
Looks like the code generates a unique name for the cached file by converting the URL into Base64. This can then easily go over the 255 character file name limit if the URL is long enough
The text was updated successfully, but these errors were encountered: