-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
check_parent performs a recursive search, but it shouldn't ? #2
Comments
Yes, I did want to check the current working directory and each of its subdirectories, and not only the Actually, as the
So in a way, it may be useless to check all of the subdirectories, but I chose to do this so that the project folder doesn't move. os.path.isfile(".videolog") for this. Anyway, what do you think about this method, i.e. having an empty file that is only used for checking if we are in a specific directory? Another idea - that could be worse - would be to keep in memory the path in which the script was executed, and to compare it with the |
Yes, we are talking about the same thing 😄 I feel that my initial issue lacked some important information and might as such have been confusing 💫, allow me to correct that 😉 In the current implementation, you have chosen to trade usability (the tool takes time to perform the os.walk out of the box) for safety (the tool will not break itself out of the box) ⚖️ But, as you might have noticed, not many tools do that❗
In fact, almost no tool will check whether what you are doing makes sense, so long as it can perform what you requested:
There is no deep philosophical reason for that, it is a limitation inherited from earlier days of computer science (see the C programming language) 😆 This is sometimes (rarely in fact) referred to as a best effort policy, i.e. the tool will try to perform what the human requested to the fullest of its abilities. This is opposed to dynamically deterministic tools that twill always produce a reliable result, but might restrain themselves. A python interpreter 🐍 that would only execute a script if it terminates would be an example of this: it cannot execute all scripts, but if it does run one then we have a reliable guarantee that the execution will eventually end. If you are interested in that sort of things I would advise you to investigate functional programming with languages such as Haskell or Rust. Historically, the reason why most tools didn't do that sort of verification is because it would have been to complex (or impossible). But the whole programming environment evolved to account for this weakness 💡, and in particular the tools 🔧 (applications, utilities) are never stored along user data 📖 in the computer. For instance on Windows, tools 🔧 can be found in C:\Program Files\ or C:\Program Files(x64)\ (for system-wide installs) or C:\Users\<username>\AppData\<some more things> (for user-specific tools). On the other hand, user data 📖 would be found in C:\Users\<username>\Documents for instance. Thus, my issue with this trade off is that you are giving up on usability (slower on large directories out of the box) to account for an edge case that should never happen. Plus it is kind of surprising when you read it 😄 With that being said, there is at least one notable exception to this: when you are debugging the tool, you will almost always do this next to its source code. Thus the problem you are tackling is a real issue for development, and I guess you wrote it because you had some misfortune in the first place 😉 But as such I believe this verification could be opt-in, and not opt-out as it is now. So instead of asking the tool to remove this safeguard 👮 with "sudo", you could instead add a debug flag that would raise it. That is just an example of how to do that, probably you will find a more suitable way. About how the verification should be performed, I think that all the ways you have mentioned and tried are valid, although I like the one where you dynamically find the path of the executable ❤️ more, because it should have no false-positive ✔️ |
The tool moves and rearranges files from only the current directory, yet the check_parent function will look for the .videolog file in the current working directory and all all of its sub-directories recursively. This is what os.walk does in src/video_logging.py :
I do not think that there is an equivalent to os.listdir but for files in python. One way to circumvent this is to take advantage of the fact that os.walk returns a generator object (and not a list). This means that instead of getting all the files recursively at once, os.walk (very probably) gets those in the current working directory, then those in the first sub-directory, etc...
Hence, one way to only get the files in the current working directory without going through every sub-directory would be something like :
Since the top directory is likely the first on to be walked through, this shouldn't be too inefficient.
That should also solve the issue mentioned in src/data.json in the sudo section :
On most modern file systems, I think that file look-ups in a given directory should be of almost constant time (correct me if I am wrong). Hence this issue is probably linked with the os.walk thing.
The text was updated successfully, but these errors were encountered: