Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Write better tests #8

Closed
trystyncote opened this issue Nov 13, 2023 · 3 comments
Closed

Write better tests #8

trystyncote opened this issue Nov 13, 2023 · 3 comments
Labels
repository: tests Relates to the tests that Scrivid has in this repository.
Milestone

Comments

@trystyncote
Copy link
Owner

My tests for core functionality is... poor. For one thing, I'm not even testing the central function compile_video(). This is because I didn't know how to write tests for it. Now that I've switched video processing libraries via #7, I feel more comfortable writing tests relating to video creation using a library that I feel more comfortable sticking with (moviepy was in a state of limbo, so I had no guarantee that it would work in future versions, and if I would have to drop it).

I would like to write better, more accurate tests for the entirety of my functionality. I think the tests shouldn't focus on being implemented correctly, but that the result matches expectation, such as:

  • Checking if running a function doesn't result in an exception;
  • Checking if a function returns the correct output given the input; or
  • Checking if the side-effects of a function do not exist or exist in the correct way (such as logging).

I think that these tests should be made so I can develop in a red-light-green-light testing fashion, where I center my designing around the tests that I'm making alongside it.

@trystyncote trystyncote added pyANY Does not relate to a specific version of Python. repository: tests Relates to the tests that Scrivid has in this repository. labels Nov 13, 2023
@trystyncote
Copy link
Owner Author

I'm adding a note after commits a17609f / 4fe26d1, which were a pair of commits that finalized the basic outline for testing the results of compile_video(). It was a long one, since I managed to create a cubic-timed algorithm by mistake (oops), and I tested two video files that I thought were identical, and weren't. Oh well.

These new commits demonstrate an outline for future tests regarding the results of compile_video(), but currently test it using sample/01_FIRST/main.py, and imports it in a little bit of a hacky way. Nothing is inherently wrong with this approach, and I chose it out of time constraints, however the sample directory is to have example scripts that handle complex logic and demonstrate the flexibility of this project, whereas the tests need to check for specific input/output compatibility.

As a result, I want to include a custom set of sample scripts, that are much shorter and handle a very limit scope and duration. The sample being used currently is a 20 second video with 30 frames per second, which is kind of an expensive video to check. Whereas in the smaller samples, I could have it around 4-5 seconds only having a little bit of logic. In fact, it's likely to be hard-coded in the construction instead of the sample's loop over tactics to discourage stramineous code, since I think that clarity in the sample is critical for a testing environment.

I think this approach also has a benefit to the test_motion_tree.py file, since it uses a series of instruction objects to check for motion_tree parsing. I think that the approach I took, with having functions for the specific sets of instructions, is acceptable. But I think that using the same samples as the mini-video testing would allow more dynamic testing, and reduce repeated types of tests.

I think that ultimately, the sub-samples should just return a list of the instructions and the metadata object. For the motion_tree tests, it just leaves the metadata object unused, and passes the instructions to motion_tree.parse. And the tests for the video-creation could take both objects and pass it straight to compile_video().

I'm leaving this comment on this issue for my own future reference, since it is 23:30, and I'm not writing this right now. And also because I'm putting a fair likelihood on me forgetting this thought process, so I'd like to make a note of it.

@trystyncote
Copy link
Owner Author

With 8042bb1, I am finally... finally done with the video tests. It was a real headache to reroute the program to work with the samples/ directory, so I can use a smaller sample set. With the fact that test_motion_tree.py also uses the samples/ directory, it's now the point when I can say that I can move on. I'm leaving this issue open for the moment, but that's largely because I need to make absolutely sure that the test coverage is good.

@trystyncote
Copy link
Owner Author

Okay, I think my coverage is sufficient now. If I decide later that it's not, I'll just re-open this issue.

@trystyncote trystyncote added repository: tests Relates to the tests that Scrivid has in this repository. and removed pyANY Does not relate to a specific version of Python. repository: tests Relates to the tests that Scrivid has in this repository. labels Jan 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
repository: tests Relates to the tests that Scrivid has in this repository.
Projects
None yet
Development

No branches or pull requests

1 participant