Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pester tests needed per comments in #33 #35

Open
vercellone opened this issue Apr 13, 2016 · 31 comments
Open

Pester tests needed per comments in #33 #35

vercellone opened this issue Apr 13, 2016 · 31 comments

Comments

@vercellone
Copy link
Contributor

vercellone commented Apr 13, 2016

Reference: PR #33

I currently have 93 Pester tests in my working folder, but I haven't yet worked out how best to share them. Right now I have a lot of stuff specific to my environment that is more or less hard coded. To make it dynamic for others, it either needs to read test input from a file outside the repository ($Home perhaps), or it needs to create a bunch of arbitrary items temporarily. The latter will require some New-* or Set-* functions that don't yet exist.

Anyone have suggestions or willing to help?

@dsolodow
Copy link
Contributor

What I'd be inclined to do is modify the test scripts so that the environment specific hard codes are variables at the top of the file(s), and then before the commit set them to generics lime example.com or such.

@vercellone
Copy link
Contributor Author

vercellone commented Apr 13, 2016

Possible, but not very portable. When adding or modifying tests that would complicate commits and merges, and would likely result in someone (probably me) inadvertently committing private settings.

I was thinking include default/template XML or JSON file(s) that needn't change unless the tests change. End users could copy the file(s) to $HOME and edit theirs independently. In the absence of said file(s), the Pester script could throw a terminating exception with guidance.

@elijahgagne
Copy link
Contributor

I think there are a lot of improvements desired in this, but https://github.com/elijahgagne/POSH-LTM-Rest.Tests/ is my first stab at this.

@vercellone
Copy link
Contributor Author

vercellone commented Apr 26, 2016

I was thinking the F5Session(s) could be serialized to the same json file as the rest of the configuration. Something along the lines of:

{
  "Sessions": [
    {
      "Name": "N.N.N.N",
      "BaseURL": "https://N.N.N.N/mgmt/tm/ltm/",
      "Credential": {
        "UserName": "vercellone",
        "Password": "[encrypted securestring]"
      }
    },
    {
      "Name": "N.N.N.N",
      "BaseURL": "https://N.N.N.N/mgmt/tm/ltm/",
      "Credential": {
        "UserName": "vercellone",
        "Password": "[encrypted securestring]"
      }
    }
  ],
  "Partition1": "partition1",
  "etc..": "etc..."
}

Encrypted secure strings can only be decrypted by the same windows user, so it still makes sense to me to have user specific config files somewhere under the $HOME folder. But the location isn't critical. We could use an environment variable to allow overriding the default location.

If you give me a couple of days, I'll generalize my tests and write a first draft of the json I believe necessary to support them. Even though I have 93 tests, my code coverage is still low (about 40%). So, there's plenty of work left to be done.

@elijahgagne
Copy link
Contributor

I like the structure of your json config file and I like the idea of storing an encrypted password in the config file.

We use a privileged account manager (PAM) to store and manage the AD credential that I use to connect to the LTMs. That password gets randomized multiple times a day. I think a solution that could work is that we have an optional script that runs at the beginning of the Pester script (if this optional script exists). I could use this script to get my credential, encrypt it, and store it in the config file. Other people could use it to prompt for input and then make arbitrary changes to the config file based on input. People who didn't need this flexibility could ignore this script.

@vercellone
Copy link
Contributor Author

Edited my previous comments to change $PSHOME to $HOME to more accurately reflect my intent of storing the configuration somewhere under the user's home directory (i.e. c:\Users\username) by default. Sorry for any confusion that might have caused.

@vercellone
Copy link
Contributor Author

I intended to refine this further before sharing, but I just gave in and checked in a first draft to a new branch (pester). In it I've added 2 files:

  1. F5-LTM\F5-LTM.Tests.ps1 - The Pester tests.
  2. F5-LTM\F5-LTM.json - A sample test configuration file that needs to be customized and copied into your $home directory

F5-LTM.Tests.ps1 includes 2 helper functions:

  1. Set-FullPathAndPartition - Derives full path and partition properties where appropriate based on the hierarchy in the config file in order to minimize repetition.
  2. ConvertPSObjectToHashtable - Facilitates conversion to Pester friendly -TestCases.

Proposed discussion points:

  • Is json the best format? Alternatives might be .txt, .yml, or a ps1 script (like elijah proposed above) which returns a collection of -TestCases. How those test cases get created would be at the discretion of that script.
  • Should the structure of the config file remain hierarchical? Or is repetition acceptable if it simplifies configuration file maintenance?
  • Multiple F5Session's are currently handled by looping through all tests for each session. It might be better if we can pass the F5Session and/or session details in the test case hashtables instead.
  • addremove - I've included an "addremove": "true" property for items that I know don't already exist to be used for Add-, New-, and Remove-* tests. If anyone has a better way I am open to suggestions.
  • There is currently no support for testing Apps.
  • Other - My Pester testing experience is limited, so any and all feedback is welcome and encouraged.

@elijahgagne
Copy link
Contributor

Thanks for working on this. Here are some comments:

  • I like the change to use a json config file because it makes the config seem more standalone from the PowerShell. One improvement I would like to see/add is a way to reference an external config that is not stored in the module folder and is not checked into source control. But I do think we should have a dummy example config file as you have provided.
  • Repetition is acceptable to me if it is simpler
  • I like the idea of storing the sessions in the test case hashtables. This point makes me realize that in my environment, I have different sets of LTMs in different data centers and as a result, each set of LTMs will need different config details (e.g. different networks so different IPs). I could handle that by using different config files for each data center. Alternatively, it might be useful to be able to tie certain tests to certain F5Sessions.
  • addremove seems good to me at this point
  • I am interested in adding more support for Apps

@joel74
Copy link
Owner

joel74 commented Jun 17, 2016

I agree with elijahgagne's comments. JSON seems a good format to me. That's my preferred format for config files, these days.

[BEGIN EDIT]
One thing that would be helpful to clear up is if the goal is to provide unit tests or integration tests. If we want to provide unit test coverage, then it seems the best approach is with mock objects that are included in the pester test config, and don't need to change because no actual communication with an LTM is actually happening.

If, on the other hand, the goal is to provide integration tests, then I'd be in favor of a JSON-formatted config file, with an obvious sample file in source, and the understanding that all users must create their own, custom config.

[END EDIT]
I also agree that repetition is preferred if it simplifies. I'll defer to your judgment on the sessions as hash tables and addremove items for the moment. Thanks.

@vercellone
Copy link
Contributor Author

  • The config file provided is an example only. The Pester test reads from $HOME\F5-LTM.json (not the one in the module folder).
  • Obviously, I chose the integration test path. Both would be ideal. But, mocking alone wouldn't catch the integration issues discovered as the result of our diverse environments.
  • Although I like JSON, the more I think about it the more I like the idea of simply dot-sourcing a PowerShell script which constructs Pester friendly TestCases (hashtables).

Reasons

Flexibility

  • Provides extensibility necessary to address elijah's PAM and multiple LTM use cases.
  • Users could build hashtables inline or import settings from csv, a database, JSON, or wherever they like.

Familiarity

  • JSON is unforgiving and difficult to troubleshoot for those familiar. For those unfamiliar it can be a nightmare.
  • It may be safe to assume the target audience knows PowerShell, but probably not safe to assume they know JSON.

Thoughts?

@elijahgagne
Copy link
Contributor

Both dot-sourcing a PS script or using a JSON file works for me. I agree that a PS script is easier and more flexible, although I would say that a JSON file feels cleaner. I do not have strong feelings either way and would be happy to do it which ever way you and everyone else prefer.

@joel74
Copy link
Owner

joel74 commented Jul 12, 2016

I'm also okay with either JSON or dot-sourcing PS. It feels like some good thought and time has already gone into this endeavor, and I'd hate to see it lose momentum.
On a separate note, I'm out here in Seattle to visit F5 for the next couple days. If either of you guys have comments, questions or requests you'd like me to pass along to the devcentral team, please do so. You can email me at jnewton(at)springcm(dot)com. Cheers.

@vercellone
Copy link
Contributor Author

Commit 566bfb3 is available as a preview. It is not PR worthy yet, but IMHO it is a much cleaner solution than JSON. I'll follow up with a sample test case creation script soon.

@vercellone
Copy link
Contributor Author

Commit bbe943e

  • Uses a single $F5LTMTestCases variable (hashtable of testcase collections) instead of a variable per test case collection. If you have an existing file, you can simply replace $Get_ with $F5LTMTestCases.Get_.
  • The test case creation script is built in. The first time you invoke-pester, If no $HOME\F5-LTM.TestCases.ps1 file exists one will be created as a template. You can fill in the blanks or inject whatever logic you deem appropriate to hydrate each collection (or not - just comment out or remove lines involving test cases you don't need).
  • I've included a pseudo test case which outputs warnings for empty test cases. Invoke-Pester -ExcludeTag 'Validation' # to suppress these empty test case warnings

Questions, concerns, or suggestions?

@elijahgagne
Copy link
Contributor

Thanks for working on this. I just got back from vacation. I'll give it a try tomorrow or Tuesday.

@elijahgagne
Copy link
Contributor

Firstly, I think there is a lot of good work here and I've learned a few things reading through it. I still need to work through this more and I hope to be able to provide better feedback soon. But here are some initial thoughts/suggestions:

  • I like storing the config file in $HOME, but I would like it more if that was a default and there was a way to override it. I've been using an environment variable for this purpose in another module. If the environment variable is present, it's used to specify the folder where the config is. If the environment variable is not present, the default is used.
  • I've found initially reading and understanding the config file a bit daunting. I am finding that I need to comment out everything and then work through each test case one by one. I'm getting a lot of failures and now I'm trying to figure out if it's because (a) I misunderstood the test, (b) the test doesn't work for my environment because of certain assumptions, (c) there is something broken in the module (unlikely since I assume all the tests pass for you). Right now there are 60 test. When there are 300+ tests, it will be a real hurdle for a newcomer.
  • I might find it helpful if there was a way to tie certain config/tests to certain F5 sessions. We have LTMs in different data centers and the objects on each set of LTMs are generally unique.
  • I am not positive that I like the repetition that I previously said I did. As I dig into this more, I'll see if I can come up with a suggestion that keeps things simple, but helps reduce the repetition.

@vercellone
Copy link
Contributor Author

Thanks for the feedback. Looking forward to improving this greatly. Hopefully my comments below will prove helpful.

  • Conditional environment variable override of the test script location: c4de1ce.
  • I agree that the default script is verbose and difficult to read for those unfamiliar with Pester -TestCases (possibly impacting a large percentage of this module's target audience). But, -TestCases represent the lowest common denominator, and therefore I believe exposing them offers the greatest flexibility. The challenge is to provide simpler alternatives for writing and/or generating them without sacrificing that flexibility.
  • Tying tests to specific sessions is not only possible, it is required. The $Sessions variable is a hashtable where the keys are your desired session names and the values are the F5sessions returned by the New-F5Session cmdlet (with the -PassThrough switch). Individual test cases must include a session name (see below). You can add as many sessions as you need. If you want one file per session, that is easily achieved by having the main script include a sub-script per session without necessarily imposing that convention on all users.
$Sessions = @{}
$Sessions.Add('Internal', (New-F5Session -LTMName '192.168.1.20' -LTMCredentials $credentials -Default -PassThrough))
$F5LTMTestCases.Get_HealthMonitor += @{ session = 'Internal' }
  • I don't like the repetition either. Convention over configuration is the direction I'd like this to evolve. But, regardless, I hope we can support full configurability down to the individual test case when necessary.

@elijahgagne
Copy link
Contributor

Yes, that was helpful. Thanks, I'm now using the environment variable to specify the test location. I am liking things more as I get to know it better. I can tell you put a lot of work into this.

I currently have 43 passing tests and 17 failing ones. I think most of the failing ones are because of assumptions that are not true in my environment. Some of those are going to be tricky to work through and this is not the thread to try to do that in.

One thing I have started doing to make things less repetitious for me is defining variables at the top that are then used multiple times below.

One question, for a couple test, I could only get them to succeed using a fullpath, but I'm not sure that should have been required. For example, with Get_VirtualServer_ByNameArray, you have dummy values of name = '[name1]','[name2]','[name3]'. Is the expectation that just the name and not the fullpath should be sufficient?

I am happy with this approach and would be happy to help continue building it out.

@vercellone
Copy link
Contributor Author

Get-VirtualServer (like all All Get-* commands that retrieve elements in partitions) uses Get-ItemPath which defaults to the Common partition if neither a fullpath nor a partition are specified. If (like @joel74) all elements are in the Common partition, then it should work fine. In that respect it is working as designed.

However, if it offers greater clarity, we could have any combination of Get_VirtualServer_ByNameArray, Get_VirtualServer_ByFullPathArray. and/or Get_VirtualServer_ByNameArrayAndPartition. Or, simply extend Get-VirtualServerByNameArray to allow an optional Partition param.

That said, consider the following: Hashtables in -Testcases collections can include extraneous key/value pairs that don't map to named test case params. The extraneous pairs are essentially ignored. That could be leveraged to provide some powerful shortcuts. For example:

$MyVirtualServers = @()
$MyVirtualServers += @{ session = 'Internal'; name = 'app1-test.dyn-intl.com'; partition = 'Development'; fullpath = '/Development/app1-test.dyn-intl.com' }
$MyVirtualServers += @{ session = 'Internal'; name = 'app2-test.dyn-intl.com'; partition = 'Development'; fullpath = '/Development/app2-test.dyn-intl.com' }
$F5LTMTestCases.Get_VirtualServer_ByNameAndPartition =
$F5LTMTestCases.Get_VirtualServer_ByPartition =
$F5LTMTestCases.Get_VirtualServer =
$F5LTMTestCases.Get_VirtualServer_ByFullpath = $MyVirtualServers
# PipeLine and Array test cases could also be derived from $MyVirtualServers

That could lead to a shorthand, where you set all the possible attributes of your virtualservers in one collection and the test permutations could be derived from it automatically. Just food for thought at this point.

@vercellone
Copy link
Contributor Author

vercellone commented Jun 1, 2017

What do you think of the following syntax? Test permutations based on the same elements could be automatically derived or declared staticly.

F5Test 'Get-HealthMonitor' {
    F5Session 'Internal' -Permutations Dynamic {
        @{ name = 'http'; partition = 'Common' }
    }
    F5Session 'External' -Permutations Static{
        @{ partition = 'Common' }
        @{ name = 'http'; partition = 'Common' }
    }
}

@vercellone
Copy link
Contributor Author

I've begun experimenting with 'Unit' testing using Mock New-F5Session and Invoke-RestMethodOverride Functions. It looks very promising for maximizing code coverage without requiring F5 device connectivity. The trick to this strategy is the implementation of the Mock Invoke-ResetMethodOverride function. Does this sound like a worthy pursuit? Any other ideas?

@elijahgagne
Copy link
Contributor

In regards to the syntax, I'm having trouble understanding. I'm thinking that would replace the objects created in F5-LTM.TestCases.ps1, right? I do like that better than what's in F5-LTM.TestCases.ps1 as it seems more clear to me how I'd fill things in.

I'm also trying to understand Dynamic vs Static permutations. Static seems straight forward. Should I think of Dynamic as a set of objects that can then be decomposed and then the parts of the object (e.g. just the name instead of the name+partition) could be tested? If that's true, then I'm wondering how you would write the tests for that. Dynamic makes it easier to write the object to test, but harder to understand what tests will be done against the objects (if I've understood).

Unit tests using mock New-F5Session and Invoke-RestMethodOverride sounds very interesting. My first thought is, how are you going to implement mock versions of those that behave like the real thing. That said, I am really worried about testing things like deleting a virtual server. I don't currently have access to a non-prod LTM environment so my testing is going against a live environment, which means I need to be really careful. I have heard that my organization has a license that allows me to spin up a LTM vmdk and create my own testing environment. It would be even nicer if I could get that in a Docker container and run different versions of the LTMs. Given how much work we're putting into creating this useful tool for managing BIGIP devices, I wonder if we could get any help from BIGIP in having test environments for the purpose of testing this module.

@vercellone
Copy link
Contributor Author

vercellone commented Jun 4, 2017

That syntax would serve as a Domain Specific Language that gets translated automatically into $F5LTMTestCases at test run time.

Your understanding of permutations is correct. Static would be 1 test case per hashtable declared. Dynamic would build as many permutations of the test cases as the properties supplied allow. The tests themselves wouldn't really change. There would just be a translation step that builds $F5LTMTestCases from the DSL. Your concern regarding understanding which test will be done is valid. That could possibly be addressed either using ValidationSets or Pester Tags to opt in to certain permutations.

My first draft of the Mock Invoke-RestMethodOverride implementation is reading a json file from disk for a given Uri, Method, and LTMVersion. I hope it will be possible to simplify this in some cases. Negative result tests (i.e. 404 Not Found) should be easy. I haven't tried it yet, but I'm thinking I will enable some sort of tracing on my copy of Invoke-RestMethodOverride so that I can capture a large number of known good responses just by running my current integration tests. Unfortunately, commands that vary depending on the request body (i.e. New-, Set-) may require a more imaginative solution.

In the absence of a dedicated testing environment risk could be mitigated by only performing New/Set/Remove functions on elements created by the test process itself. But, the risk would not be eliminated entirely in the event a test misbehaves. A test environment for integration testing would be ideal. Perhaps @joel74 can pursue that further.

@joel74
Copy link
Owner

joel74 commented Jun 6, 2017

It seems like a lot of the risk with integration tests might be able to be mitigated by utilizing a dedicated Pester partition, that's built up and torn down as part of the testing. It feels like the best way to perform the tests would be as integration tests on a real BIG-IP, otherwise the amount of effort in mocking up all the objects and methods needed to simulate the functionality would be overwhelming. Jason, do you have more details about your first draft of the Mock Invoke-RestMethodOverride that you can share?
Cheers,
Joel

vercellone added a commit to vercellone/POSH-LTM-Rest that referenced this issue Aug 23, 2017
Demonstrates mocking for behavior verification testing of
Get-HealthMonitor.
@vercellone
Copy link
Contributor Author

I do. Finally.

My integration test efforts focus largely on state verification. Verify that the properties of the objects returned match those expected for the given request parameters.

My more recent unit test efforts sought to Mock the behavior of the F5 REST API, including dynamically simulating the output of Invoke-RestMethodOverride to support essentially the same state verification utilized for integration tests. Feasible in theory, but ultimately (as Joel suspected) overwhelming. It involves a ton of technical debt in terms of both raw data and conditional logic. And, although highly flexible, key permutations would quickly get buried.

e24bad7 aims to alleviate those shortcomings by instead focusing on behavior verification. It is much cleaner, and IMO will serve to better highlight edge case permutations such as those necessitated by variations between device versions.

Looking forward to feedback.

vercellone added a commit to vercellone/POSH-LTM-Rest that referenced this issue Aug 23, 2017
Intended for inclusion in previous commit e24bad7.
@elijahgagne
Copy link
Contributor

This looks pretty amazing to me! I did not realize Pester had this capability. I like how this provides a safe way to run tests where I don't have to worry about a bad test messing up production.

@vercellone
Copy link
Contributor Author

vercellone commented Aug 29, 2017

Thanks, @elijahgagne. Would you be willing to contribute some similarly structured unit tests for other functions?

I've considering using the Assert-MockCalled -ParameterFilter to verify the requested URI. Any thoughts on that?

@joel74
Copy link
Owner

joel74 commented Aug 29, 2017

I think this looks a lot cleaner and more manageable than the previous iteration, but I'm still trying to understand the end game here. Since the F5 session and external function calls are being mocked, then we're looking for unit test functionality, and testing whether the logic internal to a given function acts as expected. Is that correct?

In this health monitor example, what's being tested is the correctness of our parameter sets, whether the function accepts an object from the pipeline, and whether it can properly add the HealthMonitor object property. There doesn't appear to be any other logic to test in this instance. This is definitely helpful for ensuring our functions work as expected, like when there is internal manipulation of JSON objects, but I'm not clear on how this helps with testing against different versions of a BIG-IP device.

A large part of the functionality of this module involves communicating with a BIG-IP device, and if we're going to mock up all that communication, we have to know how all supported versions of a BIG-IP device would respond to any given call, which seems like a lot more work than just actually creating an integration-style test and making an actual call. What am I missing in regards to using this to test on different versions of the API?

My recent efforts have been around what's involved in making the F5-LTM module a child module of a parent F5-BIGIP module, in an attempt to ensure that the module(s) we're developing here follow the architecture of the BIG-IP modules. I was able to move the F5-LTM module with only minor modifications into this new project, though I have some work yet to do to get ArgumentCompletion.ps1 functioning in this new setup. I'm going to continue in those efforts, and as more pester tests become available, I plan to keep that new F5-BIGIP project in sync with the F5-LTM project.

Cheers,
Joel

@elijahgagne
Copy link
Contributor

Yes, I can spend some time this weekend on unit tests for other functions. Unfortunately I don't have a lot of free time right now so I can't promise much.

Re: Assert-MockCalled -ParameterFilter, I'll need to get a better understanding of things so I'm sure I'll have some thoughts after this weekend.

@vercellone
Copy link
Contributor Author

vercellone commented Sep 1, 2017

@joel74 Correct: These are unit tests. Get-HealthMonitor is the Subject Under Test (SUT). There is no version specific behavior for it.

I've added tests for New-F5Session that should help clarify my vision for edge case permutations (version specific in this case).

@elijahgagne I've added a single -ParameterFilter on 1 of the Get-HealthMonitor tests to demonstrate that as well. (Note that $Uri -match '[regex]' might be more suitable for some tests. And $Body -match '[regex]' will likely be useful for New/Set operations.)

c4eb502 Code coverage is currently 92.59% for New-F5Session and 83.33% for Get-HealthMonitor.

Keep the feedback coming.

@elijahgagne
Copy link
Contributor

I got setup for this and read up on mocking, but didn't get much further than that so far. I'll work on it some this week. I'm planning to start with Get-iRule.

joel74 added a commit that referenced this issue Mar 27, 2018
Foundational Pester unit tests for #35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants