Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tests for exceptions may fail sort of silently #265

Open
ericblade opened this issue Jul 2, 2024 · 3 comments
Open

Tests for exceptions may fail sort of silently #265

ericblade opened this issue Jul 2, 2024 · 3 comments

Comments

@ericblade
Copy link

ericblade commented Jul 2, 2024

Hi! I have a fairly complex script that I can't provide, but the basic gist of it is:

call the perforce helix command (p4.exe)
if perforce exits with an errorcode 1, or has no resulting output, then print a bunch of verbose diagnosis stuff, and then throw.

The test is simple:

    It "Port option works" {
        InModuleScope JenkinsModule {
            { Invoke-Perforce -Command "info" -Port "99999" -MaxRetries 0 -RetryDelaySeconds 0 } | Should -Throw
        }
    }

the port option here expects to fail, because we're telling it to go to a network port that doesn't exist. Therefore, p4.exe fails, the Invoke-Perforce script throws, and .... the pester adapter puts up a bunch of stuff in the corner notification area, but the test just silently doesn't get marked as passed or failed.

The Pester error log reads:

2024-07-02 12:37:47.568 [info] Test Run Start: 1 test items
2024-07-02 12:37:49.015 [error] PesterInterface Error: Perforce client error:
2024-07-02 12:37:49.021 [error] PesterInterface Error: 	Connect to server failed; check $P4PORT.
2024-07-02 12:37:49.021 [error] PesterInterface Error: 	TCP connect to 99999 failed.
2024-07-02 12:37:49.022 [error] PesterInterface Error: 	TCP port number 99999 is out of range.
2024-07-02 12:37:49.059 [info] Test Run End: PesterInterface stream closed

So.. for some reason, the pester adapter is simply... dropping out.. when these perforce errors happen. I'm not sure if it's stopping the run there, or if it's just ignoring the rest of the script results after.

I suspect this is easily reproducible, but it might take me some time to build a smaller test case that I can actually provide.

However! if I set a breakpoint inside the function that is expected to throw, then tell Pester Adapter to debug the tests, the test hits that breakpoint, then if I hit continue, the test completes and is marked a success (or a failure).

@JustinGrote
Copy link
Collaborator

Thanks for the report! The way the runner works today is basically to trap or try/catch the terminating errors that occur, but it is not perfect and there may be some exception situations like running inside a module that might be being missed for some reason.

When debugging, it runs in the PowerShell Extension Terminal rather than a separate powershell process, and while it's the same "runner" script the feedback method might have a difference. I can't guarantee I'll have a lot of time to look at it soon but thanks for bringing this to the surface.

@ericblade
Copy link
Author

ericblade commented Jul 3, 2024

ok, so, i forgot to mention -- the notifications that VS Code is throwing up in the notification area, are the same messages that the PesterInterface is reporting there. So, while it's not a totally silent fail, the only obvious display is outside of the testing area -- either in the VS Code notification box, or in the Pester debug output. So, it's "sort of" a silent failure, because the test window just.. doesn't say that it passed or failed.

And p4.exe is failing with an exit code of 1. So this might be easily reproducible just by making a script that exits with exit 1, then calling it and throwing. But, you might be right, also, that being in a module might have something to do with it, too. yikes. The complexity of debugging!

Although I'm off for holiday, I'll probably try to take a poke and see if I can find any more information..

So.. this actually brings me to a question -- is there a way to mark a test in such a way that Pester-Adapter should ignore it? I think that's a feature of some other VSC testing adapters, and that would be a handy temp workaround for this problem, since the code does what it's supposed to, and the test works outside of this adapter.

Just thinking, I might try messing around with ErrorAction in the script, but that could cause it to work incorrectly in other ways, possibly.

@JustinGrote
Copy link
Collaborator

You cannot exclude individual tests from discovery but you can exclude test files from discovery, so you could put that in a standalone file, add it to the exclude settings, and potentially avoid this issue temporarily as a workaround.

If it comes up in the popup then you're hitting some kind of exception I'm not handling, handled exceptions show up in the test adapter pane.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants