Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replace integration test hand-crafted traces with checking against golden file #1548

Open
anorth opened this issue May 27, 2024 · 1 comment
Labels
enhancement New feature or request testing

Comments

@anorth
Copy link
Member

anorth commented May 27, 2024

Our integration tests include a mechanism to check an actual execution trace (method calls, params, return values, events etc) against an expected one. The trace is not often the primary endpoint of a test, but provide a way to check some of the internal steps of a multi-actor execution which can be difficult to verify by other means.

These expectations are hand-crafted. Crafting them takes quite some time, and they also lead to a lot of noise in the test files. Because of this, some integration tests don't have traces, and some of the available trace expectations are often omitted (some fields are optional).

A better way of doing this would be to generate the expected traces by running the test code, and check in those expectations as files. This would mean:

  • all tests have expectations that are checked
  • all fields in all expectations could be populated
  • authors & reviewers gain complete transparency into the internal execution
  • whenever anything changes, a diff in expectations is created and subject to code review
  • no additional effort is required when writing new tests

After the initial work to create something which generates the trace files, this could provide both better coverage and require less ongoing effort to maintain.

@rvagg
Copy link
Member

rvagg commented May 29, 2024

I'd really like this to be a thing, currently it's far too easy to None these things so the expedient path is to just opt out of things that don't feel relevant, but that just decreases coverage. I'm noticing that pattern in #1540 for events, which would be great to have if they were easy enough to generate, sanity-check, and backfill.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request testing
Projects
Status: No status
Development

No branches or pull requests

2 participants