Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streams tests guidance #204

Closed
alexjeffburke opened this issue Apr 25, 2016 · 5 comments
Closed

Streams tests guidance #204

alexjeffburke opened this issue Apr 25, 2016 · 5 comments

Comments

@alexjeffburke
Copy link

Hi Streams folks,

I'd like to get involved and agree that the most pressing thing is probably as many tests as can be mustered so we can start to move forward able to rely on the test suite for correctness of changes and catch regressions etc.

My current plan is work on a branch in a fork of node that simply slings in tests. In the absence of another approach think it best just to start somewhere and reviews can tell me if individual cases are relevant or not.

That being said, having looked at what's there I'd like to agree how to structure these additions. Many of the current tests seem to test lots of things in one file, for example for _writev it does a few different encodings, then cork(), uncork() and it makes me a little nervous. The following seem like important decisions:

  • what is the scope of each test?
    Should they each do one thing or test all cases of a given feature?
  • How explicit should tests be? There are any number of assertions one could make - I think the temptation is to assert everything given the separated nature of files but can we assume our coverage is gained over a bunch of files?
  • What syntax should these files use? I've been leaning toward using fat arrows in the node repo under the assumption we want to cover the semantics of the current implementation.

Thanks, Alex J Burke.

@mcollina
Copy link
Member

Thanks @alexjeffburke! TL;DR get started, and then we review!

A lot of streams tests were written long ago, so they accumulated some cruft. I agree with you that most of the tests are complex and verify a full feature (with a lot of assertions). Also, a lot of things are extremely complex to reason about in the current tests.

IMHO we should have more tests with a narrow scope, any other opinion?

What syntax should these files use? I've been leaning toward using fat arrows in the node repo under the assumption we want to cover the semantics of the current implementation.

Those were never updated to fat arrows, const and the like. They need updating.

You will probably wait some time until a consensus is reached on this one, so start working on it and send a PR.

@calvinmetcalf
Copy link
Contributor

My current plan is work on a branch in a fork of node that simply slings in tests. In the absence of another approach think it best just to start somewhere and reviews can tell me if individual cases are relevant or not.

I'd add a single new test first and we can give you some detailed feed back on that one and I'd suggest sticking to single new tests per pull while you're getting the hang of it so that if 3 out of 4 are good but one needs work we can merge 3 without waiting for the 4th to be fixed which may take a while.

@alexjeffburke
Copy link
Author

Great - I've added a first couple at alexjeffburke/streams-tests. Look forward to the feedback :)

@mcollina
Copy link
Member

@alexjeffburke send a PR, we will review them there!

@alexjeffburke
Copy link
Author

nodejs/node#6493

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants