-
Notifications
You must be signed in to change notification settings - Fork 40
Add integration testing w/external NextJS app (Edge runtime) #304
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@@ -27,27 +31,68 @@ runs: | |||
uses: GuillaumeFalourd/clone-github-repo-action@main |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just a note for the previous step: npm pack
packages up the existing code on the existing branch, so it'll run the code devs are proposing to merge in their PR, rather than whatever code is on npm
etc. This removes the need for us to merge things before testing them end to end.
exit 1 | ||
fi | ||
|
||
if [ "$match_count" -lt 1 ]; then |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is where we assert on the response in CI
|
||
response=$(curl --silent --location --request POST "$API_URL" --header "PINECONE_API_KEY: $API_KEY") | ||
|
||
indexName=$(echo "$response" | jq -r '.indexName') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Index names are randomized in the app. We then save the index name to the GITHUB_ENV environment variable file, so that we can delete them later -- this ensures we can have simultaneous runs of this workflow, from creating the index to deleting it later
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would it be possible to port the logic for the actual testing to an explicit TypeScript / Javascript file that handles the calling / asserting, etc?
Something similar to IntegrationRunner.js
. I think this will generally be easier to build out and manage if it lives in a code-based script rather than in a GH actions file.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Definitely! Done. Now we have a file called src/end-to-end/localRuns.ts
src/end-to-end/edge-end-to-end.sh
Outdated
# Hop into ts-client-e2e-tests repo, install deps, link local ts-client repo, and start the Next.js server | ||
pushd "$dir/ts-client-e2e-tests" | ||
npm install | ||
npm link @pinecone-database/pinecone |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This will overwrite the portion in the app's package.json
that says "@pinecone-database/pinecone": "^3.0.4-rc.2024-10.0"
. Instead, it'll link to your local version.
You must run it after npm install
, or else npm install
will change package.json
back to "^3.0.4-rc.2024-10.0"
, since that's the latest version up on npm
as of this writing.
src/end-to-end/edge-end-to-end.sh
Outdated
next dev | ||
popd | ||
|
||
#npm uninstall -g next # Can comment out if do not want to uninstall the global Next.js package |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We do have to install NextJS globally or else the next dev
command (to spin up the app on your localhost:3000
won't work when run from the ts-client
repo, which is where this bash file is being executed. If you don't want Next globally, you can uncomment line 54 and uninstall it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd maybe add a comment to the script here stating the same thing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
which script?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice work getting this stood up, a lot of moving pieces here. I've left a few questions but we should maybe sync on this so you can walk me through some of the details. I feel like there's a couple things I'm confused about, but it's probably just lack of knowledge around some of this stuff.
|
||
response=$(curl --silent --location --request POST "$API_URL" --header "PINECONE_API_KEY: $API_KEY") | ||
|
||
indexName=$(echo "$response" | jq -r '.indexName') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would it be possible to port the logic for the actual testing to an explicit TypeScript / Javascript file that handles the calling / asserting, etc?
Something similar to IntegrationRunner.js
. I think this will generally be easier to build out and manage if it lives in a code-based script rather than in a GH actions file.
CONTRIBUTING.md
Outdated
Simply run `npm run test:end-to-end -- <clone y/n> <path>` to run all end-to-end tests, passing in `y` or `n` for | ||
whether you want to clone the repo that contains the external app used for testing, and the `path` the repo should be | ||
cloned in, or is already cloned in, e.g.: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not necessarily clear on why we there's an option to clone or not clone the test app repo. What's the difference here?
I would think we would just clone it every time to make sure we're running against whatever is fresh there.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh that's true, yes.... Okay I'll change!
src/end-to-end/edge-end-to-end.sh
Outdated
# Recommendation: Do not clone the ts-client-e2e-tests repo in local ts-client repo; it'll just cause npm deps issues | ||
|
||
# inputs | ||
clone=$1 # required -- `y` or `n` to clone the ts-client-e2e-tests repo |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This feels a little confusing considering the script has an option to clone or not clone. I asked a question about this previously but I feel like I'm missing something.
src/end-to-end/edge-end-to-end.sh
Outdated
next dev | ||
popd | ||
|
||
#npm uninstall -g next # Can comment out if do not want to uninstall the global Next.js package |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd maybe add a comment to the script here stating the same thing.
end-to-end: | ||
name: end-to-end | ||
runs-on: ubuntu-latest | ||
if: always() # Run teardown even if the tests fail |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: This isn't a teardown job though, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah must've been a c&p error, apologies! I'll remove.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, no, leaving this here, actually -- the e2e tests include a teardown job, so even if the test fails, the created index will be torn down, ensuring we don't just create orphan indexes in the event that these tests fail.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems like a solid start. There's a lot of moving pieces to this, so nice job.
shell: bash | ||
|
||
- name: Deploy Project Artifacts to Vercel | ||
- name: Clean up e2e indexes |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This doesn't need to block merging, but I would recommend moving cleanup steps into a different action/job. That lets you retry just the cleanup part, if needed. And easily see that the core of the test is passing even when cleanup steps fail. In our python suite, a ton of our flakes were caused by delete requests failing for various reasons.
Similar example of this in python tests:
https://github.com/pinecone-io/pinecone-python-client/blob/main/.github/workflows/testing-dependency.yaml#L171
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll make a fast-follow ticket!
package.json
Outdated
@@ -26,6 +26,7 @@ | |||
"test:integration:edge": "TEST_ENV=edge jest src/integration/ -c jest.config.integration-edge.js --runInBand --bail", | |||
"test:integration-local:edge": "TEST_ENV=edge node src/integration/integrationRunner.js", | |||
"test:integration:cleanup": "npm run build && node utils/cleanupResources.ts", | |||
"test:end-to-end": "chmod +x ./src/end-to-end/edge-end-to-end.sh && ./src/end-to-end/edge-end-to-end.sh $npm_config_argv", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: Seems like a different command name could give better context on what these tests are. All integration tests are "end to end", so it's not obvious from this name how your new suite differs from things like test:integration:edge
above.
src/end-to-end/README.md
Outdated
## CI runs | ||
|
||
The CI runs for the end-to-end tests are defined in the `.github` directory. | ||
|
||
The `Action` in Github is called `End to end testing`. | ||
|
||
- Workflow | ||
- The workflow run by CI is called `workflows/e2d-testing.yml` | ||
- It requires a Pinecone API key and a Vercel token. | ||
- The Pinecone API key it uses the associated with the `ops@` account and is stored in a Github secret | ||
- The Vercel token is from the Pinecone Enterprise account and is stored in a Github secret | ||
- Action | ||
- The action run by the workflow is called `actions/e2e-testing/edge/action.yml` | ||
- This action packages the `ts-client` code on the current branch into the external app's repo | ||
- It then installs the Vercel CLI and sends a POST request to the test app's `/api/createSeedQuery` endpoint, | ||
which is up and running on Vercel | ||
- If the response from the endpoint does not include a match (a match to a query composed by | ||
the test app), the action fails | ||
- The action then deletes the index that the test app spun up (note: it will attempt to delete the index even if | ||
the assertions fail) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This content doesn't seem that helpful to me. Since a lot of it is a narrative telling of the workflow, it seems likely to quickly go stale as CI configurations evolve. I would probably delete most of this, but add some explanatory comments in the workflow/job/action configs. Keeping that sort of documentation right next to the implementation will give us a better shot at keeping it up to date.
What I think would be useful here are:
- Some conceptual introduction to what these tests are and why we need them, how they differ from other integration tests, etc.
- Specific instructions for how to run locally. What environment variables are needed and how do I set them?
- Links to the workflow source and github UI for the workflow.
README.md
Outdated
## Testing | ||
|
||
All unit, integration, and end-to-end testing takes place automatically in CI and is configured using Github actions | ||
and workflows, located in the `.github` directory of this repo. | ||
|
||
To run tests locally, see [CONTRIBUTING.md](CONTRIBUTING.md) for instructions. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this can probably be cut. Maybe leave a note directing people to CONTRIBUTING.md for the development readme.
End-users trying to get their app built (the main audience for this readme) shouldn't ever need to run our tests. Also, explaining about the .github
directory doesn't seem like the right level of granularity. Github Actions is in such widespread use on github that I think it feels like a waste of the reader's time to explain here. It's kind of like how we don't write a lot to explain other conventional aspects, like how we track module dependencies in our package.json
file.
- name: Send POST request to endpoint Vercel app endpoint api/createSeedQuery | ||
run: | | ||
npm install -g typescript | ||
npm install -g ts-node | ||
|
||
API_KEY="${{ inputs.PINECONE_API_KEY }}" | ||
|
||
indexName=$(ts-node src/end-to-end/localRuns.ts $API_KEY) | ||
|
||
if [[ "$indexName" != *"edge-test"* ]]; then | ||
echo "Error encountered: $indexName" | ||
exit 1 | ||
else | ||
echo "Test succeeded! Index created: $indexName" | ||
fi | ||
|
||
echo "indexName=$indexName" >> $GITHUB_ENV |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seems fine for now, but it would be cool to eventually have some sort of reusable action that let you do something like this:
- name: Call Vercel test endpoints
uses: ./actions/post-vercel-endpoint
with:
app_url: https://ts-client-e2e-tests.vercel.app
vercel_token: '${{ secrets.vercel_token }}'
endpoints:
- api/createSeedQuery
- api/test2
- api/test3
In particular it seems like the ability to parameterize on app_url would be useful, since right now we can only deploy and test one branch at a time.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cool!
…ve re-checkout of repo
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks for taking on all the feedback and iterating on this. I'm really excited to have this test harness in place finally! 🚢
…ncy (#311) ## Problem This is a fast-follow from [the PR that introduced the external app testing for Edge runtimes](#304). We just need to rename the GH workflow to match the rest of the files/code in the client, which we couldn't do previously w/o creating a new workflow and merging it to `main`. ## Type of Change - [x] Bug fix (non-breaking change which fixes an issue) - [ ] New feature (non-breaking change which adds functionality) - [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected) - [ ] This change requires a documentation update - [ ] Infrastructure change (CI configs, etc) - [ ] Non-code change (docs, etc) - [ ] None of the above: (explain here)
Problem
We do not currently have a good way to test how our client behaves end to end in different environments. One of the chief problems our users have brought to our attention in the past is that some functionalities do not work in certain runtimes, e.g. Edge or Bun, and/or with certain frameworks, e.g. NextJS.
Solution
Build an external application in a runtime and framework known to have caused problems in the past and test our client from the end-user's perspective.
This PR introduces and end-to-end test suite that interacts with an external application written using the NextJS framework and the Edge runtime.
Note: the app is automagically run in
Edge
by way of it using middleware (middleware.ts
), and theHeaders()
API.Overview of changes
localhost:3000
.testing.yml
file that is run for each push to an open PR. In CI, the Github workflow and action hit the application's Vercel endpoint and assert on its response.external-app
dir itself with more in depth information.Type of Change
Test Plan
If reviewers can pull down the code in this branch and try to run the tests locally, that'd be great.
To Dos
I'll add a README and a CONTRIBUTING file to the external app