corva-dev-advocate-coding-test
π Coding test for Corva Developer Advocate candidates. Candidates must query a public API with pagination, process the data, and save it as a CSV file.
This test will assess your ability to:
- Query large amounts of time-based data from a public API.
- Handle pagination when fetching large datasets.
- Process and export data to a
.csv
file. - Implement logging for debugging and monitoring.
- Write clear, well-documented code in Python, JavaScript, or TypeScript.
Write a Python, JavaScript, or TypeScript script that:
- Fetches time-based data from a public API that supports large datasets and pagination.
- Implements pagination to handle API limits and retrieve all available data.
- Processes the data to extract relevant fields.
- Saves the data to a CSV file in a structured format.
- Includes logging to track progress, errors, and API call status.
- Uses clear code comments to explain the logic.
You may choose any public API with time-based data and pagination. Here are some suggestions:
- NASA API (api.nasa.gov) β e.g., asteroid data, Earth observation.
- OpenWeather API (openweathermap.org/api) β e.g., historical weather data.
- COVID-19 Data API (covid19api.com) β time-based case tracking.
- Financial Data API (alphavantage.co) β stock market time-series data.
- Any other public API of your choice.
- Use Python (
requests
,csv
,pandas
), JavaScript (fetch
,fs
), or TypeScript. - Implement pagination (e.g., handling
next
page URLs,offset
parameters, etc.). - Save results to a CSV file with properly formatted columns.
- Include logging to track API requests and errors.
- Submit a Python script (
.py
), JavaScript file (.js
), or TypeScript file (.ts
). - Write comments and docstrings to explain the code.
You are allowed to use AI tools such as ChatGPT, GitHub Copilot, or others to assist you in completing this task. However, you must:
- Understand and validate the output before using it.
- Ensure all code is properly documented to explain what it does.
- Explain in your submission if and how AI tools helped you solve the problem.
- Use pandas (Python) or similar data-processing libraries for cleaning and analyzing the data.
- Implement basic error handling (e.g., handling failed API requests, retrying on timeouts).
- Optimize API calls to respect rate limits (if applicable).
Submit your completed Python (.py
), JavaScript (.js
), or TypeScript (.ts
) script along with the generated .csv
file.
Include a brief README or explanation (can be a text block in the script) describing:
- The API used
- How pagination was handled
- Any challenges encountered
- How AI tools assisted you, if used
For example, if fetching weather data, your CSV might look like:
Timestamp | Temperature | Humidity | City |
---|---|---|---|
2023-01-01 12:00 | 72Β°F | 40% | New York |
2023-01-02 12:00 | 68Β°F | 45% | New York |
- Correctness β Does the script correctly fetch and store paginated data?
- Efficiency β Is pagination handled properly to retrieve all data?
- Readability β Is the code well-structured and documented?
- Logging β Are logs useful for debugging API requests?
- Output Quality β Is the CSV well-formatted and complete?
- AI Tool Usage Explanation β If AI tools were used, is there a clear explanation of their role in the solution?