Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Task failed in the backend. Please try again. #344

Open
changliao1025 opened this issue Jan 15, 2025 · 7 comments · Fixed by #345 or #350
Open

Task failed in the backend. Please try again. #344

changliao1025 opened this issue Jan 15, 2025 · 7 comments · Fixed by #345 or #350
Assignees

Comments

@changliao1025
Copy link

When I tried to run the "Generating model", it always failed with the permission issue. See below log from the terminal:

Arguments: 'src.tasks.process_dem', 'exc': "PermissionError(13, 'Permission denied')", 'traceback': 'Traceback (most recent call last):\n File "/venv/lib/python3.11/site-packages/celery/app/trace.py", line 451, in trace_task\n R = retval = fun(*args, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^\n File "/venv/lib/python3.11/site-packages/celery/app/trace.py", line 734, in protected_call\n return self.run(*args, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^\n File "/app/src/tasks.py", line 126, in process_dem\n process_hydro_dem.main(selected_polygon, **parameters)\n File "/app/src/flood_model/process_hydro_dem.py", line 106, in main\n process_dem(selected_polygon_gdf)\n File "/app/src/flood_model/process_hydro_dem.py", line 62, in process_dem\n newzealidar.process.main(selected_polygon_gdf)\n File "/venv/lib/python3.11/site-packages/newzealidar/process.py", line 498, in main\n utils.save_gpkg(lidar_extent, "lidar_extent")\n File "/venv/lib/python3.11/site-packages/newzealidar/utils.py", line 994, in save_gpkg\n pathlib.Path(gpkg_path).mkdir(parents=True, exist_ok=True)\n File "/venv/lib/python3.11/pathlib.py", line 1116, in mkdir\n os.mkdir(self, mode)\nPermissionError: [Errno 13] Permission denied: '/stored_data/gpkg'\n', 'args': "['POLYGON ((172.71629837906798 -43.40100735095039, 172.71629837906798 -43.38586581746695, 172.67209098399104 -43.38586581746695, 172.67209098399104 -43.40100735095039, 172.71629837906798 -43.40100735095039))']", 'kwargs': '{}', 'description': 'raised unexpected', 'internal': False}

@LukeParky
Copy link
Member

Hi @changliao1025 thank you, it looks like a regression has caused this. I am investigating now.

@LukeParky
Copy link
Member

LukeParky commented Jan 20, 2025

Hi @changliao1025 I believe the issue has been resolved. You will have to pull/build a new image, and I reccomend starting with fresh volumes.

Please ensure your branch is up to date with the new changes.

Eg.
git pull && docker compose pull && docker compose down -v && docker compose up -d && docker compose logs -f backend celery_worker

@changliao1025
Copy link
Author

Thanks, I will try it again.

@LukeParky
Copy link
Member

As mentioned in #342, this issue occurs in Ubuntu-22.04 and possibly other operations systems. I am reopening the issue to provide a fix. I believe this is fixed in downstream forks so will try to isolate the fix for this release.

@LukeParky LukeParky reopened this Mar 5, 2025
@LukeParky LukeParky linked a pull request Mar 11, 2025 that will close this issue
19 tasks
@LukeParky
Copy link
Member

@changliao1025 please let me know if git pull && docker compose pull && docker compose down -v && docker compose up -d && docker compose logs -f backend celery_worker resolves this issue.

@changliao1025
Copy link
Author

It seems this issue still remains.
Image
I am using an iMac with an Intel processor.

@LukeParky LukeParky reopened this Mar 24, 2025
@LukeParky
Copy link
Member

Hi @changliao1025 it is unfortunate you are still encountering the issue. I am not yet sure how it can be happening.

I have not been able to replicate it since pushing my fix.
I am attempting one more way to replicate this.

Could you please provide me with the contents of your docker compose images output?

E.g.

CONTAINER                     REPOSITORY                                 TAG                 IMAGE ID            SIZE
backend_digital_twin          lparkinson/flood-resilience-dt             1.4                 ae586e           2.93GB
celery_worker_digital_twin    lparkinson/flood-resilience-dt             1.4                 ae586e           2.93GB
db_postgres_digital_twin      postgis/postgis                            16-3.4              06287eb8e12c        609MB
geoserver_digital_twin        lparkinson/geoserver-flood-resilience-dt   1.4                 efc8ef37f170        1.11GB
message_broker_digital_twin   redis                                      7                   ad4b31aa2de6        117MB
www_digital_twin              lparkinson/www-flood-resilience-dt         1.4                 96b584736ee6        239MB

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants