You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sometimes you need to do operations on images that you need to do a .toBuffer() so the next step of the image manipulation would use the output & metadata of the previous step.
The problem is that every time you do a sharp(image) the image seems to lose quality incrementally. Like
image=sharp(buffer).trim().toBuffer()// ... some logic ...image=sharp(image).resize().toBuffer()// ... some logic ...image=sharp(image).extract().toBuffer()
The code above is just an example but you can imagine that each operation might be on different files and image is like a shared context.
Is there any way you would recommend/suggest to do this without losing quality every time we sharp() the image buffer?
I read this discussion earlier: #241 and it seems that it might solve this specific problem but it seems that it's only a "concept" for now.
Let me know if I need to provide more info!
Thanks! 🙏
The text was updated successfully, but these errors were encountered:
Hi 👋 ,
Sometimes you need to do operations on images that you need to do a
.toBuffer()
so the next step of the image manipulation would use the output & metadata of the previous step.The problem is that every time you do a
sharp(image)
the image seems to lose quality incrementally. LikeThe code above is just an example but you can imagine that each operation might be on different files and
image
is like a shared context.Is there any way you would recommend/suggest to do this without losing quality every time we
sharp()
the image buffer?I read this discussion earlier: #241 and it seems that it might solve this specific problem but it seems that it's only a "concept" for now.
Let me know if I need to provide more info!
Thanks! 🙏
The text was updated successfully, but these errors were encountered: