Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix trainer and Qwen2-VL #179

Merged
merged 2 commits into from
Jan 11, 2025
Merged

Fix trainer and Qwen2-VL #179

merged 2 commits into from
Jan 11, 2025

Conversation

Blaizzy
Copy link
Owner

@Blaizzy Blaizzy commented Jan 11, 2025

Closes #176
Closes #173

@Blaizzy Blaizzy merged commit 21fc1b2 into main Jan 11, 2025
1 check passed
@qinxuye
Copy link

qinxuye commented Jan 13, 2025

Any plan to release a new version? mlx-lm 0.21.0 has pinned mlx>=0.22.0.

Once mlx-vlm released a new version, we could loose the pin of mlx<0.22.0 in Xinference.

@Blaizzy
Copy link
Owner Author

Blaizzy commented Jan 13, 2025

Yes, there will be a new release this week :)

@Blaizzy
Copy link
Owner Author

Blaizzy commented Jan 13, 2025

Once mlx-vlm released a new version, we could loose the pin of mlx<0.22.0 in Xinference

Not sure I follow. Could you elaborate?

@Blaizzy
Copy link
Owner Author

Blaizzy commented Jan 13, 2025

I will pin v0.22.0 as well. There are some recent models and changes that require the latest MLX release.

@qinxuye
Copy link

qinxuye commented Jan 13, 2025

Once mlx-vlm released a new version, we could loose the pin of mlx<0.22.0 in Xinference

Not sure I follow. Could you elaborate?

In which part do you need help?

@Blaizzy
Copy link
Owner Author

Blaizzy commented Jan 13, 2025

I need more context...

For instance, what is Xinference ?

@qinxuye
Copy link

qinxuye commented Jan 13, 2025

I need more context...

For instance, what is Xinference ?

Oh, I misunderstood, https://github.com/xorbitsai/inference is our project, and we have adopted your project for the Mac VLM engine.

@qinxuye
Copy link

qinxuye commented Jan 13, 2025

https://github.com/xorbitsai/inference/blob/7c6249a3383b2841a9e96243c6a900ce19a7f1d7/setup.cfg#L101

We have pinned mlx version, but the better option is that mlx-vlm can be compatible with mlx>=0.22.0

@Blaizzy
Copy link
Owner Author

Blaizzy commented Jan 13, 2025

Oh, I misunderstood, https://github.com/xorbitsai/inference is our project, and we have adopted your project for the Mac VLM engine.

No worries,

So awesome, really happy to see this!🔥

Sure thing, I will update the dependency to use >=0.22.0 on the next release

@qinxuye
Copy link

qinxuye commented Jan 18, 2025

@Blaizzy when will the next release come? Our CI is broken, and really appreciate you can release new version as soon as possible.

@Blaizzy
Copy link
Owner Author

Blaizzy commented Jan 18, 2025

Hey,

It' done v0.1.11 is out :)

@Blaizzy Blaizzy deleted the pc/bug-fixes branch February 2, 2025 19:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants