8000 fix awq backend and fp_layers issue by wenhuach21 · Pull Request #363 · intel/auto-round · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

fix awq backend and fp_layers issue #363

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Dec 2, 2024
Merged

fix awq backend and fp_layers issue #363

merged 7 commits into from
Dec 2, 2024

Conversation

wenhuach21
Copy link
Contributor
@wenhuach21 wenhuach21 commented Dec 2, 2024

fixed #361
fixed #360
fixed #357

8000 @wenhuach21 wenhuach21 changed the title add requirement for auto-round exllamav2 kernel fix awq backend and fp_layers issue Dec 2, 2024
if fp_layer in all_layer_names:
not_to_quantized_layers.append(fp_layer)
continue
if fp_layer[-1] != ".":
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

judge whether the last character is an integer value should be more in line with expectations

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the code is to handle this scenario, model.layer.1 should only exclude model.layer.1,should not exclude model.layer.11

@wenhuach21 wenhuach21 merged commit 9b9eeb6 into main Dec 2, 2024
8 checks passed
@wenhuach21 wenhuach21 deleted the update_1202 branch December 2, 2024 09:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants
0