Skip to content
This repository was archived by the owner on Jun 3, 2025. It is now read-only.

Conversation

Satrat
Copy link

@Satrat Satrat commented Oct 11, 2023

This PR implements a productionized SmoothQuantModifier for use in OBCQ, with example recipes for OPT and Llama.

Instructions and Testing

To run for OPT, and print out perplexity results

python src/sparseml/transformers/sparsification/obcq/obcq.py facebook/opt-1.3b c4
    --recipe src/sparseml/transformers/sparsification/obcq/example.yaml --eval wikitext

To run for Llama, you'll need a local copy of the model

python src/sparseml/transformers/sparsification/obcq/obcq.py /local/path/to/llama open_platypus
    --recipe src/sparseml/transformers/sparsification/obcq/example_llama.yaml --eval open_platypus

natuan and others added 30 commits August 17, 2023 23:28
…arsegpt/llama2

# Conflicts:
#	src/sparseml/experimental/sparsegpt/dispatch.py
#	src/sparseml/experimental/sparsegpt/layer_compressor.py
#	src/sparseml/experimental/sparsegpt/main.py
#	src/sparseml/experimental/sparsegpt/model_preprocessor.py
#	src/sparseml/experimental/sparsegpt/opt.py
#	src/sparseml/experimental/sparsegpt/sequential.py
#	src/sparseml/experimental/sparsegpt/sparsegpt.py
bfineran
bfineran previously approved these changes Oct 16, 2023
@Satrat Satrat requested a review from anmarques October 16, 2023 20:56
@Satrat Satrat requested a review from bfineran October 19, 2023 15:45
rahul-tuli
rahul-tuli previously approved these changes Oct 20, 2023
anmarques
anmarques previously approved these changes Oct 31, 2023
@bfineran bfineran merged commit 90250f2 into main Oct 31, 2023
@bfineran bfineran deleted the prod_smooth_quant branch October 31, 2023 12:21
bfineran pushed a commit that referenced this pull request Nov 16, 2023
bfineran pushed a commit that referenced this pull request Nov 16, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants