Skip to content

Commit

Permalink
LLM: fix error of 'AI-ModelScope/phi-2' hosted by ModelScope hub (int…
Browse files Browse the repository at this point in the history
  • Loading branch information
plusbang authored Mar 11, 2024
1 parent 5bf0208 commit e2836e3
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion python/llm/src/bigdl/llm/transformers/convert.py
Original file line number Diff line number Diff line change
Expand Up @@ -1071,7 +1071,9 @@ def _optimize_post(model, lightweight_bmm=False):
convert_forward(model,
module.MixtralBLockSparseTop2MLP,
mixtral_mlp_forward)
elif model.config.model_type == "phi-msft":
elif model.config.model_type == "phi-msft" and \
hasattr(model.config, "num_local_experts"):
# For phixtral, limit the condition to avoid applying on phi-2 hosted by ModelScope
modeling_module_name = model.__class__.__module__
module = importlib.import_module(modeling_module_name)
from bigdl.llm.transformers.models.phixtral import phixtral_moeblock_forward, \
Expand Down

0 comments on commit e2836e3

Please sign in to comment.