Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support pipeline parallel for glm-4-9b-chat #11463

Merged
merged 6 commits into from
Jul 3, 2024

Conversation

plusbang
Copy link
Contributor

@plusbang plusbang commented Jun 28, 2024

Description

Support pipeline parallel inference & serving for glm-4-9b-chat

2. User API changes

N/A

4. How to test?

@plusbang plusbang changed the title [WIP] Support pipeline parallel for glm-4-9b-chat Support pipeline parallel for glm-4-9b-chat Jun 28, 2024
@plusbang plusbang requested review from qiuxin2012 and glorysdj June 28, 2024 09:35
@glorysdj glorysdj requested a review from xiangyuT July 2, 2024 02:45
inputs_embeds = self.embedding(input_ids)
else:
batch_size, seq_length, _ = inputs_embeds.shape
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If attention_mask is given a not-None and not-All value, input_ids is needed in line 58 self.get_masks() and it will raise error if input_ids is still None. Maybe add an empty tensor here?

input_ids = torch.empty((batch_size, seq_length), device=inputs_embeds.device)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Relative code in modeling_chatglm:

    def get_masks(self, input_ids, past_key_values, padding_mask=None):
        batch_size, seq_length = input_ids.shape

Copy link
Contributor Author

@plusbang plusbang Jul 2, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If attention_mask is given a not-None and not-All value, input_ids is needed in line 58 self.get_masks() and it will raise error if input_ids is still None. Maybe add an empty tensor here?

Have updated in chatglm2.py and chatglm4.py.

@plusbang plusbang requested a review from xiangyuT July 2, 2024 06:15
Copy link
Contributor

@xiangyuT xiangyuT left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@plusbang plusbang merged commit 9274282 into intel-analytics:main Jul 3, 2024
1 check passed
RyuKosei pushed a commit to RyuKosei/ipex-llm that referenced this pull request Jul 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants