Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

kompute: implement op_getrows_f32 #6403

Merged
merged 1 commit into from
Jun 3, 2024

Conversation

woachk
Copy link
Contributor

@woachk woachk commented Mar 31, 2024

op_getrows_f32 is required since #6122 for the Vulkan w/ Kompute backend to be functional.

As such, implement this op to make this backend functional again.

This addresses issue #6400

op_getrows_f32 is required since ggerganov#6122
for the Vulkan w/ Kompute backend to be functional.

As such, implement this op to make this backend functional again.
@woachk woachk force-pushed the fix-kompute-getrows_f32 branch from 2b4ac77 to 095647b Compare March 31, 2024 06:25
Copy link
Contributor

📈 llama.cpp server for bench-server-baseline on Standard_NC4as_T4_v3: 534 iterations 🚀

  • Concurrent users: 8, duration: 10m
  • HTTP request : avg=8767.58ms p(90)=25113.61ms fails=0, finish reason: stop=534 truncated=0
  • Prompt processing (pp): avg=235.76tk/s p(90)=696.9tk/s total=205.85tk/s
  • Token generation (tg): avg=98.82tk/s p(90)=272.91tk/s total=130.33tk/s
  • ggml-org/models/phi-2/ggml-model-q4_0.gguf parallel=8 ctx-size=16384 ngl=33 batch-size=2048 ubatch-size=256 pp=1024 pp+tg=2048 branch=fix-kompute-getrows_f32 commit=095647bf5df34ad5529e2536f80d1cf72fe4495e
Time series

prompt_tokens_seconds

More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 534 iterations"
    y-axis "llamacpp:prompt_tokens_seconds"
    x-axis "llamacpp:prompt_tokens_seconds" 1711866489 --> 1711867117
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 409.29, 409.29, 409.29, 409.29, 409.29, 696.26, 696.26, 696.26, 696.26, 696.26, 664.49, 664.49, 664.49, 664.49, 664.49, 688.5, 688.5, 688.5, 688.5, 688.5, 739.88, 739.88, 739.88, 739.88, 739.88, 734.53, 734.53, 734.53, 734.53, 734.53, 723.74, 723.74, 723.74, 723.74, 723.74, 732.35, 732.35, 732.35, 732.35, 732.35, 729.44, 729.44, 729.44, 729.44, 729.44, 733.54, 733.54, 733.54, 733.54, 733.54, 729.57, 729.57, 729.57, 729.57, 729.57, 740.32, 740.32, 740.32, 740.32, 740.32, 765.18, 765.18, 765.18, 765.18, 765.18, 774.8, 774.8, 774.8, 774.8, 774.8, 796.36, 796.36, 796.36, 796.36, 796.36, 796.45, 796.45, 796.45, 796.45, 796.45, 788.79, 788.79, 788.79, 788.79, 788.79, 790.46, 790.46, 790.46, 790.46, 790.46, 785.73, 785.73, 785.73, 785.73, 785.73, 783.16, 783.16, 783.16, 783.16, 783.16, 779.96, 779.96, 779.96, 779.96, 779.96, 760.24, 760.24, 760.24, 760.24, 760.24, 760.81, 760.81, 760.81, 760.81, 760.81, 769.84, 769.84, 769.84, 769.84, 769.84, 767.95, 767.95, 767.95, 767.95, 767.95, 763.85, 763.85, 763.85, 763.85, 763.85, 769.15, 769.15, 769.15, 769.15, 769.15, 766.98, 766.98, 766.98, 766.98, 766.98, 766.58, 766.58, 766.58, 766.58, 766.58, 763.53, 763.53, 763.53, 763.53, 763.53, 762.3, 762.3, 762.3, 762.3, 762.3, 761.53, 761.53, 761.53, 761.53, 761.53, 761.31, 761.31, 761.31, 761.31, 761.31, 769.58, 769.58, 769.58, 769.58, 769.58, 773.32, 773.32, 773.32, 773.32, 773.32, 777.59, 777.59, 777.59, 777.59, 777.59, 776.94, 776.94, 776.94, 776.94, 776.94, 774.9, 774.9, 774.9, 774.9, 774.9, 774.48, 774.48, 774.48, 774.48, 774.48, 774.31, 774.31, 774.31, 774.31, 774.31, 778.07, 778.07, 778.07, 778.07, 778.07, 775.11, 775.11, 775.11, 775.11, 775.11, 769.12, 769.12, 769.12, 769.12, 769.12, 765.06, 765.06, 765.06, 765.06, 765.06, 764.36, 764.36, 764.36, 764.36, 764.36, 758.61, 758.61, 758.61, 758.61, 758.61, 756.96, 756.96, 756.96, 756.96, 756.96, 752.93, 752.93, 752.93, 752.93, 752.93, 754.66, 754.66, 754.66, 754.66, 754.66, 756.46, 756.46, 756.46, 756.46, 756.46, 755.93, 755.93, 755.93, 755.93, 755.93, 750.97, 750.97, 750.97, 750.97, 750.97, 747.54, 747.54, 747.54, 747.54, 747.54, 748.35, 748.35, 748.35, 748.35, 748.35, 748.99, 748.99, 748.99, 748.99, 748.99, 748.95, 748.95, 748.95, 748.95, 748.95, 749.87, 749.87, 749.87, 749.87, 749.87, 746.42, 746.42, 746.42, 746.42, 746.42, 746.12, 746.12, 746.12, 746.12, 746.12, 744.59, 744.59, 744.59, 744.59, 744.59, 745.81, 745.81, 745.81, 745.81, 745.81]
                    
Loading
predicted_tokens_seconds
More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 534 iterations"
    y-axis "llamacpp:predicted_tokens_seconds"
    x-axis "llamacpp:predicted_tokens_seconds" 1711866489 --> 1711867117
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 22.13, 22.13, 22.13, 22.13, 22.13, 27.33, 27.33, 27.33, 27.33, 27.33, 17.13, 17.13, 17.13, 17.13, 17.13, 18.07, 18.07, 18.07, 18.07, 18.07, 18.47, 18.47, 18.47, 18.47, 18.47, 19.27, 19.27, 19.27, 19.27, 19.27, 20.3, 20.3, 20.3, 20.3, 20.3, 20.55, 20.55, 20.55, 20.55, 20.55, 20.65, 20.65, 20.65, 20.65, 20.65, 20.71, 20.71, 20.71, 20.71, 20.71, 20.5, 20.5, 20.5, 20.5, 20.5, 20.3, 20.3, 20.3, 20.3, 20.3, 20.12, 20.12, 20.12, 20.12, 20.12, 19.71, 19.71, 19.71, 19.71, 19.71, 19.21, 19.21, 19.21, 19.21, 19.21, 18.86, 18.86, 18.86, 18.86, 18.86, 18.61, 18.61, 18.61, 18.61, 18.61, 18.77, 18.77, 18.77, 18.77, 18.77, 18.54, 18.54, 18.54, 18.54, 18.54, 18.44, 18.44, 18.44, 18.44, 18.44, 18.35, 18.35, 18.35, 18.35, 18.35, 18.13, 18.13, 18.13, 18.13, 18.13, 18.13, 18.13, 18.13, 18.13, 18.13, 18.21, 18.21, 18.21, 18.21, 18.21, 18.17, 18.17, 18.17, 18.17, 18.17, 18.24, 18.24, 18.24, 18.24, 18.24, 18.3, 18.3, 18.3, 18.3, 18.3, 18.19, 18.19, 18.19, 18.19, 18.19, 18.33, 18.33, 18.33, 18.33, 18.33, 18.41, 18.41, 18.41, 18.41, 18.41, 18.46, 18.46, 18.46, 18.46, 18.46, 18.53, 18.53, 18.53, 18.53, 18.53, 18.62, 18.62, 18.62, 18.62, 18.62, 18.6, 18.6, 18.6, 18.6, 18.6, 18.54, 18.54, 18.54, 18.54, 18.54, 18.52, 18.52, 18.52, 18.52, 18.52, 18.45, 18.45, 18.45, 18.45, 18.45, 18.41, 18.41, 18.41, 18.41, 18.41, 18.5, 18.5, 18.5, 18.5, 18.5, 18.53, 18.53, 18.53, 18.53, 18.53, 18.58, 18.58, 18.58, 18.58, 18.58, 18.51, 18.51, 18.51, 18.51, 18.51, 18.45, 18.45, 18.45, 18.45, 18.45, 18.36, 18.36, 18.36, 18.36, 18.36, 18.25, 18.25, 18.25, 18.25, 18.25, 17.82, 17.82, 17.82, 17.82, 17.82, 17.77, 17.77, 17.77, 17.77, 17.77, 17.36, 17.36, 17.36, 17.36, 17.36, 17.27, 17.27, 17.27, 17.27, 17.27, 17.31, 17.31, 17.31, 17.31, 17.31, 17.36, 17.36, 17.36, 17.36, 17.36, 17.39, 17.39, 17.39, 17.39, 17.39, 17.45, 17.45, 17.45, 17.45, 17.45, 17.48, 17.48, 17.48, 17.48, 17.48, 17.48, 17.48, 17.48, 17.48, 17.48, 17.5, 17.5, 17.5, 17.5, 17.5, 17.48, 17.48, 17.48, 17.48, 17.48, 17.48, 17.48, 17.48, 17.48, 17.48, 17.53, 17.53, 17.53, 17.53, 17.53, 17.63, 17.63, 17.63, 17.63, 17.63, 17.68, 17.68, 17.68, 17.68, 17.68]
                    
Loading

Details

kv_cache_usage_ratio

More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 534 iterations"
    y-axis "llamacpp:kv_cache_usage_ratio"
    x-axis "llamacpp:kv_cache_usage_ratio" 1711866489 --> 1711867117
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.08, 0.08, 0.08, 0.08, 0.08, 0.29, 0.29, 0.29, 0.29, 0.29, 0.11, 0.11, 0.11, 0.11, 0.11, 0.08, 0.08, 0.08, 0.08, 0.08, 0.17, 0.17, 0.17, 0.17, 0.17, 0.15, 0.15, 0.15, 0.15, 0.15, 0.13, 0.13, 0.13, 0.13, 0.13, 0.2, 0.2, 0.2, 0.2, 0.2, 0.2, 0.2, 0.2, 0.2, 0.2, 0.21, 0.21, 0.21, 0.21, 0.21, 0.17, 0.17, 0.17, 0.17, 0.17, 0.13, 0.13, 0.13, 0.13, 0.13, 0.21, 0.21, 0.21, 0.21, 0.21, 0.18, 0.18, 0.18, 0.18, 0.18, 0.22, 0.22, 0.22, 0.22, 0.22, 0.14, 0.14, 0.14, 0.14, 0.14, 0.19, 0.19, 0.19, 0.19, 0.19, 0.31, 0.31, 0.31, 0.31, 0.31, 0.24, 0.24, 0.24, 0.24, 0.24, 0.28, 0.28, 0.28, 0.28, 0.28, 0.3, 0.3, 0.3, 0.3, 0.3, 0.14, 0.14, 0.14, 0.14, 0.14, 0.12, 0.12, 0.12, 0.12, 0.12, 0.23, 0.23, 0.23, 0.23, 0.23, 0.1, 0.1, 0.1, 0.1, 0.1, 0.16, 0.16, 0.16, 0.16, 0.16, 0.28, 0.28, 0.28, 0.28, 0.28, 0.08, 0.08, 0.08, 0.08, 0.08, 0.12, 0.12, 0.12, 0.12, 0.12, 0.11, 0.11, 0.11, 0.11, 0.11, 0.14, 0.14, 0.14, 0.14, 0.14, 0.16, 0.16, 0.16, 0.16, 0.16, 0.14, 0.14, 0.14, 0.14, 0.14, 0.09, 0.09, 0.09, 0.09, 0.09, 0.14, 0.14, 0.14, 0.14, 0.14, 0.2, 0.2, 0.2, 0.2, 0.2, 0.2, 0.2, 0.2, 0.2, 0.2, 0.18, 0.18, 0.18, 0.18, 0.18, 0.12, 0.12, 0.12, 0.12, 0.12, 0.12, 0.12, 0.12, 0.12, 0.12, 0.25, 0.25, 0.25, 0.25, 0.25, 0.44, 0.44, 0.44, 0.44, 0.44, 0.49, 0.49, 0.49, 0.49, 0.49, 0.58, 0.58, 0.58, 0.58, 0.58, 0.59, 0.59, 0.59, 0.59, 0.59, 0.44, 0.44, 0.44, 0.44, 0.44, 0.46, 0.46, 0.46, 0.46, 0.46, 0.19, 0.19, 0.19, 0.19, 0.19, 0.11, 0.11, 0.11, 0.11, 0.11, 0.15, 0.15, 0.15, 0.15, 0.15, 0.12, 0.12, 0.12, 0.12, 0.12, 0.12, 0.12, 0.12, 0.12, 0.12, 0.19, 0.19, 0.19, 0.19, 0.19, 0.19, 0.19, 0.19, 0.19, 0.19, 0.18, 0.18, 0.18, 0.18, 0.18, 0.21, 0.21, 0.21, 0.21, 0.21, 0.15, 0.15, 0.15, 0.15, 0.15, 0.13, 0.13, 0.13, 0.13, 0.13, 0.12, 0.12, 0.12, 0.12, 0.12, 0.08, 0.08, 0.08, 0.08, 0.08, 0.14, 0.14, 0.14, 0.14, 0.14]
                    
Loading
requests_processing
More
---
config:
    xyChart:
        titleFontSize: 12
        width: 900
        height: 600
    themeVariables:
        xyChart:
            titleColor: "#000000"
---
xychart-beta
    title "llama.cpp bench-server-baseline on Standard_NC4as_T4_v3
 duration=10m 534 iterations"
    y-axis "llamacpp:requests_processing"
    x-axis "llamacpp:requests_processing" 1711866489 --> 1711867117
    line [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 6.0, 6.0, 6.0, 6.0, 6.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 6.0, 6.0, 6.0, 6.0, 6.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 6.0, 6.0, 6.0, 6.0, 6.0, 7.0, 7.0, 7.0, 7.0, 7.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 7.0, 7.0, 7.0, 7.0, 7.0, 6.0, 6.0, 6.0, 6.0, 6.0, 8.0, 8.0, 8.0, 8.0, 8.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 5.0, 5.0, 5.0, 5.0, 5.0]
                    
Loading

@slaren slaren requested a review from cebtenzzre March 31, 2024 20:13
cebtenzzre added a commit to nomic-ai/llama.cpp that referenced this pull request May 7, 2024
cebtenzzre added a commit to nomic-ai/llama.cpp that referenced this pull request May 8, 2024
cebtenzzre added a commit to nomic-ai/llama.cpp that referenced this pull request May 8, 2024
Co-authored-by: woachk <[email protected]>
Signed-off-by: Jared Van Bortel <[email protected]>
cebtenzzre added a commit to nomic-ai/llama.cpp that referenced this pull request May 9, 2024
Co-authored-by: woachk <[email protected]>
Signed-off-by: Jared Van Bortel <[email protected]>
@mofosyne mofosyne added Review Complexity : Medium Generally require more time to grok but manageable by beginner to medium expertise level bugfix fixes an issue or bug labels May 10, 2024
@slaren
Copy link
Collaborator

slaren commented May 27, 2024

@cebtenzzre Are there any plans to merge this here? The kompute backend has not been functional for quite a while due to this, but I see that you merged it in the nomic repository.

@guilt
Copy link
Contributor

guilt commented Jun 3, 2024

@ggerganov as discussed in the issue #6400 it happens today.

image

@ggerganov ggerganov merged commit 9e405b6 into ggerganov:master Jun 3, 2024
57 of 58 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bugfix fixes an issue or bug Review Complexity : Medium Generally require more time to grok but manageable by beginner to medium expertise level
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants