Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to acquire lock for frecency when Chinese-English mixed input #936

Open
2 tasks done
pxwg opened this issue Jan 7, 2025 · 6 comments
Open
2 tasks done

Failed to acquire lock for frecency when Chinese-English mixed input #936

pxwg opened this issue Jan 7, 2025 · 6 comments
Labels
bug Something isn't working fuzzy Filtering and sorting of completion items

Comments

@pxwg
Copy link

pxwg commented Jan 7, 2025

Make sure you have done the following

  • Updated to the latest version of blink.cmp
  • Searched for existing issues and documentation (try <C-k> on https://cmp.saghen.dev)

Bug Description

Hello everyone, I am a new user of blink-cmp for LaTeX writing. In addition to the basic texlab LSP, I have also configured a Chinese LSP rime-ls for mixed Chinese and English input.

The speed of blink-cmp is very fast! Many thanks to the developers for their thoughtful work. However, I am now encountering an issue, specifically:

When I input a mixed Chinese-English string like 你好\text{nihao}, I get the following error:

Error executing vim.schedule lua callback: byte index 18446744073709551612 is out of bounds of `你好\text{niha}`

Screenshot as follows:

截屏2025-01-08 01 37 19

After deleting the string, the error keeps occurring even if I type pure English:

Error executing vim.schedule lua callback: runtime error: Failed to acquire lock for frecency

Unless I restart neovim, this issue will persist.

This makes it unusable. After testing, it is strongly related to a segment of my LSP source configuration. As long as the transform_items table is removed, the issue is resolved. However, it's important for the Chinese input LSP setup, so I wander if there is a way to solve the issue while keeping this configuration.

Even if I use the built-in Chinese input method of my computer, the same issue occurs as long as this configuration is added. Any help would be greatly appreciated!

P.S. I've tried to disable fuzzy by set

fuzzy = { use_typo_resistance = false, use_proximity = false, use_frecency = false, use_unsafe_no_lock = false },

however, the issue still exists.

Relevant configuration

            lsp = {
              -- min_keyword_length = 0,
              fallbacks = { "ripgrep", "buffer" },
              --- @param items blink.cmp.CompletionItem[]
              transform_items = function(_, items)
                -- demote snippets
                for _, item in ipairs(items) do
                  if item.kind == require("blink.cmp.types").CompletionItemKind.Snippet then
                    item.score_offset = item.score_offset - 3
                  end
                end
                return items
              end,
            },

neovim version

NVIM v0.10.3 Build type: Release LuaJIT 2.1.1734355927

blink.cmp version

v0.9.3

@pxwg pxwg added the bug Something isn't working label Jan 7, 2025
@pxwg
Copy link
Author

pxwg commented Jan 7, 2025

I've constructed an minimal setup with LSP and blinks only, the problem still

-- blink.lua

return {
  "saghen/blink.cmp",
  opts = {
    sources = {
      -- default = { "lsp", "path", "luasnip", "buffer", "ripgrep", "lazydev" },
      default = { "lsp", "path", "buffer" },
      cmdline = {},
      providers = {
        lsp = {
          min_keyword_length = 2,
          fallbacks = { "ripgrep", "buffer" },
          --- @param items blink.cmp.CompletionItem[]
          transform_items = function(_, items)
            -- demote snippets
            for _, item in ipairs(items) do
              if item.kind == require("blink.cmp.types").CompletionItemKind.Snippet then
                item.score_offset = item.score_offset - 3
              end
            end
            return items
          end,
        },
      },
    },
  },
}

The problem still holds

2025-01-08.04.33.36.mov

BTW, 你好 in Chinese means hello.

@Saghen Saghen added the fuzzy Filtering and sorting of completion items label Jan 8, 2025
@pxwg
Copy link
Author

pxwg commented Jan 9, 2025

I've checked for newest blink-cmp, the problem still holds under the minimal configuration

@wlh320
Copy link

wlh320 commented Jan 11, 2025

If I set capabilities.general.positionEncodings = { 'utf-8' } in nvim-lspconfig, this problem won't occur. I suspect that somewhere a wrong cursor position (such as UTF-16 position) may be passed to rust code, where only UTF-8 is used to encode String.

@wlh320
Copy link

wlh320 commented Jan 12, 2025

After reading keyword.rs,

I think

    let line_prefix = line.chars().take(line_range.0).collect::<String>();
    let text_prefix = item_text.chars().take(text_range.0).collect::<String>();

should be replaced with

    let line_prefix = &line[..line_range.0];
    let text_prefix = &item_text[..text_range.0];

because regex Match returns ranges of byte indices (variable line_range and text_range) .

@aximcore
Copy link

Similar problem when use Hungarian characters on typescript comment.

Error executing vim.schedule lua callback: byte index 18446744073709551611 is out of bounds of // az első esszé egyetemes közül fog ki kerüln
Error executing vim.schedule lua callback: runtime error: Failed to acquire lock for frecency
stack traceback:
[C]: in function 'fuzzy'
...l/share/nvim/lazy/blink.cmp/lua/blink/cmp/fuzzy/init.lua:76: in function 'fuzzy'
...re/nvim/lazy/blink.cmp/lua/blink/cmp/completion/list.lua:113: in function 'fuzzy'
...re/nvim/lazy/blink.cmp/lua/blink/cmp/completion/list.lua:89: in function 'show'
...re/nvim/lazy/blink.cmp/lua/blink/cmp/completion/init.lua:53: in function <...re/nvim/lazy/blink.cmp/lua/blink/cmp/completion/init.lua:29>

Using LazyVim and default config.

@theStrangeAdventurer
Copy link

Hi! same bug with cyrillic symbols
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working fuzzy Filtering and sorting of completion items
Projects
None yet
Development

No branches or pull requests

5 participants