Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request]: blob支持最大存储67,108,864字节(64MB)数据 #20677

Closed
1 task done
yangj1211 opened this issue Dec 10, 2024 · 4 comments
Closed
1 task done
Assignees
Labels
kind/feature priority/p0 Critical feature that should be implemented in this version severity/s0 Extreme impact: Cause the application to break down and seriously affect the use
Milestone

Comments

@yangj1211
Copy link
Contributor

Is there an existing issue for the same feature request?

  • I have checked the existing issues.

Is your feature request related to a problem?

目前,MO 中的 BLOB 类型最大支持存储 64 KB 的数据,不足以处理较大的文件。

https://github.com/matrixorigin/matrixone/issues/20666

Describe the feature you'd like

更改为blob最大支持存储 67,108,864 字节(即 64 MB)的数据。

Describe implementation you've considered

No response

Documentation, Adoption, Use Case, Migration Strategy

No response

Additional information

No response

@yangj1211 yangj1211 added priority/p0 Critical feature that should be implemented in this version kind/feature severity/s0 Extreme impact: Cause the application to break down and seriously affect the use labels Dec 10, 2024
@yangj1211 yangj1211 added this to the 2.0.2 milestone Dec 10, 2024
@allengaoo
Copy link

目前前线有一些项目需要保存非结构化数据,建议这个feature 能够在2.0.2 版本提供。

@qingxinhome
Copy link
Contributor

This function has been implemented, please verify and test it @aressu1985 @yangj1211 @allengaoo @sukki37

@qingxinhome qingxinhome removed their assignment Dec 17, 2024
@aressu1985 aressu1985 assigned heni02 and unassigned aressu1985 Dec 19, 2024
@heni02
Copy link
Contributor

heni02 commented Dec 23, 2024

2.0-dev commit:06354304c7daa5e4c104c84292559f431eaad942

  1. insert 56.9MB大小压缩文件,测试结果不符合预期 [Bug]: blob length size less than max_allowed_packet ,but select blob reported ERROR Got packet bigger than 'max_allowed_packet' bytes #20880
mysql> create table table_blob(c1 int primary key,c2 blob);
Query OK, 0 rows affected (0.03 sec)

mysql> insert into table_blob values(834,LOAD_FILE('/Users/heni/Downloads/blob_test_data/blob04.tar.gz'));
Query OK, 1 row affected (0.66 sec)

mysql> select length(c2) from table_blob;
+------------+
| length(c2) |
+------------+
|   56874074 |
+------------+
1 row in set (0.12 sec)

mysql> select * from table_blob;
ERROR 2020 (HY000): Got packet bigger than 'max_allowed_packet' bytes

mysql> SHOW VARIABLES like "%max_allowed_packet%";
+--------------------+----------+
| Variable_name      | Value    |
+--------------------+----------+
| max_allowed_packet | 67108864 |
+--------------------+----------+
1 row in set (0.00 sec)

此处有疑问,max_allowed_packet为64M,c2 blob列长度没有超过该值,select还是报错

mysql:验证max_allowed_packet也是64M,但是select blob和lenth都返回NULL值,没有报错
企业微信截图_95b08f24-4cce-42ad-85e3-7ac7435cc0bc
insert 4M的文件,也是显示NULL
企业微信截图_600c1a67-f0fa-4f0f-8564-9d1bbe70c583
企业微信截图_dd0637d5-8085-4cac-880f-2a31c2c1c61d

  1. insert 4.1MB jpeg图片文件,select blob列返回值
mysql> insert into table_blob values(1,LOAD_FILE('/Users/heni/Downloads/blob_test_data/blob01.jpg'));
Query OK, 1 row affected (0.66 sec)

mysql> select * from table_blob where c1=1;
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
1 row in set (0.01 sec)

3.insert 大于64M的文件(99M),insert报错,符合预期

mysql> insert into table_blob values(3,LOAD_FILE('/Users/heni/test_data/100w20col.tar.gz'));
ERROR 20101 (HY000): internal error: Data too long for blob

4.insert 9.7M音频文件,select查询返回值

mysql> insert into table_blob values(1,LOAD_FILE('/Users/heni/Downloads/blob_test_data/blob3.ncm'));
Query OK, 1 row affected (0.27 sec)

mysql> select * from table_blob;
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
1 row in set (0.03 sec)

mysql> select length(c2) from table_blob;
+------------+
| length(c2) |
+------------+
|    9678155 |
+------------+
1 row in set (0.01 sec)

5.插入10300行9.7M blob数据,select 300行显示符合预期

mysql> select count(*) from table_blob;
+----------+
| count(*) |
+----------+
|    10300 |
+----------+
1 row in set (0.01 sec)

mysql> show create table table_blob;
+------------+------------------------------------------------------------------------------------------------------------------+
| Table      | Create Table                                                                                                     |
+------------+------------------------------------------------------------------------------------------------------------------+
| table_blob | CREATE TABLE `table_blob` (
  `c1` int NOT NULL AUTO_INCREMENT,
  `c2` blob DEFAULT NULL,
  PRIMARY KEY (`c1`)
) |
+------------+------------------------------------------------------------------------------------------------------------------+
1 row in set (0.01 sec)
mysql> select length(c2) from table_blob limit 3;
+------------+
| length(c2) |
+------------+
|    9678155 |
|    9678155 |
|    9678155 |
+------------+
3 rows in set (0.21 sec)

@heni02
Copy link
Contributor

heni02 commented Dec 26, 2024

#20800 已解决,test done

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/feature priority/p0 Critical feature that should be implemented in this version severity/s0 Extreme impact: Cause the application to break down and seriously affect the use
Projects
None yet
Development

No branches or pull requests

6 participants