Skip to content

Commit

Permalink
Add AInode sidebar and modify streaming content of English documents (#…
Browse files Browse the repository at this point in the history
  • Loading branch information
wanghui42 authored Jan 15, 2024
1 parent 3fb38a8 commit bc4a10f
Show file tree
Hide file tree
Showing 11 changed files with 796 additions and 33 deletions.
1 change: 1 addition & 0 deletions src/.vuepress/sidebar_timecho/V1.3.x/en.ts
Original file line number Diff line number Diff line change
Expand Up @@ -89,6 +89,7 @@ export const enSidebar = {
{ text: 'Data Sync', link: 'Data-Sync_timecho' },
{ text: 'Tiered Storage', link: 'Tiered-Storage_timecho' },
{ text: 'View', link: 'IoTDB-View_timecho' },
{ text: 'IoTDB AINode', link: 'IoTDB-AINode_timecho' },
{ text: 'Database Programming', link: 'Database-Programming' },
{ text: 'Security Management', link: 'Security-Management_timecho' },
{ text: 'Authority Management', link: 'Authority-Management' },
Expand Down
10 changes: 5 additions & 5 deletions src/UserGuide/Master/User-Manual/Data-Sync_timecho.md
Original file line number Diff line number Diff line change
Expand Up @@ -162,7 +162,7 @@ IoTDB> show pipeplugins

This example is used to demonstrate the synchronisation of all data from one IoTDB to another IoTDB with the data link as shown below:

![](https://alioss.timecho.com/docs/img/w1.png)
![](https://alioss.timecho.com/docs/img/e1.png)

In this example, we can create a synchronisation task named A2B to synchronise the full amount of data from IoTDB A to IoTDB B. Here we need to use the iotdb-thrift-sink plugin (built-in plugin) which uses sink, and we need to specify the address of the receiving end, in this example, we have specified 'sink.ip' and 'sink.port', and we can also specify 'sink.port'. This example specifies 'sink.ip' and 'sink.port', and also 'sink.node-urls', as in the following example statement:

Expand All @@ -180,7 +180,7 @@ with sink (

This example is used to demonstrate the synchronisation of data from a certain historical time range (8:00pm 23 August 2023 to 8:00pm 23 October 2023) to another IoTDB, the data link is shown below:

![](https://alioss.timecho.com/docs/img/w2.png)
![](https://alioss.timecho.com/docs/img/e2.png)

In this example we can create a synchronisation task called A2B. First of all, we need to define the range of data to be transferred in source, since the data to be transferred is historical data (historical data refers to the data that existed before the creation of the synchronisation task), we need to configure the source.realtime.enable parameter to false; at the same time, we need to configure the start-time and end-time of the data and the mode mode of the transfer. At the same time, you need to configure the start-time and end-time of the data and the mode mode of transmission, and it is recommended that the mode be set to hybrid mode (hybrid mode is a mixed transmission mode, which adopts the real-time transmission mode when there is no backlog of data, and adopts the batch transmission mode when there is a backlog of data, and automatically switches according to the internal situation of the system).

Expand All @@ -205,7 +205,7 @@ with SINK (

This example is used to demonstrate a scenario where two IoTDBs are dual-active with each other, with the data link shown below:

![](https://alioss.timecho.com/docs/img/w3.png)
![](https://alioss.timecho.com/docs/img/e3.png)

In this example, in order to avoid an infinite loop of data, the parameter `'source.forwarding-pipe-requests` needs to be set to ``false`` on both A and B to indicate that the data transferred from the other pipe will not be forwarded. Also set `'source.history.enable'` to `false` to indicate that historical data is not transferred, i.e., data prior to the creation of the task is not synchronised.

Expand Down Expand Up @@ -245,7 +245,7 @@ with sink (

This example is used to demonstrate a cascading data transfer scenario between multiple IoTDBs, where data is synchronised from cluster A to cluster B and then to cluster C. The data link is shown in the figure below:

![](https://alioss.timecho.com/docs/img/w4.png)
![](https://alioss.timecho.com/docs/img/e4.png)

In this example, in order to synchronise the data from cluster A to C, the pipe between BC needs to be configured with `source.forwarding-pipe-requests` to `true`, the detailed statement is as follows:

Expand Down Expand Up @@ -277,7 +277,7 @@ with sink (

This example is used to demonstrate a scenario where data from one IoTDB is synchronised to another IoTDB via a unidirectional gate, with the data link shown below:

![](https://alioss.timecho.com/docs/img/w5.png)
![](https://alioss.timecho.com/docs/img/e5.png)

In this example, you need to use the iotdb-air-gap-sink plug-in in the sink task (currently supports some models of network gates, please contact the staff of Tianmou Technology to confirm the specific model), and after configuring the network gate, execute the following statements on IoTDB A, where ip and port fill in the information of the network gate, and the detailed statements are as follows:

Expand Down
765 changes: 765 additions & 0 deletions src/UserGuide/Master/User-Manual/Streaming.md

Large diffs are not rendered by default.

16 changes: 7 additions & 9 deletions src/UserGuide/Master/User-Manual/Streaming_timecho.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ Pipe Extractor is used to extract data, Pipe Processor is used to process data,

**The model of the Pipe task is as follows:**

![Task model diagram](https://alioss.timecho.com/docs/img/%E5%90%8C%E6%AD%A5%E5%BC%95%E6%93%8E.jpeg)
![pipe.png](https://alioss.timecho.com/docs/img/pipe.png)

Describing a data flow processing task essentially describes the properties of Pipe Extractor, Pipe Processor and Pipe Connector plugins.
Users can declaratively configure the specific attributes of the three subtasks through SQL statements, and achieve flexible data ETL capabilities by combining different attributes.
Expand Down Expand Up @@ -614,14 +614,12 @@ WITH CONNECTOR (

**When creating a stream processing task, you need to configure the PipeId and the parameters of the three plugin parts:**


| Configuration item | Description | Required or not | Default implementation | Default implementation description | Whether custom implementation is allowed |
| --------- | --------------------------------------------------- | --------------------------- | -------------------- | -------------------------------------------------------- | ------------------------- |
| PipeId | A globally unique name that identifies a stream processing task | <font color=red>Required</font> | - | - | - |
| extractor | Pipe Extractor plugin, responsible for extracting stream processing data at the bottom of the database | Optional | iotdb-extractor | Integrate the full historical data of the database and subsequent real-time data arriving into the stream processing task | No |
| processor | Pipe Processor plugin, responsible for processing data | Optional | do-nothing-processor | Optional | do-nothing-processor | | processor | Pipe Processor plugin, responsible for processing data | Optional | do-nothing-processor | Does not do any processing on the incoming data | <font color=red>Yes</font> |
| <font color=red>是</font> |
| connector | Pipe Connector plugin, responsible for sending data | <font color=red>Required</font> | - | - | <font color=red>是</font> |
| Configuration | Description | Required or not | Default implementation | Default implementation description | Default implementation description |
| ------------- | ------------------------------------------------------------ | ------------------------------- | ---------------------- | ------------------------------------------------------------ | ---------------------------------- |
| PipeId | A globally unique name that identifies a stream processing | <font color=red>Required</font> | - | - | - |
| extractor | Pipe Extractor plugin, responsible for extracting stream processing data at the bottom of the database | Optional | iotdb-extractor | Integrate the full historical data of the database and subsequent real-time data arriving into the stream processing task | No |
| processor | Pipe Processor plugin, responsible for processing data | Optional | do-nothing-processor | Does not do any processing on the incoming data | <font color=red>Yes</font> |
| connector | Pipe Connector plugin, responsible for sending data | <font color=red>Required</font> | - | - | <font color=red>Yes</font> |

In the example, the iotdb-extractor, do-nothing-processor and iotdb-thrift-connector plugins are used to build the data flow processing task. IoTDB also has other built-in stream processing plugins, **please check the "System Preset Stream Processing plugin" section**.

Expand Down
2 changes: 1 addition & 1 deletion src/UserGuide/V1.2.x/User-Manual/Data-Sync.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@

**The model of a Pipe task is as follows:**

![Task model diagram](https://alioss.timecho.com/docs/img/%E6%B5%81%E5%A4%84%E7%90%86%E5%BC%95%E6%93%8E.jpeg)
![pipe.png](https://alioss.timecho.com/docs/img/pipe.png)

It describes a data sync task, which essentially describes the attributes of the Pipe Extractor, Pipe Processor, and Pipe Connector plugins. Users can declaratively configure the specific attributes of the three subtasks through SQL statements. By combining different attributes, flexible data ETL (Extract, Transform, Load) capabilities can be achieved.

Expand Down
2 changes: 1 addition & 1 deletion src/UserGuide/V1.2.x/User-Manual/Data-Sync_timecho.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@

**The model of a Pipe task is as follows:**

![Task model diagram](https://alioss.timecho.com/docs/img/%E6%B5%81%E5%A4%84%E7%90%86%E5%BC%95%E6%93%8E.jpeg)
![pipe.png](https://alioss.timecho.com/docs/img/pipe.png)

It describes a data sync task, which essentially describes the attributes of the Pipe Extractor, Pipe Processor, and Pipe Connector plugins. Users can declaratively configure the specific attributes of the three subtasks through SQL statements. By combining different attributes, flexible data ETL (Extract, Transform, Load) capabilities can be achieved.

Expand Down
2 changes: 1 addition & 1 deletion src/UserGuide/V1.2.x/User-Manual/Streaming.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ Pipe Extractor is used to extract data, Pipe Processor is used to process data,

**The model for a Pipe task is as follows:**

![任务模型图](https://alioss.timecho.com/docs/img/%E5%90%8C%E6%AD%A5%E5%BC%95%E6%93%8E.jpeg)
![pipe.png](https://alioss.timecho.com/docs/img/pipe.png)
A data stream processing task essentially describes the attributes of the Pipe Extractor, Pipe Processor, and Pipe Connector plugins.

Users can configure the specific attributes of these three subtasks declaratively using SQL statements. By combining different attributes, flexible data ETL (Extract, Transform, Load) capabilities can be achieved.
Expand Down
3 changes: 2 additions & 1 deletion src/UserGuide/V1.2.x/User-Manual/Streaming_timecho.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,8 @@ Pipe Extractor is used to extract data, Pipe Processor is used to process data,

**The model for a Pipe task is as follows:**

![任务模型图](https://alioss.timecho.com/docs/img/%E5%90%8C%E6%AD%A5%E5%BC%95%E6%93%8E.jpeg)
![pipe.png](https://alioss.timecho.com/docs/img/pipe.png)

A data stream processing task essentially describes the attributes of the Pipe Extractor, Pipe Processor, and Pipe Connector plugins.

Users can configure the specific attributes of these three subtasks declaratively using SQL statements. By combining different attributes, flexible data ETL (Extract, Transform, Load) capabilities can be achieved.
Expand Down
10 changes: 5 additions & 5 deletions src/UserGuide/V1.3.x/User-Manual/Data-Sync_timecho.md
Original file line number Diff line number Diff line change
Expand Up @@ -162,7 +162,7 @@ IoTDB> show pipeplugins

This example is used to demonstrate the synchronisation of all data from one IoTDB to another IoTDB with the data link as shown below:

![](https://alioss.timecho.com/docs/img/w1.png)
![](https://alioss.timecho.com/docs/img/e1.png)

In this example, we can create a synchronisation task named A2B to synchronise the full amount of data from IoTDB A to IoTDB B. Here we need to use the iotdb-thrift-sink plugin (built-in plugin) which uses sink, and we need to specify the address of the receiving end, in this example, we have specified 'sink.ip' and 'sink.port', and we can also specify 'sink.port'. This example specifies 'sink.ip' and 'sink.port', and also 'sink.node-urls', as in the following example statement:

Expand All @@ -180,7 +180,7 @@ with sink (

This example is used to demonstrate the synchronisation of data from a certain historical time range (8:00pm 23 August 2023 to 8:00pm 23 October 2023) to another IoTDB, the data link is shown below:

![](https://alioss.timecho.com/docs/img/w2.png)
![](https://alioss.timecho.com/docs/img/e2.png)

In this example we can create a synchronisation task called A2B. First of all, we need to define the range of data to be transferred in source, since the data to be transferred is historical data (historical data refers to the data that existed before the creation of the synchronisation task), we need to configure the source.realtime.enable parameter to false; at the same time, we need to configure the start-time and end-time of the data and the mode mode of the transfer. At the same time, you need to configure the start-time and end-time of the data and the mode mode of transmission, and it is recommended that the mode be set to hybrid mode (hybrid mode is a mixed transmission mode, which adopts the real-time transmission mode when there is no backlog of data, and adopts the batch transmission mode when there is a backlog of data, and automatically switches according to the internal situation of the system).

Expand All @@ -205,7 +205,7 @@ with SINK (

This example is used to demonstrate a scenario where two IoTDBs are dual-active with each other, with the data link shown below:

![](https://alioss.timecho.com/docs/img/w3.png)
![](https://alioss.timecho.com/docs/img/e3.png)

In this example, in order to avoid an infinite loop of data, the parameter `'source.forwarding-pipe-requests` needs to be set to ``false`` on both A and B to indicate that the data transferred from the other pipe will not be forwarded. Also set `'source.history.enable'` to `false` to indicate that historical data is not transferred, i.e., data prior to the creation of the task is not synchronised.

Expand Down Expand Up @@ -245,7 +245,7 @@ with sink (

This example is used to demonstrate a cascading data transfer scenario between multiple IoTDBs, where data is synchronised from cluster A to cluster B and then to cluster C. The data link is shown in the figure below:

![](https://alioss.timecho.com/docs/img/w4.png)
![](https://alioss.timecho.com/docs/img/e4.png)

In this example, in order to synchronise the data from cluster A to C, the pipe between BC needs to be configured with `source.forwarding-pipe-requests` to `true`, the detailed statement is as follows:

Expand Down Expand Up @@ -277,7 +277,7 @@ with sink (

This example is used to demonstrate a scenario where data from one IoTDB is synchronised to another IoTDB via a unidirectional gate, with the data link shown below:

![](https://alioss.timecho.com/docs/img/w5.png)
![](https://alioss.timecho.com/docs/img/e5.png)

In this example, you need to use the iotdb-air-gap-sink plug-in in the sink task (currently supports some models of network gates, please contact the staff of Tianmou Technology to confirm the specific model), and after configuring the network gate, execute the following statements on IoTDB A, where ip and port fill in the information of the network gate, and the detailed statements are as follows:

Expand Down
2 changes: 1 addition & 1 deletion src/UserGuide/V1.3.x/User-Manual/Streaming.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ Pipe Extractor is used to extract data, Pipe Processor is used to process data,

**The model for a Pipe task is as follows:**

![任务模型图](https://alioss.timecho.com/docs/img/%E5%90%8C%E6%AD%A5%E5%BC%95%E6%93%8E.jpeg)
![pipe.png](https://alioss.timecho.com/docs/img/pipe.png)
A data stream processing task essentially describes the attributes of the Pipe Extractor, Pipe Processor, and Pipe Connector plugins.

Users can configure the specific attributes of these three subtasks declaratively using SQL statements. By combining different attributes, flexible data ETL (Extract, Transform, Load) capabilities can be achieved.
Expand Down
16 changes: 7 additions & 9 deletions src/UserGuide/V1.3.x/User-Manual/Streaming_timecho.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ Pipe Extractor is used to extract data, Pipe Processor is used to process data,

**The model of the Pipe task is as follows:**

![Task model diagram](https://alioss.timecho.com/docs/img/%E5%90%8C%E6%AD%A5%E5%BC%95%E6%93%8E.jpeg)
![pipe.png](https://alioss.timecho.com/docs/img/pipe.png)

Describing a data flow processing task essentially describes the properties of Pipe Extractor, Pipe Processor and Pipe Connector plugins.
Users can declaratively configure the specific attributes of the three subtasks through SQL statements, and achieve flexible data ETL capabilities by combining different attributes.
Expand Down Expand Up @@ -614,14 +614,12 @@ WITH CONNECTOR (

**When creating a stream processing task, you need to configure the PipeId and the parameters of the three plugin parts:**


| Configuration item | Description | Required or not | Default implementation | Default implementation description | Whether custom implementation is allowed |
| --------- | --------------------------------------------------- | --------------------------- | -------------------- | -------------------------------------------------------- | ------------------------- |
| PipeId | A globally unique name that identifies a stream processing task | <font color=red>Required</font> | - | - | - |
| extractor | Pipe Extractor plugin, responsible for extracting stream processing data at the bottom of the database | Optional | iotdb-extractor | Integrate the full historical data of the database and subsequent real-time data arriving into the stream processing task | No |
| processor | Pipe Processor plugin, responsible for processing data | Optional | do-nothing-processor | Optional | do-nothing-processor | | processor | Pipe Processor plugin, responsible for processing data | Optional | do-nothing-processor | Does not do any processing on the incoming data | <font color=red>Yes</font> |
| <font color=red>是</font> |
| connector | Pipe Connector plugin, responsible for sending data | <font color=red>Required</font> | - | - | <font color=red>是</font> |
| Configuration | Description | Required or not | Default implementation | Default implementation description | Default implementation description |
| ------------- | ------------------------------------------------------------ | ------------------------------- | ---------------------- | ------------------------------------------------------------ | ---------------------------------- |
| PipeId | A globally unique name that identifies a stream processing | <font color=red>Required</font> | - | - | - |
| extractor | Pipe Extractor plugin, responsible for extracting stream processing data at the bottom of the database | Optional | iotdb-extractor | Integrate the full historical data of the database and subsequent real-time data arriving into the stream processing task | No |
| processor | Pipe Processor plugin, responsible for processing data | Optional | do-nothing-processor | Does not do any processing on the incoming data | <font color=red>Yes</font> |
| connector | Pipe Connector plugin, responsible for sending data | <font color=red>Required</font> | - | - | <font color=red>Yes</font> |

In the example, the iotdb-extractor, do-nothing-processor and iotdb-thrift-connector plugins are used to build the data flow processing task. IoTDB also has other built-in stream processing plugins, **please check the "System Preset Stream Processing plugin" section**.

Expand Down

0 comments on commit bc4a10f

Please sign in to comment.