Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FLINK-36784][docs][pipeline-connector/mysql] Add op_ts metadata docs… #3949

Merged
merged 1 commit into from
Mar 14, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 9 additions & 0 deletions docs/content.zh/docs/connectors/pipeline-connectors/mysql.md
Original file line number Diff line number Diff line change
Expand Up @@ -312,6 +312,15 @@ pipeline:
<td>Boolean</td>
<td>是否将TINYINT(1)类型当做Boolean类型处理,默认true。</td>
</tr>
<tr>
<td>metadata.list</td>
<td>optional</td>
<td style="word-wrap: break-word;">false</td>
<td>String</td>
<td>
可额外读取的SourceRecord中元数据的列表,后续可直接使用在transform模块,英文逗号 `,` 分割。目前可用值包含:op_ts。
</td>
</tr>
</tbody>
</table>
</div>
Expand Down
23 changes: 23 additions & 0 deletions docs/content.zh/docs/core-concept/transform.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,6 +70,29 @@ There are some hidden columns used to access metadata information. They will onl
| __table_name__ | String | Name of the table that contains the row. |
| __data_event_type__ | String | Operation type of data change event. |

除了以上的元数据字段,连接器也可以解析额外的元数据并放入 DataChangeEvent 的 meta map中,这些元数据也可以在transform使用。
例如如下的作业,MySQL 连接器可以解析 `op_ts` 元数据并在transform中使用。

```yaml
source:
type: mysql
hostname: localhost
port: 3306
username: testuser
password: testpwd
tables: testdb.customer
server-id: 5400-5404
server-time-zone: UTC
metadata.list: op_ts

transform:
- source-table: testdb.customer
projection: \*, __namespace_name__ || '.' || __schema_name__ || '.' || __table_name__ AS identifier_name, __data_event_type__ AS type, op_ts AS opts

sink:
type: values
```

## Metadata relationship

| Type | Namespace | SchemaName | Table |
Expand Down
9 changes: 9 additions & 0 deletions docs/content/docs/connectors/pipeline-connectors/mysql.md
Original file line number Diff line number Diff line change
Expand Up @@ -332,6 +332,15 @@ pipeline:
When 'use.legacy.json.format' = 'false', the data would be converted to {"key1": "value1", "key2": "value2"}, with whitespace before values and after commas preserved.
</td>
</tr>
<tr>
<td>metadata.list</td>
<td>optional</td>
<td style="word-wrap: break-word;">false</td>
<td>String</td>
<td>
List of readable metadata from SourceRecord to be passed to downstream and could be used in transform module, split by `,`. Available readable metadata are: op_ts.
</td>
</tr>
</tbody>
</table>
</div>
Expand Down
24 changes: 24 additions & 0 deletions docs/content/docs/core-concept/transform.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,6 +70,30 @@ There are some hidden columns used to access metadata information. They will onl
| __table_name__ | String | Name of the table that contains the row. |
| __data_event_type__ | String | Operation type of data change event. |

Besides these fields, pipeline connectors could parse more metadata and put them in the meta map of the DataChangeEvent.
These metadata could be accessed in the transform module.
For example, MySQL pipeline connector could parse `op_ts` and use it in the transform module.

```yaml
source:
type: mysql
hostname: localhost
port: 3306
username: testuser
password: testpwd
tables: testdb.customer
server-id: 5400-5404
server-time-zone: UTC
metadata.list: op_ts

transform:
- source-table: testdb.customer
projection: \*, __namespace_name__ || '.' || __schema_name__ || '.' || __table_name__ AS identifier_name, __data_event_type__ AS type, op_ts AS opts

sink:
type: values
```

## Metadata relationship

| Type | Namespace | SchemaName | Table |
Expand Down