Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add aof_writer cmd_writer json_writer #914

Open
wants to merge 3 commits into
base: v4
Choose a base branch
from

Conversation

carlvine500
Copy link

@carlvine500 carlvine500 commented Dec 30, 2024

json_writer

json_writer 应用场景:

转换成json之后导入到mongodb中(mongodb以json格式存储), 提供同事做数据分析, 例如: 按dbIndex/前缀分析各个业务的数据量内存占比, 找出各个业务前缀中serialize最大的100个key .

json_writer输出格式示例:

{"DbId":0,"Argv":["SELECT","0"],"CmdName":"SELECT","Group":"CONNECTION","Keys":null,"KeyIndexes":null,"Slots":[],"SerializedSize":23}
{"DbId":0,"Argv":["set","key1","1"],"CmdName":"SET","Group":"STRING","Keys":["key1"],"KeyIndexes":[2],"Slots":[9189],"SerializedSize":30}
{"DbId":0,"Argv":["set","key2","2"],"CmdName":"SET","Group":"STRING","Keys":["key2"],"KeyIndexes":[2],"Slots":[4998],"SerializedSize":30}
{"DbId":0,"Argv":["set","key3","3"],"CmdName":"SET","Group":"STRING","Keys":["key3"],"KeyIndexes":[2],"Slots":[935],"SerializedSize":30}
{"DbId":0,"Argv":["sadd","key4","1","2","3","4"],"CmdName":"SADD","Group":"SET","Keys":["key4"],"KeyIndexes":[2],"Slots":[13120],"SerializedSize":52}
{"DbId":0,"Argv":["lpush","key5","1","2","3","4","5"],"CmdName":"LPUSH","Group":"LIST","Keys":["key5"],"KeyIndexes":[2],"Slots":[9057],"SerializedSize":60}
{"DbId":0,"Argv":["zadd","key6","1","2","3","4","5","6"],"CmdName":"ZADD","Group":"SORTED_SET","Keys":["key6"],"KeyIndexes":[2],"Slots":[4866],"SerializedSize":66}

cmd_writer

cmd_writer应用场景

配合filter导出主数据, 做完批量数据订正后, 重新执行修正的命令 redis-cli -p 16379 < cmd.txt

cmd_writer输出格式示例:

SELECT 0
set key1 1
set key2 2
set key3 3
sadd key4 1 2 3 4
lpush key5 1 2 3 4 5
zadd key6 1 2 3 4 5 6

aof_writer

aof_writer应用场景

配合filter把对应前缀的业务数据导出, 再配合redis-cli --pipe 快速导入到客户内网

aof_writer 输出格式示例:

*2
$6
SELECT
$1
0
*3
$3
set
$4
key1
$1
1

@EquentR
Copy link
Collaborator

EquentR commented Dec 30, 2024

理论上来说aof_writer似乎没有必要,因为可以将redis设置为纯aof模式然后把aof copy出来。
至于其他两种,感觉可以作为异构数据导入功能,但是单纯的json或命令导出为文件还是需要对应的转换工具才能使用,redisshake作为为redis做数据迁移的工具,感觉这两个已经脱离redis本身的范畴了🤣

@carlvine500
Copy link
Author

理论上来说aof_writer似乎没有必要,因为可以将redis设置为纯aof模式然后把aof copy出来。 至于其他两种,感觉可以作为异构数据导入功能,但是单纯的json或命令导出为文件还是需要对应的转换工具才能使用,redisshake作为为redis做数据迁移的工具,感觉这两个已经脱离redis本身的范畴了🤣

如果appendonly.aof按业务前缀存储了, 有A业务, B业务
A:keyXXX
B:keyYYY
当需要把A业务迁移到客户离线环境的时候, 可以当按A:业务前缀filter出来导出文件A.aof , 再放到需要的离线环境中导入

公网的环境不能通客户内网的时候, 导出文件传输到客户内网还是有用处的 .

我们也有银行客户多个系统之间是不通的, 只允许通过共享一个文件, 定期存取一个文件来交换数据 .

@EquentR
Copy link
Collaborator

EquentR commented Dec 30, 2024

好吧,也算合理🤣

@carlvine500
Copy link
Author

carlvine500 commented Dec 31, 2024

@EquentR json格式的, 导入到mongodb分析会很方便, 可以写类似sql的语句做分析, 运维工程师也方便使用linux jq命令分析, 如下示例:
筛选业务前缀A:开头的Keys, 并按序列化SerializedSize从大到小排序:

3991735609525_ pic

@EquentR
Copy link
Collaborator

EquentR commented Dec 31, 2024

@carlvine500 感觉做成异构目的端的数据适配器挺合适,但写成resp命令相关的文件,我个人觉得有点奇怪🤣我这边也有涉及一些跨网络隔离装置传输数据相关的功能,有使用redisshake将内网区数据通过单向tcp导出到外网区,但目的端还是redis。不知道你这个feature泛用性如何

@suxb201 咋看这个PR呢?

@suxb201
Copy link
Member

suxb201 commented Jan 2, 2025

我觉得这个 PR 挺好的,如果能补充上文档和配置样例更好了。另外请问考虑改成下面这种配置吗:

[file_writer]
path=""
type="" # json aof cmd

但是这样配置,如果后面 json 格式要添加配置微调格式就不好搞了(但应该不会有类似需求吧)。

@carlvine500
Copy link
Author

我觉得这个 PR 挺好的,如果能补充上文档和配置样例更好了。另外请问考虑改成下面这种配置吗:

[file_writer]
path=""
type="" # json aof cmd

但是这样配置,如果后面 json 格式要添加配置微调格式就不好搞了(但应该不会有类似需求吧)。

收到, 我找时间重构一下

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants