欢迎访问 南瓜慢说 www.pkslow.com获取更多精彩文章!
以前咱们用两篇文章讲解了Spring Cloud Data Flow
,例子都是用UI
操做的,但咱们在Linux
系统上常常是没法提供界面来操做,集成在Jenkins
上也没法使用UI
。好在官方提供了Data Flow Shell
工具,能够在命令行模式下进行操做,很是方便。java
相关文章可参考:linux
Spring Cloud Data Flow初体验,以Local模式运行spring
把Spring Cloud Data Flow部署在Kubernetes上,再跑个任务试试docker
Spring Cloud Data Flow Server
提供了可操做的REST API
,因此这个Shell
工具的本质仍是经过调用REST API
来交互的。shell
首先要确保咱们已经安装有Java
环境和下载了可执行的jar
包:spring-cloud-dataflow-shell-2.5.3.RELEASE.jarbash
而后启动以下:微信
$ java -jar spring-cloud-dataflow-shell-2.5.3.RELEASE.jar
默认是链接了http://localhost:9393
的Server
,能够经过--dataflow.uri=地址
来指定。若是须要认证信息,须要加上--dataflow.username=用户 --dataflow.password=密码
。app
好比咱们链接以前安装在Kubernetes
上的Server
以下:工具
$ java -jar spring-cloud-dataflow-shell-2.5.3.RELEASE.jar --dataflow.uri=http://localhost:30093
介绍一下Application
相关操做:post
列出全部目前注册的app
:
dataflow:>app list ╔═══╤══════╤═════════╤════╤════════════════════╗ ║app│source│processor│sink│ task ║ ╠═══╪══════╪═════════╪════╪════════════════════╣ ║ │ │ │ │composed-task-runner║ ║ │ │ │ │timestamp-batch ║ ║ │ │ │ │timestamp ║ ╚═══╧══════╧═════════╧════╧════════════════════╝
查看某个app
的信息:
dataflow:>app info --type task timestamp
清除app
注册信息:
dataflow:>app unregister --type task timestamp Successfully unregistered application 'timestamp' with type 'task'.
清除全部app
注册信息:
dataflow:>app all unregister Successfully unregistered applications. dataflow:>app list No registered apps. You can register new apps with the 'app register' and 'app import' commands.
注册一个app
:
dataflow:>app register --name timestamp-pkslow --type task --uri docker:springcloudtask/timestamp-task:2.1.1.RELEASE Successfully registered application 'task:timestamp-pkslow' dataflow:>app list ╔═══╤══════╤═════════╤════╤════════════════╗ ║app│source│processor│sink│ task ║ ╠═══╪══════╪═════════╪════╪════════════════╣ ║ │ │ │ │timestamp-pkslow║ ╚═══╧══════╧═════════╧════╧════════════════╝
批量导入app
,能够从一个URL
或一个properties
文件导入:
dataflow:>app import https://dataflow.spring.io/task-docker-latest Successfully registered 3 applications from [task.composed-task-runner, task.timestamp.metadata, task.composed-task-runner.metadata, task.timestamp-batch.metadata, task.timestamp-batch, task.timestamp]
须要注意的是,在注册或导入app
时,若是重复的话,默认是没法导入的,不会覆盖。若是想要覆盖,能够加参数--force
。
dataflow:>app register --name timestamp-pkslow --type task --uri docker:springcloudtask/timestamp-task:2.1.1.RELEASE Command failed org.springframework.cloud.dataflow.rest.client.DataFlowClientException: The 'task:timestamp-pkslow' application is already registered as docker:springcloudtask/timestamp-task:2.1.1.RELEASE The 'task:timestamp-pkslow' application is already registered as docker:springcloudtask/timestamp-task:2.1.1.RELEASE dataflow:>app register --name timestamp-pkslow --type task --uri docker:springcloudtask/timestamp-task:2.1.1.RELEASE --force Successfully registered application 'task:timestamp-pkslow'
列出task
:
dataflow:>task list ╔════════════════╤════════════════════════════════╤═══════════╤═══════════╗ ║ Task Name │ Task Definition │description│Task Status║ ╠════════════════╪════════════════════════════════╪═══════════╪═══════════╣ ║timestamp-pkslow│timestamp │ │COMPLETE ║ ║timestamp-two │<t1: timestamp || t2: timestamp>│ │ERROR ║ ║timestamp-two-t1│timestamp │ │COMPLETE ║ ║timestamp-two-t2│timestamp │ │COMPLETE ║ ╚════════════════╧════════════════════════════════╧═══════════╧═══════════╝
删除一个task
,这里咱们删除的是一个组合task
,因此会把子task
也一并删除了:
dataflow:>task destroy timestamp-two Destroyed task 'timestamp-two'
删除全部task
,会有风险提示:
dataflow:>task all destroy Really destroy all tasks? [y, n]: y All tasks destroyed dataflow:>task list ╔═════════╤═══════════════╤═══════════╤═══════════╗ ║Task Name│Task Definition│description│Task Status║ ╚═════════╧═══════════════╧═══════════╧═══════════╝
建立一个task
:
dataflow:>task create timestamp-pkslow-t1 --definition "timestamp --format=\"yyyy\"" --description "pkslow timestamp task" Created new task 'timestamp-pkslow-t1'
启动一个task
并查看状态,启动时须要记录执行ID,而后经过执行ID来查询状态:
dataflow:>task launch timestamp-pkslow-t1 Launched task 'timestamp-pkslow-t1' with execution id 8 dataflow:>task execution status 8
查看全部task
执行并查看执行日志:
dataflow:>task execution list dataflow:>task execution log 8 . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v2.1.13.RELEASE) 2020-08-01 17:20:51.626 INFO 1 --- [ Thread-5] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Shutdown initiated... 2020-08-01 17:20:51.633 INFO 1 --- [ Thread-5] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Shutdown completed.
能够进行http
请求:
dataflow:>http get https://www.pkslow.com dataflow:>http post --target https://www.pkslow.com --data "data" > POST (text/plain) https://www.pkslow.com data > 405 METHOD_NOT_ALLOWED Error sending data 'data' to 'https://www.pkslow.com'
先准备一个脚本文件,用来放Data Flow Shell
命令,文件名为pkslow.shell
,内容以下:
version date app list
执行与结果以下:
dataflow:>script pkslow.shell version 2.5.3.RELEASE date Sunday, August 2, 2020 1:59:34 AM CST app list ╔═══╤══════╤═════════╤════╤════════════════════╗ ║app│source│processor│sink│ task ║ ╠═══╪══════╪═════════╪════╪════════════════════╣ ║ │ │ │ │timestamp-pkslow ║ ║ │ │ │ │composed-task-runner║ ║ │ │ │ │timestamp-batch ║ ║ │ │ │ │timestamp ║ ╚═══╧══════╧═════════╧════╧════════════════════╝ Script required 0.045 seconds to execute dataflow:>
但其实咱们在CI/CD
的pipeline
中,并不想先启动一个shell
命令行,而后再执行一个脚本。咱们想一步到位,直接执行,执行完毕后退出shell
命令行。这也是有办法的,能够在启动的时候经过 --spring.shell.commandFile
指定文件,若是有多个文件则用逗号,
分隔。以下所示:
$ java -jar spring-cloud-dataflow-shell-2.5.3.RELEASE.jar --dataflow.uri=http://localhost:30093 --spring.shell.commandFile=pkslow.shell Successfully targeted http://localhost:30093 2020-08-02T02:03:49+0800 INFO main o.s.c.d.s.DataflowJLineShellComponent:311 - 2.5.3.RELEASE 2020-08-02T02:03:49+0800 INFO main o.s.c.d.s.DataflowJLineShellComponent:311 - Sunday, August 2, 2020 2:03:49 AM CST 2020-08-02T02:03:49+0800 INFO main o.s.c.d.s.DataflowJLineShellComponent:309 - ╔═══╤══════╤═════════╤════╤════════════════════╗ ║app│source│processor│sink│ task ║ ╠═══╪══════╪═════════╪════╪════════════════════╣ ║ │ │ │ │timestamp-pkslow ║ ║ │ │ │ │composed-task-runner║ ║ │ │ │ │timestamp-batch ║ ║ │ │ │ │timestamp ║ ╚═══╧══════╧═════════╧════╧════════════════════╝ $
执行完毕后,不会在shell
命令行模式里,而是退回linux
的终端。这正是咱们所须要的。
咱们来准备一个注册应用——建立任务——执行任务
的脚本试试:
version date app register --name pkslow-app-1 --type task --uri docker:springcloudtask/timestamp-task:2.1.1.RELEASE task create pkslow-task-1 --definition "pkslow-app-1" task launch pkslow-task-1
执行与结果以下:
$ java -jar spring-cloud-dataflow-shell-2.5.3.RELEASE.jar --dataflow.uri=http://localhost:30093 --spring.shell.commandFile=pkslow.shell Successfully targeted http://localhost:30093 2020-08-02T02:06:41+0800 INFO main o.s.c.d.s.DataflowJLineShellComponent:311 - 2.5.3.RELEASE 2020-08-02T02:06:41+0800 INFO main o.s.c.d.s.DataflowJLineShellComponent:311 - Sunday, August 2, 2020 2:06:41 AM CST 2020-08-02T02:06:41+0800 INFO main o.s.c.d.s.DataflowJLineShellComponent:311 - Successfully registered application 'task:pkslow-app-1' 2020-08-02T02:06:42+0800 INFO main o.s.c.d.s.DataflowJLineShellComponent:311 - Created new task 'pkslow-task-1' 2020-08-02T02:06:51+0800 INFO main o.s.c.d.s.DataflowJLineShellComponent:311 - Launched task 'pkslow-task-1' with execution id 9
这样,咱们就能够实现自动化打包与部署运行了。
强大的shell
工具提供了许多命令,其实不用一一记住,能够经过help
命令查看全部命令:
dataflow:>help
若是只对特定的一类命令感兴趣,能够经过help xxx
的方式获取帮助:
dataflow:>help version * version - Displays shell version dataflow:>help app * app all unregister - Unregister all applications * app default - Change the default application version * app import - Register all applications listed in a properties file * app info - Get information about an application * app list - List all registered applications * app register - Register a new application * app unregister - Unregister an application
shell
还支持tab
键补全命令。
本文的命令比较多,不想形成冗长,部分执行结果就不贴出来了,原文可到官网参考。
欢迎关注微信公众号<南瓜慢说>,将持续为你更新...
多读书,多分享;多写做,多整理。