集群同步hive的脚本

程序员就是把一切手工作的事情变成让计算机来作,从而能够让本身偷偷懒。程序员

如下就是个很是low的hive文件夹同步程序,至于节点超过100个或者1000个的,能够加个循环了。shell

#!/bin/sh #================ hive 安装包同步 =================# # 该脚本用来将name节点的hive文件夹同步到data节点 # # 当hive安装包变更时,须要同步data节点,不然oozie # # 经过shell调用hive程序时,会由于分配的节点hive安 # # 装包不一样步而引发错误 # #==================================================# # 1.清理旧的hive ssh -t hadoop@dwprod-dataslave1 rm -r /opt/local/hive ssh -t hadoop@dwprod-dataslave2 rm -r /opt/local/hive ssh -t hadoop@dwprod-dataslave3 rm -r /opt/local/hive ssh -t hadoop@dwprod-dataslave4 rm -r /opt/local/hive ssh -t hadoop@dwprod-dataslave5 rm -r /opt/local/hive ssh -t hadoop@dwprod-dataslave6 rm -r /opt/local/hive ssh -t hadoop@dwprod-dataslave7 rm -r /opt/local/hive ssh -t hadoop@dwprod-dataslave8 rm -r /opt/local/hive ssh -t hadoop@dwprod-dataslave9 rm -r /opt/local/hive ssh -t hadoop@dwprod-dataslave10 rm -r /opt/local/hive # 2.拷贝新的hive scp -r -q /opt/local/hive hadoop@dwprod-dataslave1:/opt/local/
scp -r -q /opt/local/hive hadoop@dwprod-dataslave2:/opt/local/
scp -r -q /opt/local/hive hadoop@dwprod-dataslave3:/opt/local/
scp -r -q /opt/local/hive hadoop@dwprod-dataslave4:/opt/local/
scp -r -q /opt/local/hive hadoop@dwprod-dataslave5:/opt/local/
scp -r -q /opt/local/hive hadoop@dwprod-dataslave6:/opt/local/
scp -r -q /opt/local/hive hadoop@dwprod-dataslave7:/opt/local/
scp -r -q /opt/local/hive hadoop@dwprod-dataslave8:/opt/local/
scp -r -q /opt/local/hive hadoop@dwprod-dataslave9:/opt/local/
scp -r -q /opt/local/hive hadoop@dwprod-dataslave10:/opt/local/
相关文章
相关标签/搜索