潇湘夜雨移动版

主页 > 系统 > 操作系统 >

lnmp环境的网站常用维护命令

1.打包压缩命令
[root@www data]# tar -zcf blog-`date +%F`.tar.gz fenglin/   #压缩归档
[root@www data]# ls
blog-2016-09-16.tar.gz  
[root@www data]# tar -zxvf blog-2016-09-16.tar.gz   #解压命令
2.mysql数据库完全备份和单库备份命令
mysqldump -u root -p dream>dream-`date +%F-%H-%M-%S`.sql   #备份dream数据库
mysqldump -u root -p --master-data=1 --events --lock-all-tables --flush-logs --all-databases >mysql-`date +%F-%H-%M-%S`.sql  #完全备份
[root@www ~]# ls    #查看备份后的文件
 mysql-2016-09-16-19-43-13.sql
dream-2016-09-16-19-48-20.sql  
3.查看mysql的相关格式
mysql> show variables like'%format%';
+--------------------------+-------------------+
| Variable_name            | Value             |
+--------------------------+-------------------+
| binlog_format            | STATEMENT         |
| date_format              | %Y-%m-%d          |
| datetime_format          | %Y-%m-%d %H:%i:%s |
| default_week_format      | 0                 |
| innodb_file_format       | Antelope          |
| innodb_file_format_check | ON                |
| innodb_file_format_max   | Antelope          |
| time_format              | %H:%i:%s          |
+--------------------------+-------------------+

4.查找和过滤日志ip
   sed '/127.0.0.1/d' /var/log/nginx/access.log #过滤包含“127.0.0.1”的记录
 
   sed -n '/172.18.109.207/p' /var/log/nginx/access.log #查找包含“172.18.109.207”的记录
 
5.替换内容:
   [root@m-proxy1 log]# awk '{print $7}' access.log |head -3
www.xzitv.com/t/jscell/2d/1_8_111_6_22305.php
www.xzitv.com/t/1/4/js/posterTvGrid.js
www.xzitv.com/m2o/player/keep_alive.php?access%5Ftoken=&id=15&time=1481384520676&extend=
[root@m-proxy1 log]# awk '{print $7}' access.log |head -3|sed 's#www#ftp#g'
ftp.xzitv.com/t/jscell/2d/1_8_111_6_22305.php
ftp.xzitv.com/t/1/4/js/posterTvGrid.js
ftp.xzitv.com/m2o/player/keep_alive.php?access%5Ftoken=&id=15&time=1481384520676&extend=
案例:使用sed替换字符串和改变文件内容
[root@youxiang ~]# cat connect.php #文件原始内容
<?php
 $conn=mysql_connect('171.18.109.234','root','123456');
 if ($conn)
echo "Success...";
 else
echo "Failure...";
?>
[root@youxiang ~]# sed 's#root#mysql#g' connect.php #查看文件并将root字符显示成mysql
<?php
 $conn=mysql_connect('171.18.109.234','mysql','123456');
 if ($conn)
echo "Success...";
 else
echo "Failure...";
?>
[root@youxiang ~]# sed -i 's#root#mysql#g' connect.php #替换文件内容
[root@youxiang ~]# cat connect.php 
<?php
 $conn=mysql_connect('171.18.109.234','mysql','123456');
 if ($conn)
echo "Success...";
 else
echo "Failure...";
?>
 
6.日志域名统计
[root@m-proxy1 log]# awk '{print $7}' access.log |head -3
www.xzitv.com/t/jscell/2d/1_8_111_6_22305.php
www.xzitv.com/t/1/4/js/posterTvGrid.js
www.xzitv.com/m2o/player/keep_alive.php?access%5Ftoken=&id=15&time=1481384520676&extend=
[root@m-proxy1 log]# awk '{print $7}' access.log |head -3|sed ' s/\/.*//' #匹配第一个斜杠
www.xzitv.com
www.xzitv.com
www.xzitv.com
或者 awk -F/ '{print $1}' 文件名 #以/为分隔符并匹配第一个/
[root@m-proxy1 log]# awk '{print $7}' access.log |sed 's/\/.*//'|sort|uniq -c|sort -rn|head -5
 19752 www.xzitv.com
 18949 img.xzitv.cn
  5893 m2o.xzitv.cn
  2551 mobile.xzitv.cn
  2336 adv.xzitv.cn
案例:匹配web访问日志中域名
[root@youxiang ~]# cat weburl 
http://www.baidu.com/index.<a target="_blank" href="http://www.2cto.com/kf/qianduan/css/" class="keylink" style="border:none; padding:0px; margin:0px; color:rgb(51,51,51); text-decoration:none; font-size:14px">html</a>  
http://www.baidu.com/1.html  
http://post.baidu.com/index.html  
http://mp3.baidu.com/index.html  
http://www.baidu.com/3.html  
http://post.baidu.com/2.html  
[root@youxiang ~]# sed -e ' s/http:\/\///' -e ' s/\/.*//' weburl 
www.baidu.com
www.baidu.com
post.baidu.com
mp3.baidu.com
www.baidu.com
post.baidu.com
 
或者用awk命令:
[root@youxiang ~]# awk -F/ '{print $3}' weburl |sort -r|uniq -c|awk '{print $1"\t",$2}'
3 www.baidu.com
2 post.baidu.com
1 mp3.baidu.com
 
loki日志语法:
json格式替换
{app="loki"}| json| line_format `{{ regexReplaceAll "([0-9]{1,3}\\.){3}[0-9]{1,3}" .log "11111" }}`
非json格式替换
{app="loki"}| line_format `{{ regexReplaceAll "([0-9]{1,3}\\.){3}[0-9]{1,3}" __line__ "11111" }}`
{app="loki"}| json| .log =~ "([0-9]{1,3}\\.){3}[0-9]{1,3}"|line_format `{{ regexReplaceAll "([0-9]{1,3}\\.){3}[0-9]{1,3}" .log "11111" }}`
 
{app="loki"}|~ "([0-9]{1,3}\\.){3}[0-9]{1,3}"| json| line_format `{{ regexReplaceAll "([0-9]{1,3}\\.){3}[0-9]{1,3}" .log "11111" }}`
 
{cluster=~"prod-core", container=~"nginx-ingress-controller"} |json |path =~ `/data-receiver/\d+v/file-blocks/\d+/v\d+`| line_format `{{ regexReplaceAll "/data-receiver/\\d+v/file-blocks/\\d+/v\\d+" .path "data-receiver-api" }}`
 
 
{container="data-receiver"} | regexp `useTime:(?P<use_time>\d+)` | unwrap use_time [1s]) by(container)
 
upload block fun merge====
 
upload block fun append====
 
{cluster=~"prod-core", container=~"nginx-ingress-controller"} |json |path =~ `/data-receiver/\d+v/file-blocks/\d+/v\d+`| line_format `{{ regexReplaceAll "/data-receiver/\\d+v/file-blocks/\\d+/v\\d+" __line__ "data-receiver-api" }}`
 
{cluster=~"prod-core", container=~"nginx-ingress-controller"} |json |path =~ `/data-receiver/\d+v/file-blocks/\d+/v\d+`| request_time > 5
 
upstream_response_time
 
{cluster=~"prod-core", container=~"nginx-ingress-controller"} |json |path =~ `/data-receiver/\d+v/file-blocks/\d+/v\d+`|upstream_response_time =~ `\d+\.\d+\,\d+\.\d+`
 
 
-
 
{cluster=~"prod-core", container=~"nginx-ingress-controller"} |json |path =~ `/data-receiver/\d+v/file-blocks/\d+/v\d+`|upstream_response_time !~ `\d+\.\d+`
{app="data-receiver"} |~ "OssTime"
 
 
avg_over_time({app="data-receiver"} |~ "OssTime" |~ "upload block fun append===="| regexp `OssTime:(?P<oss_time>\d+)` | unwrap oss_time [1s]) by(container)
 
avg_over_time({app="data-receiver"} |~ "OssTime" |~ "upload block fun append===="| regexp `kafka1:(?P<kafka1>\d+)` | unwrap kafka1 [1s]) by(container)
 
prod-data-model-analysis
 
admission.datakit/java-lib.version
v1.25.2-guance
 
{container="data-receiver"} |~ "upload block fun append===="|~ "kafka1:\\d\\d"
 
 
avg_over_time({cluster=~"prod-core", container=~"nginx-ingress-controller"} |json |path =~ `/data-receiver/\d+v/file-blocks/\d+/v\d+` and upstream_response_time !~ `-`| line_format `{{ regexReplaceAll "/data-receiver/\\d+v/file-blocks/\\d+/v\\d+" __line__ "data-receiver-api" }}`|unwrap upstream_response_time[1s]) by (vhost)
 
 
avg_over_time({source="ilogtail"} |json |contents_request_uri =~ `/data-receiver/\d+v/file-blocks/\d+/v\d+` and contents_upstream_response_time !~ `-`| line_format `{{ regexReplaceAll "/data-receiver/\\d+v/file-blocks/\\d+/v\\d+" __line__ "data-receiver-api" }}`|unwrap contents_upstream_response_time[1s]) by (contents_domain)
 
 
 
 
avg_over_time({cluster=~"prod-core", container=~"nginx-ingress-controller"} | json | vhost="ocs.DD"|unwrap request_time[1s]) by 域名)

grafana配置loki告警规则:
grafna配置loki告警规则,获取{{ $value }}变量值中过滤value=的值:{{ .Values.A.Value }}
 
[ var='A' labels={cluster=prod-core, container=async-recorder, pod=prod-async-recorder-57b467fc89-vjx9v} value=204 ], [ var='B' labels={cluster=prod-core, container=async-recorder, pod=prod-async-recorder-57b467fc89-vjx9v} value=204 ]
 
 
 
 
错误日志告警: 服务{{ $labels.container }} 错误日志数{{ .Values.A.Value }}
 
 
此写法会匹配status="200"或status标签值为空的时间序列
 
http_requests_total{method="GET", status=~"200|^$"}
 
错误日志告警: 服务{{ $labels.container }} 错误日志数{{ .Values.A.Value }}
 
 
‌正则表达式排除‌
若需模糊匹配,使用!~运算符配合正则表达式:
 
{namespace!~".*150.*"}
 

(责任编辑:liangzh)