此次要分享的是一个storage account log分析的solution,首先仍是按照老规矩,先讲下这个事的背景,以及咱们须要这么作的缘由,咱们都知道storage account能够做为很是实用的保存数据的存储,扩展性极强,便于使用,无需维护,仍是按需付费,因此愈来愈多数据开始被存放在storage account中,可是同时,咱们也须要愈来愈重视storage account的安全与合规,好比能够借助global azure security center中的功能,对storage account中的文件进行病毒扫描,另外,还能够对storage account使用service endpoint或者private endpoint来把他变成一个私有的服务数据库
这些其实都不是咱们要讲的重点,咱们今天要讲的是如何使用Log Analytics分析storage account访问log,为何须要这么作呢?Storage access的log是存储在storage里的,查阅并不方便,都是一些像IIS Log这种的字段,通常个人作法是下载下来以后手动转成CSV,而后利用excel进行分析,这种作法也很简单,可是步骤其实比较繁琐,而若是能够直接用Log Analytics进行分析的话至关于就是直接查询数据库里数据的感受,也不须要频繁的转换数据了,具体怎么作能够看下json
首先第一步咱们须要开启storage account的访问log,有些人可能不太清楚,默认实际上是不会记录详细的访问log的,须要在diagnostic这里开启
windows
开启以后,尝试访问几回,等待一段时间以后,在storage explorer里就能够看到log了api
而拿到的数据默认都是这种格式安全
2.0;2020-12-29T04:09:17.6971151Z;GetBlob;SASSuccess;200;681;3;sas;;mxyarmtemplate;blob;"https://mxyarmtemplate.blob.core.windows.net:443/template/Linuxvmdeploy.json?sv=2019-12-12&si=testpolicy&sr=c&sig=XXXXX";"/mxyarmtemplate/template/Linuxvmdeploy.json";c76b0385-401e-004d-3f98-dd4ba9000000;0;124.126.17.6:50735;2019-12-12;294;0;410;6805;0;;;""0x8D89F61D1748E94"";Sunday, 13-Dec-20 12:22:54 GMT;;"Mozilla/5.0 (Windows NT; Windows NT 10.0; zh-CN) WindowsPowerShell/5.1.18362.1171";;;;;;;;;; 2.0;2020-12-29T04:09:23.4252152Z;GetBlob;SASSuccess;200;728;4;sas;;mxyarmtemplate;blob;"https://mxyarmtemplate.blob.core.windows.net:443/template/Linuxvmdeploy.json?sv=2019-12-12&si=testpolicy&sr=c&sig=XXXXX";"/mxyarmtemplate/template/Linuxvmdeploy.json";c76b0db9-401e-004d-2a98-dd4ba9000000;0;124.126.17.6:50735;2019-12-12;294;0;410;6805;0;;;""0x8D89F61D1748E94"";Sunday, 13-Dec-20 12:22:54 GMT;;"Mozilla/5.0 (Windows NT; Windows NT 10.0; zh-CN) WindowsPowerShell/5.1.18362.1171";;;;;;;;;;
这种格式是没办法被log analytics接受的,咱们目前采用的办法是把storage account的访问log下载下来,而后手动上传到log analytics的方式,log analytics能够接受的格式是JSON,因此咱们须要先把这种格式准换成JSON,能够用下边的脚原本作这件事app
Function ConvertSemicolonToURLEncoding([String] $InputText) { $ReturnText = "" $chars = $InputText.ToCharArray() $StartConvert = $false foreach($c in $chars) { if($c -eq '"') { $StartConvert = ! $StartConvert } if($StartConvert -eq $true -and $c -eq ';') { $ReturnText += "%3B" } else { $ReturnText += $c } } return $ReturnText } Function FormalizeJsonValue($Text) { $Text1 = "" if($Text.IndexOf("`"") -eq 0) { $Text1=$Text } else {$Text1="`"" + $Text+ "`""} if($Text1.IndexOf("%3B") -ge 0) { $ReturnText = $Text1.Replace("%3B", ";") } else { $ReturnText = $Text1 } return $ReturnText } Function ConvertLogLineToJson([String] $logLine) { $logLineEncoded = ConvertSemicolonToURLEncoding($logLine) $elements = $logLineEncoded.split(';') $FormattedElements = New-Object System.Collections.ArrayList foreach($element in $elements) { $NewText = FormalizeJsonValue($element) $FormattedElements.Add($NewText) > null } $Columns = ( "version-number", "request-start-time", "operation-type", "request-status", "http-status-code", "end-to-end-latency-in-ms", "server-latency-in-ms", "authentication-type", "requester-account-name", "owner-account-name", "service-type", "request-url", "requested-object-key", "request-id-header", "operation-count", "requester-ip-address", "request-version-header", "request-header-size", "request-packet-size", "response-header-size", "response-packet-size", "request-content-length", "request-md5", "server-md5", "etag-identifier", "last-modified-time", "conditions-used", "user-agent-header", "referrer-header", "client-request-id" ) # Propose json payload $logJson = "[{"; For($i = 0;$i -lt $Columns.Length;$i++) { $logJson += "`"" + $Columns[$i] + "`":" + $FormattedElements[$i] if($i -lt $Columns.Length - 1) { $logJson += "," } } $logJson += "}]"; return $logJson }
这几个函数能够把storage account的访问log直接转换成JSON格式ide
转换完成后,结合下边的脚本把log post到log analytics就能够了,注意须要把customid和key之类的变量替换成本身的实际值
函数
$TimeStampField = "" $LogType = "" $SharedKey = "" $CustomerId = "" $ResourceGroup = "" Function Build-Signature ($customerId, $sharedKey, $date, $contentLength, $method, $contentType, $resource) { $xHeaders = "x-ms-date:" + $date $stringToHash = $method + "`n" + $contentLength + "`n" + $contentType + "`n" + $xHeaders + "`n" + $resource $bytesToHash = [Text.Encoding]::UTF8.GetBytes($stringToHash) $keyBytes = [Convert]::FromBase64String($sharedKey) $sha256 = New-Object System.Security.Cryptography.HMACSHA256 $sha256.Key = $keyBytes $calculatedHash = $sha256.ComputeHash($bytesToHash) $encodedHash = [Convert]::ToBase64String($calculatedHash) $authorization = 'SharedKey {0}:{1}' -f $customerId,$encodedHash return $authorization } # # Create the function to create and post the request # Function Post-LogAnalyticsData($customerId, $sharedKey, $body, $logType) { <# CustomerID Log Analytics 工做区的惟一标识符。 #> $method = "POST" $contentType = "application/json" $resource = "/api/logs" $rfc1123date = [DateTime]::UtcNow.ToString("r") $contentLength = $body.Length $signature = Build-Signature ` -customerId $customerId ` -sharedKey $sharedKey ` -date $rfc1123date ` -contentLength $contentLength ` -method $method ` -contentType $contentType ` -resource $resource $uri = "https://" + $customerId + ".ods.opinsights.azure.com" + $resource + "?api-version=2016-04-01" $headers = @{ "Authorization" = $signature; "Log-Type" = $logType; "x-ms-date" = $rfc1123date; "time-generated-field" = $TimeStampField; } $response = Invoke-WebRequest -Uri $uri -Method $method -ContentType $contentType -Headers $headers -Body $body -UseBasicParsing return $response.StatusCode } $storageAccount = Get-AzStorageAccount -ResourceGroupName $ResourceGroup -Name $StorageAccountName -ErrorAction SilentlyContinue if($null -eq $storageAccount) { throw "The storage account specified does not exist in this subscription." } $storageContext = $storageAccount.Context $containers = New-Object System.Collections.ArrayList $container = Get-AzStorageContainer -Context $storageContext -Name "$ContainerName" -ErrorAction SilentlyContinue | ForEach-Object { $containers.Add($_) } | Out-Null Write-Output("> Container count: {0}" -f $containers.Count) $token = $Null $maxReturn = 5000 $successPost = 0 $failedPost = 0 # Enumerate containers $containers | ForEach-Object { $container = $_.CloudBlobContainer Write-Output("> Reading container {0}" -f $container.Name) do { $blobs = Get-AzStorageBlob -Context $storageContext -Container $container.Name -MaxCount $maxReturn -ContinuationToken $token if($Null -eq $blobs) { break } #Set-StrictMode will cause Get-AzStorageBlob returns result in different data types when there is only one blob if($blobs.GetType().Name -eq "AzureStorageBlob") { $token = $Null } else { $token = $blobs[$blobs.Count - 1].ContinuationToken; } # Enumerate log blobs foreach($blob in $blobs) { Write-Output("> Downloading blob: {0}" -f $blob.Name) $filename = ".\log.txt" Get-AzStorageBlobContent -Context $storageContext -Container $container.Name -Blob $blob.Name -Destination $filename -Force > Null Write-Output("> Posting logs to log analytic workspace: {0}" -f $blob.Name) $lines = Get-Content $filename foreach($line in $lines) { $json = ConvertLogLineToJson($line) $response = Post-LogAnalyticsData -customerId $customerId -sharedKey $sharedKey -body ([System.Text.Encoding]::UTF8.GetBytes($json)) -logType $logType if($response -eq "200") { $successPost++ } else { $failedPost++ Write-Output "> Failed to post one log to Log Analytics workspace" } } remove-item $filename -Force } } While ($token -ne $Null) Write-Output "> Log lines posted to Log Analytics workspace: success = $successPost, failure = $failedPost" }
脚本运行以后,能够看到已经在下载、转换、而后上传log了post
LA里已经能够查到log了,这个StorageAccountLog是在脚本里本身设置的LogType,不是自动生成的ui