zipkin-java-brave源码分析(三)

什么是brave

brave是zipkin官方提供的java版本zipkin-client实现java

brave提供的功能

<modules>
    <module>brave-core</module>
    <module>brave-benchmarks</module>
    <module>brave-http</module>
    <module>brave-core-spring</module>
    <module>brave-resteasy-spring</module>
    <module>brave-resteasy3-spring</module>
    <module>brave-spancollector-http</module>
    <module>brave-spancollector-scribe</module>
    <module>brave-spancollector-kafka</module>
    <module>brave-spancollector-local</module>
    <module>brave-sampler-zookeeper</module>
    <module>brave-jersey</module>
    <module>brave-jersey2</module>
    <module>brave-jaxrs2</module>
    <module>brave-grpc</module>
    <module>brave-apache-http-interceptors</module>
    <module>brave-spring-web-servlet-interceptor</module>
    <module>brave-spring-resttemplate-interceptors</module>
    <module>brave-mysql</module>
    <module>brave-web-servlet-filter</module>
    <module>brave-okhttp</module>
  </modules>

基于http提供brave源码分析

brave-spancollector-http

image 提供httpCollector收集器mysql

brave-web-servlet-filter

image 基于http请求提供过滤器git

brave-apache-http-interceptors

image 基于apache-http-client发起气球提供拦截器github

基于springboot启动

step1配置

  • 针对collector的SpanCollector
  • 针对http请求的filter BraveServletFilter
  • 针对数据发送的Brave
  • 针对http-client请求的拦截器BraveHttpRequestInterceptor,BraveHttpResponseInterceptor
@Configuration
public class ZipkinConfig {
    //span(一次请求信息或者一次链路调用)信息收集器  
    @Bean  
    public SpanCollector spanCollector() {  
        Config config = HttpSpanCollector.Config.builder()  
                .compressionEnabled(false)// 默认false,span在transport以前是否会被gzipped  
                .connectTimeout(5000)  
                .flushInterval(1)  
                .readTimeout(6000)  
                .build();  
        return HttpSpanCollector.create("http://localhost:9411", config, new EmptySpanCollectorMetricsHandler());  
    }  
      
    //做为各调用链路,只须要负责将指定格式的数据发送给zipkin  
    @Bean  
    public Brave brave(SpanCollector spanCollector){  
        Builder builder = new Builder("service1");//指定serviceName  
        builder.spanCollector(spanCollector);  
        builder.traceSampler(Sampler.create(1));//采集率  
        return builder.build();  
    }  
  
  
    //设置server的(服务端收到请求和服务端完成处理,并将结果发送给客户端)过滤器  
    @Bean  
    public BraveServletFilter braveServletFilter(Brave brave) {  
        BraveServletFilter filter = new BraveServletFilter(brave.serverRequestInterceptor(),  
                brave.serverResponseInterceptor(), new DefaultSpanNameProvider());  
        return filter;  
    }  
      
    //设置client的(发起请求和获取到服务端返回信息)拦截器  
    @Bean  
    public CloseableHttpClient okHttpClient(Brave brave){  
       CloseableHttpClient httpclient = HttpClients.custom()
                .addInterceptorFirst(new BraveHttpRequestInterceptor(brave.clientRequestInterceptor(), new DefaultSpanNameProvider()))
                .addInterceptorFirst(new BraveHttpResponseInterceptor(brave.clientResponseInterceptor()))
                .build();
        return httpclient;  
    }
    
}

基于http发起请求服务端处理

post or get url : http://localhost/service1web

相关代码请查看 zipkin简单介绍及环境搭建(一)spring

流程图 brave-http-collector-receive pointsql

针对请求,若是Sampledheader包含(X-B3-Sampled)会获取header中的ParentSpanId,TraceId,SpanId直接返回,否者会认为这是一个新的请求会构建Span
HttpServerRequestAdapter.getTraceData()
public TraceData getTraceData() {
        final String sampled = serverRequest.getHttpHeaderValue(BraveHttpHeaders.Sampled.getName());
        if (sampled != null) {
            if (sampled.equals("0") || sampled.toLowerCase().equals("false")) {
                return TraceData.builder().sample(false).build();
            } else {
                final String parentSpanId = serverRequest.getHttpHeaderValue(BraveHttpHeaders.ParentSpanId.getName());
                final String traceId = serverRequest.getHttpHeaderValue(BraveHttpHeaders.TraceId.getName());
                final String spanId = serverRequest.getHttpHeaderValue(BraveHttpHeaders.SpanId.getName());

                if (traceId != null && spanId != null) {
                    SpanId span = getSpanId(traceId, spanId, parentSpanId);
                    return TraceData.builder().sample(true).spanId(span).build();
                }
            }
        }
        return TraceData.builder().build();
    }
针对请求的采样
traceSampler().isSampled(newTraceId),没有使用zk状况下CountingSampler来决定
public synchronized boolean isSampled(long traceIdIgnored) {
    boolean result = sampleDecisions.get(i++);
    if (i == 100) i = 0;
    return result;
  }

基于apache-http发起请求

流程图 brave-http-client-sendapache

如何在代码中添加本身的annotation or binaryAnnotation

直接注入Brave便可 ps(不建议这样作,代码侵入。 zipkin不建议添加大量数据)springboot

@RestController
public class ZipkinBraveController {

    @Autowired
    private CloseableHttpClient httpClient;
    @Autowired
    private com.github.kristofa.brave.Brave brave;
    
    @GetMapping("/service1")
    public String myboot() throws Exception {
        brave.serverTracer().submitBinaryAnnotation("状态", "成功"); 
        Thread.sleep(100);//100ms
        HttpGet get = new HttpGet("http://localhost:81/test");
        CloseableHttpResponse execute = httpClient.execute(get);
        /*
         * 一、执行execute()的先后,会执行相应的拦截器(cs,cr)
         * 二、请求在被调用方执行的先后,也会执行相应的拦截器(sr,ss)
         */
        return EntityUtils.toString(execute.getEntity(), "utf-8");
    }
}

ps

  • 若是collector要使用kafka直接切换spanController便可,须要server端进行对应配置
client端
        KafkaSpanCollector.create(KafkaSpanCollector.Config.builder().kafkaProperties(null).build(), new EmptySpanCollectorMetricsHandler());

server端须要配置kafka配置
final class KafkaZooKeeperSetCondition extends SpringBootCondition {
  static final String PROPERTY_NAME = "zipkin.collector.kafka.zookeeper";

  @Override
  public ConditionOutcome getMatchOutcome(ConditionContext context, AnnotatedTypeMetadata a) {
    String kafkaZookeeper = context.getEnvironment().getProperty(PROPERTY_NAME);
    return kafkaZookeeper == null || kafkaZookeeper.isEmpty() ?
        ConditionOutcome.noMatch(PROPERTY_NAME + " isn't set") :
        ConditionOutcome.match();
  }
}

连接

相关文章
相关标签/搜索