Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Migrate to a New Platform by Creating a Pipelin...

Migrate to a New Platform by Creating a Pipeline in Go (Julia Wong 2019)

Avatar for GopherConAU

GopherConAU

November 01, 2019
Tweet

More Decks by GopherConAU

Other Decks in Programming

Transcript

  1. JULIA WONG | DEVELOPER @ ATLASSIAN Migrate to a new

    platform by creating a pipeline in Go
  2. Trial vendors Multiple vendors and it can’t disrupt our users

    Send in parallel Can’t afford a big bang cut-over. Need time to migrate from old to new vendor. Visibility What metrics are being sent? How much? Pipeline Needs to solve
  3. Trial vendors Multiple vendors and it can’t disrupt our users

    Send in parallel Can’t afford a big bang cut-over. Need time to migrate from old to new vendor. Visibility What metrics are being sent? How much? Pipeline Needs to solve
  4. Trial vendors Multiple vendors and it can’t disrupt our users

    Send in parallel Can’t afford a big bang cut-over. Need time to migrate from old to new vendor. Visibility What metrics are being sent? How much? Pipeline Needs to solve
  5. return &httputil.ReverseProxy { FlushInterval: 5, // Modifies the request Director:

    director, } func director() func(req *http.Request) { return func(req *http.Request) { // Emit metrics about payload } } Reverse Proxy Proxy Metrics Compression
  6. return &httputil.ReverseProxy { FlushInterval: 5, // Modifies the request Director:

    director, } func director() func(req *http.Request) { return func(req *http.Request) { // Emit metrics about payload // Compress payload } } Reverse Proxy Proxy Metrics Compression
  7. Stream composed of shards Shards can split and merge Stream

    workers need to determine which shards to process Lease - associate worker with shard/s KINESIS
  8. type OldDatapoint struct { Points []*[2]float64 `json:"points"` Metric string `json:"metric"`

    Type string `json:"type"` Interval float64 `json:"interval"` } var point OldDatapoint err := json.Unmarshal(data, &point) Metric Proxy JSON Conversion
  9. type OldDatapoint struct { Points []*[2]float64 `json:"points"` Metric string `json:"metric"`

    Type string `json:"type"` Interval float64 `json:"interval"` } func translate (o *OldDatapoint) { datapoint := &newVendor.Datapoint { Metric: o.Metric, Value: int64(o.Points[0][1]), Time: time.Unix(int64(o.Point[0][0]), 0) } } Metric Proxy JSON Conversion
  10. OLD VENDOR SERVICE STATSD SERVER METRICS PROXY NEW VENDOR KINESIS

    STREAM WORKERS github.com/atlassian/gostatsd
  11. Forwarder and Aggregator STATSD
 (AGGREGATOR) SERVICE STATSD
 (FORWARDER) UDP
 RAW

    METRICS JSON
 AGGREGATED METRICS
 PROTOBUF
 CONSOLIDATED METRICS
  12. OLD VENDOR SERVICE STATSD SERVER V1 SERVICE + STATSD SIDECAR

    V2 STATSD SERVER V2 METRICS PROXY NEW VENDOR KINESIS STREAM WORKERS
  13. STATSD SERVER V2 SERVICE + STATSD SIDECAR V2 NEW VENDOR

    METRICS PROXY KINESIS STREAM WORKERS
  14. Benchmarking and expvar Minimal work to use tooling Kinesis stream

    worker Go workers aren’t on par with Java workers, yet httputil package Creating a reverse proxy was easy! Learnings Go
  15. Benchmarking and expvar Minimal work to use tooling Learnings Go

    Kinesis stream worker Go workers aren’t on par with Java workers, yet httputil package Creating a reverse proxy was easy!
  16. Benchmarking and expvar Minimal work to use tooling Learnings Go

    Kinesis stream worker Go workers aren’t on par with Java workers, yet httputil package Creating a reverse proxy was easy!
  17. Insights and control Knowledge is power Experiment safely Have a

    way to test changes by forking the data Add an interface early Put something between you and your vendor Learnings Pipeline
  18. Insights and control Knowledge is power Experiment safely Have a

    way to test changes by forking the data Add an interface early Put something between you and your vendor Learnings Pipeline
  19. Insights and control Knowledge is power Experiment safely Have a

    way to test changes by forking the data Add an interface early Put something between you and your vendor Learnings Pipeline