Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Operationalizing Ray Serve

Anyscale
April 14, 2022

Operationalizing Ray Serve

In this session, we will introduce you to a new declarative REST API for Ray Serve, which allows you to configure and update your Ray Serve applications without modifying application files. Incorporate this API into your existing CI/CD process to manage applications on Ray Serve as part of your MLOps lifecycle.

Anyscale

April 14, 2022
Tweet

More Decks by Anyscale

Other Decks in Technology

Transcript

  1. • Existing workflow for deploying Ray Serve • Ray Serve’s

    new ops-friendly workflow • Walk-through examples from the new Serve CLI • Integration with deployment graphs Outline
  2. Challenges with Config in Python • No source of truth

    • Configuration mixed with code • Tough to build custom ops tooling on top of Serve
  3. Operational Advantages • Structured config is single source of truth

    • Automation: easier access to configurations options • Enables custom ops tooling for Serve using the new YAML config interface
  4. Ray Serve Offers the Best of Both Worlds Developer Operator

    • Quick updates • Few Replicas • Python • Consistent updates • Many Replicas • YAML
  5. Future Plans: Improved Kubernetes Support • Structured config is basis

    for a better Kubernetes integration • Easily deploy, update, and monitor Ray Serve on K8s • Enable automated workflows like CI/CD, continual learning
  6. Future Plans: MLOps Integrations • Ray Serve is a scalable,

    compute layer • Integrations with best-in-breed MLOps tooling • Model monitoring • Drift detection • Experiment tracking • Model management
  7. • Join the community ◦ discuss.ray.io ◦ github.com/ray-project/ray ◦ @raydistributed

    and @anyscalecompute • Fill out our survey (QR code) for: ◦ Feedback to help shape the future of Ray Serve ◦ One-on-one sessions with developers ◦ Updates about upcoming features Please get in touch 22