Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Customization Conundrum: Wrangling the Sakai development lifecycle

Customization Conundrum: Wrangling the Sakai development lifecycle

Many schools adopt Sakai because, as an open-source platform, each campus can meet local needs by making changes to the source code. Yet when is it sensible and sustainable to make local customizations? How do you weigh and prioritize the feedback and requests of a diverse population of faculty and students? And once you've committed to making changes, how do you establish an effective and repeatable process for implementing, testing, and evaluating efficacy?

With such a vibrant development community, we don't always have to reinvent the wheel. How can we discover and leverage customizations made by other schools in a sometimes chaotic, open-source community? And how best to manage campus expectations regarding new development, when just because we can make local changes doesn't always mean we should?

http://lanyrd.com/2013/apereo/schtxg/
#apereo13

Lou Rinaldi

June 04, 2013
Tweet

More Decks by Lou Rinaldi

Other Decks in Technology

Transcript

  1. Tier 1 support is contacted with a feature request or

    bug report IT Service Mgmt tool (Service- Now) Issue/Bug tracking tool (JIRA) dev priorities spread- sheet Sakai CLE production environment Sakai CLE staging (test) environment Sakai CLE development environment Developers' local workstation(s) Yale's customization development process for Sakai CLE Ideas generated from conferences or internal analysis Periodic reviews of the items on this spreadsheet may reorder their prioritization over time. Can/should/will we do this work? Here are some research questions that inform our decisions. Feature requests: 1. "Is this feature request recurring?" (If not, is it utilitarian?) 2. "Is it new/unique in the Sakai community?" (If not, close loop accordingly) 3. "Do we have the capacity?" 4. "Do we have the resources?" 5. "Assuming yes to questions 1-4, what is our timeframe?" Bug reports: 1. "Is the bug consistently reproducible?" 2. "Is it impactful enough to warrant a fix?" (If not, close loop accordingly) Internal demoing, iterative tweaks, initial QA Rigorous QA for changes intended to be deployed to production as-is. Proof of concepts Periodic deployments pending QA validation CHG