Management for Security Life Cycle #appsecapac2014

D2c0774c30304e4970b502118aa791fe?s=47 OWASP Japan
March 20, 2014
2.3k

Management for Security Life Cycle #appsecapac2014

D2c0774c30304e4970b502118aa791fe?s=128

OWASP Japan

March 20, 2014
Tweet

Transcript

  1. 2.

    About  Me   •  I  belong  to  the  team  that

     performs   vulnerability  verifica?on  of  services  and   products  in  Cybozu,Inc.   •  I  work  to  improve  our  organiza?onal  security.   – Managing  vulnerability  responses  (POC)   – Organizing    “cybozu.com  Security  Challenge”
  2. 3.

    The  theme  of  today •  Management  for  vulnerability  life  cycle

      – how  to  handle  vulnerability  informa?on  from   outside  contributors   Accept Develop Publish Vulnerability  disclosure   Vulnerability  handling  processes  
  3. 4.

    •  History  of  vulnerability  informa?on  handling   •  Evalua?on  process

     of  vulnerability  informa?on   •  Bringing  it  in  our  development  process   •  Improvement  of  process   •  ISO/IEC  29147  and  ISO/IEC  30111   Agenda
  4. 5.

    •  History  of  vulnerability  informa8on  handling   •  Evalua?on  process

     of  vulnerability  informa?on   •  Bringing  it  in  our  development  process   •  Improvement  of  process   •  ISO/IEC  29147  and  ISO/IEC  30111   Agenda
  5. 6.

    •  Before  2006   History  of  vulnerability   informa?on  handling

      Inden?fy   vulnerability Verify   report Implement   measures Respond  to   contributor Receive   report Service  or  product   Customer  Support   Issue   Advisory Outside  contributors   In-­‐house  
  6. 7.

    History  of  vulnerability   informa?on  handling •  Before  2006  

    – There  was  no  team  to  deal  with  vulnerability   informa?on.   – Customer  support  handled  vulnerability   informa?on  from  the  outside.   – The  QA  manager  of  a  service  or  a  product   evaluated  vulnerabili?es  according  to  rules  of   “bug  priority”  determined  in-­‐house.    
  7. 8.

    History  of  vulnerability   informa?on  handling •  2006   – Mr.

     Hiromitsu  Takagi  pointed  out  our  problem   with  improperly  released  vulnerability   informa?on.(Reference)   – Summary   •  We  did  not  issue  an  advisory  for  a  vulnerability  we   fixed.   •  Our  customer  support  was  not  able  to  respond   properly.  
  8. 9.

    History  of  vulnerability   informa?on  handling •  Problems   – There

     was  no  team  to  respond  to  vulnerability   informa?on  from  outside  contributors  or  our  own   employees. – No  established  process  of  vulnerability  inspec?on   was  available.   – There  was  no  policy  about  how  to  publish   vulnerability  informa?on.
  9. 10.

    History  of  vulnerability   informa?on  handling •  2006  -­‐  2009

      – Established  PSIRT   •  Product  security  incident  response  team   – Responsible  for  coordina?on  between  outside   contributors  and  product/service  teams.   – It  was  s?ll  the  service  or  product  development   team  who  was  responsible  for  disclosure  and   assessment  of  vulnerabili?es.  
  10. 11.

    •  2006  -­‐  2009   History  of  vulnerability   informa?on

     handling Inden?fy   vulnerability Verify   report Implement   measures Respond  to   contributor Receive   report PSIRT   Issue   advisory Coordinate   JPCERT/CC Outside  contributors   In-­‐house   Service  or  product  
  11. 12.

    History  of  vulnerability   informa?on  handling •  2011   – Established

     CSIRT  (Cy-­‐SIRT)   – Aggregated  all  vulnerability  informa?on  to   standardize  the  methods  of  evalua?ng   vulnerabili?es.   •  Person  in  charge  of  POC  verified  vulnerabili?es.   •  Each  team  implemented  countermeasures  based  on   the  results  of  vulnerability  evalua?on.  
  12. 13.

    •  2011   History  of  vulnerability   informa?on  handling Inden?fy

      vulnerability Implement   measures Respond  to   contributor Receive   report Cy-­‐SIRT   Issue   advisory Verify   report Coordinate   JPCERT/CC Outside  contributors   In-­‐house   Service  or  product  
  13. 14.

    History  of  vulnerability   informa?on  handling   •  Many  problems

     occurred  …   – Product  teams  and  CSIRT  could  not  agree  on  the   results  of  vulnerability  assessment.   – The  complexity  of  the  a_ack  was  not  evaluated.   – CSIRT  could  not  evaluate  vulnerabili?es  of  third-­‐ party  so`ware.  
  14. 15.

    History  of  vulnerability   informa?on  handling •  To  solve  these

     problems,  we  inves?gated   – evalua?on  processes  of  vulnerability  informa?on   – handling  processes  of  vulnerability  informa?on   •  We  decided  to  establish  a  new  process  based   on  the  survey  result.
  15. 16.

    •  History  of  vulnerability  informa?on  handling   •  Evalua8on  process

     of  vulnerability   informa8on   •  Bringing  it  in  our  development  process   •  Improvement  of  process   •  ISO/IEC  29147  and  ISO/IEC  30111   Agenda
  16. 17.

    Evalua?on  process  of   vulnerability  informa?on   •  Requirements  

    – To  be  able  to  standardize  evalua?on  criteria  of   vulnerability   – To  make  a  framework  to  quan?ta?vely  evaluate   the  risk  of  a  vulnerability   •  Inves?ga?on  objects   – OWASP  Risk  Methodology   – CVSS  v2  
  17. 18.

    Evalua?on  process  of   vulnerability  informa?on •  We  decided  to

     use  CVSS  v2   – consistent  with  our  purpose  of  standardizing   assessment  of  vulnerability   – Many  organiza?ons  are  using  CVSS.   – See  also  references  1  
  18. 19.

    Evalua?on  process  of   vulnerability  informa?on •  CVSS  v2  

    – Common  Vulnerability  Scoring  System   – A  universal  open  and  standardized  method  for   ra?ng  IT  vulnerabili?es.     – h_p://www.first.org/cvss     – h_p://www.ipa.go.jp/security/vuln/CVSS.html  
  19. 20.

    Evalua?on  process  of   vulnerability  informa?on •  Base  Metrics  

    – Access  Vector  (AV)   – Access  Complexity  (AC),  Authen?ca?on  (Au)   – Impact  of  Confiden?ality  (C),  Integrity  (I),   Availability  (A)   •  Each  item  is  rated  on  a  scale  of  one  to  three   – Calcula?ng  the  Base  Score  using  an  original   formula.  
  20. 21.

    Evalua?on  process  of   vulnerability  informa?on •  The  severity  of

     the  vulnerability  is  determined   by  the  base  score.   – Level  Ⅲ (HIGH) (  Base  Score  7.0  ~ 10.0)   – Level  Ⅱ (MEDIUM) (Base  Score  4.0  ~ 6.9)   – Level  Ⅰ (LOW) (Base  Score  0.1  ~ 3.9)   •  Users  can  evaluate  the  risk  from  the  base  core   released  by  the  vendor.  
  21. 22.

    Evalua?on  process  of   vulnerability  informa?on •  Good  Points  

    – Many  evalua?on  results     •  30000+  evalua?on  results  were  published  in  JVN  iPedia   by  2012.   – Evalua?on  by  an  external  organiza?on   •  19  of  our  own  vulnerabili?es  were  evaluated  by  Dec.   2011.   •  Disadvantages   – No  referen?al  materials  are  available  other  than   the  original.  
  22. 23.

    •  History  of  vulnerability  informa?on  handling   •  Evalua?on  process

     of  vulnerability  informa?on   •  Bringing  it  in  our  development  process   •  Improvement  of  process   •  ISO/IEC  29147  and  ISO/IEC  30111   Agenda
  23. 24.

    Bringing  it  in  our    development  process   •  Prepara?on

      – Inves?ga?on  of  our  vulnerabili?es   – Development  of  evalua?on  process   – Bringing  it  in  our  development  process   •  Prepara?on  period     – About  one  month  
  24. 25.

    Bringing  it  in  our    development  process   •  Inves?ga?on

     of  our  vulnerabili?es     – aggregated  informa?on  of  all  the  vulnerabili?es  of   all  our  products  detected  in  2010  –  2011.   – evaluated  them  using  CVSS  v2   – At  first,  many  people  were  working.   •  Results  were  o`en  in  disagreement  with  each  other.   •  So,  I  evaluated  all  vulnerabili?es.  
  25. 26.

    Bringing  it  in  our    development  process   •  Evalua?on

     result  by  in-­‐house  standards   •  Evalua?on  result  by  CVSS  v2   Emergency  Request   49   Middle 65   Level  Ⅲ  (HIGH) 17 Level  Ⅱ(MIDDLE) 59 LevelⅠ(LOW) 38
  26. 27.

    Bringing  it  in  our    development  process   •  Vulnerability

     types  (CWE)   – Points  of  view  of  CVSS  evalua?on  can  vary   depending  on  the  type  of  the  vulnerability   •  Ex.  XSS  (AV/Ac/Au/C/I/A:-­‐/-­‐/-­‐/N/P/N)   •  CWE(Common  Weakness  Enumera?on)   – aims  to  provide  a  common  base  to  iden?fy  the   type  of  so`ware  weakness  (vulnerability).   •  h_p://www.ipa.go.jp/security/english/vuln/ CWE_en.html   
  27. 28.

    Bringing  it  in  our    development  process •  Selec?ng  CWE

     types   – Types  detected  in  our  products  over  the  past  two   years   – Types  adopted  in  external  projects   •  JVNiPedia   •  IPA  Web  健康診断   •  OWASP  Top  10  Project   •  We  adopted  17  +  2  items.   – See  references  2  
  28. 29.

    Bringing  it  in  our    development  process   •  Making

     evalua?on  guidelines   – More  than  one  person  must  be  able  to  make  a   consistent  evalua?on.   – Described  many  specific  examples   •  See  References  3   •  Peer  Review   – Based  on  the  guidelines,  two  ore  more  members   are  reviewing  the  content.  
  29. 30.

    Bringing  it  in  our    development  process   •  Bringing

     it  in  our  development  process   – Set  a  priority  according  to  the  severity  of  the   vulnerability.   •  To  fix  severe  ones  first   – Made  more  specific  rules  to  raise  or  lower   priori?es.   •  An  SQL  Injec?on  is  given  a  high  priority  if  the  Base   Score  is  greater  than  6.5.   •  An  XSS  is  given  a  high  priority  if  the  Base  Score  is  5.0  or   larger.  
  30. 31.

    Bringing  it  in  our    development  process   33  

    99   178   レベルⅢ   レベルⅡ   レベルⅠ   •  Severity  two  years  a`er  the  introduc?on
  31. 32.

    Bringing  it  in  our    development  process   •  Vulnerabili?es

     two  years  a`er  the  introduc?on CWE-­‐264   29%   CWE-­‐79   24%   CWE-­‐20   12%   CWE-­‐89   9%   CWE-­‐399   6%   CWE-­‐200   5%  
  32. 33.

    Bringing  it  in  our    development  process   •  Vulnerabili?es

     two  years  a`er  the  introduc?on – Following  vulnerabili?es  account  for  70%  of  the   total     •  CWE-­‐264  Permissions,  Privileges,  and  Access  Controls   •  CWE-­‐79  Cross-­‐site  Scrip?ng   •  CWE-­‐20  Improper  Input  Valida?on   •  CWE-­‐89  SQL  Injec?on  
  33. 34.

    Bringing  it  in  our    development  process   •  Benefits

      – Assessment  of  vulnerabili?es  was  standardized   – It  became  possible  to  explain  the  seriousness  of   the  vulnerability  to  the  outside.   •  Benefit  of  improvement  of  processes   – Sharing  the  informa?on  with  other  projects.  
  34. 35.

    Bringing  it  in  our    development  process   •  Challenges

      – Responding  to  vulnerabili?es  that  can  not  be   evaluated  in  CVSS  v2   •  Larger  incidents  occur  when  two  or  more  vulnerabili?es   are  combined   •  Problems  with  authen?ca?on  in  one  service  expand  the   damage  to  other  SSO-­‐able  systems   – Considering  CVSS  v3   •  Great  changes  in  the  assessment  system  will  require  a   re-­‐evalua?on.  
  35. 36.

    •  History  of  vulnerability  informa?on  handling   •  Evalua?on  process

     of  vulnerability  informa?on   •  Bringing  it  in  our  development  process   •  Improvement  of  process   •  ISO/IEC  29147  and  ISO/IEC  30111   Agenda
  36. 37.

    •  2011   Improvement  of  process Inden?fy   vulnerability Implement

      measures Respond  to   contributor Receive   report Cy-­‐SIRT   Issue   advisory Verify   report Coordinate   JPCERT/CC Outside  contributors   In-­‐house   Service  or  product  
  37. 38.

    Improvement  of  process •  2011   – We  released  informa?on  on

     a  vulnerability  before   we  finish  arranging  the  publishing  schedule  of  it   with  an  external  organiza?on.   – We  did  it  ahead  of  schedule  because  we  were   receiving  many  inquiries  on  it.   Many  people  pointed  out  the  unclearness  in  our   policy  about  publishing  countermeasures.
  38. 39.

    Improvement  of  process •  Problems   – Vulnerability  informa?on  was  not

     shared  with   CSIRT  and  other  teams.   – Referen?al  criteria  about  vulnerability  handling   had  not  been  published.   – Policies  about  disclosure  was  not  clear.  
  39. 40.

    Improvement  of  process •  To  solve  these  problems   – It

     was  necessary  to  organize  a  vulnerability   informa?on  handling  workflow  into  a  simple   framework.   •  At  that  ?me,  JPCERT  /  CC  provided  us  with  the   best  informa?on.   – ISO/IEC  29147  and  ISO/IEC  30111  
  40. 41.

    •  History  of  vulnerability  informa?on  handling   •  Evalua?on  process

     of  vulnerability  informa?on   •  Bringing  it  in  our  development  process   •  Improvement  of  process   •  ISO/IEC  29147  and  ISO/IEC  30111   Agenda
  41. 42.

    ISO/IEC  29147   ISO/IEC  30111   •  Interna?onal  standards  of

     how  to  handle   vulnerability   – ISO/IEC  29147  Vulnerability  disclosure   – ISO/IEC  30111  Vulnerability  handling  processes   •  In  early  2013,  both  standards  were  at  the   stage  of  the  DIS.   – Stage  of  the  development  of  Interna?onal   Standards  
  42. 43.

    ISO/IEC  29147   ISO/IEC  30111 •  Image   Accept Develop

    Publish Vulnerability  disclosure   (ISO/IEC  30111)   Vulnerability  handling  processes   (ISO/IEC  29147)  
  43. 44.

    ISO  /  IEC  29147 •  Vulnerability  disclosure   – Published  on

     2014-­‐02-­‐05   – Focuses  on  ac?ons  visible  to  the  outside.     •  Statement  of  policy  to  deal  with  vulnerabili?es   •  How  to  receive  vulnerability  informa?on  from  a   coordinator  or  vulnerability  finder   •  How  to  publish  vulnerability  informa?on  along  with  the   patches  or  disclose  to  par?cular  person  
  44. 45.

    ISO  /  IEC  30111 •  Vulnerability  handling  processes   – Published

     on  2013-­‐10-­‐22   – Focuses  on  handling  procedures  within  the   organiza?on   •  illustrates  the  role  and  responsibili?es  of  each   department   •  splits  the  handling  process  between  the  services   provided  online  and  the  system  that  the  customer   controls  
  45. 46.

    ISO/IEC  29147   ISO/IEC  30111 •  In  applying  these  standard

     in-­‐house,  we  noted   the  following   – Take  ?me   – Do  not  change  the  framework  of  the  standard   •  Vulnerability  disclosure   •  Vulnerability  handling  processes   – Do  not  intend  to  comply  with  interna?onal   standards  
  46. 47.

    •  2013  and  aSer   ISO/IEC  29147   ISO/IEC  30111

    Inden?fy   vulnerability Implement   measures Respond  to   contributor   Receive   report Cy-­‐SIRT   Issue   advisory Verify   report Coordinate   JPCERT/CC Outside  contributors   In-­‐house   Service  or  product  
  47. 48.

    •  2013  or  aSer   ISO/IEC  29147   ISO/IEC  30111

    Inden?fy   vulnerability Implement   measures Respond  to   contributor Receive   report Cy-­‐SIRT   Issue   advisory Verify   report Coordinate   JPCERT/CC Outside  contributors   In-­‐house   Service  or  product  
  48. 49.

    Use  of  ISO  /  IEC  29147 •  Established  a  contact

     point  for  vulnerability   informa?on   – Published  the  CSIRT  Statements   – h_ps://www.cybozu.com/jp/features/ management/cysirt.html     •  Improved  the  content  of  vulnerability   informa?on   – described  the  results  of  CVSS  v2  evalua?on  and  a   CVE  number  in  our  vulnerability  informa?on.
  49. 51.

    Use  of  ISO  /  IEC  29147 •  Acknowledgments   – We

     acknowledge  people  who  reported  a   vulnerability  to  us  if  they  agree  to  publica?on  of   their  name.   – h_p://cybozu.co.jp/specialthanks.html
  50. 52.

    Use  of  ISO  /  IEC  30111 •  All  our  teams

     started  to  use  a  database  with   vulnerability  informa?on.   – implemented  a  workflow  to  publish  vulnerability   informa?on   – Cy-­‐SIRT  reviews  the  advisory  before  it  is  published  
  51. 53.

    Use  of  ISO  /  IEC  30111 •  Formulated  the  policy

     to  handle  vulnerabili?es  of   third-­‐party  so`ware     – Collect  vulnerability  informa?on  about  these   so`ware  every  day  and  share  it  with  interested   par?es.   – Inves?gate  if  it  affects  our  products  and  services:  if   it  does,  correspond  to  it  by  the  same  standard  as   we  do  to  ours.   – Contact  the  developer  if  the  vulnerability  is  due  to   third-­‐party  so`ware.  
  52. 55.

    ISO/IEC  29147   ISO/IEC  30111   •  Vulnerability  disclosure  

    – We  publish  an  advisory  if:   •  the  customer  needs  to  do  something  to  implement   countermeasures  we  advise,   •  the  incident  that  occurred  in  our  company  caused   damage  to  the  customer,  or   •  an  outside  contributor  pointed  us  out  a  problem   •  Vulnerability  handling  processes   – We  stated  that  we  provide  countermeasures   according  to  the  severity  of  the  vulnerability.
  53. 56.

    ISO/IEC  29147 •  Benefits   – Can  standardize  advisories   – Can

     cooperate  more  aggressively  with  outside   contributors   •  cybozu.com  Security  Challenge  (2013-­‐11)   •  Started  to  provide  verifica?on  environments  to   contributors  (2014-­‐02)   •  Bug  bounty  programs(2014  2Q)
  54. 57.

    ISO/IEC  29147   •  Challenges  for  the  future   – To

     gain  experience  in  receiving  reports   •  Vulnerabili?es  that  are  not  listed  in  CWE   •  A_acking  methods  using  social  engineering   – To  gain  experience  in  handling  vulnerabili?es  of   third-­‐party  so`ware   •  To  cooperate  with  other  CERT  
  55. 58.

    ISO/IEC  30111 •  Benefits   – Clarified  the  criteria  for  corresponding

     to   vulnerabili?es   – Built  a  vulnerability  informa?on  database  which  is   referred  to  by  all  the  teams.   – Collected  and  updated  vulnerability  informa?on  of   third-­‐party  so`ware.  
  56. 59.

    ISO/IEC  30111 •  Challenges  for  the  future   – To  improve

     the  accuracy  of  evalua?on   •  Frequent  differences  in  evalua?on  results  between  the   external  ins?tu?ons  and  CSIRT  could  cause  our  teams  a   sense  of  distrust   – Which  should  we  fix  first  -­‐  generic  bugs  or   vulnerabili?es?   •  Defend  the  Security  Baseline.   •  Discuss  when  to  fix  it.  
  57. 60.

    Summary •  We  have  revised  vulnerability  handling  processes   in

     accordance  with  interna?onal  standards.   –  Vulnerability  disclosure   •  Improvement  of  our  advisory  (CVE  /  CWE)   •  Coopera?on  with  contributors  (acknowledgments  /  Bug   Bounty  Program)   –  Vulnerability  handling  processes   •  CVSS  v2   •  Handling  policies  for  vulnerability  of  third-­‐party  so`ware    
  58. 62.

    Bibliography •  情報セキュリティ早期警戒パートナーシップガイド ライン   –  h_p://www.ipa.go.jp/security/ciadr/ partnership_guide.html    

    •  共通脆弱性識別子 CVE  概説(IPA)   –  h_p://www.ipa.go.jp/security/vuln/CVE.html   •  脆弱性対策の効果的な進め方(IPA)   –  h_ps://www.ipa.go.jp/files/000035701.pdf     •  一般社団法人JPCERTコーディネーションセンター 様ご提供資料  
  59. 63.

    References  1 •  OWASP  Risk  Methodology   – Method  for  es?ma?ng

     the  risk  to  be  given  to  the   business  Web  applica?on  vulnerabili?es   •  Risk  =  Likelihood  *  Impact   1.  Evaluate  the  likelihood  and  impact  in  terms  of   12-­‐16  factors.     2.  Calculates  the  average  value   3.  Measure  the  severity  of  the  vulnerability  by   assigning  the  matrix  evalua?on  results  
  60. 64.

    References  1 •  Determining  the  Severity  of  the  Risk  

    – Low(0  to  <  3)   – Medium(3  to  <  6)   – High(6  to  9)
  61. 65.

    References  1 •  Good  Points   –  Easy  to  understand

     the  method  of  risk  assessment   –  Can  make  risk  assessment  in  detail   •  Examples  of  assessment  items   –  Likelihood  -­‐  Threat  Agent  Factor  (Skill  Level)   •  Security  penetra?on  skills  (1)   •  network  and  programming  skills  (3)   •  advanced  computer  user  (4)   •  some  technical  skills  (6)   •  no  technical  skills  (9)    
  62. 66.

    References  1 •  Disadvantages   – Poor  or  no  record  of

     adop?on   •  cost  of  explaining  to  the  outside  is  high   •  Can  not  refer  to  evalua?on  results  of  other   organiza?ons   – High  cost   •  12  to  16  items  are  required  for  evalutaion   •  Each  item  is  rated  on  a  scale  of  one  to  ten     – Difficulty  in  ensuring  consistency  
  63. 67.

    References  2   •  CWE  list  to  adopt  (17  type

     +  2)   –  CWE-­‐16    Configura?on   –  CWE-­‐20    Improper  Input  Valida?on   –  CWE-­‐22  Path  Traversal   –  CWE-­‐78  OS  Command  Injec?on   –  CWE-­‐79    Cross-­‐site  Scrip?ng   –  CWE-­‐89    SQL  Injec?on   –  CWE-­‐93    CRLF  Injec?on   –  CWE-­‐113  HTTP  Response  Splixng   –  CWE-­‐200  Informa?on  Exposure   –  CWE-­‐264  Permissions,  Privileges,  and  Access  Controls  
  64. 68.

    References  2   •  CWE  list  to  adopt  (17  type

     +  2)   –  CWE-­‐287  Improper  Authen?ca?on –  CWE-­‐352  CSRF –  CWE-­‐362  Race  Condi?on   –  CWE-­‐384  Session  Fixa?on –  CWE-­‐399  Resource  Management  Errors –  CWE-­‐601  Open  Redirect   –  CWE-­‐614  Sensi?ve  Cookie  in  HTTPS  Session  Without   'Secure'  A_ribute   –  CWE-­‐Other(Others) –  CWE-­‐DesignError(  Problem  of  system  design  )
  65. 69.

    References  3   •  Guideline  of  CVSS  v2  evalua?on  

    •  AC(  Access  Complexity  ):LOW   – Risk  becomes  obvious  by   •  registering  the  a_ack  code  to  any  parameter   •  Value  registered  in  the  input  form     •  falsify  the  values  contained  in  the  JSON  string   •  a_ack  pa_ern  which  is  included  in  the  mail  sent  from   the  outside