performs vulnerability veriﬁca?on of services and products in Cybozu,Inc. • I work to improve our organiza?onal security. – Managing vulnerability responses （POC） – Organizing “cybozu.com Security Challenge”
– There was no team to deal with vulnerability informa?on. – Customer support handled vulnerability informa?on from the outside. – The QA manager of a service or a product evaluated vulnerabili?es according to rules of “bug priority” determined in-‐house.
Hiromitsu Takagi pointed out our problem with improperly released vulnerability informa?on.（Reference） – Summary • We did not issue an advisory for a vulnerability we ﬁxed. • Our customer support was not able to respond properly.
was no team to respond to vulnerability informa?on from outside contributors or our own employees. – No established process of vulnerability inspec?on was available. – There was no policy about how to publish vulnerability informa?on.
– Established PSIRT • Product security incident response team – Responsible for coordina?on between outside contributors and product/service teams. – It was s?ll the service or product development team who was responsible for disclosure and assessment of vulnerabili?es.
CSIRT （Cy-‐SIRT） – Aggregated all vulnerability informa?on to standardize the methods of evalua?ng vulnerabili?es. • Person in charge of POC veriﬁed vulnerabili?es. • Each team implemented countermeasures based on the results of vulnerability evalua?on.
occurred … – Product teams and CSIRT could not agree on the results of vulnerability assessment. – The complexity of the a_ack was not evaluated. – CSIRT could not evaluate vulnerabili?es of third-‐ party so`ware.
– Access Vector (AV) – Access Complexity (AC), Authen?ca?on (Au) – Impact of Conﬁden?ality (C), Integrity (I), Availability (A) • Each item is rated on a scale of one to three – Calcula?ng the Base Score using an original formula.
the vulnerability is determined by the base score. – Level Ⅲ （HIGH） （ Base Score 7.0 ～ 10.0） – Level Ⅱ （MEDIUM） （Base Score 4.0 ～ 6.9） – Level Ⅰ （LOW） （Base Score 0.1 ～ 3.9） • Users can evaluate the risk from the base core released by the vendor.
– Many evalua?on results • 30000+ evalua?on results were published in JVN iPedia by 2012. – Evalua?on by an external organiza?on • 19 of our own vulnerabili?es were evaluated by Dec. 2011. • Disadvantages – No referen?al materials are available other than the original.
of our vulnerabili?es – aggregated informa?on of all the vulnerabili?es of all our products detected in 2010 – 2011. – evaluated them using CVSS v2 – At ﬁrst, many people were working. • Results were o`en in disagreement with each other. • So, I evaluated all vulnerabili?es.
types (CWE) – Points of view of CVSS evalua?on can vary depending on the type of the vulnerability • Ex. XSS （AV/Ac/Au/C/I/A:-‐/-‐/-‐/N/P/N） • CWE（Common Weakness Enumera?on） – aims to provide a common base to iden?fy the type of so`ware weakness (vulnerability). • h_p://www.ipa.go.jp/security/english/vuln/ CWE_en.html
evalua?on guidelines – More than one person must be able to make a consistent evalua?on. – Described many speciﬁc examples • See References 3 • Peer Review – Based on the guidelines, two ore more members are reviewing the content.
it in our development process – Set a priority according to the severity of the vulnerability. • To ﬁx severe ones ﬁrst – Made more speciﬁc rules to raise or lower priori?es. • An SQL Injec?on is given a high priority if the Base Score is greater than 6.5. • An XSS is given a high priority if the Base Score is 5.0 or larger.
two years a`er the introduc?on – Following vulnerabili?es account for 70% of the total • CWE-‐264 Permissions, Privileges, and Access Controls • CWE-‐79 Cross-‐site Scrip?ng • CWE-‐20 Improper Input Valida?on • CWE-‐89 SQL Injec?on
– Assessment of vulnerabili?es was standardized – It became possible to explain the seriousness of the vulnerability to the outside. • Beneﬁt of improvement of processes – Sharing the informa?on with other projects.
– Responding to vulnerabili?es that can not be evaluated in CVSS v2 • Larger incidents occur when two or more vulnerabili?es are combined • Problems with authen?ca?on in one service expand the damage to other SSO-‐able systems – Considering CVSS v3 • Great changes in the assessment system will require a re-‐evalua?on.
a vulnerability before we ﬁnish arranging the publishing schedule of it with an external organiza?on. – We did it ahead of schedule because we were receiving many inquiries on it. Many people pointed out the unclearness in our policy about publishing countermeasures.
how to handle vulnerability – ISO/IEC 29147 Vulnerability disclosure – ISO/IEC 30111 Vulnerability handling processes • In early 2013, both standards were at the stage of the DIS. – Stage of the development of Interna?onal Standards
2014-‐02-‐05 – Focuses on ac?ons visible to the outside. • Statement of policy to deal with vulnerabili?es • How to receive vulnerability informa?on from a coordinator or vulnerability ﬁnder • How to publish vulnerability informa?on along with the patches or disclose to par?cular person
on 2013-‐10-‐22 – Focuses on handling procedures within the organiza?on • illustrates the role and responsibili?es of each department • splits the handling process between the services provided online and the system that the customer controls
in-‐house, we noted the following – Take ?me – Do not change the framework of the standard • Vulnerability disclosure • Vulnerability handling processes – Do not intend to comply with interna?onal standards
point for vulnerability informa?on – Published the CSIRT Statements – h_ps://www.cybozu.com/jp/features/ management/cysirt.html • Improved the content of vulnerability informa?on – described the results of CVSS v2 evalua?on and a CVE number in our vulnerability informa?on.
to handle vulnerabili?es of third-‐party so`ware – Collect vulnerability informa?on about these so`ware every day and share it with interested par?es. – Inves?gate if it aﬀects our products and services: if it does, correspond to it by the same standard as we do to ours. – Contact the developer if the vulnerability is due to third-‐party so`ware.
– We publish an advisory if: • the customer needs to do something to implement countermeasures we advise, • the incident that occurred in our company caused damage to the customer, or • an outside contributor pointed us out a problem • Vulnerability handling processes – We stated that we provide countermeasures according to the severity of the vulnerability.
cooperate more aggressively with outside contributors • cybozu.com Security Challenge （2013-‐11） • Started to provide veriﬁca?on environments to contributors （2014-‐02） • Bug bounty programs（2014 2Q）
gain experience in receiving reports • Vulnerabili?es that are not listed in CWE • A_acking methods using social engineering – To gain experience in handling vulnerabili?es of third-‐party so`ware • To cooperate with other CERT
the accuracy of evalua?on • Frequent diﬀerences in evalua?on results between the external ins?tu?ons and CSIRT could cause our teams a sense of distrust – Which should we ﬁx ﬁrst -‐ generic bugs or vulnerabili?es? • Defend the Security Baseline. • Discuss when to ﬁx it.
the risk to be given to the business Web applica?on vulnerabili?es • Risk = Likelihood * Impact 1. Evaluate the likelihood and impact in terms of 12-‐16 factors. 2. Calculates the average value 3. Measure the severity of the vulnerability by assigning the matrix evalua?on results
adop?on • cost of explaining to the outside is high • Can not refer to evalua?on results of other organiza?ons – High cost • 12 to 16 items are required for evalutaion • Each item is rated on a scale of one to ten – Diﬃculty in ensuring consistency
• AC（ Access Complexity ）：LOW – Risk becomes obvious by • registering the a_ack code to any parameter • Value registered in the input form • falsify the values contained in the JSON string • a_ack pa_ern which is included in the mail sent from the outside