Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Processing Smart City Data in the Cloud with Do...

Michel Krämer
December 11, 2014

Processing Smart City Data in the Cloud with Domain-Specific Languages

In this presentation that I gave at the Smart City Clouds Workshop co-located with the Utility and Cloud Computing Conference UCC 2014 I talked about a novel user interface that allows domain experts such as urban planners to harness the capabilities of Cloud Computing. The user interface is based on Domain-Specific Languages (DSLs) that are really readable and understandable even for users with no background in computer science.

My DSLs also hide the technical details of Cloud Computing and enable the users to specify what should be done instead of how it should be done.

In this talk I specifically focused on the modelling method that I use to specify new Domain-Specific Languages.

Michel Krämer

December 11, 2014
Tweet

More Decks by Michel Krämer

Other Decks in Research

Transcript

  1. As an urban planner I would like to update my

    existing 3D city model based on analysing recent LIDAR point clouds. " " -- Random IQmulus user
  2. 5 Identify relevant verbs 6 Build sample DSL scripts 7

    Derive grammar 8 Review and reiterate
  3. ANALYSECUSERCSTORIES 3 TopographicCobject UrbanCfurniture LIDARCpointCcloud Tree Car People Roof CableCnetwork

    TrafficClight UrbanCarea 3DCcityCmodel RubbishCbin BusCstop Antenna StreetCedge Image Growth Object Bike FacadeCelement ...
  4. BUILD SAMPLE SCRIPTS 6 with recent PointCloud do exclude NonStaticObjects

    select added Trees and added FacadeElements add to CityModel end
  5. BUILD SAMPLE SCRIPTS 6 applyGSplineInterpolation GGGGwithG[PointCloud2010] GGGGusingGformat:GcRGBc GGGGGGGGandGtolerance:G0.3 GGGGGGGGandGiterations:G6 GGGGGGGGasGsurface2010

    applyGTrimSurfaceWithPoints GGGGwithG[PointCloud2010]GandGsurface2010 GGGGusingGsensibility:G7 GGGGasGtrimmedSurface2010
  6. DERIVE GRAMMAR 7 start = SP* statements SP* statements =

    statement ( SP+ statement )* statement = block / process block = with SP+ statements SP+ "end" with = "with" SP+ dataset SP+ "do" dataset = "recent" SP+ ID / ID ...
  7. DERIVE GRAMMAR (FINAL) 7 { <<var<ast'uilder<R<options1ast'uilderE <<function<asty_<{ <<<<return<ast'uilder1posyliney_*<columny__E <<} <<function<extract9istylist*<index_<{

    <<<<var<result<R<[]E <<<<for<yvar<i<R<WE<i<F<list1lengthE<PPi_<{ <<<<<<result1pushylist[i][index]_E <<<<} <<<<return<resultE <<} <<function<build9istyfirst*<rest*<index_<{ <<<<return<[first]1concatyextract9istyrest*<index__E <<} } start <<R<workflow<?<empty_workflow empty_workflow <<R<SPS<{ <<<<return<asty_1workflowy[]_E <<} workflow <<R<SPS<sOstatements<SPS<{ <<<<return<asty_1workflowys_E <<} statements <<R<firstOstatement<moreOy<SPP<statement<_S<{ <<<<return<build9istyfirst*<more*<I_E <<} statement <<R<with <<?<process_statement process_statement <<R<pOprocess<datasetsOprocess_statement_withM <<<<paramsOprocess_statement_usingM<nameOprocess_statement_asM<{ <<<<return<asty_1processyp*<datasets*<params*<name_E <<} process_statement_with <<R<SPP<WYTZ<SPP<firstOdataset_expression<moreOy <<<<SPP<jN!<SPP<dataset_expression<_S<{ <<<<return<build9istyfirst*<more*<H_E <<} process_statement_using <<R<SPP<USYNV<SPP<pOparams<{ <<<<return<pE <<} process_statement_as <<R<SPP<jS<SPP<nOname<{ <<<<return<nE <<} with <<R<WYTZ<SPP<dsOdataset_expression<SPP<!O<SPP <<<<sOstatements<SPP<\N!<{ <<<<return<asty_1withyds*<s_E <<} dataset_expression <<R<dsOdataset<y<SPP<Oz<SPP<dataset<_M<{ <<<<return<dsE <<} recent <<R<R\-\NT<SPP<nameONjK\<{ <<<<return<asty_1recentyname_E <<} latest <<R<9jT\ST<SPP<nameONjK\<{ <<<<return<asty_1latestyname_E <<} dataset <<R<recent <<?<latest <<?<placeholder <<?<NjK\ params <<R<firstOparam<moreOy<SPP<jN!<SPP<param<_S<{ <<<<return<build9istyfirst*<more*<H_E <<} param <<R<h{h<SPS<tuple_ids<SPS<hOh<SPS<tuple_expression<SPS<h}h <<?<keyONjK\<SPS<hOh<SPS<valueOexpression<{ <<<<return<asty_1paramykey*<value_E <<} tuple_expression <<R<expression<y<h*h<SPS<expression<_S expression <<R<placeholder <<?<NUK'\R <<?<string <<?<NjK\ tuple_ids <<R<NjK\<y<h*h<SPS<NjK\<_S placeholder <<R<h[h<nameONjK\<h]h<{ <<<<return<asty_1placeholderyname_E <<} ref <<R<nONjK\<SPS<h1h<SPS<pOref<{<return<asty_1objectRefyn*<p_E<} <<?<nONjK\<{<return<asty_1refyn_E<} name <<R<nONjK\<{ <<<<return<asty_1nameyn_E <<} string <<R</h/<sOstring_characterS</h/<{ <<<<return<texty_1sliceyI*<:I_E <<} <<?<h/h<sOstring_characterS<h/h<{ <<<<return<texty_1sliceyI*<:I_E <<} string_character <<R<g[h\\\r\n]<1 <<?<h\\h<\S-jP\_-ZjRj-T\R store_process <<R<STOR\<rOySPP<ref_M<dsOySPP<TO<SPP<dataset_M<{ <<<<if<yr_<{ <<<<<<r<R<r[I]E <<<<} <<<<if<yds_<{ <<<<<<ds<R<ds[H]E <<<<} <<<<return<asty_1storeyds*<r_E <<} visualize_process <<R<VYSUj9YZ\<{ <<<<return<asty_1visualizey_E <<} apply_process <<R<y<jPP9Y<?<-j9-U9jT\<_<SPP<nameONjK\<{ <<<<return<asty_1applyyname_E <<} process <<R<apply_process <<?<store_process <<?<visualize_process <<?<urban_process <<?<marine_process <<?<land_process add_process <<R<j!!<SPP<TO<SPP<dsOdataset<{ <<<<return<asty_1addyds_E <<} <<?<j!!<SPP<rOref<SPP<TO<SPP<dsOdataset<{ <<<<return<asty_1addyds*<r_E <<} exclude_process <<R<\X-9U!\<y<SPP<urban_dataset_param<_M<SPP<nameONjK\<{ <<<<return<asty_1excludeyname_E <<} urban_process <<R<add_process <<?<exclude_process <<?<XOYN<SPP<NjK\<y<SPP<jN!<SPP<NjK\<_S <<?<S\9\-T<SPP<urban_dataset_param<SPP<NjK\<y<SPP <<<<jN!<SPP<urban_dataset_param<SPP<NjK\<_S urban_dataset_param <<R<j!!\! <<?<j99 <<?<-\RTjYN <<?<!\zORK\! marine_process <<R<V\N\RjT\<SPP<NjK\ land_process <<R<-O9ORYZ\<SPP<NjK\<SPP<dataset ?SS<NjK\S?Y!\NTYzY\RS<SSSSSSSSSSS? NjK\ <<R<gReservedWord<firstO[_a:zj:Z]<moreONjK\_KOR\S<{ <<<<return<first<P<more1joinyhh_E <<} NjK\_KOR\ <<R<[_a:zj:ZW:G] ?SS<R\S\RV\!<WOR!S<SSSSSSSSSSSSSS? ReservedWord <<R<Jeyword Jeyword <<R<j!! <<?<j!!\! <<?<j99 <<?<jN! <<?<jPP9Y <<?<jS <<?<-j9-U9jT\ <<?<-\RTjYN <<?<-O9ORYZ\ <<?<!\zORK\! <<?<!O <<?<\N! <<?<\X-9U!\ <<?<V\N\RjT\ <<?<XOYN <<?<9jT\ST <<?<Oz <<?<R\-\NT <<?<S\9\-T <<?<STOR\ <<?<TO <<?<USYNV <<?<VYSUj9YZ\ <<?<WYTZ ?SS<TOJ\NS<SSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSS? NUK'\R <<R<[W:G]P<y<h1h<[W:G]P<_M<{ <<<<return<parsezloatytexty__E <<} \S-jP\_-ZjRj-T\R <<R<[/h\\bfnrtv] SP <<R<[<\t\n\r] ?SS<J\YWOR!<TOJ\NS<SSSSSSSSSSSSSSSSSSSSSSSSSS? j!!<<<<<<<<<<<<R<haddh<<<<<<<<<<<<gNjK\_KOR\ j!!\!<<<<<<<<<<R<haddedh<<<<<<<<<<gNjK\_KOR\ j99<<<<<<<<<<<<R<hallh<<<<<<<<<<<<gNjK\_KOR\ jN!<<<<<<<<<<<<R<handh<<<<<<<<<<<<gNjK\_KOR\ jPP9Y<<<<<<<<<<R<happlyh<<<<<<<<<<gNjK\_KOR\ jS<<<<<<<<<<<<<R<hash<<<<<<<<<<<<<gNjK\_KOR\ -j9-U9jT\<<<<<<R<hcalculateh<<<<<<gNjK\_KOR\ -\RTjYN<<<<<<<<R<hcertainh<<<<<<<<gNjK\_KOR\ -O9ORYZ\<<<<<<<R<hcolorizeh<<<<<<<gNjK\_KOR\ <<<<<<<<<<<<<<<?<hcolouriseh<<<<<<gNjK\_KOR\ !\zORK\!<<<<<<<R<hdeformedh<<<<<<<gNjK\_KOR\ !O<<<<<<<<<<<<<R<hdoh<<<<<<<<<<<<<gNjK\_KOR\ \N!<<<<<<<<<<<<R<hendh<<<<<<<<<<<<gNjK\_KOR\ \X-9U!\<<<<<<<<R<hexcludeh<<<<<<<<gNjK\_KOR\ V\N\RjT\<<<<<<<R<hgenerateh<<<<<<<gNjK\_KOR\ XOYN<<<<<<<<<<<R<hjoinh<<<<<<<<<<<gNjK\_KOR\ 9jT\ST<<<<<<<<<R<hlatesth<<<<<<<<<gNjK\_KOR\ Oz<<<<<<<<<<<<<R<hofh<<<<<<<<<<<<<gNjK\_KOR\ R\-\NT<<<<<<<<<R<hrecenth<<<<<<<<<gNjK\_KOR\ S\9\-T<<<<<<<<<R<hselecth<<<<<<<<<gNjK\_KOR\ STOR\<<<<<<<<<<R<hstoreh<<<<<<<<<<gNjK\_KOR\ TO<<<<<<<<<<<<<R<htoh<<<<<<<<<<<<<gNjK\_KOR\ USYNV<<<<<<<<<<R<husingh<<<<<<<<<<gNjK\_KOR\ VYSUj9YZ\<<<<<<R<hvisualizeh<<<<<<gNjK\_KOR\ <<<<<<<<<<<<<<<?<hvisualiseh<<<<<<gNjK\_KOR\ WYTZ<<<<<<<<<<<R<hwithh<<<<<<<<<<<gNjK\_KOR\
  8. CONCLUSION Cloud is scalable! Current UIs are too complicated Cloud

    is easy! Current solutions are not scalable