Slide 18
Slide 18 text
@doanduyhai et @aseigneurin
#SparkCassandra
Spark
en
pra'que
val conf = new SparkConf()
.setAppName("myapp")
.setMaster("local[4]")
val sc = new SparkContext(conf)
val data = Array(1, 2, 3, 4, 5)
val rdd = sc.parallelize(data)
SparkConf conf = new SparkConf()
.setAppName("myapp")
.setMaster("local[4]");
JavaSparkContext sc = new JavaSparkContext(conf);
List data = Arrays.asList(1, 2, 3, 4, 5);
JavaRDD rdd = sc.parallelize(data);
Scala
Java