Save data to Cassandra in Spark-shell:
vardata = Normalfill.map (line = Line.split ("\u0005")) Data.map ( line= = (Line (0), Line (1), Line (2)) . Savetocassandra ("Cui", "Oper_ios", Somecolumns ("User_no","cust_id","Oper_code","Oper_time"))
Savetocassandra method when the field type is counter, the default behavior is count
CREATE TABLE CUI.INCR (
Name text,
Count counter,
PRIMARY KEY (name)
)
scala> var rdd = Sc.parallelize (Array (("Cui", 100))
rdd:org.apache.spark.rdd.rdd[(String, Int)] = parallelcollectionrdd[820] at parallelize at <console>:42
scala> Rdd.savetocassandra ("Cui", "incr", Somecolumns ("name", "Count"))
16/01/21 16:55:35 INFO Core. Cluster:new Cassandra host/172.25.1.158:9042 Added
......
Name Count
Cui 100
scala> var rdd = Sc.parallelize (Array (("Cui", 100))
rdd:org.apache.spark.rdd.rdd[(String, Int)] = parallelcollectionrdd[821] at parallelize at <console>:42
scala> Rdd.savetocassandra ("Cui", "incr", Somecolumns ("name", "Count"))
Name Count
Cui
Spark-cassandra-connector Inserting data Functions Savetocassandra