SPARK_HOME: /usr/local/service/sparkspark.master: yarnspark.submit.deployMode: clusterspark.app.name: zeppelin-spark
wordcount.txt file to the /tmp path of emr hdfs first.hdfs://HDFS45983 value of the fs.defaultFS configuration item in core-site.xml.%sparkval data = sc.textFile("hdfs://HDFS45983/tmp/wordcount.txt")case class WordCount(word: String, count: Integer)val result = data.flatMap(x => x.split(" ")).map(x => (x, 1)).reduceByKey(_ + _).map(x => WordCount(x._1, x._2))result.toDF().registerTempTable("result")%sqlselect * from result
FLINK_HOME: /usr/local/service/flinkflink.execution.mode: yarn
%flinkval data = benv.fromElements("hello world", "hello flink", "hello hadoop")data.flatMap(line => line.split("\\\\s")).map(w => (w, 1)).groupBy(0).sum(1).print()
hbase.home: /usr/local/service/hbasehbase.ruby.sources: lib/rubyzeppelin.hbase.test.mode: false
/usr/local/service/zeppelin/local-repo path of the cluster, you don't need to configure dependencies. They are required only if you want to define JAR packages.%hbasehelp 'get'%hbaselist
zeppelin.livy.url: http://ip:8998
%livy.sparksc.version%livy.pysparkprint "1"%livy.sparkrhello <- function( name ) {sprintf( "Hello, %s", name );}hello("livy")
kylin.api.url: http://ip:16500/kylin/api/querykylin.api.user: ADMINkylin.api.password: KYLINkylin.query.project: default
%kylin(default)select count(*) from table1
default.url: jdbc:mysql://ip:3306default.user: xxxdefault.password: xxxdefault.driver: com.mysql.jdbc.Driver
/usr/local/service/zeppelin/local-repo path of the cluster, you don't need to configure dependencies. They are required only if you want to define JAR packages.%mysqlshow databases
default.url: jdbc:hive2://ip:7001default.user: hadoopdefault.password:default.driver: org.apache.hive.jdbc.HiveDriver
/usr/local/service/zeppelin/local-repo path of the cluster, you don't need to configure dependencies. They are required only if you want to define JAR packages.%hiveshow databases%hiveuse default;show tables;
default.url: jdbc:presto://ip:9000?user=hadoopdefault.user: hadoopdefault.password:default.driver: io.prestosql.jdbc.PrestoDriver
/usr/local/service/zeppelin/local-repo path of the cluster, you don't need to configure dependencies. They are required only if you want to define JAR packages.%prestoshow catalogs;%prestoshow schemas from hive;%prestoshow tables from hive.default;
Feedback