使用spark和RDD映射cassandra数据库的表



我必须映射一个表,其中写入了一个应用程序的使用历史。这个表有这些元组:

<AppId,date,cpuUsage,memoryUsage>
<AppId,date,cpuUsage,memoryUsage>
<AppId,date,cpuUsage,memoryUsage>
<AppId,date,cpuUsage,memoryUsage>
<AppId,date,cpuUsage,memoryUsage>

AppId总是不同的,因为在许多应用程序中被引用,date用这种格式表示,dd/mm/yyyy hh/mmcpuUsagememoryUsage%表示,所以例如:

<3ghffh3t482age20304,230720142245,0.2,3,5>

我以这种方式从cassandra检索数据(小片段):

public static void main(String[] args) {
        Cluster cluster;
        Session session;
        cluster = Cluster.builder().addContactPoint("127.0.0.1").build();
        session = cluster.connect();
        session.execute("CREATE KEYSPACE IF NOT EXISTS foo WITH replication "
                + "= {'class':'SimpleStrategy', 'replication_factor':3};");
        String createTableAppUsage = "CREATE TABLE IF NOT EXISTS foo.appusage"
                + "(appid text,date text, cpuusage double, memoryusage double, "
                + "PRIMARY KEY(appid,date) " + "WITH CLUSTERING ORDER BY (time ASC);";
        session.execute(createTableAppUsage);
        // Use select to get the appusage's table rows
        ResultSet resultForAppUsage = session.execute("SELECT appid,cpuusage FROM foo.appusage");
       for (Row row: resultForAppUsage)
             System.out.println("appid :" + row.getString("appid") +" "+ "cpuusage"+row.getString("cpuusage"));
        // Clean up the connection by closing it
        cluster.close();
    }

所以,我现在的问题是通过key value映射数据并创建一个集成此代码的元组(不工作的代码段):

        <AppId,cpuusage>
        JavaPairRDD<String, Integer> saveTupleKeyValue =someStructureFromTakeData.mapToPair(new PairFunction<String, String, Integer>() {
            public Tuple2<String, Integer> call(String x) {
                return new Tuple2(x, y);
            }

如何使用RDD和reduce eg. cpuusage >50映射appId和cpuusage ?

帮忙吗?

假设您已经创建了一个有效的SparkContext sparkContext,并将spark-cassandra连接器依赖项添加到您的项目中,并配置了spark应用程序以与cassandra集群通信(请参阅文档),那么我们可以像这样在RDD中加载数据:

val data = sparkContext.cassandraTable("foo", "appusage").select("appid", "cpuusage")

在Java中,这个想法是相同的,但它需要更多的管道,这里描述

相关内容

  • 没有找到相关文章

最新更新