TwoDArrayWritable中的行列值



我从mapper中发出二维双数组作为值,并试图在reducer中访问它。为获取所有2D数组的和,转换回double。

public static class DoubleTwoDArrayWritable extends TwoDArrayWritable {
        public DoubleTwoDArrayWritable () { 
             super (DoubleWritable.class) ;
        }
    }
齿轮

public class ReducerSvm extends Reducer<Text, DoubleTwoDArrayWritable, Text, Text>{
    public void reduce(Text key,Iterable<DoubleTwoDArrayWritable> values,Context context){
        System.out.println("key------"+key.toString());
        Writable [][] getArray = null;
        double C[][] = new double[3][1];
        for (DoubleTwoDArrayWritable value : values)
        {
            getArray = value.get();
            for (int i=0; i<3 ; i++ )
            {
                for (int j=0 ; j<1 ; j++ ){
                    System.out.println("v--> "+((DoubleWritable)getArray[i][j]).get());
                    C[i][j] = ((DoubleWritable)getArray[i][j]).get();
                }
            }
            System.out.println("C array");
            for (int i=0; i<3 ; i++ ){
                for (int j=0 ; j<1 ; j++ ){
                    System.out.println(C[i][j]+" ");
                }
                System.out.println("");
            }
        }

我能够得到我的双数组在Reducer . but我hardcoded我的行和值。如何在使用TwoDArrayWritable时得到减速机中rowcolumn的计数

编辑:

按照Balduz的建议,我编辑了代码
public void reduce(Text key,Iterable<DoubleTwoDArrayWritable> values,Context context){
        for (DoubleTwoDArrayWritable value : values) {
            Writable[][] currentArray = value.get();
            int rowSize = currentArray.length;
            int columnSize = currentArray[0].length;
            System.out.println("row size: "+rowSize);
            double[][] myArray = new double[rowSize][columnSize];
            for (int i = 0; i < currentArray.length; i++) {
                for (int j = 0; i < currentArray[i].length; j++) {
                     myArray[i][j] = ((DoubleWritable)currentArray[i][j]).get();
                }
            }
            System.out.println("myArray array");
            for (int i=0; i<myArray.length ; i++ ){
                for (int j=0 ; j<myArray[0].length ; j++ ){
                    System.out.println(myArray[i][j]+" ");
                }
                System.out.println("");
            }
        }
}
}

我可以得到正确的行大小。

java.lang.ArrayIndexOutOfBoundsException: 1
    at edu.am.bigdata.svmmodel.ReducerTrail.reduce(ReducerTrail.java:26)
    at edu.am.bigdata.svmmodel.ReducerTrail.reduce(ReducerTrail.java:1)
    at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:164)
    at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:610)
    at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:444)
    at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:449)

首先,请不要将变量命名为getArray,因为它看起来像方法名,会导致混淆。要遍历每个矩阵,需要执行以下操作:

for (DoubleTwoDArrayWritable value : values) {
    Writable[][] currentArray = value.get();
    for (int i = 0; i < currentArray.length; i++) {
        for (int j = 0; j < currentArray[i].length; j++) {
             DoubleWritable valueYouWant = (DoubleWritable)currentArray[i][j];
        }
    }
}
编辑:

为了将整个矩阵存储在一个变量中,我假设每行都有相同数量的列。在这种情况下,你可以这样初始化它:

for (DoubleTwoDArrayWritable value : values) {
    Writable[][] currentArray = value.get();
    int rowSize = currentArray.length;
    int columnSize = currentArray[0].length;
    double[][] myArray = new double[rowSize][columnSize];
    for (int i = 0; i < currentArray.length; i++) {
        for (int j = 0; j < currentArray[i].length; j++) {
             myArray[i][j] = ((DoubleWritable)currentArray[i][j]).get();
        }
    }
}

在可写键DoubleTwoDArrayWritable中公开两个方法,并从reducer中调用这两个方法来获取信息。

public static class DoubleTwoDArrayWritable extends TwoDArrayWritable {
    public DoubleTwoDArrayWritable () { 
         super (DoubleWritable.class) ;
    }
    public int getRow() {
        Writable[][] 2dArray = super.get();
        return 2dArray.length;
    }
    public int getColumn() {
        Writable[][] 2dArray = super.get();
        return 2dArray[0].length;
    }
}

希望能有所帮助。

相关内容

  • 没有找到相关文章

最新更新