要将输入ARFF文件分为较小的块以处理非常大的数据集



我试图在地图上运行一个weka分类器,并加载了甚至200mb的整个arff文件,都会导致空间错误,所以我想将arff文件拆分为块,但是事实是,它必须维护块信息,即每个块中的arff属性信息,以便在每个映射器中运行分类器。这是我试图拆分数据但无法使用效率的代码,

 List<InputSplit> splits = new ArrayList<InputSplit>();
        for (FileStatus file: listStatus(job)) {
            Path path = file.getPath();
            FileSystem fs = path.getFileSystem(job.getConfiguration());
            //number of bytes in this file
            long length = file.getLen();
            BlockLocation[] blkLocations = fs.getFileBlockLocations(file, 0, length);
            // make sure this is actually a valid file
            if(length != 0) {
                // set the number of splits to make. NOTE: the value can be changed to anything
                int count = job.getConfiguration().getInt("Run-num.splits",1);
                for(int t = 0; t < count; t++) {
                    //split the file and add each chunk to the list
                    splits.add(new FileSplit(path, 0, length, blkLocations[0].getHosts())); 
                }
            }
            else {
                // Create empty array for zero length files
                splits.add(new FileSplit(path, 0, length, new String[0]));
            }
        }
        return splits;

您是否先尝试过?

在mapred-site.xml中,添加此属性:

<property>
    <name>mapred.child.java.opts</name>
    <value>-Xmx2048m</value>
</property>

//MR Jobs的内存分配

最新更新