博客
关于我
强烈建议你试试无所不能的chatGPT,快点击我
Layui_Tree模块遍历HDFS
阅读量:5339 次
发布时间:2019-06-15

本文共 12925 字,大约阅读时间需要 43 分钟。

注:转载请署名

一、实体

package com.ebd.application.common.Base;import java.util.List;public class HDFSDir {	private String id;      //自定id	private String pid;		//父ID	private String name;	//当前目录名称	private String alias;	//目录别名,可不用	private String dir;		//自"/"目录后的完整目录	private boolean spread;  //是否展开(true,false)	private List
children; //子目录 public String getId() { return id; } public void setId(String id) { this.id = id; } public String getPid() { return pid; } public void setPid(String pid) { this.pid = pid; } public String getName() { return name; } public void setName(String name) { this.name = name; } public String getAlias() { return alias; } public void setAlias(String alias) { this.alias = alias; } public String getDir() { return dir; } public void setDir(String dir) { this.dir = dir; } public boolean isSpread() { return spread; } public void setSpread(boolean spread) { this.spread = spread; } public List
getChildren() { return children; } public void setChildren(List
children) { this.children = children; }}

二、工具类

package hdfstest;import java.io.IOException;import java.net.URI;import java.util.ArrayList;import java.util.List;import org.apache.commons.lang3.StringUtils;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.FSDataInputStream;import org.apache.hadoop.fs.FileStatus;import org.apache.hadoop.fs.FileSystem;import org.apache.hadoop.fs.Path;import org.apache.hadoop.mapred.JobConf;import com.ebd.application.common.Base.HDFSDir;import com.ebd.application.common.utils.Identities;import net.sf.json.JSONObject;public class HdfsListTest {		//HDFS访问地址	private static final String HDFS = "hdfs://bigdata.hadoop.com:9000";	public HdfsListTest(Configuration conf) {		this(HDFS, conf);	}		public HdfsListTest(String hdfs, Configuration conf) {		this.hdfsPath = hdfs;		this.conf = conf;	}		//hdfs路径	private String hdfsPath;		//Hadoop系统配置	private Configuration conf;		//启动函数	public static void main(String[] args) throws IOException {		JobConf conf = config();//		System.out.println(conf.get("hadoop.http.staticuser.user"));//		System.out.println(System.getenv("HADOOP_HOME"));		HdfsListTest hdfs = new HdfsListTest(conf);//		hdfs.mkdirs("/testput");//		hdfs.copyFile("C:\\Users\\Administrator\\Desktop\\testput", "/testput/testput2");//		hdfs.catFile("/testput/testput");//		hdfs.download("/testput/testput", "E:\\");//		hdfs.ls("hdfs://bigdata.hadoop.com:9000/user");//		hdfs.rmr("/testput");//		List
fileList = hdfs.getTree("/","/","|-"); List
kk = new ArrayList
(); HDFSDir ds1 = new HDFSDir(); HDFSDir ds2 = new HDFSDir(); HDFSDir ds3 = new HDFSDir(); ds1.setId(Identities.uuid()); ds1.setDir("/testput"); ds2.setId(Identities.uuid()); ds2.setDir("/user"); ds3.setId(Identities.uuid()); ds3.setDir("/tmp");// kk.add(ds1);// kk.add(ds2);// kk.add(ds3); HDFSDir ds = new HDFSDir(); ds.setId(Identities.uuid()); ds.setDir("/"); kk.add(ds);// List
fileList = hdfs.getListTree("/","/user",0); HDFSDir hdfss = hdfs.getChildNode(ds); JSONObject object = JSONObject.fromObject(hdfss); System.out.println(dirJsonFunc(object.toString())); } //加载Hadoop配置文件 public static JobConf config(){ JobConf conf = new JobConf(HdfsListTest.class); conf.setJobName("HdfsDAO"); conf.addResource("hadoop/core-site.xml"); conf.addResource("hadoop/hdfs-site.xml"); conf.addResource("hadoop/mapred-site.xml"); return conf; } //在根目录下创建文件夹 public void mkdirs(String folder) throws IOException { Path path = new Path(folder); FileSystem fs = FileSystem.get(URI.create(hdfsPath), conf); if (!fs.exists(path)) { fs.mkdirs(path); System.out.println("Create: " + folder); } fs.close(); } //某个文件夹的文件列表 public FileStatus[] ls(String folder) throws IOException { Path path = new Path(folder); FileSystem fs = FileSystem.get(URI.create(hdfsPath), conf); FileStatus[] list = fs.listStatus(path); System.out.println("ls: " + folder); System.out.println("=========================================================="); if(list != null) for (FileStatus f : list) { System.out.printf("name: %s, folder: %s, size: %d\n", f.getPath(), f.isDir(), f.getLen());// System.out.printf("%s, folder: %s, 大小: %dK\n", f.getPath().getName(), (f.isDir()?"目录":"文件"), f.getLen()/1024); } System.out.println("=========================================================="); fs.close(); return list; } public void copyFile(String local, String remote) throws IOException { FileSystem fs = FileSystem.get(URI.create(hdfsPath), conf); //remote---/用户/用户下的文件或文件夹 fs.copyFromLocalFile(new Path(local), new Path(remote)); System.out.println("copy from: " + local + " to " + remote); fs.close(); } public void catFile(String remote) throws IOException { FSDataInputStream instream = null; FileSystem fs = FileSystem.get(URI.create(hdfsPath), conf); Path path = new Path(remote); if(fs.isFile(path)){ fs.open(path); instream = fs.open(path); byte[] b = new byte[1024]; instream.read(b); System.out.println(new String(b,"utf-8")); fs.close(); } } List
treeList = new ArrayList
(); public List
getTree(String top, String remote, String prefix) throws IOException { Path path = new Path(remote); FileSystem fs = FileSystem.get(URI.create(hdfsPath), conf); FileStatus[] list = fs.listStatus(path); if(list != null) for (FileStatus f : list) {// System.out.printf("name: %s, folder: %s, size: %d\n", f.getPath(), f.isDir(), f.getLen()); System.out.println(prefix+ f.getPath().getName()); top += f.getPath().getName(); treeList.add(top); if(fs.isDirectory(f.getPath())){ getTree(top,f.getPath().toString(),prefix+"-"); } } return treeList; } int id = 0; static int pid = 0; List
dirList = new ArrayList
(); HDFSDir hdfsDir = null; private List
getListTree(String top, String remote, int pid) throws IOException { Path path = new Path(remote); FileSystem fs = FileSystem.get(URI.create(hdfsPath), conf); FileStatus[] list = fs.listStatus(path); if(list != null) for (FileStatus f : list) { if(f.isDirectory()){ hdfsDir = new HDFSDir();// hdfsDir.setId(id++);// hdfsDir.setPid(pid); hdfsDir.setName(f.getPath().getName()); hdfsDir.setAlias(f.getPath().getName()); hdfsDir.setDir(f.getPath().toString().substring(HDFS.length())); hdfsDir.setSpread(false); System.out.println(f.getPath().getName()+"="+f.getPath().toString().substring(HDFS.length())); dirList.add(hdfsDir); }// System.out.printf("name: %s, folder: %s, size: %d\n", f.getPath(), f.isDir(), f.getLen());// System.out.println(prefix+ f.getPath().getName());// top += f.getPath().getName();// if(fs.isDirectory(f.getPath())){// getListTree(top,f.getPath().toString(),pid++);// } } return dirList; } List
cDirList = null; public HDFSDir getChildNode(HDFSDir pDir) throws IOException{ Path path = null; if(pDir.getChildren() != null && pDir.getChildren().size() >= 1){ for(HDFSDir p : pDir.getChildren()){ path = new Path(p.getDir()); FileSystem fs = FileSystem.get(URI.create(hdfsPath), conf); FileStatus[] list = fs.listStatus(path); if(list != null){ cDirList = new ArrayList
(); for (FileStatus f : list) { if(f.isDirectory()){ hdfsDir = new HDFSDir(); hdfsDir.setId(Identities.uuid()); hdfsDir.setPid(p.getId()); hdfsDir.setName(f.getPath().getName()); hdfsDir.setAlias(f.getPath().getName()); hdfsDir.setDir(f.getPath().toString().substring(HDFS.length())); hdfsDir.setSpread(false); cDirList.add(hdfsDir); } } p.setChildren(cDirList); for(HDFSDir pp : cDirList){ getChildNode(pp); } } } }else{ path = new Path(pDir.getDir()); FileSystem fs = FileSystem.get(URI.create(hdfsPath), conf); FileStatus[] list = fs.listStatus(path); if(list != null){ cDirList = new ArrayList
(); for (FileStatus f : list) { if(f.isDirectory()){ hdfsDir = new HDFSDir(); hdfsDir.setId(Identities.uuid()); hdfsDir.setPid(pDir.getId()); hdfsDir.setName(f.getPath().getName().equals("")?"/":f.getPath().getName()); hdfsDir.setAlias(f.getPath().getName().equals("")?"/":f.getPath().getName()); hdfsDir.setDir(f.getPath().toString().substring(HDFS.length())); hdfsDir.setSpread(false); cDirList.add(hdfsDir); } } pDir.setChildren(cDirList); for(HDFSDir pp : cDirList){ getChildNode(pp); } } } return pDir; } public static String dirJsonFunc(String jsonStr) { if (StringUtils.isNotBlank(jsonStr)) { String[] reg_array = {"([\"])","(,['])","([']:)"}; String[] rpa_array = {"'",",",":"}; for(int i=0;i

转换工具类

package test;import org.apache.commons.lang3.StringUtils;import com.ebd.application.common.Base.HDFSDir;import com.ebd.application.common.utils.Identities;import net.sf.json.JSONArray;public class TestObjectToJson {	public static void main(String[] args) {		HDFSDir ds = new HDFSDir();		ds.setId(Identities.uuid());		ds.setDir("/testput");		JSONArray js = JSONArray.fromObject(ds);//		System.out.println(js.toString());				String jsonStr = js.toString();		//		String reg_1 = "([\"])"; //双引号转单引号//		String reg_2 = "(,['])"; //去掉逗号后面的单引号//		String reg_3 = "([']:)"; //去掉冒号前面的单引号//		String reg_4 = "('{'['])"; //去掉开头大括号后面的单引号//		Pattern pattern = Pattern.compile(regEx);//		jsonStr =  jsonStr.replaceAll(reg_1, "'");//		jsonStr =  jsonStr.replaceAll(reg_2, ",");//		jsonStr =  jsonStr.replaceAll(reg_3, ":");//		jsonStr =  jsonStr.replaceAll("{'", "{");//        System.out.println(jsonStr); 	}		public static String dirJsonFunc(String jsonStr) {		 if (StringUtils.isNotBlank(jsonStr)) {			 String[] reg_array = {"([\"])","(,['])","([']:)"};			 String[] rpa_array = {"'",",",":"};			 for(int i=0;i

  

 

工具类

package hdfstest;import java.io.File;import java.io.FileOutputStream;import java.io.IOException;import java.io.OutputStream;import java.net.URI;import java.util.ArrayList;import java.util.List;import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.FSDataInputStream;import org.apache.hadoop.fs.FileStatus;import org.apache.hadoop.fs.FileSystem;import org.apache.hadoop.fs.Path;import org.apache.hadoop.io.IOUtils;import org.apache.hadoop.mapred.JobConf;import com.ebd.application.common.utils.CreateFileUtil;public class HdfsDirConsoleTest {		//HDFS访问地址	private static final String HDFS = "hdfs://bigdata.hadoop.com:9000";	public HdfsDirConsoleTest(Configuration conf) {		this(HDFS, conf);	}		public HdfsDirConsoleTest(String hdfs, Configuration conf) {		this.hdfsPath = hdfs;		this.conf = conf;	}		//hdfs路径	private String hdfsPath;		//Hadoop系统配置	private Configuration conf;		//启动函数	public static void main(String[] args) throws IOException {		JobConf conf = config();		System.out.println(conf.get("hadoop.http.staticuser.user"));		System.out.println(System.getenv("HADOOP_HOME"));		HdfsDirConsoleTest hdfs = new HdfsDirConsoleTest(conf);//		hdfs.mkdirs("/testput");//		hdfs.copyFile("C:\\Users\\Administrator\\Desktop\\testput", "/testput/testput2");//		hdfs.catFile("/testput/testput");		hdfs.download("/testput/testput", "D:/ss/ss",conf);//		hdfs.ls("hdfs://bigdata.hadoop.com:9000/");//		hdfs.rmr("/testput");//		List
fileList = hdfs.getTree("/","/","|-");// for(int i=0;i
treeList = new ArrayList
(); public List
getTree(String top, String remote, String prefix) throws IOException { Path path = new Path(remote); FileSystem fs = FileSystem.get(URI.create(hdfsPath), conf); FileStatus[] list = fs.listStatus(path); if(list != null) for (FileStatus f : list) {// System.out.printf("name: %s, folder: %s, size: %d\n", f.getPath(), f.isDir(), f.getLen()); System.out.println(prefix+ f.getPath().getName()); top += f.getPath().getName(); treeList.add(top); if(fs.isDirectory(f.getPath())){ getTree(top,f.getPath().toString(),prefix+"-"); } } return treeList; } //删除文件或文件夹 public void rmr(String folder) throws IOException { Path path = new Path(folder); FileSystem fs = FileSystem.get(URI.create(hdfsPath), conf); fs.deleteOnExit(path); System.out.println("Delete: " + folder); fs.close(); } //下载文件到本地系统 public void download(String remote, String local,JobConf conf) throws IOException { // Path path = new Path(remote);// FileSystem fs = FileSystem.get(URI.create(hdfsPath), conf);// fs.copyToLocalFile(path, new Path(local));// System.out.println("download: from" + remote + " to " + local);// fs.close(); FileSystem fs = FileSystem.get(URI.create(remote),conf); FSDataInputStream fsdi = fs.open(new Path(remote)); if(CreateFileUtil.createDir(local)){ OutputStream output = new FileOutputStream(local+remote.substring(remote.lastIndexOf("/"))); IOUtils.copyBytes(fsdi,output,4096,true); } } public static void makdir(String path) { String strPath = "E:/a/aa/"; File file = new File(strPath); File fileParent = file.getParentFile(); if(!fileParent.exists()){ fileParent.mkdirs(); } }}

  

转载于:https://www.cnblogs.com/eRrsr/p/8368110.html

你可能感兴趣的文章
模拟Post登陆带验证码的网站
查看>>
NYOJ458 - 小光棍数
查看>>
java中常用方法
查看>>
【Programming Clip】06、07年清华计算机考研上机试题解答(个别测试用例无法通过)...
查看>>
canvas动画
查看>>
4,7周围玩家
查看>>
关于webpack升级过后不能打包的问题;
查看>>
vue - 生命周期
查看>>
Python正则表达式
查看>>
Linux进程间通信--命名管道
查看>>
UVa 10970 - Big Chocolate
查看>>
js输出
查看>>
H5多文本换行
查看>>
HAL层三类函数及其作用
查看>>
Odoo 去掉 恼人的 "上午"和"下午"
查看>>
web@h,c小总结
查看>>
java编程思想笔记(一)——面向对象导论
查看>>
Data Structure 基本概念
查看>>
Ubuntu改坏sudoers后无法使用sudo的解决办法
查看>>
NEYC 2017 游记
查看>>