Notice
Recent Posts
Recent Comments
Link
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | |||||
3 | 4 | 5 | 6 | 7 | 8 | 9 |
10 | 11 | 12 | 13 | 14 | 15 | 16 |
17 | 18 | 19 | 20 | 21 | 22 | 23 |
24 | 25 | 26 | 27 | 28 | 29 | 30 |
Tags
- Express
- mybatis
- Android
- IntelliJ
- vaadin
- NPM
- SPC
- mapreduce
- plugin
- JavaScript
- Sqoop
- Java
- GIT
- Spring
- R
- table
- react
- es6
- xPlatform
- SQL
- MSSQL
- 공정능력
- tomcat
- 보조정렬
- Eclipse
- Kotlin
- SSL
- Python
- hadoop
- window
Archives
- Today
- Total
DBILITY
hadoop job control 본문
반응형
- 순차적으로 실행
JobConf job1=new JobConf(getConf()); ... JobClient.run(job1); JobConf job2=new JobConf(getConf()); ... JobClient.run(job2); JobConf job3=new JobConf(getConf()); ... JobClient.run(job3);
- 의존관계의 연결
JobConf job1=new JobConf(getConf()); ... JobConf job2=new JobConf(getConf()); ... JobConf job3=new JobConf(getConf()); ... JobConf job4=new JobConf(getConf()); ... JobControl con=new JobControl("DependancyJob"); org.apache.hadoop.mapred.jobcontrol.Job wrapperJob1 =new org.apache.hadoop.mapred.jobcontrol.Job(job1); con.addJob(wrapperJob1); org.apache.hadoop.mapred.jobcontrol.Job wrapperJob2 =new org.apache.hadoop.mapred.jobcontrol.Job(job2); con.addJob(wrapperJob2); org.apache.hadoop.mapred.jobcontrol.Job wrapperJob3 =new org.apache.hadoop.mapred.jobcontrol.Job(job3); con.addJob(wrapperJob3); wrapperJob3.addDependingJob(wrapperJob1); wrapperJob3.addDependingJob(wrapperJob2); org.apache.hadoop.mapred.jobcontrol.Job wrapperJob4 =new org.apache.hadoop.mapred.jobcontrol.Job(job4); con.addJob(wrapperJob4); wrapperJob4.addDependingJob(wrapperJob3); con.run();
- 여러개의 Mapper와 한개의 Reducer
JobConf tmConf = new JobConf(false); ChainMapper.addMapper(conf, MapperClass, LongWritable.class, Text.class, Text.class, IntWritable.class, true, tmConf); JobConf umConf = new JobConf(false); ChainMapper.addMapper(conf, MapperClass, Text.class, IntWritable.class, Text.class, IntWritable.class, true, umConf); JobConf wrConf = new JobConf(false); ChainReducer.setReducer(conf, ReducerClass, Text.class, IntWritable.class, Text.class, IntWritable.class, true, wrConf); /*JobClient client = new JobClient(conf); RunningJob job = client.submitJob(conf); client.monitorAndPrintJob(conf, job);*/ RunningJob job = JobClient.runJob(conf);
반응형
'bigdata > hadoop' 카테고리의 다른 글
hadoop mapreduce - Container killed on request. Exit code is 143 (0) | 2018.04.14 |
---|---|
hadoop 2.7.5 compile (0) | 2018.04.11 |
hadoop 2.6.4 window 10 pseudo distribution mode 설치 (0) | 2017.04.01 |
hadoop 2.x winutils (0) | 2017.03.27 |
hadoop 2.6.x eclipse plugin (0) | 2017.03.23 |
Comments