apache flink docker-compose 運行試用

來源:互聯網
上載者:User

標籤:簡單   action   運行   overwrite   lam   star   opp   cti   cow   

apache 是一個流處理架構,官方提供了docker 鏡像,同時也提供了基於docker-compose 啟動並執行說明

docker-compose file
version: "2.1"services:  jobmanager:    image: flink    expose:      - "6123"    ports:      - "8081:8081"    command: jobmanager    environment:      - JOB_MANAGER_RPC_ADDRESS=jobmanager  taskmanager:    image: flink    expose:      - "6121"      - "6122"    depends_on:      - jobmanager    command: taskmanager    links:      - "jobmanager:jobmanager"    environment:      - JOB_MANAGER_RPC_ADDRESS=jobmanager
運行
docker-compose up -d
效果

編寫簡單job

使用maven

  • 腳手架產生

    按照提示輸入group 等資訊。。

mvn archetype:generate       -DarchetypeGroupId=org.apache.flink       -DarchetypeArtifactId=flink-quickstart-java       -DarchetypeVersion=1.6.0 - JOB_MANAGER_RPC_ADDRESS=jobmanager
  • 修改代碼
預設項目結構├── pom.xml└── src    └── main        ├── java        │ └── com        │ └── dalong        │ └── app        │ ├── BatchJob.java        │ ├── StreamingJob.java        └── resources            └── log4j.properties添加一個技術的批處理添加之後項目結構├── pom.xml└── src    └── main        ├── java        │ └── com        │ └── dalong        │ └── app        │ ├── BatchJob.java        │ ├── StreamingJob.java        │ └── util        │ └── WordCountData.java        └── resources            └── log4j.properties代碼說明:BatchJob.javapackage com.dalong.app;import org.apache.flink.api.java.ExecutionEnvironment;import org.apache.flink.api.common.functions.FlatMapFunction;import org.apache.flink.api.common.functions.ReduceFunction;import org.apache.flink.api.java.DataSet;import org.apache.flink.api.java.ExecutionEnvironment;import org.apache.flink.api.java.utils.ParameterTool;import org.apache.flink.core.fs.FileSystem.WriteMode;import com.dalong.app.util.WordCountData;import org.apache.flink.util.Collector;public class BatchJob {   public static class Word {      // fields      private String word;      private int frequency;      // constructors      public Word() {}      public Word(String word, int i) {         this.word = word;         this.frequency = i;      }      // getters setters      public String getWord() {         return word;      }      public void setWord(String word) {         this.word = word;      }      public int getFrequency() {         return frequency;      }      public void setFrequency(int frequency) {         this.frequency = frequency;      }      @Override      public String toString() {         return "Word=" + word + " freq=" + frequency;      }   }   public static void main(String[] args) throws Exception {      final ParameterTool params = ParameterTool.fromArgs(args);      // set up the execution environment      final ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();      // make parameters available in the web interface      env.getConfig().setGlobalJobParameters(params);      // get input data      DataSet<String> text;      if (params.has("input")) {         // read the text file from given input path         text = env.readTextFile(params.get("input"));      } else {         // get default test text data         System.out.println("Executing WordCount example with default input data set.");         System.out.println("Use --input to specify file input.");         text = WordCountData.getDefaultTextLineDataSet(env);      }      DataSet<Word> counts =            // split up the lines into Word objects (with frequency = 1)            text.flatMap(new Tokenizer())                  // group by the field word and sum up the frequency                  .groupBy("word")                  .reduce(new ReduceFunction<Word>() {                     @Override                     public Word reduce(Word value1, Word value2) throws Exception {                        return new Word(value1.word, value1.frequency + value2.frequency);                     }                  });      if (params.has("output")) {         counts.writeAsText(params.get("output"), WriteMode.OVERWRITE);         // execute program         env.execute("WordCount-Pojo Example");      } else {         System.out.println("Printing result to stdout. Use --output to specify output path.");         counts.print();      }   }   public static final class Tokenizer implements FlatMapFunction<String, Word> {      @Override      public void flatMap(String value, Collector<Word> out) {         // normalize and split the line         String[] tokens = value.toLowerCase().split("\\W+");         // emit the pairs         for (String token : tokens) {            if (token.length() > 0) {               out.collect(new Word(token, 1));            }         }      }   }}util/WordCountData.javapackage com.dalong.app.util;import org.apache.flink.api.java.DataSet;import org.apache.flink.api.java.ExecutionEnvironment;public class WordCountData {    public static final String[] WORDS = new String[] {            "To be, or not to be,--that is the question:--",            "Whether ‘tis nobler in the mind to suffer",            "The slings and arrows of outrageous fortune",            "Or to take arms against a sea of troubles,",            "And by opposing end them?--To die,--to sleep,--",            "No more; and by a sleep to say we end",            "The heartache, and the thousand natural shocks",            "That flesh is heir to,--‘tis a consummation",            "Devoutly to be wish‘d. To die,--to sleep;--",            "To sleep! perchance to dream:--ay, there‘s the rub;",            "For in that sleep of death what dreams may come,",            "When we have shuffled off this mortal coil,",            "Must give us pause: there‘s the respect",            "That makes calamity of so long life;",            "For who would bear the whips and scorns of time,",            "The oppressor‘s wrong, the proud man‘s contumely,",            "The pangs of despis‘d love, the law‘s delay,",            "The insolence of office, and the spurns",            "That patient merit of the unworthy takes,",            "When he himself might his quietus make",            "With a bare bodkin? who would these fardels bear,",            "To grunt and sweat under a weary life,",            "But that the dread of something after death,--",            "The undiscover‘d country, from whose bourn",            "No traveller returns,--puzzles the will,",            "And makes us rather bear those ills we have",            "Than fly to others that we know not of?",            "Thus conscience does make cowards of us all;",            "And thus the native hue of resolution",            "Is sicklied o‘er with the pale cast of thought;",            "And enterprises of great pith and moment,",            "With this regard, their currents turn awry,",            "And lose the name of action.--Soft you now!",            "The fair Ophelia!--Nymph, in thy orisons",            "Be all my sins remember‘d."    };    public static DataSet<String> getDefaultTextLineDataSet(ExecutionEnvironment env) {        return env.fromElements(WORDS);    }}
  • 構建
cd  flink-appmvn clean package
  • 提交job

    產生的程式碼很簡單

  • 運行job
  • 效果

參考資料

https://hub.docker.com/r/library/flink/
https://github.com/apache/flink/blob/master/flink-examples/flink-examples-batch/src/main/java/org/apache/flink/examples/java/wordcount/
https://github.com/rongfengliang/flink-docker-compose-demo

    

apache flink docker-compose 運行試用

相關文章

聯繫我們

該頁面正文內容均來源於網絡整理,並不代表阿里雲官方的觀點,該頁面所提到的產品和服務也與阿里云無關,如果該頁面內容對您造成了困擾,歡迎寫郵件給我們,收到郵件我們將在5個工作日內處理。

如果您發現本社區中有涉嫌抄襲的內容,歡迎發送郵件至: info-contact@alibabacloud.com 進行舉報並提供相關證據,工作人員會在 5 個工作天內聯絡您,一經查實,本站將立刻刪除涉嫌侵權內容。

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.