[
https://issues.apache.org/jira/browse/FLINK-176?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Robert Metzger closed FLINK-176.
--------------------------------
Resolution: Fixed
--> new Java API in 0.5
> Add spark-inspired LocalExecutionContext and ClusterExecutionContext
> --------------------------------------------------------------------
>
> Key: FLINK-176
> URL:
https://issues.apache.org/jira/browse/FLINK-176> Project: Flink
> Issue Type: Improvement
> Reporter: GitHub Import
> Labels: github-import
> Fix For: pre-apache
>
>
> I would like to propose a new way to submit jobs since I find the currently existing way to cumbersome. The idea is heavily inspired by spark, but still valid.
> With the new way you would do something like this:
> val ec = LocalExectionContext()
> val input = ec.textFile("<file>")
> val counts = input.flatMap { _..split(" ") map { (_, 1) } }
> .groupBy { case (word, _) => word }
> .reduce { (w1, w2) => (w1._1, w1._2 + w2._2) }
> .write("<output>")
> To run on the cluster you would do:
> val ec = ClusterExecutionContext("<job manager ip>", ...)
> ...
> The execution context would have methods to check whether the job is complete and so on.
> ---------------- Imported from GitHub ----------------
> Url:
https://github.com/stratosphere/stratosphere/issues/176> Created by: [aljoscha|
https://github.com/aljoscha]
> Labels: enhancement, scala api,
> Assignee: [aljoscha|
https://github.com/aljoscha]
> Created at: Fri Oct 18 11:25:36 CEST 2013
> State: open
--
This message was sent by Atlassian JIRA
(v6.2#6252)