陈梓立 created FLINK-9162:
--------------------------
Summary: Scala REPL hanging when running example
Key: FLINK-9162
URL:
https://issues.apache.org/jira/browse/FLINK-9162 Project: Flink
Issue Type: Bug
Components: Scala Shell
Affects Versions: 1.5.1
Environment: {code:java}
➜ build-target git:(master) scala -version
Scala code runner version 2.12.4 -- Copyright 2002-2017, LAMP/EPFL and Lightbend, Inc.
➜ build-target git:(master) java -version
java version "1.8.0_161"
Java(TM) SE Runtime Environment (build 1.8.0_161-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.161-b12, mixed mode)
{code}
With GitHub latest SNAPSHOT
Reporter: 陈梓立
{code:java}
➜ build-target git:(master) bin/start-cluster.sh
Starting cluster.
Starting standalonesession daemon on host localhost.
Starting taskexecutor daemon on host localhost.
➜ build-target git:(master) bin/start-scala-shell.sh local
Starting Flink Shell:
log4j:WARN No appenders could be found for logger (org.apache.flink.configuration.GlobalConfiguration).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See [
http://logging.apache.org/log4j/1.2/faq.html#noconfig] for more info.
Starting local Flink cluster (host: localhost, port: 49592).
Connecting to Flink cluster (host: localhost, port: 49592).
...
scala> val dataStream = senv.fromElements(1, 2, 3, 4)
dataStream: org.apache.flink.streaming.api.scala.DataStream[Int] = org.apache.flink.streaming.api.scala.DataStream@6b576ff8
scala> dataStream.countWindowAll(2).sum(0).print()
res0: org.apache.flink.streaming.api.datastream.DataStreamSink[Int] = org.apache.flink.streaming.api.datastream.DataStreamSink@304e1e4e
scala> val text = benv.fromElements(
| "To be, or not to be,--that is the question:--",
| "Whether 'tis nobler in the mind to suffer",
| "The slings and arrows of outrageous fortune",
| "Or to take arms against a sea of troubles,")
text: org.apache.flink.api.scala.DataSet[String] = org.apache.flink.api.scala.DataSet@1237aa73
scala> val counts = text .flatMap \{ _.toLowerCase.split("\\W+") } .map \{ (_, 1) }.groupBy(0).sum(1)
counts: org.apache.flink.api.scala.AggregateDataSet[(String, Int)] = org.apache.flink.api.scala.AggregateDataSet@7dbf92aa
scala> counts.print()
<Hanging>{code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)