[VOTE] Release 1.12.0, release candidate #3

classic Classic list List threaded Threaded
16 messages Options
Reply | Threaded
Open this post in threaded view
|

[VOTE] Release 1.12.0, release candidate #3

Robert Metzger
Hi everyone,

We have resolved the licensing issue Chesnay found.

Please review and vote on the release candidate #3 for the version 1.12.0,
as follows:

[ ] +1, Approve the release
[ ] -1, Do not approve the release (please provide specific comments)


The complete staging area is available for your review, which includes:
* JIRA release notes [1a], and website release notes [1b]
* the official Apache source release and binary convenience releases to be
deployed to dist.apache.org [2], which are signed with the key with
fingerprint D9839159 [3],
* all artifacts to be deployed to the Maven Central Repository [4],
* source code tag "release-1.12.0-rc3" [5]

We will soon publish the PR for the release announcement blog post!

The vote will be open for at least 72 hours. It is adopted by majority
approval, with at least 3 PMC affirmative votes.

Thanks,
Dian & Robert

[1a]
https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12348263
[1b] https://github.com/apache/flink/pull/14195
[2] https://dist.apache.org/repos/dist/dev/flink/flink-1.12.0-rc3/
[3] https://dist.apache.org/repos/dist/release/flink/KEYS
[4] https://repository.apache.org/content/repositories/orgapacheflink-1404
[5] https://github.com/apache/flink/releases/tag/release-1.12.0-rc3
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release 1.12.0, release candidate #3

Robert Metzger
+1 (binding)


Checks:
- checksums seem correct
- source archive code compiles
- Compiled a test job against the staging repository
- launched a standalone cluster, ran some test jobs against it
- quickstart contains correct version
- regular jars contain correct NOTICE file
- Looked a bit over the output of
     git diff release-1.11.2...release-1.12 --  "**/pom.xml"



I noticed that at least one more jar file contains an invalid LICENSE file
in it's root. This has already been the case with Flink 1.11, and from the
context (apache flink jar, all the other license and notice files talk
about this being an Apache project) it should be clear that the license
file is not meant for the whole jar file contents.
I will still extend the automated LicenseChecker to resolve this, but I
don't want to cancel the release because of this.



On Wed, Dec 2, 2020 at 11:19 AM Robert Metzger <[hidden email]> wrote:

> Hi everyone,
>
> We have resolved the licensing issue Chesnay found.
>
> Please review and vote on the release candidate #3 for the version 1.12.0,
> as follows:
>
> [ ] +1, Approve the release
> [ ] -1, Do not approve the release (please provide specific comments)
>
>
> The complete staging area is available for your review, which includes:
> * JIRA release notes [1a], and website release notes [1b]
> * the official Apache source release and binary convenience releases to be
> deployed to dist.apache.org [2], which are signed with the key with
> fingerprint D9839159 [3],
> * all artifacts to be deployed to the Maven Central Repository [4],
> * source code tag "release-1.12.0-rc3" [5]
>
> We will soon publish the PR for the release announcement blog post!
>
> The vote will be open for at least 72 hours. It is adopted by majority
> approval, with at least 3 PMC affirmative votes.
>
> Thanks,
> Dian & Robert
>
> [1a]
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12348263
> [1b] https://github.com/apache/flink/pull/14195
> [2] https://dist.apache.org/repos/dist/dev/flink/flink-1.12.0-rc3/
> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
> [4] https://repository.apache.org/content/repositories/orgapacheflink-1404
> [5] https://github.com/apache/flink/releases/tag/release-1.12.0-rc3
>
>
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release 1.12.0, release candidate #3

Robert Metzger
There's now a pull request for the announcement blog post, please help
checking it: https://github.com/apache/flink-web/pull/397

On Thu, Dec 3, 2020 at 9:03 AM Robert Metzger <[hidden email]> wrote:

> +1 (binding)
>
>
> Checks:
> - checksums seem correct
> - source archive code compiles
> - Compiled a test job against the staging repository
> - launched a standalone cluster, ran some test jobs against it
> - quickstart contains correct version
> - regular jars contain correct NOTICE file
> - Looked a bit over the output of
>      git diff release-1.11.2...release-1.12 --  "**/pom.xml"
>
>
>
> I noticed that at least one more jar file contains an invalid LICENSE file
> in it's root. This has already been the case with Flink 1.11, and from the
> context (apache flink jar, all the other license and notice files talk
> about this being an Apache project) it should be clear that the license
> file is not meant for the whole jar file contents.
> I will still extend the automated LicenseChecker to resolve this, but I
> don't want to cancel the release because of this.
>
>
>
> On Wed, Dec 2, 2020 at 11:19 AM Robert Metzger <[hidden email]>
> wrote:
>
>> Hi everyone,
>>
>> We have resolved the licensing issue Chesnay found.
>>
>> Please review and vote on the release candidate #3 for the version
>> 1.12.0, as follows:
>>
>> [ ] +1, Approve the release
>> [ ] -1, Do not approve the release (please provide specific comments)
>>
>>
>> The complete staging area is available for your review, which includes:
>> * JIRA release notes [1a], and website release notes [1b]
>> * the official Apache source release and binary convenience releases to
>> be deployed to dist.apache.org [2], which are signed with the key with
>> fingerprint D9839159 [3],
>> * all artifacts to be deployed to the Maven Central Repository [4],
>> * source code tag "release-1.12.0-rc3" [5]
>>
>> We will soon publish the PR for the release announcement blog post!
>>
>> The vote will be open for at least 72 hours. It is adopted by majority
>> approval, with at least 3 PMC affirmative votes.
>>
>> Thanks,
>> Dian & Robert
>>
>> [1a]
>> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12348263
>> [1b] https://github.com/apache/flink/pull/14195
>> [2] https://dist.apache.org/repos/dist/dev/flink/flink-1.12.0-rc3/
>> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
>> [4]
>> https://repository.apache.org/content/repositories/orgapacheflink-1404
>> [5] https://github.com/apache/flink/releases/tag/release-1.12.0-rc3
>>
>>
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release 1.12.0, release candidate #3

Xingbo Huang
+1 (non-binding)

Checks:
1. verified checksums and signatures
2. build Flink with Scala 2.11
3. pip install pyflink in MacOS/CentOS under py35,py36,py37,py38
4. test Pandas UDAF/General UDAF/Python DataStream MapFunction
5. start standalone cluster and submit a python udf job.
6. verified NOTICE/LICENSE files of some regular modules

I observed that the NOTICE file of flink-sql-connector-hbase-2.2 lists 3
dependencies that are not bundled in:
commons-lang:commons-lang:2.6
org.apache.hbase:hbase-hadoop-compat:2.2.3
org.apache.hbase:hbase-hadoop2-compat:2.2.3

I guess listing more than dependencies with apache licensed shouldn't be a
blocker issue. I have opened a PR[1] to fix it.

[1] https://github.com/apache/flink/pull/14299

Best,
Xingbo

Robert Metzger <[hidden email]> 于2020年12月3日周四 下午5:36写道:

> There's now a pull request for the announcement blog post, please help
> checking it: https://github.com/apache/flink-web/pull/397
>
> On Thu, Dec 3, 2020 at 9:03 AM Robert Metzger <[hidden email]> wrote:
>
> > +1 (binding)
> >
> >
> > Checks:
> > - checksums seem correct
> > - source archive code compiles
> > - Compiled a test job against the staging repository
> > - launched a standalone cluster, ran some test jobs against it
> > - quickstart contains correct version
> > - regular jars contain correct NOTICE file
> > - Looked a bit over the output of
> >      git diff release-1.11.2...release-1.12 --  "**/pom.xml"
> >
> >
> >
> > I noticed that at least one more jar file contains an invalid LICENSE
> file
> > in it's root. This has already been the case with Flink 1.11, and from
> the
> > context (apache flink jar, all the other license and notice files talk
> > about this being an Apache project) it should be clear that the license
> > file is not meant for the whole jar file contents.
> > I will still extend the automated LicenseChecker to resolve this, but I
> > don't want to cancel the release because of this.
> >
> >
> >
> > On Wed, Dec 2, 2020 at 11:19 AM Robert Metzger <[hidden email]>
> > wrote:
> >
> >> Hi everyone,
> >>
> >> We have resolved the licensing issue Chesnay found.
> >>
> >> Please review and vote on the release candidate #3 for the version
> >> 1.12.0, as follows:
> >>
> >> [ ] +1, Approve the release
> >> [ ] -1, Do not approve the release (please provide specific comments)
> >>
> >>
> >> The complete staging area is available for your review, which includes:
> >> * JIRA release notes [1a], and website release notes [1b]
> >> * the official Apache source release and binary convenience releases to
> >> be deployed to dist.apache.org [2], which are signed with the key with
> >> fingerprint D9839159 [3],
> >> * all artifacts to be deployed to the Maven Central Repository [4],
> >> * source code tag "release-1.12.0-rc3" [5]
> >>
> >> We will soon publish the PR for the release announcement blog post!
> >>
> >> The vote will be open for at least 72 hours. It is adopted by majority
> >> approval, with at least 3 PMC affirmative votes.
> >>
> >> Thanks,
> >> Dian & Robert
> >>
> >> [1a]
> >>
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12348263
> >> [1b] https://github.com/apache/flink/pull/14195
> >> [2] https://dist.apache.org/repos/dist/dev/flink/flink-1.12.0-rc3/
> >> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
> >> [4]
> >> https://repository.apache.org/content/repositories/orgapacheflink-1404
> >> [5] https://github.com/apache/flink/releases/tag/release-1.12.0-rc3
> >>
> >>
>
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release 1.12.0, release candidate #3

Rui Li
+1 (non-binding)

Built from source and verified hive connector tests for different hive
versions.
Setup a cluster to connect to a real hive warehouse and run some queries
successfully.

On Thu, Dec 3, 2020 at 8:44 PM Xingbo Huang <[hidden email]> wrote:

> +1 (non-binding)
>
> Checks:
> 1. verified checksums and signatures
> 2. build Flink with Scala 2.11
> 3. pip install pyflink in MacOS/CentOS under py35,py36,py37,py38
> 4. test Pandas UDAF/General UDAF/Python DataStream MapFunction
> 5. start standalone cluster and submit a python udf job.
> 6. verified NOTICE/LICENSE files of some regular modules
>
> I observed that the NOTICE file of flink-sql-connector-hbase-2.2 lists 3
> dependencies that are not bundled in:
> commons-lang:commons-lang:2.6
> org.apache.hbase:hbase-hadoop-compat:2.2.3
> org.apache.hbase:hbase-hadoop2-compat:2.2.3
>
> I guess listing more than dependencies with apache licensed shouldn't be a
> blocker issue. I have opened a PR[1] to fix it.
>
> [1] https://github.com/apache/flink/pull/14299
>
> Best,
> Xingbo
>
> Robert Metzger <[hidden email]> 于2020年12月3日周四 下午5:36写道:
>
> > There's now a pull request for the announcement blog post, please help
> > checking it: https://github.com/apache/flink-web/pull/397
> >
> > On Thu, Dec 3, 2020 at 9:03 AM Robert Metzger <[hidden email]>
> wrote:
> >
> > > +1 (binding)
> > >
> > >
> > > Checks:
> > > - checksums seem correct
> > > - source archive code compiles
> > > - Compiled a test job against the staging repository
> > > - launched a standalone cluster, ran some test jobs against it
> > > - quickstart contains correct version
> > > - regular jars contain correct NOTICE file
> > > - Looked a bit over the output of
> > >      git diff release-1.11.2...release-1.12 --  "**/pom.xml"
> > >
> > >
> > >
> > > I noticed that at least one more jar file contains an invalid LICENSE
> > file
> > > in it's root. This has already been the case with Flink 1.11, and from
> > the
> > > context (apache flink jar, all the other license and notice files talk
> > > about this being an Apache project) it should be clear that the license
> > > file is not meant for the whole jar file contents.
> > > I will still extend the automated LicenseChecker to resolve this, but I
> > > don't want to cancel the release because of this.
> > >
> > >
> > >
> > > On Wed, Dec 2, 2020 at 11:19 AM Robert Metzger <[hidden email]>
> > > wrote:
> > >
> > >> Hi everyone,
> > >>
> > >> We have resolved the licensing issue Chesnay found.
> > >>
> > >> Please review and vote on the release candidate #3 for the version
> > >> 1.12.0, as follows:
> > >>
> > >> [ ] +1, Approve the release
> > >> [ ] -1, Do not approve the release (please provide specific comments)
> > >>
> > >>
> > >> The complete staging area is available for your review, which
> includes:
> > >> * JIRA release notes [1a], and website release notes [1b]
> > >> * the official Apache source release and binary convenience releases
> to
> > >> be deployed to dist.apache.org [2], which are signed with the key
> with
> > >> fingerprint D9839159 [3],
> > >> * all artifacts to be deployed to the Maven Central Repository [4],
> > >> * source code tag "release-1.12.0-rc3" [5]
> > >>
> > >> We will soon publish the PR for the release announcement blog post!
> > >>
> > >> The vote will be open for at least 72 hours. It is adopted by majority
> > >> approval, with at least 3 PMC affirmative votes.
> > >>
> > >> Thanks,
> > >> Dian & Robert
> > >>
> > >> [1a]
> > >>
> >
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12348263
> > >> [1b] https://github.com/apache/flink/pull/14195
> > >> [2] https://dist.apache.org/repos/dist/dev/flink/flink-1.12.0-rc3/
> > >> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
> > >> [4]
> > >>
> https://repository.apache.org/content/repositories/orgapacheflink-1404
> > >> [5] https://github.com/apache/flink/releases/tag/release-1.12.0-rc3
> > >>
> > >>
> >
>


--
Best regards!
Rui Li
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release 1.12.0, release candidate #3

Xintong Song
+1 (non-binding)

   - Verified checksums and signatures
   - No binaries found in source archive
   - Build from source
   - Tried a couple of example jobs in various deployment mode
      - Local
      - Standalone
      - Native Kubernetes Application
      - Native Kubernetes Session
      - Yarn Job
      - Yarn Session
   - Changing memory configurations, things work as expected
   - UI looks good
   - Logs look good



Thank you~

Xintong Song



On Thu, Dec 3, 2020 at 9:18 PM Rui Li <[hidden email]> wrote:

> +1 (non-binding)
>
> Built from source and verified hive connector tests for different hive
> versions.
> Setup a cluster to connect to a real hive warehouse and run some queries
> successfully.
>
> On Thu, Dec 3, 2020 at 8:44 PM Xingbo Huang <[hidden email]> wrote:
>
> > +1 (non-binding)
> >
> > Checks:
> > 1. verified checksums and signatures
> > 2. build Flink with Scala 2.11
> > 3. pip install pyflink in MacOS/CentOS under py35,py36,py37,py38
> > 4. test Pandas UDAF/General UDAF/Python DataStream MapFunction
> > 5. start standalone cluster and submit a python udf job.
> > 6. verified NOTICE/LICENSE files of some regular modules
> >
> > I observed that the NOTICE file of flink-sql-connector-hbase-2.2 lists 3
> > dependencies that are not bundled in:
> > commons-lang:commons-lang:2.6
> > org.apache.hbase:hbase-hadoop-compat:2.2.3
> > org.apache.hbase:hbase-hadoop2-compat:2.2.3
> >
> > I guess listing more than dependencies with apache licensed shouldn't be
> a
> > blocker issue. I have opened a PR[1] to fix it.
> >
> > [1] https://github.com/apache/flink/pull/14299
> >
> > Best,
> > Xingbo
> >
> > Robert Metzger <[hidden email]> 于2020年12月3日周四 下午5:36写道:
> >
> > > There's now a pull request for the announcement blog post, please help
> > > checking it: https://github.com/apache/flink-web/pull/397
> > >
> > > On Thu, Dec 3, 2020 at 9:03 AM Robert Metzger <[hidden email]>
> > wrote:
> > >
> > > > +1 (binding)
> > > >
> > > >
> > > > Checks:
> > > > - checksums seem correct
> > > > - source archive code compiles
> > > > - Compiled a test job against the staging repository
> > > > - launched a standalone cluster, ran some test jobs against it
> > > > - quickstart contains correct version
> > > > - regular jars contain correct NOTICE file
> > > > - Looked a bit over the output of
> > > >      git diff release-1.11.2...release-1.12 --  "**/pom.xml"
> > > >
> > > >
> > > >
> > > > I noticed that at least one more jar file contains an invalid LICENSE
> > > file
> > > > in it's root. This has already been the case with Flink 1.11, and
> from
> > > the
> > > > context (apache flink jar, all the other license and notice files
> talk
> > > > about this being an Apache project) it should be clear that the
> license
> > > > file is not meant for the whole jar file contents.
> > > > I will still extend the automated LicenseChecker to resolve this,
> but I
> > > > don't want to cancel the release because of this.
> > > >
> > > >
> > > >
> > > > On Wed, Dec 2, 2020 at 11:19 AM Robert Metzger <[hidden email]>
> > > > wrote:
> > > >
> > > >> Hi everyone,
> > > >>
> > > >> We have resolved the licensing issue Chesnay found.
> > > >>
> > > >> Please review and vote on the release candidate #3 for the version
> > > >> 1.12.0, as follows:
> > > >>
> > > >> [ ] +1, Approve the release
> > > >> [ ] -1, Do not approve the release (please provide specific
> comments)
> > > >>
> > > >>
> > > >> The complete staging area is available for your review, which
> > includes:
> > > >> * JIRA release notes [1a], and website release notes [1b]
> > > >> * the official Apache source release and binary convenience releases
> > to
> > > >> be deployed to dist.apache.org [2], which are signed with the key
> > with
> > > >> fingerprint D9839159 [3],
> > > >> * all artifacts to be deployed to the Maven Central Repository [4],
> > > >> * source code tag "release-1.12.0-rc3" [5]
> > > >>
> > > >> We will soon publish the PR for the release announcement blog post!
> > > >>
> > > >> The vote will be open for at least 72 hours. It is adopted by
> majority
> > > >> approval, with at least 3 PMC affirmative votes.
> > > >>
> > > >> Thanks,
> > > >> Dian & Robert
> > > >>
> > > >> [1a]
> > > >>
> > >
> >
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12348263
> > > >> [1b] https://github.com/apache/flink/pull/14195
> > > >> [2] https://dist.apache.org/repos/dist/dev/flink/flink-1.12.0-rc3/
> > > >> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
> > > >> [4]
> > > >>
> > https://repository.apache.org/content/repositories/orgapacheflink-1404
> > > >> [5] https://github.com/apache/flink/releases/tag/release-1.12.0-rc3
> > > >>
> > > >>
> > >
> >
>
>
> --
> Best regards!
> Rui Li
>
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release 1.12.0, release candidate #3

Yang Wang
+1 (non-binding)

* Build from source
* Deploy Flink cluster in following deployments with HA enabled(ZooKeeper
and K8s), including kill JobManager and check failover
  * Native K8s Session
  * Native K8s Application
  * Yarn Session
  * Yarn Per-Job
  * Yarn Application
* Check webUI and logs in different deployments especially via `kubectl
logs` in K8s

Best,
Yang

Xintong Song <[hidden email]> 于2020年12月4日周五 下午3:00写道:

> +1 (non-binding)
>
>    - Verified checksums and signatures
>    - No binaries found in source archive
>    - Build from source
>    - Tried a couple of example jobs in various deployment mode
>       - Local
>       - Standalone
>       - Native Kubernetes Application
>       - Native Kubernetes Session
>       - Yarn Job
>       - Yarn Session
>    - Changing memory configurations, things work as expected
>    - UI looks good
>    - Logs look good
>
>
>
> Thank you~
>
> Xintong Song
>
>
>
> On Thu, Dec 3, 2020 at 9:18 PM Rui Li <[hidden email]> wrote:
>
> > +1 (non-binding)
> >
> > Built from source and verified hive connector tests for different hive
> > versions.
> > Setup a cluster to connect to a real hive warehouse and run some queries
> > successfully.
> >
> > On Thu, Dec 3, 2020 at 8:44 PM Xingbo Huang <[hidden email]> wrote:
> >
> > > +1 (non-binding)
> > >
> > > Checks:
> > > 1. verified checksums and signatures
> > > 2. build Flink with Scala 2.11
> > > 3. pip install pyflink in MacOS/CentOS under py35,py36,py37,py38
> > > 4. test Pandas UDAF/General UDAF/Python DataStream MapFunction
> > > 5. start standalone cluster and submit a python udf job.
> > > 6. verified NOTICE/LICENSE files of some regular modules
> > >
> > > I observed that the NOTICE file of flink-sql-connector-hbase-2.2 lists
> 3
> > > dependencies that are not bundled in:
> > > commons-lang:commons-lang:2.6
> > > org.apache.hbase:hbase-hadoop-compat:2.2.3
> > > org.apache.hbase:hbase-hadoop2-compat:2.2.3
> > >
> > > I guess listing more than dependencies with apache licensed shouldn't
> be
> > a
> > > blocker issue. I have opened a PR[1] to fix it.
> > >
> > > [1] https://github.com/apache/flink/pull/14299
> > >
> > > Best,
> > > Xingbo
> > >
> > > Robert Metzger <[hidden email]> 于2020年12月3日周四 下午5:36写道:
> > >
> > > > There's now a pull request for the announcement blog post, please
> help
> > > > checking it: https://github.com/apache/flink-web/pull/397
> > > >
> > > > On Thu, Dec 3, 2020 at 9:03 AM Robert Metzger <[hidden email]>
> > > wrote:
> > > >
> > > > > +1 (binding)
> > > > >
> > > > >
> > > > > Checks:
> > > > > - checksums seem correct
> > > > > - source archive code compiles
> > > > > - Compiled a test job against the staging repository
> > > > > - launched a standalone cluster, ran some test jobs against it
> > > > > - quickstart contains correct version
> > > > > - regular jars contain correct NOTICE file
> > > > > - Looked a bit over the output of
> > > > >      git diff release-1.11.2...release-1.12 --  "**/pom.xml"
> > > > >
> > > > >
> > > > >
> > > > > I noticed that at least one more jar file contains an invalid
> LICENSE
> > > > file
> > > > > in it's root. This has already been the case with Flink 1.11, and
> > from
> > > > the
> > > > > context (apache flink jar, all the other license and notice files
> > talk
> > > > > about this being an Apache project) it should be clear that the
> > license
> > > > > file is not meant for the whole jar file contents.
> > > > > I will still extend the automated LicenseChecker to resolve this,
> > but I
> > > > > don't want to cancel the release because of this.
> > > > >
> > > > >
> > > > >
> > > > > On Wed, Dec 2, 2020 at 11:19 AM Robert Metzger <
> [hidden email]>
> > > > > wrote:
> > > > >
> > > > >> Hi everyone,
> > > > >>
> > > > >> We have resolved the licensing issue Chesnay found.
> > > > >>
> > > > >> Please review and vote on the release candidate #3 for the version
> > > > >> 1.12.0, as follows:
> > > > >>
> > > > >> [ ] +1, Approve the release
> > > > >> [ ] -1, Do not approve the release (please provide specific
> > comments)
> > > > >>
> > > > >>
> > > > >> The complete staging area is available for your review, which
> > > includes:
> > > > >> * JIRA release notes [1a], and website release notes [1b]
> > > > >> * the official Apache source release and binary convenience
> releases
> > > to
> > > > >> be deployed to dist.apache.org [2], which are signed with the key
> > > with
> > > > >> fingerprint D9839159 [3],
> > > > >> * all artifacts to be deployed to the Maven Central Repository
> [4],
> > > > >> * source code tag "release-1.12.0-rc3" [5]
> > > > >>
> > > > >> We will soon publish the PR for the release announcement blog
> post!
> > > > >>
> > > > >> The vote will be open for at least 72 hours. It is adopted by
> > majority
> > > > >> approval, with at least 3 PMC affirmative votes.
> > > > >>
> > > > >> Thanks,
> > > > >> Dian & Robert
> > > > >>
> > > > >> [1a]
> > > > >>
> > > >
> > >
> >
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12348263
> > > > >> [1b] https://github.com/apache/flink/pull/14195
> > > > >> [2]
> https://dist.apache.org/repos/dist/dev/flink/flink-1.12.0-rc3/
> > > > >> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
> > > > >> [4]
> > > > >>
> > > https://repository.apache.org/content/repositories/orgapacheflink-1404
> > > > >> [5]
> https://github.com/apache/flink/releases/tag/release-1.12.0-rc3
> > > > >>
> > > > >>
> > > >
> > >
> >
> >
> > --
> > Best regards!
> > Rui Li
> >
>
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release 1.12.0, release candidate #3

Wei Zhong-2
+1 (non-binding)

- verified checksums and signatures
- build Flink with Scala 2.11
- pip install pyflink on Windows python 3.7
- run a python job with udfs on Windows
- pyflink shell works well on local mode and remote mode

Best,
Wei

> 在 2020年12月4日,17:21,Yang Wang <[hidden email]> 写道:
>
> +1 (non-binding)
>
> * Build from source
> * Deploy Flink cluster in following deployments with HA enabled(ZooKeeper
> and K8s), including kill JobManager and check failover
>  * Native K8s Session
>  * Native K8s Application
>  * Yarn Session
>  * Yarn Per-Job
>  * Yarn Application
> * Check webUI and logs in different deployments especially via `kubectl
> logs` in K8s
>
> Best,
> Yang
>
> Xintong Song <[hidden email]> 于2020年12月4日周五 下午3:00写道:
>
>> +1 (non-binding)
>>
>>   - Verified checksums and signatures
>>   - No binaries found in source archive
>>   - Build from source
>>   - Tried a couple of example jobs in various deployment mode
>>      - Local
>>      - Standalone
>>      - Native Kubernetes Application
>>      - Native Kubernetes Session
>>      - Yarn Job
>>      - Yarn Session
>>   - Changing memory configurations, things work as expected
>>   - UI looks good
>>   - Logs look good
>>
>>
>>
>> Thank you~
>>
>> Xintong Song
>>
>>
>>
>> On Thu, Dec 3, 2020 at 9:18 PM Rui Li <[hidden email]> wrote:
>>
>>> +1 (non-binding)
>>>
>>> Built from source and verified hive connector tests for different hive
>>> versions.
>>> Setup a cluster to connect to a real hive warehouse and run some queries
>>> successfully.
>>>
>>> On Thu, Dec 3, 2020 at 8:44 PM Xingbo Huang <[hidden email]> wrote:
>>>
>>>> +1 (non-binding)
>>>>
>>>> Checks:
>>>> 1. verified checksums and signatures
>>>> 2. build Flink with Scala 2.11
>>>> 3. pip install pyflink in MacOS/CentOS under py35,py36,py37,py38
>>>> 4. test Pandas UDAF/General UDAF/Python DataStream MapFunction
>>>> 5. start standalone cluster and submit a python udf job.
>>>> 6. verified NOTICE/LICENSE files of some regular modules
>>>>
>>>> I observed that the NOTICE file of flink-sql-connector-hbase-2.2 lists
>> 3
>>>> dependencies that are not bundled in:
>>>> commons-lang:commons-lang:2.6
>>>> org.apache.hbase:hbase-hadoop-compat:2.2.3
>>>> org.apache.hbase:hbase-hadoop2-compat:2.2.3
>>>>
>>>> I guess listing more than dependencies with apache licensed shouldn't
>> be
>>> a
>>>> blocker issue. I have opened a PR[1] to fix it.
>>>>
>>>> [1] https://github.com/apache/flink/pull/14299
>>>>
>>>> Best,
>>>> Xingbo
>>>>
>>>> Robert Metzger <[hidden email]> 于2020年12月3日周四 下午5:36写道:
>>>>
>>>>> There's now a pull request for the announcement blog post, please
>> help
>>>>> checking it: https://github.com/apache/flink-web/pull/397
>>>>>
>>>>> On Thu, Dec 3, 2020 at 9:03 AM Robert Metzger <[hidden email]>
>>>> wrote:
>>>>>
>>>>>> +1 (binding)
>>>>>>
>>>>>>
>>>>>> Checks:
>>>>>> - checksums seem correct
>>>>>> - source archive code compiles
>>>>>> - Compiled a test job against the staging repository
>>>>>> - launched a standalone cluster, ran some test jobs against it
>>>>>> - quickstart contains correct version
>>>>>> - regular jars contain correct NOTICE file
>>>>>> - Looked a bit over the output of
>>>>>>     git diff release-1.11.2...release-1.12 --  "**/pom.xml"
>>>>>>
>>>>>>
>>>>>>
>>>>>> I noticed that at least one more jar file contains an invalid
>> LICENSE
>>>>> file
>>>>>> in it's root. This has already been the case with Flink 1.11, and
>>> from
>>>>> the
>>>>>> context (apache flink jar, all the other license and notice files
>>> talk
>>>>>> about this being an Apache project) it should be clear that the
>>> license
>>>>>> file is not meant for the whole jar file contents.
>>>>>> I will still extend the automated LicenseChecker to resolve this,
>>> but I
>>>>>> don't want to cancel the release because of this.
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Wed, Dec 2, 2020 at 11:19 AM Robert Metzger <
>> [hidden email]>
>>>>>> wrote:
>>>>>>
>>>>>>> Hi everyone,
>>>>>>>
>>>>>>> We have resolved the licensing issue Chesnay found.
>>>>>>>
>>>>>>> Please review and vote on the release candidate #3 for the version
>>>>>>> 1.12.0, as follows:
>>>>>>>
>>>>>>> [ ] +1, Approve the release
>>>>>>> [ ] -1, Do not approve the release (please provide specific
>>> comments)
>>>>>>>
>>>>>>>
>>>>>>> The complete staging area is available for your review, which
>>>> includes:
>>>>>>> * JIRA release notes [1a], and website release notes [1b]
>>>>>>> * the official Apache source release and binary convenience
>> releases
>>>> to
>>>>>>> be deployed to dist.apache.org [2], which are signed with the key
>>>> with
>>>>>>> fingerprint D9839159 [3],
>>>>>>> * all artifacts to be deployed to the Maven Central Repository
>> [4],
>>>>>>> * source code tag "release-1.12.0-rc3" [5]
>>>>>>>
>>>>>>> We will soon publish the PR for the release announcement blog
>> post!
>>>>>>>
>>>>>>> The vote will be open for at least 72 hours. It is adopted by
>>> majority
>>>>>>> approval, with at least 3 PMC affirmative votes.
>>>>>>>
>>>>>>> Thanks,
>>>>>>> Dian & Robert
>>>>>>>
>>>>>>> [1a]
>>>>>>>
>>>>>
>>>>
>>>
>> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12348263
>>>>>>> [1b] https://github.com/apache/flink/pull/14195
>>>>>>> [2]
>> https://dist.apache.org/repos/dist/dev/flink/flink-1.12.0-rc3/
>>>>>>> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
>>>>>>> [4]
>>>>>>>
>>>> https://repository.apache.org/content/repositories/orgapacheflink-1404
>>>>>>> [5]
>> https://github.com/apache/flink/releases/tag/release-1.12.0-rc3
>>>>>>>
>>>>>>>
>>>>>
>>>>
>>>
>>>
>>> --
>>> Best regards!
>>> Rui Li
>>>
>>

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release 1.12.0, release candidate #3

Till Rohrmann
+1 (binding)

* Verified the checksums
* Ran RC on Minikube cluster
** Session mode
** Application mode
* Built Flink from sources

Cheers,
Till

On Fri, Dec 4, 2020 at 2:15 PM Wei Zhong <[hidden email]> wrote:

> +1 (non-binding)
>
> - verified checksums and signatures
> - build Flink with Scala 2.11
> - pip install pyflink on Windows python 3.7
> - run a python job with udfs on Windows
> - pyflink shell works well on local mode and remote mode
>
> Best,
> Wei
>
> > 在 2020年12月4日,17:21,Yang Wang <[hidden email]> 写道:
> >
> > +1 (non-binding)
> >
> > * Build from source
> > * Deploy Flink cluster in following deployments with HA enabled(ZooKeeper
> > and K8s), including kill JobManager and check failover
> >  * Native K8s Session
> >  * Native K8s Application
> >  * Yarn Session
> >  * Yarn Per-Job
> >  * Yarn Application
> > * Check webUI and logs in different deployments especially via `kubectl
> > logs` in K8s
> >
> > Best,
> > Yang
> >
> > Xintong Song <[hidden email]> 于2020年12月4日周五 下午3:00写道:
> >
> >> +1 (non-binding)
> >>
> >>   - Verified checksums and signatures
> >>   - No binaries found in source archive
> >>   - Build from source
> >>   - Tried a couple of example jobs in various deployment mode
> >>      - Local
> >>      - Standalone
> >>      - Native Kubernetes Application
> >>      - Native Kubernetes Session
> >>      - Yarn Job
> >>      - Yarn Session
> >>   - Changing memory configurations, things work as expected
> >>   - UI looks good
> >>   - Logs look good
> >>
> >>
> >>
> >> Thank you~
> >>
> >> Xintong Song
> >>
> >>
> >>
> >> On Thu, Dec 3, 2020 at 9:18 PM Rui Li <[hidden email]> wrote:
> >>
> >>> +1 (non-binding)
> >>>
> >>> Built from source and verified hive connector tests for different hive
> >>> versions.
> >>> Setup a cluster to connect to a real hive warehouse and run some
> queries
> >>> successfully.
> >>>
> >>> On Thu, Dec 3, 2020 at 8:44 PM Xingbo Huang <[hidden email]>
> wrote:
> >>>
> >>>> +1 (non-binding)
> >>>>
> >>>> Checks:
> >>>> 1. verified checksums and signatures
> >>>> 2. build Flink with Scala 2.11
> >>>> 3. pip install pyflink in MacOS/CentOS under py35,py36,py37,py38
> >>>> 4. test Pandas UDAF/General UDAF/Python DataStream MapFunction
> >>>> 5. start standalone cluster and submit a python udf job.
> >>>> 6. verified NOTICE/LICENSE files of some regular modules
> >>>>
> >>>> I observed that the NOTICE file of flink-sql-connector-hbase-2.2 lists
> >> 3
> >>>> dependencies that are not bundled in:
> >>>> commons-lang:commons-lang:2.6
> >>>> org.apache.hbase:hbase-hadoop-compat:2.2.3
> >>>> org.apache.hbase:hbase-hadoop2-compat:2.2.3
> >>>>
> >>>> I guess listing more than dependencies with apache licensed shouldn't
> >> be
> >>> a
> >>>> blocker issue. I have opened a PR[1] to fix it.
> >>>>
> >>>> [1] https://github.com/apache/flink/pull/14299
> >>>>
> >>>> Best,
> >>>> Xingbo
> >>>>
> >>>> Robert Metzger <[hidden email]> 于2020年12月3日周四 下午5:36写道:
> >>>>
> >>>>> There's now a pull request for the announcement blog post, please
> >> help
> >>>>> checking it: https://github.com/apache/flink-web/pull/397
> >>>>>
> >>>>> On Thu, Dec 3, 2020 at 9:03 AM Robert Metzger <[hidden email]>
> >>>> wrote:
> >>>>>
> >>>>>> +1 (binding)
> >>>>>>
> >>>>>>
> >>>>>> Checks:
> >>>>>> - checksums seem correct
> >>>>>> - source archive code compiles
> >>>>>> - Compiled a test job against the staging repository
> >>>>>> - launched a standalone cluster, ran some test jobs against it
> >>>>>> - quickstart contains correct version
> >>>>>> - regular jars contain correct NOTICE file
> >>>>>> - Looked a bit over the output of
> >>>>>>     git diff release-1.11.2...release-1.12 --  "**/pom.xml"
> >>>>>>
> >>>>>>
> >>>>>>
> >>>>>> I noticed that at least one more jar file contains an invalid
> >> LICENSE
> >>>>> file
> >>>>>> in it's root. This has already been the case with Flink 1.11, and
> >>> from
> >>>>> the
> >>>>>> context (apache flink jar, all the other license and notice files
> >>> talk
> >>>>>> about this being an Apache project) it should be clear that the
> >>> license
> >>>>>> file is not meant for the whole jar file contents.
> >>>>>> I will still extend the automated LicenseChecker to resolve this,
> >>> but I
> >>>>>> don't want to cancel the release because of this.
> >>>>>>
> >>>>>>
> >>>>>>
> >>>>>> On Wed, Dec 2, 2020 at 11:19 AM Robert Metzger <
> >> [hidden email]>
> >>>>>> wrote:
> >>>>>>
> >>>>>>> Hi everyone,
> >>>>>>>
> >>>>>>> We have resolved the licensing issue Chesnay found.
> >>>>>>>
> >>>>>>> Please review and vote on the release candidate #3 for the version
> >>>>>>> 1.12.0, as follows:
> >>>>>>>
> >>>>>>> [ ] +1, Approve the release
> >>>>>>> [ ] -1, Do not approve the release (please provide specific
> >>> comments)
> >>>>>>>
> >>>>>>>
> >>>>>>> The complete staging area is available for your review, which
> >>>> includes:
> >>>>>>> * JIRA release notes [1a], and website release notes [1b]
> >>>>>>> * the official Apache source release and binary convenience
> >> releases
> >>>> to
> >>>>>>> be deployed to dist.apache.org [2], which are signed with the key
> >>>> with
> >>>>>>> fingerprint D9839159 [3],
> >>>>>>> * all artifacts to be deployed to the Maven Central Repository
> >> [4],
> >>>>>>> * source code tag "release-1.12.0-rc3" [5]
> >>>>>>>
> >>>>>>> We will soon publish the PR for the release announcement blog
> >> post!
> >>>>>>>
> >>>>>>> The vote will be open for at least 72 hours. It is adopted by
> >>> majority
> >>>>>>> approval, with at least 3 PMC affirmative votes.
> >>>>>>>
> >>>>>>> Thanks,
> >>>>>>> Dian & Robert
> >>>>>>>
> >>>>>>> [1a]
> >>>>>>>
> >>>>>
> >>>>
> >>>
> >>
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12348263
> >>>>>>> [1b] https://github.com/apache/flink/pull/14195
> >>>>>>> [2]
> >> https://dist.apache.org/repos/dist/dev/flink/flink-1.12.0-rc3/
> >>>>>>> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
> >>>>>>> [4]
> >>>>>>>
> >>>>
> https://repository.apache.org/content/repositories/orgapacheflink-1404
> >>>>>>> [5]
> >> https://github.com/apache/flink/releases/tag/release-1.12.0-rc3
> >>>>>>>
> >>>>>>>
> >>>>>
> >>>>
> >>>
> >>>
> >>> --
> >>> Best regards!
> >>> Rui Li
> >>>
> >>
>
>
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release 1.12.0, release candidate #3

Zhu Zhu
+1 (binding)

- verified signature and checksum
- built from source
- run testing jobs on yarn with manually triggered failures. checked logs
and WebUI of those jobs
  * DataStream job (paralelism=1000) with multiple disjoint pipelined
regions
  * DataSet job (paralelism=1000) with all edges blocking

Thanks,
Zhu

Till Rohrmann <[hidden email]> 于2020年12月4日周五 下午11:45写道:

> +1 (binding)
>
> * Verified the checksums
> * Ran RC on Minikube cluster
> ** Session mode
> ** Application mode
> * Built Flink from sources
>
> Cheers,
> Till
>
> On Fri, Dec 4, 2020 at 2:15 PM Wei Zhong <[hidden email]> wrote:
>
> > +1 (non-binding)
> >
> > - verified checksums and signatures
> > - build Flink with Scala 2.11
> > - pip install pyflink on Windows python 3.7
> > - run a python job with udfs on Windows
> > - pyflink shell works well on local mode and remote mode
> >
> > Best,
> > Wei
> >
> > > 在 2020年12月4日,17:21,Yang Wang <[hidden email]> 写道:
> > >
> > > +1 (non-binding)
> > >
> > > * Build from source
> > > * Deploy Flink cluster in following deployments with HA
> enabled(ZooKeeper
> > > and K8s), including kill JobManager and check failover
> > >  * Native K8s Session
> > >  * Native K8s Application
> > >  * Yarn Session
> > >  * Yarn Per-Job
> > >  * Yarn Application
> > > * Check webUI and logs in different deployments especially via `kubectl
> > > logs` in K8s
> > >
> > > Best,
> > > Yang
> > >
> > > Xintong Song <[hidden email]> 于2020年12月4日周五 下午3:00写道:
> > >
> > >> +1 (non-binding)
> > >>
> > >>   - Verified checksums and signatures
> > >>   - No binaries found in source archive
> > >>   - Build from source
> > >>   - Tried a couple of example jobs in various deployment mode
> > >>      - Local
> > >>      - Standalone
> > >>      - Native Kubernetes Application
> > >>      - Native Kubernetes Session
> > >>      - Yarn Job
> > >>      - Yarn Session
> > >>   - Changing memory configurations, things work as expected
> > >>   - UI looks good
> > >>   - Logs look good
> > >>
> > >>
> > >>
> > >> Thank you~
> > >>
> > >> Xintong Song
> > >>
> > >>
> > >>
> > >> On Thu, Dec 3, 2020 at 9:18 PM Rui Li <[hidden email]> wrote:
> > >>
> > >>> +1 (non-binding)
> > >>>
> > >>> Built from source and verified hive connector tests for different
> hive
> > >>> versions.
> > >>> Setup a cluster to connect to a real hive warehouse and run some
> > queries
> > >>> successfully.
> > >>>
> > >>> On Thu, Dec 3, 2020 at 8:44 PM Xingbo Huang <[hidden email]>
> > wrote:
> > >>>
> > >>>> +1 (non-binding)
> > >>>>
> > >>>> Checks:
> > >>>> 1. verified checksums and signatures
> > >>>> 2. build Flink with Scala 2.11
> > >>>> 3. pip install pyflink in MacOS/CentOS under py35,py36,py37,py38
> > >>>> 4. test Pandas UDAF/General UDAF/Python DataStream MapFunction
> > >>>> 5. start standalone cluster and submit a python udf job.
> > >>>> 6. verified NOTICE/LICENSE files of some regular modules
> > >>>>
> > >>>> I observed that the NOTICE file of flink-sql-connector-hbase-2.2
> lists
> > >> 3
> > >>>> dependencies that are not bundled in:
> > >>>> commons-lang:commons-lang:2.6
> > >>>> org.apache.hbase:hbase-hadoop-compat:2.2.3
> > >>>> org.apache.hbase:hbase-hadoop2-compat:2.2.3
> > >>>>
> > >>>> I guess listing more than dependencies with apache licensed
> shouldn't
> > >> be
> > >>> a
> > >>>> blocker issue. I have opened a PR[1] to fix it.
> > >>>>
> > >>>> [1] https://github.com/apache/flink/pull/14299
> > >>>>
> > >>>> Best,
> > >>>> Xingbo
> > >>>>
> > >>>> Robert Metzger <[hidden email]> 于2020年12月3日周四 下午5:36写道:
> > >>>>
> > >>>>> There's now a pull request for the announcement blog post, please
> > >> help
> > >>>>> checking it: https://github.com/apache/flink-web/pull/397
> > >>>>>
> > >>>>> On Thu, Dec 3, 2020 at 9:03 AM Robert Metzger <[hidden email]
> >
> > >>>> wrote:
> > >>>>>
> > >>>>>> +1 (binding)
> > >>>>>>
> > >>>>>>
> > >>>>>> Checks:
> > >>>>>> - checksums seem correct
> > >>>>>> - source archive code compiles
> > >>>>>> - Compiled a test job against the staging repository
> > >>>>>> - launched a standalone cluster, ran some test jobs against it
> > >>>>>> - quickstart contains correct version
> > >>>>>> - regular jars contain correct NOTICE file
> > >>>>>> - Looked a bit over the output of
> > >>>>>>     git diff release-1.11.2...release-1.12 --  "**/pom.xml"
> > >>>>>>
> > >>>>>>
> > >>>>>>
> > >>>>>> I noticed that at least one more jar file contains an invalid
> > >> LICENSE
> > >>>>> file
> > >>>>>> in it's root. This has already been the case with Flink 1.11, and
> > >>> from
> > >>>>> the
> > >>>>>> context (apache flink jar, all the other license and notice files
> > >>> talk
> > >>>>>> about this being an Apache project) it should be clear that the
> > >>> license
> > >>>>>> file is not meant for the whole jar file contents.
> > >>>>>> I will still extend the automated LicenseChecker to resolve this,
> > >>> but I
> > >>>>>> don't want to cancel the release because of this.
> > >>>>>>
> > >>>>>>
> > >>>>>>
> > >>>>>> On Wed, Dec 2, 2020 at 11:19 AM Robert Metzger <
> > >> [hidden email]>
> > >>>>>> wrote:
> > >>>>>>
> > >>>>>>> Hi everyone,
> > >>>>>>>
> > >>>>>>> We have resolved the licensing issue Chesnay found.
> > >>>>>>>
> > >>>>>>> Please review and vote on the release candidate #3 for the
> version
> > >>>>>>> 1.12.0, as follows:
> > >>>>>>>
> > >>>>>>> [ ] +1, Approve the release
> > >>>>>>> [ ] -1, Do not approve the release (please provide specific
> > >>> comments)
> > >>>>>>>
> > >>>>>>>
> > >>>>>>> The complete staging area is available for your review, which
> > >>>> includes:
> > >>>>>>> * JIRA release notes [1a], and website release notes [1b]
> > >>>>>>> * the official Apache source release and binary convenience
> > >> releases
> > >>>> to
> > >>>>>>> be deployed to dist.apache.org [2], which are signed with the
> key
> > >>>> with
> > >>>>>>> fingerprint D9839159 [3],
> > >>>>>>> * all artifacts to be deployed to the Maven Central Repository
> > >> [4],
> > >>>>>>> * source code tag "release-1.12.0-rc3" [5]
> > >>>>>>>
> > >>>>>>> We will soon publish the PR for the release announcement blog
> > >> post!
> > >>>>>>>
> > >>>>>>> The vote will be open for at least 72 hours. It is adopted by
> > >>> majority
> > >>>>>>> approval, with at least 3 PMC affirmative votes.
> > >>>>>>>
> > >>>>>>> Thanks,
> > >>>>>>> Dian & Robert
> > >>>>>>>
> > >>>>>>> [1a]
> > >>>>>>>
> > >>>>>
> > >>>>
> > >>>
> > >>
> >
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12348263
> > >>>>>>> [1b] https://github.com/apache/flink/pull/14195
> > >>>>>>> [2]
> > >> https://dist.apache.org/repos/dist/dev/flink/flink-1.12.0-rc3/
> > >>>>>>> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
> > >>>>>>> [4]
> > >>>>>>>
> > >>>>
> > https://repository.apache.org/content/repositories/orgapacheflink-1404
> > >>>>>>> [5]
> > >> https://github.com/apache/flink/releases/tag/release-1.12.0-rc3
> > >>>>>>>
> > >>>>>>>
> > >>>>>
> > >>>>
> > >>>
> > >>>
> > >>> --
> > >>> Best regards!
> > >>> Rui Li
> > >>>
> > >>
> >
> >
>
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release 1.12.0, release candidate #3

Guowei Ma
+1(non-binding)
- build from source
- build a docker image
- start a session from local k8s cluster
- submit a wordcount job in streaming mode.
- submit a wordcount job in batch mode.
Best,
Guowei


On Sat, Dec 5, 2020 at 3:13 PM Zhu Zhu <[hidden email]> wrote:

> +1 (binding)
>
> - verified signature and checksum
> - built from source
> - run testing jobs on yarn with manually triggered failures. checked logs
> and WebUI of those jobs
>   * DataStream job (paralelism=1000) with multiple disjoint pipelined
> regions
>   * DataSet job (paralelism=1000) with all edges blocking
>
> Thanks,
> Zhu
>
> Till Rohrmann <[hidden email]> 于2020年12月4日周五 下午11:45写道:
>
> > +1 (binding)
> >
> > * Verified the checksums
> > * Ran RC on Minikube cluster
> > ** Session mode
> > ** Application mode
> > * Built Flink from sources
> >
> > Cheers,
> > Till
> >
> > On Fri, Dec 4, 2020 at 2:15 PM Wei Zhong <[hidden email]> wrote:
> >
> > > +1 (non-binding)
> > >
> > > - verified checksums and signatures
> > > - build Flink with Scala 2.11
> > > - pip install pyflink on Windows python 3.7
> > > - run a python job with udfs on Windows
> > > - pyflink shell works well on local mode and remote mode
> > >
> > > Best,
> > > Wei
> > >
> > > > 在 2020年12月4日,17:21,Yang Wang <[hidden email]> 写道:
> > > >
> > > > +1 (non-binding)
> > > >
> > > > * Build from source
> > > > * Deploy Flink cluster in following deployments with HA
> > enabled(ZooKeeper
> > > > and K8s), including kill JobManager and check failover
> > > >  * Native K8s Session
> > > >  * Native K8s Application
> > > >  * Yarn Session
> > > >  * Yarn Per-Job
> > > >  * Yarn Application
> > > > * Check webUI and logs in different deployments especially via
> `kubectl
> > > > logs` in K8s
> > > >
> > > > Best,
> > > > Yang
> > > >
> > > > Xintong Song <[hidden email]> 于2020年12月4日周五 下午3:00写道:
> > > >
> > > >> +1 (non-binding)
> > > >>
> > > >>   - Verified checksums and signatures
> > > >>   - No binaries found in source archive
> > > >>   - Build from source
> > > >>   - Tried a couple of example jobs in various deployment mode
> > > >>      - Local
> > > >>      - Standalone
> > > >>      - Native Kubernetes Application
> > > >>      - Native Kubernetes Session
> > > >>      - Yarn Job
> > > >>      - Yarn Session
> > > >>   - Changing memory configurations, things work as expected
> > > >>   - UI looks good
> > > >>   - Logs look good
> > > >>
> > > >>
> > > >>
> > > >> Thank you~
> > > >>
> > > >> Xintong Song
> > > >>
> > > >>
> > > >>
> > > >> On Thu, Dec 3, 2020 at 9:18 PM Rui Li <[hidden email]>
> wrote:
> > > >>
> > > >>> +1 (non-binding)
> > > >>>
> > > >>> Built from source and verified hive connector tests for different
> > hive
> > > >>> versions.
> > > >>> Setup a cluster to connect to a real hive warehouse and run some
> > > queries
> > > >>> successfully.
> > > >>>
> > > >>> On Thu, Dec 3, 2020 at 8:44 PM Xingbo Huang <[hidden email]>
> > > wrote:
> > > >>>
> > > >>>> +1 (non-binding)
> > > >>>>
> > > >>>> Checks:
> > > >>>> 1. verified checksums and signatures
> > > >>>> 2. build Flink with Scala 2.11
> > > >>>> 3. pip install pyflink in MacOS/CentOS under py35,py36,py37,py38
> > > >>>> 4. test Pandas UDAF/General UDAF/Python DataStream MapFunction
> > > >>>> 5. start standalone cluster and submit a python udf job.
> > > >>>> 6. verified NOTICE/LICENSE files of some regular modules
> > > >>>>
> > > >>>> I observed that the NOTICE file of flink-sql-connector-hbase-2.2
> > lists
> > > >> 3
> > > >>>> dependencies that are not bundled in:
> > > >>>> commons-lang:commons-lang:2.6
> > > >>>> org.apache.hbase:hbase-hadoop-compat:2.2.3
> > > >>>> org.apache.hbase:hbase-hadoop2-compat:2.2.3
> > > >>>>
> > > >>>> I guess listing more than dependencies with apache licensed
> > shouldn't
> > > >> be
> > > >>> a
> > > >>>> blocker issue. I have opened a PR[1] to fix it.
> > > >>>>
> > > >>>> [1] https://github.com/apache/flink/pull/14299
> > > >>>>
> > > >>>> Best,
> > > >>>> Xingbo
> > > >>>>
> > > >>>> Robert Metzger <[hidden email]> 于2020年12月3日周四 下午5:36写道:
> > > >>>>
> > > >>>>> There's now a pull request for the announcement blog post, please
> > > >> help
> > > >>>>> checking it: https://github.com/apache/flink-web/pull/397
> > > >>>>>
> > > >>>>> On Thu, Dec 3, 2020 at 9:03 AM Robert Metzger <
> [hidden email]
> > >
> > > >>>> wrote:
> > > >>>>>
> > > >>>>>> +1 (binding)
> > > >>>>>>
> > > >>>>>>
> > > >>>>>> Checks:
> > > >>>>>> - checksums seem correct
> > > >>>>>> - source archive code compiles
> > > >>>>>> - Compiled a test job against the staging repository
> > > >>>>>> - launched a standalone cluster, ran some test jobs against it
> > > >>>>>> - quickstart contains correct version
> > > >>>>>> - regular jars contain correct NOTICE file
> > > >>>>>> - Looked a bit over the output of
> > > >>>>>>     git diff release-1.11.2...release-1.12 --  "**/pom.xml"
> > > >>>>>>
> > > >>>>>>
> > > >>>>>>
> > > >>>>>> I noticed that at least one more jar file contains an invalid
> > > >> LICENSE
> > > >>>>> file
> > > >>>>>> in it's root. This has already been the case with Flink 1.11,
> and
> > > >>> from
> > > >>>>> the
> > > >>>>>> context (apache flink jar, all the other license and notice
> files
> > > >>> talk
> > > >>>>>> about this being an Apache project) it should be clear that the
> > > >>> license
> > > >>>>>> file is not meant for the whole jar file contents.
> > > >>>>>> I will still extend the automated LicenseChecker to resolve
> this,
> > > >>> but I
> > > >>>>>> don't want to cancel the release because of this.
> > > >>>>>>
> > > >>>>>>
> > > >>>>>>
> > > >>>>>> On Wed, Dec 2, 2020 at 11:19 AM Robert Metzger <
> > > >> [hidden email]>
> > > >>>>>> wrote:
> > > >>>>>>
> > > >>>>>>> Hi everyone,
> > > >>>>>>>
> > > >>>>>>> We have resolved the licensing issue Chesnay found.
> > > >>>>>>>
> > > >>>>>>> Please review and vote on the release candidate #3 for the
> > version
> > > >>>>>>> 1.12.0, as follows:
> > > >>>>>>>
> > > >>>>>>> [ ] +1, Approve the release
> > > >>>>>>> [ ] -1, Do not approve the release (please provide specific
> > > >>> comments)
> > > >>>>>>>
> > > >>>>>>>
> > > >>>>>>> The complete staging area is available for your review, which
> > > >>>> includes:
> > > >>>>>>> * JIRA release notes [1a], and website release notes [1b]
> > > >>>>>>> * the official Apache source release and binary convenience
> > > >> releases
> > > >>>> to
> > > >>>>>>> be deployed to dist.apache.org [2], which are signed with the
> > key
> > > >>>> with
> > > >>>>>>> fingerprint D9839159 [3],
> > > >>>>>>> * all artifacts to be deployed to the Maven Central Repository
> > > >> [4],
> > > >>>>>>> * source code tag "release-1.12.0-rc3" [5]
> > > >>>>>>>
> > > >>>>>>> We will soon publish the PR for the release announcement blog
> > > >> post!
> > > >>>>>>>
> > > >>>>>>> The vote will be open for at least 72 hours. It is adopted by
> > > >>> majority
> > > >>>>>>> approval, with at least 3 PMC affirmative votes.
> > > >>>>>>>
> > > >>>>>>> Thanks,
> > > >>>>>>> Dian & Robert
> > > >>>>>>>
> > > >>>>>>> [1a]
> > > >>>>>>>
> > > >>>>>
> > > >>>>
> > > >>>
> > > >>
> > >
> >
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12348263
> > > >>>>>>> [1b] https://github.com/apache/flink/pull/14195
> > > >>>>>>> [2]
> > > >> https://dist.apache.org/repos/dist/dev/flink/flink-1.12.0-rc3/
> > > >>>>>>> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
> > > >>>>>>> [4]
> > > >>>>>>>
> > > >>>>
> > > https://repository.apache.org/content/repositories/orgapacheflink-1404
> > > >>>>>>> [5]
> > > >> https://github.com/apache/flink/releases/tag/release-1.12.0-rc3
> > > >>>>>>>
> > > >>>>>>>
> > > >>>>>
> > > >>>>
> > > >>>
> > > >>>
> > > >>> --
> > > >>> Best regards!
> > > >>> Rui Li
> > > >>>
> > > >>
> > >
> > >
> >
>
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release 1.12.0, release candidate #3

Leonard Xu
+1 (non-binding)

- checked/verified signatures and hashes
- built from source code with scala 2.11 succeeded
- checked that there are no missing artifacts
- started a cluster, WebUI was accessible, submitted a wordcount job and ran succeeded, no suspicious log output
- tested using SQL Client to submit job and the query result is expected
- tested read/write from/to sql kafka/upsert-kafka connector in SQL Client
- tested read/write/join hvie table in SQL Client

Best,
Leonard Xu


> 在 2020年12月7日,11:17,Guowei Ma <[hidden email]> 写道:
>
> +1(non-binding)
> - build from source
> - build a docker image
> - start a session from local k8s cluster
> - submit a wordcount job in streaming mode.
> - submit a wordcount job in batch mode.
> Best,
> Guowei
>
>
> On Sat, Dec 5, 2020 at 3:13 PM Zhu Zhu <[hidden email]> wrote:
>
>> +1 (binding)
>>
>> - verified signature and checksum
>> - built from source
>> - run testing jobs on yarn with manually triggered failures. checked logs
>> and WebUI of those jobs
>>  * DataStream job (paralelism=1000) with multiple disjoint pipelined
>> regions
>>  * DataSet job (paralelism=1000) with all edges blocking
>>
>> Thanks,
>> Zhu
>>
>> Till Rohrmann <[hidden email]> 于2020年12月4日周五 下午11:45写道:
>>
>>> +1 (binding)
>>>
>>> * Verified the checksums
>>> * Ran RC on Minikube cluster
>>> ** Session mode
>>> ** Application mode
>>> * Built Flink from sources
>>>
>>> Cheers,
>>> Till
>>>
>>> On Fri, Dec 4, 2020 at 2:15 PM Wei Zhong <[hidden email]> wrote:
>>>
>>>> +1 (non-binding)
>>>>
>>>> - verified checksums and signatures
>>>> - build Flink with Scala 2.11
>>>> - pip install pyflink on Windows python 3.7
>>>> - run a python job with udfs on Windows
>>>> - pyflink shell works well on local mode and remote mode
>>>>
>>>> Best,
>>>> Wei
>>>>
>>>>> 在 2020年12月4日,17:21,Yang Wang <[hidden email]> 写道:
>>>>>
>>>>> +1 (non-binding)
>>>>>
>>>>> * Build from source
>>>>> * Deploy Flink cluster in following deployments with HA
>>> enabled(ZooKeeper
>>>>> and K8s), including kill JobManager and check failover
>>>>> * Native K8s Session
>>>>> * Native K8s Application
>>>>> * Yarn Session
>>>>> * Yarn Per-Job
>>>>> * Yarn Application
>>>>> * Check webUI and logs in different deployments especially via
>> `kubectl
>>>>> logs` in K8s
>>>>>
>>>>> Best,
>>>>> Yang
>>>>>
>>>>> Xintong Song <[hidden email]> 于2020年12月4日周五 下午3:00写道:
>>>>>
>>>>>> +1 (non-binding)
>>>>>>
>>>>>>  - Verified checksums and signatures
>>>>>>  - No binaries found in source archive
>>>>>>  - Build from source
>>>>>>  - Tried a couple of example jobs in various deployment mode
>>>>>>     - Local
>>>>>>     - Standalone
>>>>>>     - Native Kubernetes Application
>>>>>>     - Native Kubernetes Session
>>>>>>     - Yarn Job
>>>>>>     - Yarn Session
>>>>>>  - Changing memory configurations, things work as expected
>>>>>>  - UI looks good
>>>>>>  - Logs look good
>>>>>>
>>>>>>
>>>>>>
>>>>>> Thank you~
>>>>>>
>>>>>> Xintong Song
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Thu, Dec 3, 2020 at 9:18 PM Rui Li <[hidden email]>
>> wrote:
>>>>>>
>>>>>>> +1 (non-binding)
>>>>>>>
>>>>>>> Built from source and verified hive connector tests for different
>>> hive
>>>>>>> versions.
>>>>>>> Setup a cluster to connect to a real hive warehouse and run some
>>>> queries
>>>>>>> successfully.
>>>>>>>
>>>>>>> On Thu, Dec 3, 2020 at 8:44 PM Xingbo Huang <[hidden email]>
>>>> wrote:
>>>>>>>
>>>>>>>> +1 (non-binding)
>>>>>>>>
>>>>>>>> Checks:
>>>>>>>> 1. verified checksums and signatures
>>>>>>>> 2. build Flink with Scala 2.11
>>>>>>>> 3. pip install pyflink in MacOS/CentOS under py35,py36,py37,py38
>>>>>>>> 4. test Pandas UDAF/General UDAF/Python DataStream MapFunction
>>>>>>>> 5. start standalone cluster and submit a python udf job.
>>>>>>>> 6. verified NOTICE/LICENSE files of some regular modules
>>>>>>>>
>>>>>>>> I observed that the NOTICE file of flink-sql-connector-hbase-2.2
>>> lists
>>>>>> 3
>>>>>>>> dependencies that are not bundled in:
>>>>>>>> commons-lang:commons-lang:2.6
>>>>>>>> org.apache.hbase:hbase-hadoop-compat:2.2.3
>>>>>>>> org.apache.hbase:hbase-hadoop2-compat:2.2.3
>>>>>>>>
>>>>>>>> I guess listing more than dependencies with apache licensed
>>> shouldn't
>>>>>> be
>>>>>>> a
>>>>>>>> blocker issue. I have opened a PR[1] to fix it.
>>>>>>>>
>>>>>>>> [1] https://github.com/apache/flink/pull/14299
>>>>>>>>
>>>>>>>> Best,
>>>>>>>> Xingbo
>>>>>>>>
>>>>>>>> Robert Metzger <[hidden email]> 于2020年12月3日周四 下午5:36写道:
>>>>>>>>
>>>>>>>>> There's now a pull request for the announcement blog post, please
>>>>>> help
>>>>>>>>> checking it: https://github.com/apache/flink-web/pull/397
>>>>>>>>>
>>>>>>>>> On Thu, Dec 3, 2020 at 9:03 AM Robert Metzger <
>> [hidden email]
>>>>
>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>>> +1 (binding)
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Checks:
>>>>>>>>>> - checksums seem correct
>>>>>>>>>> - source archive code compiles
>>>>>>>>>> - Compiled a test job against the staging repository
>>>>>>>>>> - launched a standalone cluster, ran some test jobs against it
>>>>>>>>>> - quickstart contains correct version
>>>>>>>>>> - regular jars contain correct NOTICE file
>>>>>>>>>> - Looked a bit over the output of
>>>>>>>>>>    git diff release-1.11.2...release-1.12 --  "**/pom.xml"
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> I noticed that at least one more jar file contains an invalid
>>>>>> LICENSE
>>>>>>>>> file
>>>>>>>>>> in it's root. This has already been the case with Flink 1.11,
>> and
>>>>>>> from
>>>>>>>>> the
>>>>>>>>>> context (apache flink jar, all the other license and notice
>> files
>>>>>>> talk
>>>>>>>>>> about this being an Apache project) it should be clear that the
>>>>>>> license
>>>>>>>>>> file is not meant for the whole jar file contents.
>>>>>>>>>> I will still extend the automated LicenseChecker to resolve
>> this,
>>>>>>> but I
>>>>>>>>>> don't want to cancel the release because of this.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Wed, Dec 2, 2020 at 11:19 AM Robert Metzger <
>>>>>> [hidden email]>
>>>>>>>>>> wrote:
>>>>>>>>>>
>>>>>>>>>>> Hi everyone,
>>>>>>>>>>>
>>>>>>>>>>> We have resolved the licensing issue Chesnay found.
>>>>>>>>>>>
>>>>>>>>>>> Please review and vote on the release candidate #3 for the
>>> version
>>>>>>>>>>> 1.12.0, as follows:
>>>>>>>>>>>
>>>>>>>>>>> [ ] +1, Approve the release
>>>>>>>>>>> [ ] -1, Do not approve the release (please provide specific
>>>>>>> comments)
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> The complete staging area is available for your review, which
>>>>>>>> includes:
>>>>>>>>>>> * JIRA release notes [1a], and website release notes [1b]
>>>>>>>>>>> * the official Apache source release and binary convenience
>>>>>> releases
>>>>>>>> to
>>>>>>>>>>> be deployed to dist.apache.org [2], which are signed with the
>>> key
>>>>>>>> with
>>>>>>>>>>> fingerprint D9839159 [3],
>>>>>>>>>>> * all artifacts to be deployed to the Maven Central Repository
>>>>>> [4],
>>>>>>>>>>> * source code tag "release-1.12.0-rc3" [5]
>>>>>>>>>>>
>>>>>>>>>>> We will soon publish the PR for the release announcement blog
>>>>>> post!
>>>>>>>>>>>
>>>>>>>>>>> The vote will be open for at least 72 hours. It is adopted by
>>>>>>> majority
>>>>>>>>>>> approval, with at least 3 PMC affirmative votes.
>>>>>>>>>>>
>>>>>>>>>>> Thanks,
>>>>>>>>>>> Dian & Robert
>>>>>>>>>>>
>>>>>>>>>>> [1a]
>>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>
>>>
>> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12348263
>>>>>>>>>>> [1b] https://github.com/apache/flink/pull/14195
>>>>>>>>>>> [2]
>>>>>> https://dist.apache.org/repos/dist/dev/flink/flink-1.12.0-rc3/
>>>>>>>>>>> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
>>>>>>>>>>> [4]
>>>>>>>>>>>
>>>>>>>>
>>>> https://repository.apache.org/content/repositories/orgapacheflink-1404
>>>>>>>>>>> [5]
>>>>>> https://github.com/apache/flink/releases/tag/release-1.12.0-rc3
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> Best regards!
>>>>>>> Rui Li
>>>>>>>
>>>>>>
>>>>
>>>>
>>>
>>

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release 1.12.0, release candidate #3

dwysakowicz

+1 (binding)

* Verified the checksums
* Verified that the source archives do not contain any binaries
* Built Flink from sources
* Run a streaming WorldCount example in BATCH and STREAM mode
* Run a slightly heavier WorldCount version in BATCH and STREAM mode
* Verified licensing of Hbase connectors
** I found one issue that we do not list protobuf-java-util in flink-sql-connector-hbase-2.2 NOTICE file, which is pulled in through hbase-shaded-miscellaneous (there is a couple more dependencies we do not list from it, but they are Apache licensed)
* Run some Table examples (I found two examples are not runnable[1]), but I would not consider it a blocker

Let me know what you think about the licensing issue.

Best,
Dawid

[1]https://issues.apache.org/jira/browse/FLINK-20464
On 07/12/2020 08:07, Leonard Xu wrote:
+1 (non-binding)

- checked/verified signatures and hashes
- built from source code with scala 2.11 succeeded
- checked that there are no missing artifacts
- started a cluster, WebUI was accessible, submitted a wordcount job and ran succeeded, no suspicious log output
- tested using SQL Client to submit job and the query result is expected
- tested read/write from/to sql kafka/upsert-kafka connector in SQL Client 
- tested read/write/join hvie table in SQL Client

Best,
Leonard Xu


在 2020年12月7日,11:17,Guowei Ma [hidden email] 写道:

+1(non-binding)
- build from source
- build a docker image
- start a session from local k8s cluster
- submit a wordcount job in streaming mode.
- submit a wordcount job in batch mode.
Best,
Guowei


On Sat, Dec 5, 2020 at 3:13 PM Zhu Zhu [hidden email] wrote:

+1 (binding)

- verified signature and checksum
- built from source
- run testing jobs on yarn with manually triggered failures. checked logs
and WebUI of those jobs
 * DataStream job (paralelism=1000) with multiple disjoint pipelined
regions
 * DataSet job (paralelism=1000) with all edges blocking

Thanks,
Zhu

Till Rohrmann [hidden email] 于2020年12月4日周五 下午11:45写道:

+1 (binding)

* Verified the checksums
* Ran RC on Minikube cluster
** Session mode
** Application mode
* Built Flink from sources

Cheers,
Till

On Fri, Dec 4, 2020 at 2:15 PM Wei Zhong [hidden email] wrote:

+1 (non-binding)

- verified checksums and signatures
- build Flink with Scala 2.11
- pip install pyflink on Windows python 3.7
- run a python job with udfs on Windows
- pyflink shell works well on local mode and remote mode

Best,
Wei

在 2020年12月4日,17:21,Yang Wang [hidden email] 写道:

+1 (non-binding)

* Build from source
* Deploy Flink cluster in following deployments with HA
enabled(ZooKeeper
and K8s), including kill JobManager and check failover
* Native K8s Session
* Native K8s Application
* Yarn Session
* Yarn Per-Job
* Yarn Application
* Check webUI and logs in different deployments especially via
`kubectl
logs` in K8s

Best,
Yang

Xintong Song [hidden email] 于2020年12月4日周五 下午3:00写道:

+1 (non-binding)

 - Verified checksums and signatures
 - No binaries found in source archive
 - Build from source
 - Tried a couple of example jobs in various deployment mode
    - Local
    - Standalone
    - Native Kubernetes Application
    - Native Kubernetes Session
    - Yarn Job
    - Yarn Session
 - Changing memory configurations, things work as expected
 - UI looks good
 - Logs look good



Thank you~

Xintong Song



On Thu, Dec 3, 2020 at 9:18 PM Rui Li [hidden email]
wrote:

                  
+1 (non-binding)

Built from source and verified hive connector tests for different
hive
versions.
Setup a cluster to connect to a real hive warehouse and run some
queries
successfully.

On Thu, Dec 3, 2020 at 8:44 PM Xingbo Huang [hidden email]
wrote:

                    
+1 (non-binding)

Checks:
1. verified checksums and signatures
2. build Flink with Scala 2.11
3. pip install pyflink in MacOS/CentOS under py35,py36,py37,py38
4. test Pandas UDAF/General UDAF/Python DataStream MapFunction
5. start standalone cluster and submit a python udf job.
6. verified NOTICE/LICENSE files of some regular modules

I observed that the NOTICE file of flink-sql-connector-hbase-2.2
lists
3
dependencies that are not bundled in:
commons-lang:commons-lang:2.6
org.apache.hbase:hbase-hadoop-compat:2.2.3
org.apache.hbase:hbase-hadoop2-compat:2.2.3

I guess listing more than dependencies with apache licensed
shouldn't
be
a
blocker issue. I have opened a PR[1] to fix it.

[1] https://github.com/apache/flink/pull/14299

Best,
Xingbo

Robert Metzger [hidden email] 于2020年12月3日周四 下午5:36写道:

There's now a pull request for the announcement blog post, please
help
checking it: https://github.com/apache/flink-web/pull/397

On Thu, Dec 3, 2020 at 9:03 AM Robert Metzger <
[hidden email]

              
wrote:

                        
+1 (binding)


Checks:
- checksums seem correct
- source archive code compiles
- Compiled a test job against the staging repository
- launched a standalone cluster, ran some test jobs against it
- quickstart contains correct version
- regular jars contain correct NOTICE file
- Looked a bit over the output of
   git diff release-1.11.2...release-1.12 --  "**/pom.xml"



I noticed that at least one more jar file contains an invalid
LICENSE
file
in it's root. This has already been the case with Flink 1.11,
and
from
the
context (apache flink jar, all the other license and notice
files
talk
about this being an Apache project) it should be clear that the
license
file is not meant for the whole jar file contents.
I will still extend the automated LicenseChecker to resolve
this,
but I
don't want to cancel the release because of this.



On Wed, Dec 2, 2020 at 11:19 AM Robert Metzger <
[hidden email]>
wrote:

Hi everyone,

We have resolved the licensing issue Chesnay found.

Please review and vote on the release candidate #3 for the
version
1.12.0, as follows:

[ ] +1, Approve the release
[ ] -1, Do not approve the release (please provide specific
comments)

The complete staging area is available for your review, which
includes:
* JIRA release notes [1a], and website release notes [1b]
* the official Apache source release and binary convenience
releases
to
be deployed to dist.apache.org [2], which are signed with the
key
with
fingerprint D9839159 [3],
* all artifacts to be deployed to the Maven Central Repository
[4],
* source code tag "release-1.12.0-rc3" [5]

We will soon publish the PR for the release announcement blog
post!
The vote will be open for at least 72 hours. It is adopted by
majority
approval, with at least 3 PMC affirmative votes.

Thanks,
Dian & Robert

[1a]


                      

                    

                  

                

            

          
https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12348263
[1b] https://github.com/apache/flink/pull/14195
[2]
https://dist.apache.org/repos/dist/dev/flink/flink-1.12.0-rc3/
[3] https://dist.apache.org/repos/dist/release/flink/KEYS
[4]


                    
https://repository.apache.org/content/repositories/orgapacheflink-1404
[5]
https://github.com/apache/flink/releases/tag/release-1.12.0-rc3


                      

                    

--
Best regards!
Rui Li


                


          

        

    

signature.asc (849 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release 1.12.0, release candidate #3

Congxian Qiu
+1  (non-biding)

checklist
- checksum & gpg  ok
- build from source, ok
- all pom version point to 1.12.0
- run some program in local env, ok, and no strange log showed

Best,
Congxian


Dawid Wysakowicz <[hidden email]> 于2020年12月7日周一 下午5:52写道:

> +1 (binding)
>
> * Verified the checksums
> * Verified that the source archives do not contain any binaries
> * Built Flink from sources
> * Run a streaming WorldCount example in BATCH and STREAM mode
> * Run a slightly heavier WorldCount version in BATCH and STREAM mode
> * Verified licensing of Hbase connectors*** I found one issue that we do not list protobuf-java-util in flink-sql-connector-hbase-2.2 NOTICE file, which is pulled in through hbase-shaded-miscellaneous (there is a couple more dependencies we do not list from it, but they are Apache licensed)*
> * Run some Table examples (I found two examples are not runnable[1]), but I would not consider it a blocker
>
> Let me know what you think about the licensing issue.
>
> Best,
> Dawid
>
> [1]https://issues.apache.org/jira/browse/FLINK-20464
>
> On 07/12/2020 08:07, Leonard Xu wrote:
>
> +1 (non-binding)
>
> - checked/verified signatures and hashes
> - built from source code with scala 2.11 succeeded
> - checked that there are no missing artifacts
> - started a cluster, WebUI was accessible, submitted a wordcount job and ran succeeded, no suspicious log output
> - tested using SQL Client to submit job and the query result is expected
> - tested read/write from/to sql kafka/upsert-kafka connector in SQL Client
> - tested read/write/join hvie table in SQL Client
>
> Best,
> Leonard Xu
>
>
>
> 在 2020年12月7日,11:17,Guowei Ma <[hidden email]> <[hidden email]> 写道:
>
> +1(non-binding)
> - build from source
> - build a docker image
> - start a session from local k8s cluster
> - submit a wordcount job in streaming mode.
> - submit a wordcount job in batch mode.
> Best,
> Guowei
>
>
> On Sat, Dec 5, 2020 at 3:13 PM Zhu Zhu <[hidden email]> <[hidden email]> wrote:
>
>
> +1 (binding)
>
> - verified signature and checksum
> - built from source
> - run testing jobs on yarn with manually triggered failures. checked logs
> and WebUI of those jobs
>  * DataStream job (paralelism=1000) with multiple disjoint pipelined
> regions
>  * DataSet job (paralelism=1000) with all edges blocking
>
> Thanks,
> Zhu
>
> Till Rohrmann <[hidden email]> <[hidden email]> 于2020年12月4日周五 下午11:45写道:
>
>
> +1 (binding)
>
> * Verified the checksums
> * Ran RC on Minikube cluster
> ** Session mode
> ** Application mode
> * Built Flink from sources
>
> Cheers,
> Till
>
> On Fri, Dec 4, 2020 at 2:15 PM Wei Zhong <[hidden email]> <[hidden email]> wrote:
>
>
> +1 (non-binding)
>
> - verified checksums and signatures
> - build Flink with Scala 2.11
> - pip install pyflink on Windows python 3.7
> - run a python job with udfs on Windows
> - pyflink shell works well on local mode and remote mode
>
> Best,
> Wei
>
>
> 在 2020年12月4日,17:21,Yang Wang <[hidden email]> <[hidden email]> 写道:
>
> +1 (non-binding)
>
> * Build from source
> * Deploy Flink cluster in following deployments with HA
>
> enabled(ZooKeeper
>
> and K8s), including kill JobManager and check failover
> * Native K8s Session
> * Native K8s Application
> * Yarn Session
> * Yarn Per-Job
> * Yarn Application
> * Check webUI and logs in different deployments especially via
>
> `kubectl
>
> logs` in K8s
>
> Best,
> Yang
>
> Xintong Song <[hidden email]> <[hidden email]> 于2020年12月4日周五 下午3:00写道:
>
>
> +1 (non-binding)
>
>  - Verified checksums and signatures
>  - No binaries found in source archive
>  - Build from source
>  - Tried a couple of example jobs in various deployment mode
>     - Local
>     - Standalone
>     - Native Kubernetes Application
>     - Native Kubernetes Session
>     - Yarn Job
>     - Yarn Session
>  - Changing memory configurations, things work as expected
>  - UI looks good
>  - Logs look good
>
>
>
> Thank you~
>
> Xintong Song
>
>
>
> On Thu, Dec 3, 2020 at 9:18 PM Rui Li <[hidden email]> <[hidden email]>
>
> wrote:
>
> +1 (non-binding)
>
> Built from source and verified hive connector tests for different
>
> hive
>
> versions.
> Setup a cluster to connect to a real hive warehouse and run some
>
> queries
>
> successfully.
>
> On Thu, Dec 3, 2020 at 8:44 PM Xingbo Huang <[hidden email]> <[hidden email]>
>
> wrote:
>
> +1 (non-binding)
>
> Checks:
> 1. verified checksums and signatures
> 2. build Flink with Scala 2.11
> 3. pip install pyflink in MacOS/CentOS under py35,py36,py37,py38
> 4. test Pandas UDAF/General UDAF/Python DataStream MapFunction
> 5. start standalone cluster and submit a python udf job.
> 6. verified NOTICE/LICENSE files of some regular modules
>
> I observed that the NOTICE file of flink-sql-connector-hbase-2.2
>
> lists
>
> 3
>
> dependencies that are not bundled in:
> commons-lang:commons-lang:2.6
> org.apache.hbase:hbase-hadoop-compat:2.2.3
> org.apache.hbase:hbase-hadoop2-compat:2.2.3
>
> I guess listing more than dependencies with apache licensed
>
> shouldn't
>
> be
>
> a
>
> blocker issue. I have opened a PR[1] to fix it.
>
> [1] https://github.com/apache/flink/pull/14299
>
> Best,
> Xingbo
>
> Robert Metzger <[hidden email]> <[hidden email]> 于2020年12月3日周四 下午5:36写道:
>
>
> There's now a pull request for the announcement blog post, please
>
> help
>
> checking it: https://github.com/apache/flink-web/pull/397
>
> On Thu, Dec 3, 2020 at 9:03 AM Robert Metzger <
>
> [hidden email]
>
> wrote:
>
> +1 (binding)
>
>
> Checks:
> - checksums seem correct
> - source archive code compiles
> - Compiled a test job against the staging repository
> - launched a standalone cluster, ran some test jobs against it
> - quickstart contains correct version
> - regular jars contain correct NOTICE file
> - Looked a bit over the output of
>    git diff release-1.11.2...release-1.12 --  "**/pom.xml"
>
>
>
> I noticed that at least one more jar file contains an invalid
>
> LICENSE
>
> file
>
> in it's root. This has already been the case with Flink 1.11,
>
> and
>
> from
>
> the
>
> context (apache flink jar, all the other license and notice
>
> files
>
> talk
>
> about this being an Apache project) it should be clear that the
>
> license
>
> file is not meant for the whole jar file contents.
> I will still extend the automated LicenseChecker to resolve
>
> this,
>
> but I
>
> don't want to cancel the release because of this.
>
>
>
> On Wed, Dec 2, 2020 at 11:19 AM Robert Metzger <
>
> [hidden email]>
>
> wrote:
>
>
> Hi everyone,
>
> We have resolved the licensing issue Chesnay found.
>
> Please review and vote on the release candidate #3 for the
>
> version
>
> 1.12.0, as follows:
>
> [ ] +1, Approve the release
> [ ] -1, Do not approve the release (please provide specific
>
> comments)
>
> The complete staging area is available for your review, which
>
> includes:
>
> * JIRA release notes [1a], and website release notes [1b]
> * the official Apache source release and binary convenience
>
> releases
>
> to
>
> be deployed to dist.apache.org [2], which are signed with the
>
> key
>
> with
>
> fingerprint D9839159 [3],
> * all artifacts to be deployed to the Maven Central Repository
>
> [4],
>
> * source code tag "release-1.12.0-rc3" [5]
>
> We will soon publish the PR for the release announcement blog
>
> post!
>
> The vote will be open for at least 72 hours. It is adopted by
>
> majority
>
> approval, with at least 3 PMC affirmative votes.
>
> Thanks,
> Dian & Robert
>
> [1a]
>
>
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12348263
>
> [1b] https://github.com/apache/flink/pull/14195
> [2]
>
> https://dist.apache.org/repos/dist/dev/flink/flink-1.12.0-rc3/
>
> [3] https://dist.apache.org/repos/dist/release/flink/KEYS
> [4]
>
>
> https://repository.apache.org/content/repositories/orgapacheflink-1404
>
> [5]
>
> https://github.com/apache/flink/releases/tag/release-1.12.0-rc3
>
> --
> Best regards!
> Rui Li
>
>
>
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release 1.12.0, release candidate #3

Chesnay Schepler-3
In reply to this post by dwysakowicz
I've filed https://issues.apache.org/jira/browse/FLINK-20519 for the
hbase issue.

Since we still ship the protobuf license I don't think this is a blocker.

On 12/7/2020 10:52 AM, Dawid Wysakowicz wrote:

>
> +1 (binding)
>
> * Verified the checksums
> * Verified that the source archives do not contain any binaries
> * Built Flink from sources
> * Run a streaming WorldCount example in BATCH and STREAM mode
> * Run a slightly heavier WorldCount version in BATCH and STREAM mode
> * Verified licensing of Hbase connectors
> *** I found one issue that we do not list protobuf-java-util in
> flink-sql-connector-hbase-2.2 NOTICE file, which is pulled in through
> hbase-shaded-miscellaneous (there is a couple more dependencies we do
> not list from it, but they are Apache licensed)*
> * Run some Table examples (I found two examples are not runnable[1]), but I would not consider it a blocker
>
> Let me know what you think about the licensing issue.
>
> Best,
> Dawid
>
> [1]https://issues.apache.org/jira/browse/FLINK-20464
> On 07/12/2020 08:07, Leonard Xu wrote:
>> +1 (non-binding)
>>
>> - checked/verified signatures and hashes
>> - built from source code with scala 2.11 succeeded
>> - checked that there are no missing artifacts
>> - started a cluster, WebUI was accessible, submitted a wordcount job and ran succeeded, no suspicious log output
>> - tested using SQL Client to submit job and the query result is expected
>> - tested read/write from/to sql kafka/upsert-kafka connector in SQL Client
>> - tested read/write/join hvie table in SQL Client
>>
>> Best,
>> Leonard Xu
>>
>>
>>> 在 2020年12月7日,11:17,Guowei Ma<[hidden email]>  写道:
>>>
>>> +1(non-binding)
>>> - build from source
>>> - build a docker image
>>> - start a session from local k8s cluster
>>> - submit a wordcount job in streaming mode.
>>> - submit a wordcount job in batch mode.
>>> Best,
>>> Guowei
>>>
>>>
>>> On Sat, Dec 5, 2020 at 3:13 PM Zhu Zhu<[hidden email]>  wrote:
>>>
>>>> +1 (binding)
>>>>
>>>> - verified signature and checksum
>>>> - built from source
>>>> - run testing jobs on yarn with manually triggered failures. checked logs
>>>> and WebUI of those jobs
>>>>   * DataStream job (paralelism=1000) with multiple disjoint pipelined
>>>> regions
>>>>   * DataSet job (paralelism=1000) with all edges blocking
>>>>
>>>> Thanks,
>>>> Zhu
>>>>
>>>> Till Rohrmann<[hidden email]>  于2020年12月4日周五 下午11:45写道:
>>>>
>>>>> +1 (binding)
>>>>>
>>>>> * Verified the checksums
>>>>> * Ran RC on Minikube cluster
>>>>> ** Session mode
>>>>> ** Application mode
>>>>> * Built Flink from sources
>>>>>
>>>>> Cheers,
>>>>> Till
>>>>>
>>>>> On Fri, Dec 4, 2020 at 2:15 PM Wei Zhong<[hidden email]>  wrote:
>>>>>
>>>>>> +1 (non-binding)
>>>>>>
>>>>>> - verified checksums and signatures
>>>>>> - build Flink with Scala 2.11
>>>>>> - pip install pyflink on Windows python 3.7
>>>>>> - run a python job with udfs on Windows
>>>>>> - pyflink shell works well on local mode and remote mode
>>>>>>
>>>>>> Best,
>>>>>> Wei
>>>>>>
>>>>>>> 在 2020年12月4日,17:21,Yang Wang<[hidden email]>  写道:
>>>>>>>
>>>>>>> +1 (non-binding)
>>>>>>>
>>>>>>> * Build from source
>>>>>>> * Deploy Flink cluster in following deployments with HA
>>>>> enabled(ZooKeeper
>>>>>>> and K8s), including kill JobManager and check failover
>>>>>>> * Native K8s Session
>>>>>>> * Native K8s Application
>>>>>>> * Yarn Session
>>>>>>> * Yarn Per-Job
>>>>>>> * Yarn Application
>>>>>>> * Check webUI and logs in different deployments especially via
>>>> `kubectl
>>>>>>> logs` in K8s
>>>>>>>
>>>>>>> Best,
>>>>>>> Yang
>>>>>>>
>>>>>>> Xintong Song<[hidden email]>  于2020年12月4日周五 下午3:00写道:
>>>>>>>
>>>>>>>> +1 (non-binding)
>>>>>>>>
>>>>>>>>   - Verified checksums and signatures
>>>>>>>>   - No binaries found in source archive
>>>>>>>>   - Build from source
>>>>>>>>   - Tried a couple of example jobs in various deployment mode
>>>>>>>>      - Local
>>>>>>>>      - Standalone
>>>>>>>>      - Native Kubernetes Application
>>>>>>>>      - Native Kubernetes Session
>>>>>>>>      - Yarn Job
>>>>>>>>      - Yarn Session
>>>>>>>>   - Changing memory configurations, things work as expected
>>>>>>>>   - UI looks good
>>>>>>>>   - Logs look good
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> Thank you~
>>>>>>>>
>>>>>>>> Xintong Song
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Thu, Dec 3, 2020 at 9:18 PM Rui Li<[hidden email]>
>>>> wrote:
>>>>>>>>> +1 (non-binding)
>>>>>>>>>
>>>>>>>>> Built from source and verified hive connector tests for different
>>>>> hive
>>>>>>>>> versions.
>>>>>>>>> Setup a cluster to connect to a real hive warehouse and run some
>>>>>> queries
>>>>>>>>> successfully.
>>>>>>>>>
>>>>>>>>> On Thu, Dec 3, 2020 at 8:44 PM Xingbo Huang<[hidden email]>
>>>>>> wrote:
>>>>>>>>>> +1 (non-binding)
>>>>>>>>>>
>>>>>>>>>> Checks:
>>>>>>>>>> 1. verified checksums and signatures
>>>>>>>>>> 2. build Flink with Scala 2.11
>>>>>>>>>> 3. pip install pyflink in MacOS/CentOS under py35,py36,py37,py38
>>>>>>>>>> 4. test Pandas UDAF/General UDAF/Python DataStream MapFunction
>>>>>>>>>> 5. start standalone cluster and submit a python udf job.
>>>>>>>>>> 6. verified NOTICE/LICENSE files of some regular modules
>>>>>>>>>>
>>>>>>>>>> I observed that the NOTICE file of flink-sql-connector-hbase-2.2
>>>>> lists
>>>>>>>> 3
>>>>>>>>>> dependencies that are not bundled in:
>>>>>>>>>> commons-lang:commons-lang:2.6
>>>>>>>>>> org.apache.hbase:hbase-hadoop-compat:2.2.3
>>>>>>>>>> org.apache.hbase:hbase-hadoop2-compat:2.2.3
>>>>>>>>>>
>>>>>>>>>> I guess listing more than dependencies with apache licensed
>>>>> shouldn't
>>>>>>>> be
>>>>>>>>> a
>>>>>>>>>> blocker issue. I have opened a PR[1] to fix it.
>>>>>>>>>>
>>>>>>>>>> [1]https://github.com/apache/flink/pull/14299
>>>>>>>>>>
>>>>>>>>>> Best,
>>>>>>>>>> Xingbo
>>>>>>>>>>
>>>>>>>>>> Robert Metzger<[hidden email]>  于2020年12月3日周四 下午5:36写道:
>>>>>>>>>>
>>>>>>>>>>> There's now a pull request for the announcement blog post, please
>>>>>>>> help
>>>>>>>>>>> checking it:https://github.com/apache/flink-web/pull/397
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Dec 3, 2020 at 9:03 AM Robert Metzger <
>>>> [hidden email]
>>>>>>>>>> wrote:
>>>>>>>>>>>> +1 (binding)
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> Checks:
>>>>>>>>>>>> - checksums seem correct
>>>>>>>>>>>> - source archive code compiles
>>>>>>>>>>>> - Compiled a test job against the staging repository
>>>>>>>>>>>> - launched a standalone cluster, ran some test jobs against it
>>>>>>>>>>>> - quickstart contains correct version
>>>>>>>>>>>> - regular jars contain correct NOTICE file
>>>>>>>>>>>> - Looked a bit over the output of
>>>>>>>>>>>>     git diff release-1.11.2...release-1.12 --  "**/pom.xml"
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> I noticed that at least one more jar file contains an invalid
>>>>>>>> LICENSE
>>>>>>>>>>> file
>>>>>>>>>>>> in it's root. This has already been the case with Flink 1.11,
>>>> and
>>>>>>>>> from
>>>>>>>>>>> the
>>>>>>>>>>>> context (apache flink jar, all the other license and notice
>>>> files
>>>>>>>>> talk
>>>>>>>>>>>> about this being an Apache project) it should be clear that the
>>>>>>>>> license
>>>>>>>>>>>> file is not meant for the whole jar file contents.
>>>>>>>>>>>> I will still extend the automated LicenseChecker to resolve
>>>> this,
>>>>>>>>> but I
>>>>>>>>>>>> don't want to cancel the release because of this.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Wed, Dec 2, 2020 at 11:19 AM Robert Metzger <
>>>>>>>> [hidden email]>
>>>>>>>>>>>> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> Hi everyone,
>>>>>>>>>>>>>
>>>>>>>>>>>>> We have resolved the licensing issue Chesnay found.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Please review and vote on the release candidate #3 for the
>>>>> version
>>>>>>>>>>>>> 1.12.0, as follows:
>>>>>>>>>>>>>
>>>>>>>>>>>>> [ ] +1, Approve the release
>>>>>>>>>>>>> [ ] -1, Do not approve the release (please provide specific
>>>>>>>>> comments)
>>>>>>>>>>>>> The complete staging area is available for your review, which
>>>>>>>>>> includes:
>>>>>>>>>>>>> * JIRA release notes [1a], and website release notes [1b]
>>>>>>>>>>>>> * the official Apache source release and binary convenience
>>>>>>>> releases
>>>>>>>>>> to
>>>>>>>>>>>>> be deployed to dist.apache.org [2], which are signed with the
>>>>> key
>>>>>>>>>> with
>>>>>>>>>>>>> fingerprint D9839159 [3],
>>>>>>>>>>>>> * all artifacts to be deployed to the Maven Central Repository
>>>>>>>> [4],
>>>>>>>>>>>>> * source code tag "release-1.12.0-rc3" [5]
>>>>>>>>>>>>>
>>>>>>>>>>>>> We will soon publish the PR for the release announcement blog
>>>>>>>> post!
>>>>>>>>>>>>> The vote will be open for at least 72 hours. It is adopted by
>>>>>>>>> majority
>>>>>>>>>>>>> approval, with at least 3 PMC affirmative votes.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Thanks,
>>>>>>>>>>>>> Dian & Robert
>>>>>>>>>>>>>
>>>>>>>>>>>>> [1a]
>>>>>>>>>>>>>
>>>> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12348263
>>>>>>>>>>>>> [1b]https://github.com/apache/flink/pull/14195
>>>>>>>>>>>>> [2]
>>>>>>>> https://dist.apache.org/repos/dist/dev/flink/flink-1.12.0-rc3/
>>>>>>>>>>>>> [3]https://dist.apache.org/repos/dist/release/flink/KEYS
>>>>>>>>>>>>> [4]
>>>>>>>>>>>>>
>>>>>> https://repository.apache.org/content/repositories/orgapacheflink-1404
>>>>>>>>>>>>> [5]
>>>>>>>> https://github.com/apache/flink/releases/tag/release-1.12.0-rc3
>>>>>>>>> --
>>>>>>>>> Best regards!
>>>>>>>>> Rui Li
>>>>>>>>>

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release 1.12.0, release candidate #3

Dian Fu-2
Thank you all for the votes.

Regarding to the license issue reported by Dawid, Chesnay's point sounds reasonable for me. It should not be a blocker issue.

Since the voting time has passed, I will conclude the vote result in a separate thread.

> 在 2020年12月7日,下午9:45,Chesnay Schepler <[hidden email]> 写道:
>
> I've filed https://issues.apache.org/jira/browse/FLINK-20519 for the hbase issue.
>
> Since we still ship the protobuf license I don't think this is a blocker.
>
> On 12/7/2020 10:52 AM, Dawid Wysakowicz wrote:
>>
>> +1 (binding)
>>
>> * Verified the checksums
>> * Verified that the source archives do not contain any binaries
>> * Built Flink from sources
>> * Run a streaming WorldCount example in BATCH and STREAM mode
>> * Run a slightly heavier WorldCount version in BATCH and STREAM mode
>> * Verified licensing of Hbase connectors
>> *** I found one issue that we do not list protobuf-java-util in flink-sql-connector-hbase-2.2 NOTICE file, which is pulled in through hbase-shaded-miscellaneous (there is a couple more dependencies we do not list from it, but they are Apache licensed)*
>> * Run some Table examples (I found two examples are not runnable[1]), but I would not consider it a blocker
>>
>> Let me know what you think about the licensing issue.
>>
>> Best,
>> Dawid
>>
>> [1]https://issues.apache.org/jira/browse/FLINK-20464
>> On 07/12/2020 08:07, Leonard Xu wrote:
>>> +1 (non-binding)
>>>
>>> - checked/verified signatures and hashes
>>> - built from source code with scala 2.11 succeeded
>>> - checked that there are no missing artifacts
>>> - started a cluster, WebUI was accessible, submitted a wordcount job and ran succeeded, no suspicious log output
>>> - tested using SQL Client to submit job and the query result is expected
>>> - tested read/write from/to sql kafka/upsert-kafka connector in SQL Client
>>> - tested read/write/join hvie table in SQL Client
>>>
>>> Best,
>>> Leonard Xu
>>>
>>>
>>>> 在 2020年12月7日,11:17,Guowei Ma<[hidden email]>  写道:
>>>>
>>>> +1(non-binding)
>>>> - build from source
>>>> - build a docker image
>>>> - start a session from local k8s cluster
>>>> - submit a wordcount job in streaming mode.
>>>> - submit a wordcount job in batch mode.
>>>> Best,
>>>> Guowei
>>>>
>>>>
>>>> On Sat, Dec 5, 2020 at 3:13 PM Zhu Zhu<[hidden email]>  wrote:
>>>>
>>>>> +1 (binding)
>>>>>
>>>>> - verified signature and checksum
>>>>> - built from source
>>>>> - run testing jobs on yarn with manually triggered failures. checked logs
>>>>> and WebUI of those jobs
>>>>>  * DataStream job (paralelism=1000) with multiple disjoint pipelined
>>>>> regions
>>>>>  * DataSet job (paralelism=1000) with all edges blocking
>>>>>
>>>>> Thanks,
>>>>> Zhu
>>>>>
>>>>> Till Rohrmann<[hidden email]>  于2020年12月4日周五 下午11:45写道:
>>>>>
>>>>>> +1 (binding)
>>>>>>
>>>>>> * Verified the checksums
>>>>>> * Ran RC on Minikube cluster
>>>>>> ** Session mode
>>>>>> ** Application mode
>>>>>> * Built Flink from sources
>>>>>>
>>>>>> Cheers,
>>>>>> Till
>>>>>>
>>>>>> On Fri, Dec 4, 2020 at 2:15 PM Wei Zhong<[hidden email]>  wrote:
>>>>>>
>>>>>>> +1 (non-binding)
>>>>>>>
>>>>>>> - verified checksums and signatures
>>>>>>> - build Flink with Scala 2.11
>>>>>>> - pip install pyflink on Windows python 3.7
>>>>>>> - run a python job with udfs on Windows
>>>>>>> - pyflink shell works well on local mode and remote mode
>>>>>>>
>>>>>>> Best,
>>>>>>> Wei
>>>>>>>
>>>>>>>> 在 2020年12月4日,17:21,Yang Wang<[hidden email]>  写道:
>>>>>>>>
>>>>>>>> +1 (non-binding)
>>>>>>>>
>>>>>>>> * Build from source
>>>>>>>> * Deploy Flink cluster in following deployments with HA
>>>>>> enabled(ZooKeeper
>>>>>>>> and K8s), including kill JobManager and check failover
>>>>>>>> * Native K8s Session
>>>>>>>> * Native K8s Application
>>>>>>>> * Yarn Session
>>>>>>>> * Yarn Per-Job
>>>>>>>> * Yarn Application
>>>>>>>> * Check webUI and logs in different deployments especially via
>>>>> `kubectl
>>>>>>>> logs` in K8s
>>>>>>>>
>>>>>>>> Best,
>>>>>>>> Yang
>>>>>>>>
>>>>>>>> Xintong Song<[hidden email]>  于2020年12月4日周五 下午3:00写道:
>>>>>>>>
>>>>>>>>> +1 (non-binding)
>>>>>>>>>
>>>>>>>>>  - Verified checksums and signatures
>>>>>>>>>  - No binaries found in source archive
>>>>>>>>>  - Build from source
>>>>>>>>>  - Tried a couple of example jobs in various deployment mode
>>>>>>>>>     - Local
>>>>>>>>>     - Standalone
>>>>>>>>>     - Native Kubernetes Application
>>>>>>>>>     - Native Kubernetes Session
>>>>>>>>>     - Yarn Job
>>>>>>>>>     - Yarn Session
>>>>>>>>>  - Changing memory configurations, things work as expected
>>>>>>>>>  - UI looks good
>>>>>>>>>  - Logs look good
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Thank you~
>>>>>>>>>
>>>>>>>>> Xintong Song
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Thu, Dec 3, 2020 at 9:18 PM Rui Li<[hidden email]>
>>>>> wrote:
>>>>>>>>>> +1 (non-binding)
>>>>>>>>>>
>>>>>>>>>> Built from source and verified hive connector tests for different
>>>>>> hive
>>>>>>>>>> versions.
>>>>>>>>>> Setup a cluster to connect to a real hive warehouse and run some
>>>>>>> queries
>>>>>>>>>> successfully.
>>>>>>>>>>
>>>>>>>>>> On Thu, Dec 3, 2020 at 8:44 PM Xingbo Huang<[hidden email]>
>>>>>>> wrote:
>>>>>>>>>>> +1 (non-binding)
>>>>>>>>>>>
>>>>>>>>>>> Checks:
>>>>>>>>>>> 1. verified checksums and signatures
>>>>>>>>>>> 2. build Flink with Scala 2.11
>>>>>>>>>>> 3. pip install pyflink in MacOS/CentOS under py35,py36,py37,py38
>>>>>>>>>>> 4. test Pandas UDAF/General UDAF/Python DataStream MapFunction
>>>>>>>>>>> 5. start standalone cluster and submit a python udf job.
>>>>>>>>>>> 6. verified NOTICE/LICENSE files of some regular modules
>>>>>>>>>>>
>>>>>>>>>>> I observed that the NOTICE file of flink-sql-connector-hbase-2.2
>>>>>> lists
>>>>>>>>> 3
>>>>>>>>>>> dependencies that are not bundled in:
>>>>>>>>>>> commons-lang:commons-lang:2.6
>>>>>>>>>>> org.apache.hbase:hbase-hadoop-compat:2.2.3
>>>>>>>>>>> org.apache.hbase:hbase-hadoop2-compat:2.2.3
>>>>>>>>>>>
>>>>>>>>>>> I guess listing more than dependencies with apache licensed
>>>>>> shouldn't
>>>>>>>>> be
>>>>>>>>>> a
>>>>>>>>>>> blocker issue. I have opened a PR[1] to fix it.
>>>>>>>>>>>
>>>>>>>>>>> [1]https://github.com/apache/flink/pull/14299
>>>>>>>>>>>
>>>>>>>>>>> Best,
>>>>>>>>>>> Xingbo
>>>>>>>>>>>
>>>>>>>>>>> Robert Metzger<[hidden email]>  于2020年12月3日周四 下午5:36写道:
>>>>>>>>>>>
>>>>>>>>>>>> There's now a pull request for the announcement blog post, please
>>>>>>>>> help
>>>>>>>>>>>> checking it:https://github.com/apache/flink-web/pull/397
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Dec 3, 2020 at 9:03 AM Robert Metzger <
>>>>> [hidden email]
>>>>>>>>>>> wrote:
>>>>>>>>>>>>> +1 (binding)
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> Checks:
>>>>>>>>>>>>> - checksums seem correct
>>>>>>>>>>>>> - source archive code compiles
>>>>>>>>>>>>> - Compiled a test job against the staging repository
>>>>>>>>>>>>> - launched a standalone cluster, ran some test jobs against it
>>>>>>>>>>>>> - quickstart contains correct version
>>>>>>>>>>>>> - regular jars contain correct NOTICE file
>>>>>>>>>>>>> - Looked a bit over the output of
>>>>>>>>>>>>>    git diff release-1.11.2...release-1.12 --  "**/pom.xml"
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> I noticed that at least one more jar file contains an invalid
>>>>>>>>> LICENSE
>>>>>>>>>>>> file
>>>>>>>>>>>>> in it's root. This has already been the case with Flink 1.11,
>>>>> and
>>>>>>>>>> from
>>>>>>>>>>>> the
>>>>>>>>>>>>> context (apache flink jar, all the other license and notice
>>>>> files
>>>>>>>>>> talk
>>>>>>>>>>>>> about this being an Apache project) it should be clear that the
>>>>>>>>>> license
>>>>>>>>>>>>> file is not meant for the whole jar file contents.
>>>>>>>>>>>>> I will still extend the automated LicenseChecker to resolve
>>>>> this,
>>>>>>>>>> but I
>>>>>>>>>>>>> don't want to cancel the release because of this.
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Wed, Dec 2, 2020 at 11:19 AM Robert Metzger <
>>>>>>>>> [hidden email]>
>>>>>>>>>>>>> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> Hi everyone,
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> We have resolved the licensing issue Chesnay found.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Please review and vote on the release candidate #3 for the
>>>>>> version
>>>>>>>>>>>>>> 1.12.0, as follows:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> [ ] +1, Approve the release
>>>>>>>>>>>>>> [ ] -1, Do not approve the release (please provide specific
>>>>>>>>>> comments)
>>>>>>>>>>>>>> The complete staging area is available for your review, which
>>>>>>>>>>> includes:
>>>>>>>>>>>>>> * JIRA release notes [1a], and website release notes [1b]
>>>>>>>>>>>>>> * the official Apache source release and binary convenience
>>>>>>>>> releases
>>>>>>>>>>> to
>>>>>>>>>>>>>> be deployed to dist.apache.org [2], which are signed with the
>>>>>> key
>>>>>>>>>>> with
>>>>>>>>>>>>>> fingerprint D9839159 [3],
>>>>>>>>>>>>>> * all artifacts to be deployed to the Maven Central Repository
>>>>>>>>> [4],
>>>>>>>>>>>>>> * source code tag "release-1.12.0-rc3" [5]
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> We will soon publish the PR for the release announcement blog
>>>>>>>>> post!
>>>>>>>>>>>>>> The vote will be open for at least 72 hours. It is adopted by
>>>>>>>>>> majority
>>>>>>>>>>>>>> approval, with at least 3 PMC affirmative votes.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Thanks,
>>>>>>>>>>>>>> Dian & Robert
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> [1a]
>>>>>>>>>>>>>>
>>>>> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12348263
>>>>>>>>>>>>>> [1b]https://github.com/apache/flink/pull/14195
>>>>>>>>>>>>>> [2]
>>>>>>>>> https://dist.apache.org/repos/dist/dev/flink/flink-1.12.0-rc3/
>>>>>>>>>>>>>> [3]https://dist.apache.org/repos/dist/release/flink/KEYS
>>>>>>>>>>>>>> [4]
>>>>>>>>>>>>>>
>>>>>>> https://repository.apache.org/content/repositories/orgapacheflink-1404
>>>>>>>>>>>>>> [5]
>>>>>>>>> https://github.com/apache/flink/releases/tag/release-1.12.0-rc3
>>>>>>>>>> --
>>>>>>>>>> Best regards!
>>>>>>>>>> Rui Li
>>>>>>>>>>
>