Arjun Prakash created FLINK-15215:
-------------------------------------
Summary: Not able to provide a custom AWS credentials provider with flink-s3-fs-hadoop
Key: FLINK-15215
URL:
https://issues.apache.org/jira/browse/FLINK-15215 Project: Flink
Issue Type: Bug
Components: FileSystems
Affects Versions: 1.9.0
Reporter: Arjun Prakash
I am using Flink 1.9.0 on EMR (emr 5.28), with StreamingFileSink using an S3 filesystem.
I want flink to write to an S3 bucket which is running in another AWS account, and I want to do that by assuming a role in the other account. This can be easily accomplished by providing a custom credential provider (similar to [
https://aws.amazon.com/blogs/big-data/securely-analyze-data-from-another-aws-account-with-emrfs/])
As described [here|#pluggable-file-systems]], I copied {{flink-s3-fs-hadoop-1.9.0.jar}} to the plugins directory. But the configuration parameter 'fs.s3a.aws.credentials.provider' is getting shaded [
https://github.com/apache/flink/blob/master/flink-filesystems/flink-s3-fs-hadoop/src/main/java/org/apache/flink/fs/s3hadoop/S3FileSystemFactory.java#L47]}}, and so are all the aws sdk dependencies, so when I provide a custom credential provider, it complained that I was not implementing the correct interface (AWSCredentialsProvider)
The fix made in
https://issues.apache.org/jira/browse/FLINK-13044 allows users to use one of the built-in credential providers like `_InstanceProfileCredentialsProvider`,_ but still does not help with providing custom credential providers.
Related:
https://issues.apache.org/jira/browse/FLINK-13602--
This message was sent by Atlassian Jira
(v8.3.4#803005)