-
Notifications
You must be signed in to change notification settings - Fork 9.1k
HADOOP-19343: Manage hadoop-gcp Guava version directly in its pom.xml. #7883
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
💔 -1 overall
This message was automatically generated. |
<dependency> | ||
<groupId>com.google.guava</groupId> | ||
<artifactId>guava</artifactId> | ||
<version>33.1.0-jre</version> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the https://github.com/google/guava/blob/v33.4.8/README.md says:
- APIs without
@Beta
will remain binary-compatible for the indefinite
future. (Previously, we sometimes removed such APIs after a deprecation
period. The last release to remove non-@Beta
APIs was Guava 21.0.) Even
@Deprecated
APIs will remain (again, unless they are@Beta
). We have no
plans to start removing things again, but officially, we're leaving our
options open in case of surprises (like, say, a serious security problem).
So Guava seems to ensure backward compatibility since 22, except for @Beta
API? Suppose the GCS client does not use @Beta
API, it is seems to be safe and simple to always use the latest Guava in the whole Hadoop project?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Interesting, maybe it's safe to do the upgrade then.
I propose that we move ahead with this PR on the feature branch, using a targeted approach just for the hadoop-gcp module. The decision to upgrade for the whole project is a separate discussion and deserves to be tracked in a dedicated bug/PR/release note.
@arunkumarchacko is helping with a little more testing before I commit this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The decision to upgrade for the whole project is a separate discussion and deserves to be tracked in a dedicated bug/PR/release note.
make sense
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks to @arunkumarchacko for additional testing to confirm that this works!
@cnauroth LGTM. This PR can be merged from my perspective. |
Closes #7883 Signed-off-by: Shilun Fan <[email protected]> Reviewed-by: Cheng Pan <[email protected]>
I merged this to the feature branch. Thank you for the reviews, @pan3793 and @slfan1989 . |
Closes #7883 Signed-off-by: Shilun Fan <[email protected]> Reviewed-by: Cheng Pan <[email protected]>
Oops, I had to revert this. It works fine within the hadoop-gcp module, but the dependency convergence failures then just get pushed down to hadoop-tools-dist. (See below.) This wasn't caught in pre-commit, because at the time the distro wasn't including hadoop-gcp yet. (It was before #7877 went in.) We'll need to explore either upgrading the Guava version for the whole project or something else.
|
Closes apache#7883 Signed-off-by: Shilun Fan <[email protected]> Reviewed-by: Cheng Pan <[email protected]>
Description of PR
Instead of upgrading the Guava version for the whole project, manage the version specifically needed just in hadoop-gcp due to its GCS SDK dependencies.
How was this patch tested?
Full build with all GCS integration tests.
For code changes:
LICENSE
,LICENSE-binary
,NOTICE-binary
files?