-
Notifications
You must be signed in to change notification settings - Fork 1.1k
BYO bucket for v2 functions uploads with runfunctions enabled #8980
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Summary of Changes
Hello @inlined, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!
This pull request introduces a significant change to how Cloud Functions (v2) source code is uploaded, particularly when the 'runfunctions' experiment is enabled. The core idea is to shift from using Google Cloud Functions' API-generated upload URLs to a 'Bring Your Own' (BYO) Google Cloud Storage bucket approach. This custom bucket will be automatically managed, including a 1-day retention policy for uploaded source files to optimize costs. This change facilitates progressive testing of new deployment mechanisms and aims to provide more control over the source upload process.
Highlights
- Conditional Source Upload Logic: Introduces a new source code upload mechanism for Cloud Functions (v2) and 'run' functions, which is activated when the 'runfunctions' experiment flag is enabled. This allows for progressive testing of new deployment flows.
- Bring Your Own (BYO) Bucket for Source Uploads: When the 'runfunctions' experiment is active, the system will now use a dedicated, user-owned Google Cloud Storage bucket (
firebase-functions-src-<projectNumber>
) for uploading function source code, moving away from relying on the GCF API's generated upload URLs. - Automated Cost Management: The newly created custom GCS buckets for source uploads are configured with a lifecycle rule to automatically delete objects after 1 day, which is the lowest available retention policy, helping to keep storage costs under control.
- Improved Testability: Refactored the
createReadStream
function to be exportable, which enhances the testability of the source upload process by allowing easier mocking in unit tests.
Using Gemini Code Assist
The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.
Invoking Gemini
You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command>
or @gemini-code-assist <command>
. Below is a summary of the supported commands.
Feature | Command | Description |
---|---|---|
Code Review | /gemini review |
Performs a code review for the current pull request in its current state. |
Pull Request Summary | /gemini summary |
Provides a summary of the current pull request in its current state. |
Comment | @gemini-code-assist | Responds in comments when explicitly tagged, both in issue comments and review comments. |
Help | /gemini help |
Displays a list of available commands. |
Customization
To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/
folder in the base of the repository. Detailed instructions can be found here.
Limitations & Feedback
Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.
You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.
Footnotes
-
Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution. ↩
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request introduces a new source upload mechanism for v2 functions when the runfunctions
experiment is enabled. Instead of using a GCF-generated signed URL, it now creates a dedicated GCS bucket for the project and uploads the source code there. This is a solid approach to decouple from GCF-specific APIs and provides more control. The implementation is well-contained behind the experiment flag, and the new logic is thoroughly covered by unit tests. The related changes in gcp/storage.ts
, such as the new upsertBucket
function, are robust and improve reusability. Overall, this is a well-executed change.
src/deploy/functions/deploy.ts
Outdated
|
||
// New behavior: BYO bucket since we're moving away from the GCF API. | ||
// Using "func-" as the prefix to balance clarity and avoid name lenght limits. | ||
// TODO: can wire project number through to ensure we fit in the 63 char limit. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@annajowang Here's the PR I told you about offline. After writing the initial version, I looked up code to answer Abhii's q and realized there's a lot of overlap. I consolidated the code (where FAH's was better than what I had!). While I was there I got nerd sniped; I simplified some code, replaced Map with the canonical Record, and made some stuff more parallelized. /gemini review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request introduces a significant change for v2 functions under the runfunctions
experiment, switching to a 'Bring Your Own' bucket for source uploads instead of using GCF's generated URLs. This includes a 1-day object retention policy for cost management. A new upsertBucket
utility has been added to gcp/storage.ts
to streamline bucket creation, and it's well-tested. Additionally, there's a widespread and beneficial refactoring across the App Hosting code, replacing Map
objects with Record<string, ...>
for better serializability and consistency. The use of Promise.all
with map
also improves performance by parallelizing operations.
I've found one critical issue in src/deploy/apphosting/release.ts
where a filter operation doesn't modify the array as intended, which could lead to runtime errors. I've also pointed out a leftover TODO comment that should be removed. Overall, the changes are well-structured and the new functionality is thoroughly tested.
Codecov Report❌ Patch coverage is Additional details and impacted files@@ Coverage Diff @@
## master #8980 +/- ##
==========================================
- Coverage 50.93% 49.27% -1.67%
==========================================
Files 510 670 +160
Lines 33396 38355 +4959
Branches 6934 7649 +715
==========================================
+ Hits 17011 18900 +1889
- Misses 14886 17930 +3044
- Partials 1499 1525 +26 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
src/deploy/apphosting/deploy.ts
Outdated
file: zippedSourcePath, | ||
stream: fs.createReadStream(zippedSourcePath), | ||
}, | ||
`firebaseapphosting-sources-${options.projectNumber}-${backendLocation.toLowerCase()}`, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the same long string as the prefix of the one below. can this be a shared string const?
src/gcp/storage.ts
Outdated
const localAPIClient = new Client({ urlPrefix: url.origin, auth: false }); | ||
): Promise<{ generation: string | null }> { | ||
const url = new URL(uploadUrl, storageOrigin()); | ||
const signedUrl = url.searchParams.has("GoogleAccessId"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit isSignedUrl
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good for functions
condition: { | ||
age: 30, | ||
}, | ||
await Promise.all( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit* consider .allSettled
.?
(aside - I really dislike the boilerplate required to properly catch and fire errors when using .allSettled
- wondering if you know of an easier way?)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All settled would be new error handling not present in the prevoius code, so I'll continue a conversation on whether this needs to be a follow-up.
Given that this is deploy not release and has no changes, what behavior would we want to have if we have partial success?
const uploadOpts = { | ||
file: source.functionsSourceV2!, | ||
stream: fs.createReadStream(source.functionsSourceV2!), | ||
stream: (exports as { createReadStream: typeof createReadStream }).createReadStream( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
aside: dang why wouldn't exports.createReadStream
work here??
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is what we do elsewhere so I did it here. It works but you don't get type enforcement.
await gcs.upload(uploadOpts, res.uploadUrl, undefined, true /* ignoreQuotaProject */); | ||
return res.storageSource; | ||
|
||
// New behavior: BYO bucket since we're moving away from the GCF API. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we leave this comment out for now?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will adjust to say while using the runfunctions experiment.
rule: [ | ||
{ | ||
action: { type: "Delete" }, | ||
condition: { age: 1 }, // Delete objects after 1 day |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is it yagni if we make this configurable?
(e.g. we occasionally get support request from users wanting to "get their source code")
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's start with a safe default (not spending unbounded money as time progresses) and add configurability as a feature in the future
As a way to test things progressively, when the "runfunctions" experiment is enabled, always choose to use your own bucket rather than using GCF's generate upload source function. To keep costs under control, objects are deleted after 1d, which is the lowest available retention policy.
Verified to work with the runfunctions experiment both enabled and disabled