Original author: Arsany Milad - Engineer @ Stackdrop
Retool's PDF component uses fetch() to load files. Browsers enforce CORS on fetch requests. If your S3 bucket isn't configured to allow requests from Retool's origin, the request gets blocked silently and the component renders nothing. Adding Retool's domain to the bucket's CORS configuration fixes it.
Why does a signed S3 URL load correctly in a browser tab but fail silently inside Retool's PDF component?
On a random Friday I found myself working on an invoicing feature: add expense info, upload files, save, done. But when I tried to open the uploaded PDF inside the app it just showed "PDF couldn't be loaded."
![PDF could not be loaded error in Retool]

The Retool console wasn't helpful, so I checked the browser console and found this:
Access to fetch at 'https://<s3-endpoint>/<path>/document.pdf?<signed-url-params>'
from origin 'https://app.retool.com' has been blocked by CORS policy:
No 'Access-Control-Allow-Origin' header is present on the requested resource.
The signed URL was valid — pasting it directly into the browser tab opened the PDF without issue. The problem was specific to the PDF component.
Here's why. When you paste a URL into the address bar, the browser performs a top-level navigation request. CORS rules don't apply to navigation. When Retool's PDF component loads a file, it calls fetch() internally. Fetch requests are cross-origin requests, and the browser enforces CORS on them. If the S3 bucket's CORS configuration doesn't include Retool's domain in AllowedOrigins, the browser blocks the response before the component receives anything.
Both the URL and the signature are valid. The bucket just hasn't been told that Retool's domain is allowed to make cross-origin requests to it.
How do you configure S3 CORS to allow Retool's PDF component to load signed URLs?
This is a CORS configuration change on the S3 bucket. No IAM changes, no bucket policy edits, no changes to how you generate signed URLs.
Step 1: Open the bucket in AWS
- Log into the AWS Console
- Navigate to S3
- Select the bucket your Retool app is reading from
- Go to Permissions
- Scroll to CORS configuration
- Click Edit
Step 2: Add the CORS configuration
[
{
"AllowedHeaders": ["*"],
"AllowedMethods": ["GET", "HEAD"],
"AllowedOrigins": [
"https://app.retool.com"
],
"ExposeHeaders": [
"ETag",
"Content-Type",
"Content-Disposition"
],
"MaxAgeSeconds": 3000
}
]
If your Retool instance runs on a custom subdomain (e.g. https://yourcompany.retool.com), replace https://app.retool.com with your actual domain. Use only the specific domains you control — wildcards like *.retool.com cover every Retool-hosted app across all customers, not just yours.
What each field does:
| Field | Purpose |
|---|---|
AllowedOrigins |
Tells S3 which domains are permitted to make cross-origin requests |
AllowedMethods |
Restricts which HTTP methods those origins can use |
AllowedHeaders |
Permits the headers Retool includes in its fetch requests |
ExposeHeaders |
Makes file metadata — content type, disposition, ETag — readable by the browser |
MaxAgeSeconds |
Controls how long the browser caches the preflight response |
Note on PUT: The template above uses
GETandHEADonly, which is enough for rendering PDFs. If your app also uploads files directly to S3 from the browser via presigned upload URLs — as in my case with invoice uploads — add"PUT"toAllowedMethods. If uploads go through your backend, you don't need it.
Step 3: Save and verify
Save the configuration. Hard-refresh Retool with Ctrl+Shift+R. Open DevTools → Network tab, reload the PDF component. The S3 response should now include:
Access-Control-Allow-Origin: https://app.retool.com
![PDF rendering correctly after CORS fix]

How should S3 CORS be configured when Retool has separate production and staging environments?
List each domain explicitly in AllowedOrigins:
"AllowedOrigins": [
"https://yourapp.retool.com",
"https://yourapp-staging.retool.com"
]
Keep this list as narrow as possible. A wildcard ("*") on a private bucket allows cross-origin requests from any domain. Explicit origins are always the right call on a bucket serving private documents.
When does an S3 CORS configuration for Retool need PUT or POST in AllowedMethods?
For rendering PDFs and images, GET and HEAD are sufficient. You only need PUT if the browser is uploading files directly to S3 via presigned upload URLs initiated from the frontend. If uploads go through your backend or a Retool resource connection, the browser never makes the upload request, so PUT isn't needed.
Only add methods you actively use. Each one you include extends the surface area of what cross-origin requests can do on that bucket.
What are the most common mistakes that cause S3 CORS issues to persist in Retool after applying a fix?
Treating the symptom as a signing issue. The signed URL works in the browser, so the signature is fine. CORS is a separate layer. The URL can be perfectly valid and still get blocked at the fetch level.
Using "*" as the allowed origin. This removes the origin restriction entirely. On a private bucket serving sensitive documents, explicit domains are the right call.
Using *.retool.com as the allowed origin. This wildcard covers every Retool-hosted app across all customers, not just yours. Always use your specific Retool domain.
Skipping the hard refresh. Browsers cache preflight responses for the duration set in MaxAgeSeconds. If you test immediately after saving without a hard refresh, the browser may still be acting on the cached response from before the fix. Ctrl+Shift+R clears this.
Editing the wrong bucket. If production and staging use different buckets, confirm which one your Retool app is reading from before making the change.
What is the workaround for loading private S3 files in Retool when the bucket CORS configuration cannot be modified?
Two options.
The first is to use Retool's built-in S3 resource to read the file server-side. Because the request originates from Retool's backend rather than the browser, CORS doesn't apply. The tradeoff is memory management — PDFs are large and you'll need to handle query cleanup after use to avoid memory issues.
The second is to proxy the file through your own backend: fetch the object server-side, encode it as base64, and return it to Retool as a data URI:
data:application/pdf;base64,...
Same principle — server-to-server requests aren't subject to browser CORS enforcement. The memory and latency cost scales with document size, which becomes significant with large files or high request volume. Modifying the bucket CORS configuration is the better long-term solution where access permits it.
FAQ
Why does a signed S3 URL open correctly in a browser tab but fail to load inside Retool's PDF component?
When you open a signed URL directly in a browser tab, the browser performs a top-level navigation request. CORS rules don't apply to navigation. When Retool's PDF component loads the same URL, it uses fetch() internally. Fetch requests are cross-origin requests and the browser enforces CORS on them. If the S3 bucket's CORS configuration doesn't include Retool's domain in AllowedOrigins, the browser blocks the response before the component receives anything. The PDF component renders nothing and gives no visible error in the UI.
How do I fix a CORS error blocking a signed S3 URL from loading in the Retool PDF component?
Add your specific Retool domain to the CORS configuration on the S3 bucket serving the files. In the AWS Console, go to S3 → your bucket → Permissions → CORS configuration → Edit, and add https://app.retool.com (or your custom subdomain) to AllowedOrigins. Save the config and hard-refresh Retool with Ctrl+Shift+R. This is a bucket-level change — no IAM or bucket policy edits are required.
I updated the S3 CORS configuration but Retool's PDF component is still blocked. What should I check?
The most common cause is a cached preflight response. Browsers cache CORS preflight results for the duration set in MaxAgeSeconds. If you test without a hard refresh (Ctrl+Shift+R), the browser may still be acting on the cached response from before the fix. If a hard refresh doesn't resolve it, open DevTools → Network and confirm Access-Control-Allow-Origin is now present in the S3 response. If it's still missing, verify you edited the correct bucket — production and staging buckets are separate.
Does the S3 CORS fix for Retool's PDF component also apply to image components and other file types?
Any Retool component that loads files using fetch() can be blocked by missing CORS headers. The PDF component is the most common case because PDFs are typically served from private buckets via signed URLs. If you encounter the same symptom with another component type — the URL works in a browser but the component renders nothing — the fix is the same: add Retool's domain to AllowedOrigins on the S3 bucket serving those files.
Is there a workaround for loading private S3 files in Retool when the bucket CORS configuration cannot be modified?
Yes. You can use Retool's built-in S3 resource to fetch the file server-side, or proxy it through your own backend and return it as a base64 data URI (data:application/pdf;base64,...). Both approaches bypass browser CORS enforcement because the request originates from a server rather than the browser. The memory and latency cost scales with document size — modifying the bucket CORS configuration is the better long-term solution where access permits it.
Arsany Milad is a developer at Stackdrop, a Retool-certified agency building governed internal tools for mid-market and enterprise clients across EMEA.









![Calling Apple Vision API from Tauri for Offline OCR [PDF Devlog #2]](https://media2.dev.to/dynamic/image/width=1000,height=420,fit=cover,gravity=auto,format=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5fmkhkvvjnkae9c6k7u9.png)


