How we used Rails Active Storage and got it working with Amazon S3.
If you are new to Active Storage, here are a couple of useful links:
-
A great video introduction to Active Storage: File uploading with ActiveStorage in Rails 5.2
-
A useful overview on creating your bucket on Amazon S3: Tutorial: How to use Amazon S3 and CloudFront CDN to serve images fast and cheap
For our use case we are uploading PDF documents direct to the cloud. We have no post-processing to be done on the server so we can have a nice separation.
Part 1: Create and configure your S3 bucket
1 Create your bucket:
In the Amazon S3 interface, we created our bucket with the same name as our site, eg “www.mySite.com”.
2 Grant access permissions:
We were already using AWS credentials to deploy and run in ElasticBeanstalk, so we initially tried using the same user, but making sure it had the required permissions on the bucket: s3:ListBucket, s3:PutObject, s3:GetObject, and s3:DeleteObject.
Although “you can grant permission to an AWS user by the canonical user ID”, getting the canonical user ID can be elusive. Instead we initially tried adding a policy on the bucket to allow our IAM user access, but we eventually ended up creating a dedicated ActiveStorage group with an ActiveStorage IAM user so we could easily revoke or replace it, should we say, commit the credentials to github.
In the process you are likely to use the AWS Policy Generator to write the policy, then copy and paste it into the Bucket policy editor and save. On saving we did get an error:
Error
Action does not apply to any resource(s) in statement
We had to handle the “s3:ListBucket” action separately to allow the policy to save, as you can see below. We also added a wild card “*” at the end of the Resource value for actions that apply to the bucket contents. Here is how our ActiveStorage group policy looked after saving:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:ListAllMyBuckets"
],
"Resource": "arn:aws:s3:::*"
},
{
"Effect": "Allow",
"Action": [
"s3:ListBucket",
"s3:GetBucketLocation"
],
"Resource": [
"arn:aws:s3:::www.mysite.org",
"arn:aws:s3:::www.mysite.com"
]
},
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:PutObjectAcl",
"s3:GetObject",
"s3:GetObjectAcl",
"s3:DeleteObject"
],
"Resource": [
"arn:aws:s3:::www.mysite.org/*",
"arn:aws:s3:::www.mysite.com/*"
]
}
]
}
3 Update CORS configuration:
Adding the CORS configuration will prevent or solve this ajax error when you attempt to directly upload from client to cloud:
Cross-Origin Request Blocked… (Reason: CORS header ‘Access-Control-Allow-Origin’ missing).
Under the Bucket/Properties/CORS Configuration, replace the existing configuration with the following and Save it. For a quick test, use this:
<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
<AllowedOrigin>*</AllowedOrigin>
<AllowedMethod>GET</AllowedMethod>
<MaxAgeSeconds>3000</MaxAgeSeconds>
<AllowedHeader>Authorization</AllowedHeader>
</CORSRule>
<CORSRule>
<AllowedOrigin>*</AllowedOrigin>
<AllowedMethod>PUT</AllowedMethod>
<AllowedMethod>POST</AllowedMethod>
<MaxAgeSeconds>3000</MaxAgeSeconds>
<AllowedHeader>*</AllowedHeader>
</CORSRule>
</CORSConfiguration>
- thanks to Tom Rossi for this.
Once you are done with experimenting, tie it down to known origins, such as ‘https://{anything}.mysite.com’. Note the wild card ‘*’.
<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
<AllowedOrigin>https://*.mysite.com</AllowedOrigin>
<AllowedMethod>GET</AllowedMethod>
<MaxAgeSeconds>3000</MaxAgeSeconds>
<AllowedHeader>Authorization</AllowedHeader>
</CORSRule>
<CORSRule>
<AllowedOrigin>https://*.mysite.com</AllowedOrigin>
<AllowedMethod>PUT</AllowedMethod>
<AllowedMethod>POST</AllowedMethod>
<MaxAgeSeconds>3000</MaxAgeSeconds>
<AllowedHeader>*</AllowedHeader>
</CORSRule>
</CORSConfiguration>
Part 2: Update your rails app.
1 You must declare an S3 service in config/storage.yml. This part took some experimentation to get right. Initially we assumed we could use environment variables as we do elsewhere, but S3 consistantly returned 403 forbidden no matter what we tried with Amazon S3 permissions. The following configuration didn’t work:
# config/storage.yml (this didn't work)
amazon:
service: S3
access_key_id: <%= ENV["AWS_ACCESS_KEY_ID"] %>
secret_access_key: <%= ENV["AWS_SECRET_KEY"] %>
region: us-west-1
bucket: www.mySite.com
The Rails Active Storage documentation appears to suggest you can remove access_key_id, secret_access_key, and region from the config/storage.yml, but if you do you will get these errors in production:
/opt/rubies/ruby-2.4.2/lib/ruby/gems/2.4.0/gems/aws-partitions-1.71.0/lib/aws-partitions/endpoint_provider.rb:82:in
block in partition_matching_region': Cannot load
Rails.config.active_storage.service: (NoMethodError) undefined method
match' for nil:NilClass
What did work was committing the actual keys for the ActiveStorage IAM user we created in Amazon S3 just for this purpose. (We also precompiled the assets, but not sure if that is significant)
# config/storage.yml (this works)
amazon:
service: S3
access_key_id: QWERJKLQCMCNVTNDSIFEGJS
secret_access_key: tis2CaJpYDRuw0CkpFY15d3f4u1VyGudZxhYHTyD
region: us-west-1
bucket: www.mySite.com
But you don’t want your unencripted AWS Credentials committed to souce control! Instead you can encript them using the new Rails 5.2 encrypted secrets (which is a separate topic…)
2 Update your model:
# app/models/document.rb
class Document < ApplicationRecord
has_one_attached :object
end
3 Add the upload field in the ‘form’ view We are uploading PDFs direct from client to cloud, so the direct_upload option is set to true.
# app/views/documents/_form.html.erb
<%= f.file_field :object,
multiple: false,
direct_upload: true,
options: { class: "btn btn-success" } %>
4 Ensure the activestorage javascript is included in the asset pipeline, ours was already:
# app/assets/javascripts/application.js
...
//= require activestorage
...
5 Display the download link in the ‘Show’ view
# app/views/documents/show.html.erb
link_to( "Document",
rails_blob_path( document.object, disposition: "attachment" )
)
6 (Optional) Add a progress bar to your upload form
The file uploads can take a while with nothing apparently happening. Consider building a better user experience by simply adding ‘direct_uploads.js’ and ‘direct_uploads.css’ available from here: JavaScript events It works as is with no change required to your view, and of course you can modify as you like.
To support the experimental ES6 syntax in the js script we passed the :harmony => true option to Uglifier in our production.rb configuration:
#config/environments/production.rb
...
require 'uglifier'
config.assets.js_compressor = Uglifier.new(harmony: true)
...
To conclude
With just these few simple steps it is possible to upload files directly from the client to S3 on submitting the form. And to download file from a link on the view.
It was not necessary to make the S3 bucket publicly accessable to achieve this.