aws s3 logo

Hosting your website using S3 is fairly common practice on AWS. It’s cheap, requires no servers, and even supports client-side web apps with frameworks like React. To make things easier, we’ll set up a deployment pipeline to handle updates.

How Does This Work?

For an S3 site, you can make updates by running aws s3 sync, or by adding new objects to the bucket manually. However, this is a very manual process, and it can be automated pretty easily.

Using AWS’s CodePipeline CI/CD service, you can configure a pipeline to listen for changes in your source control. Whenever a change is detected, CodePipeline will send the source over to CodeBuild, which will handle building the project. For web apps, this most notably includes running npm run build, which will package up project assets into a production build. The build is sent directly to S3, updating the application.

Note that if your project doesn’t require any sort of building with npm, and you just want to sync your Git repo to S3, the setup will be much more easy. Simply configure the source stage to connect to your source control, skip the build stage, then choose “AWS S3” for the deployment step. Enter in your bucket details, and the contents of your repo will be synced to the bucket on every update.

Setting Up the Pipeline

From the CodePipeline Console, click “Create New Pipeline,” give it a name, and choose to create a new service role.

create new pipeline

For the source stage, select your Git repository. If you’re using AWS’s own CodeCommit, you can select the repo and branch. Otherwise, you’ll have to connect to your Github or Bitbucket account. Gitlab is not supported.

source stage

For the build stage, select “CodeBuild,” and create a new build project.

create build project

This will open up a dialog window, which will autoconfigure the source for the new CodeBuild project to use CodePipeline. You’ll have to configure your environment here, select an OS, and choose a runtime version. You’ll also want to create a new service role, which will have to be modified to be able to access your deployment bucket.

configure build environment

For the build configuration, choose to use a buildspec file.

buildspec file

Buildspec is a YAML format for defining what commands CodeBuild will run. This will vary depending on your application. For this example, we’ll assume you’re building your JavaScript application with npm. Paste the following into a new file called buildspec.yml, placed at the root of your repository.

version: 0.2

      nodejs: 10
      - npm i npm@latest -g
      - npm install
      - npm run build
      - aws s3 sync ./build s3://bucket-name

This Buildspec actually takes care of deployment to S3—during the post build stage, CodeBuild will run aws s3 sync to sync the /build folder with the specified bucket.

Click create on the build project, which will take you back to CodePipeline. Click “Next,” and choose to skip the deployment stage, because CodeBuild can handle running the command. Create the pipeline.

For the first run, the build stage will fail, becausee the CodeBuild service role doesn’t have access to S3. Head over to the IAM Management Console, find the CodeBuild role under “Roles,” and attach a new policy that allows access to S3. Really, you should probably limit this to access only the deployment bucket.

add policy to role

Head back over to CodePipeline, and click “Release Change” on your pipeline to manually trigger a pipeline update. If nothing is wrong with your Buildspec, you should see the pipeline succeed.

pipeline succeeded

If you make a commit to Git, a new build will be spun up, and S3 will be updated with the build artifacts. If there are any errors in the build, CodePipeline will stop before updating.

If your pipeline is failing, you can click on “CodeBuild” in the sidebar to view the logs for the most recent build, which will help you track down errors.

Profile Photo for Anthony Heddings Anthony Heddings
Anthony Heddings is the resident cloud engineer for LifeSavvy Media, a technical writer, programmer, and an expert at Amazon's AWS platform. He's written hundreds of articles for How-To Geek and CloudSavvy IT that have been read millions of times.
Read Full Bio »