Nicolas Villanueva

Blog #

AWS

As mentioned in the previous blog, I set out a goal to build a static site using only Amazon Web Service products. For the most part, I followed Amazon's blog post about hosting a static website in AWS. Here's a breakdown of all the products I've used for a continuous deployment of a static website:

Route 53 - First, I needed a domain name. Using AWS Route 53, was really easy and I was able to find this domain, AND have it hosted in a few minutes. This process was infinitely easier than using GoDaddy or NameCheap

S3 - AWS S3 (Simple Storage Solution) is just a service that stores and serves data. Since I'm only building a static website, this is their recommended solution of serving the content of your webpage out to the world. I created a total of 3 buckets:

  1. Hosting static content of nvillanueva.com
  2. Redirecting traffic of www.nvillanueva.com to S3 Bucket above
  3. Storing code from the build (I'll get into more detail on this later)

Code Commit - I was surprised to see such a robust offering of services from AWS, but none more so than Code Commit. It's a code repository, similar to GitHub or BitBucket. It has the basics of branching, commit tracking, and triggers. The functionality is basic, but it serves the need for a file versioning platform

Code Build - In AWS CodeBuild, you can define a build process to execute from a buildspec.ymlfile in the root folder of your Code Commit repository. This requires a 'directory' for all build artifacts to be published to, and that's the 3rd S3 Bucket from earlier. The all-important final build step pushes artifacts to the 1st S3 Bucket, hosting nvillanueva.com Example buildspec.yml file for yourdomain.com S3 Bucket version: 0.1 phases:  install:   commands:    - echo Install Phase    - aws configure set s3.signature_version s3v4  pre_build:   commands:    - echo Pre_Build Phase  build:   commands:    - echo Build started on `date`  post_build:   commands:    - echo Build completed on `date`    - echo Sync over contents of src/ folder to bucket    - aws s3 sync src s3://yourdomain.com --delete artifacts:  files:   - 'src/*.*'  discard-paths: no

Code Pipeline - With Code Pipeline, I was able to define the Code Commit repo as a 'source' and constantly check the main branch for any updates I've made. If anything has changed, then the pipeline will automatically kick off the build mentioned above

Overall, I would say this experience was easier than I intended and the documentation from Amazon is GREAT! If you're up for a challenge, I'd say this isn't a bad place to start.

Best,
nv

Update (May 8, 2023): After learning about a new function in the S3 cli, I've updated the build step from aws s3 cp ... --recursive to aws s3 sync ... --delete.

×