Amazon AWS can be amazing and interesting to deal with at times. When you host a static website using S3 and Route 53, it sometimes creates files for each request the service receives. To purge these files and avoid having tens of thousands or even millions of useless files, use this command on your local CLI:
#Delete all files that name start with "2018" files from the bucket:
aws s3 rm s3://[your-aws-bucket-name]/ --recursive --exclude '*' --include '2018*'
I know that I've raved many times about the power of Amazon AWS, and this article will be no different. Amazon S3 provides a nearly free option to host static websites within their S3 product. S3 is, as Amazon describes it, Scalable Storage in the cloud. What this means, it that S3 is essentially a CDN (content delivery network). I host all types of images for websites, high scale emails and even promotional advertisements.
For my developer friends, S3 is a solution that can significantly, if not all together eliminate the cost of having a place to put websites before their completed for your clients to see on the internet.
For one of my attorney clients, I edited their layout locally, created a folder in my existing S3 environment and enabled the option to make the content a static website. The image below shows you exactly how to do this.
Voila! You have a valid website to email to your clients for approval. Obviously, this is free (as long as it doesn't get a lot of traffic - gigs I mean).
Amazon AWS is one of the most powerful and popular cloud tools available. I've been thoroughly impressed with its scalability and it's ease of use, once it's established. I will strongly suggest speaking with or consulting with someone with previous knowledge of the toolset, or you can have a bill for several hundred or even thousand dollars on your hands for things that you weren't aware were running.
While it's cumbersome at first, one of the easiest tools to use is Amazon S3. It essentially provides you the capabilities to have your own CDN (Content Delivery Network). In this space, you can store website images, html, and various other non-executable types, as easily as you can upload an image to Facebook. The only caveat I would have about this awesome tool is that it absolutely requires that you setup and administer the CORS for each folder or group. With CORS, you can say who can retrieve these images (ex: yourwebsite.com is the only location that can request these images). Without this setup, it's very easy for your images to end up cached in Google and you pay the bill for site crawls, users sharing an image, etc.
Otherwise, it's a great tool and as an Enterprise Consultant, I haven't seen an easier to use option to date.