Automatically Deploying Website from Git to AWS S3

I am a big fan of Amazon AWS – this blog has been running on it for a few years now. Since moving to AWS S3 (for storage) and CloudFront (as a Content Delivery Network) to host static websites, such as my homepage, I have been trying to work out how to get them to automatically deploy when I update the Git repository I use to manage the source code. I looked in to it in some detail last year and concluded that AWS CodePipeline would get me close, but would require a workaround as it did not support deploying to S3. In the end I decided that a custom AWS Lambda function was needed.

Lambda is a service that hosts your code, in a state where it is ready to run when triggered, without needing to have a server. You are only billed for the time your code is running (above a free threshold), so it is perfect for small infrequent jobs, such as deploying changes to a website or even using it with Alexa for home automation. It seemed like an interesting area to explore and gain some knowledge, but I think I went in at the deep end, trying to develop a complex function, using an unfamiliar language (Node.js) on an unfamiliar platform. Then other tasks popped up and it fell by the wayside.

Then earlier this year I saw an announcement from AWS that CodePipeline would now support deploying to S3 and thought my problem had been solved. Although I must admit that I was a bit disappointed not to have the challenge to code it myself. Fast forward a few months and I had the opportunity to set up the CodePipeline, which was very easy. However, it only supported copying the code from the Git repository to the S3 bucket. It did not refresh Cloudfront, so my problem remained unsolved.

The CodePipeline did allow for an extra step to be added at the end of the process, which could be a Lambda function, so I went off in search of a Lambda function to trigger an invalidation on CloudFront when an S3 bucket has been updated. The first result I found was a blog post by Miguel Ángel Nieto, which explained the process well, but was designed to work for one S3 bucket and one CloudFront distribution. As I have multiple websites, I wanted a solution that I could deploy once, and use for all websites, so my search continued. Next I came across a blog post by Yago Nobre, which looked to do exactly what I needed. Except that I could not get the source code to work. I tried debugging it for a while, but was not making much progress. It did give me an understanding of how to link a bucket to a CloudFront distribution, trigger the Lambda function from the bucket and use the Boto3 AWS SDK for Python to extract the bucket ID and CloudFront distribution from the triggering bucket – all the things that were lacking from the first blog post/sample code. Fortunately both were written in Python, using the Boto3 AWS SDK, so I was able to start work on merging them.

I was not terribly familiar with the Python language, to the point of having to search how to make comments in the code, but I saw it as a good learning experience. What I actually found harder than the new-to-me language, was coding in the Lambda Management Console, which I had to do, due to both the inputs and outputs for the function being other AWS features, meaning I could not develop locally on my Mac. Discovering the CloudWatch logs console did make things easier, as I could use the print() function to check values of variables at various stages of the function running and work out where problems were. The comprehensive AWS documentation, particularly the Python Code Samples for S3 were also helpful. Another slight difficulty I experienced was the short delay between the bucket being updated and the Lambda function triggering, it was only a few minutes, but enough to add some confusion to the process.

Eventually I got to a point where adding or removing a file on an S3 bucket, would trigger an invalidation in the correct CloudFront distribution. In the end I did not need to link it to the end of the CodePipeline, as the Lambda function is triggered by the update to the S3 bucket (which itself is done by CodePipeline). All that was left to do was to tidy up the code, write some documentation, and share it on Github for anyone to use or modify. I have kept this post more about the backgound to this project, the code, and instructions to use it are all on Github.

This code probably only saves a few minutes each time I update one of my websites, and may take a number of years to cancel out the time I spent working on it. Even more if I factor in the time spent on the original version prior to the CodePipeline to S3 announcement, but I find coding so much more rewarding when you are solving an actual problem. I also feel like I have levelled up as a geek, by publishing my first repository on Github. Now with this little project out of the way, I can start work on a new server, and WordPress theme for this blog, which was one of my goals for 2019.

Apple Watch Series 3 – Long Term Review

Jen bought me an Apple Watch for my birthday a few years ago. I have been meaning to write a short review for a while now, as today is the fourth anniversary of the original model being launched I thought it was a good day to publish it! I always prefer reading these long term reviews, to the usual short preview as a product is launched. I’m not a professional technology review, just a geek with a blog, so for a really detailed look check out DC Rainmaker’s review.

My watch is a non-cellular 42mm Apple Watch Series 3 in space grey, it came with a grey sport band. When Jen took me to the Apple Store, to refine the Apple Watch hints I’d dropped, I couldn’t get on with the sport band at all, so I told Jen I wasn’t fussed between the black or grey, as I planned to replace it straight away. However once I had the watch I quickly got used to the strap and probably would have preferred the black sports band. I have since bought a black sport loop – which has become my main strap, unless I am swimming, out in the rain or dealing with Henry, who is sicking up a lot of milk at the moment.

I had considered the cellular versions of the watch, but I didn’t think it would be worth the extra cost, both the purchase cost and the £5 per month service charge. I also actually prefer the look of the watch without the red dot on the crown, which signifies the cellular versions. The Apple Watch 4 solution of just a red ring looks a lot nearer. It is just a shame that the sport loop wasn’t available with the basic watch, only the cellular version, again this has been remedied with the new version – kudos to Apple for sorting these niggles.

When the original Apple Watch was announced, I wasn’t interested in it at all. I had (and still have) a couple of nice automatic watches and a Casio G-Shock, for when a more rugged watch was needed. Even though I considered myself more of a geek than a watch guy, I couldn’t see me wearing an Apple Watch rather than my other watches. Although I did appreciate some of the details and nods to traditional watches on the Apple Watch.

Fast forward a few years, Owen had been born, Jen was looking to get her fitness back and Apple had added GPS to the Series 2 Apple Watch, making it a much better prospect for a fitness watch. In addition to the fitness features I could see that having iPhone notifications on her wrist would be handy whilst wrangling a now wriggling Owen. So I took a flyer and bought Jen an Apple Watch Series 2 for her birthday. Much like when I’d bought her an iPad a few years before, it quickly became an essential device. This was very apparent when Jen forgot her watch charger when we went to Croyde and we had to ask her parents to bring it down when they joined us.

Shortly after Apple announced the Series 3 Apple Watch, now with a barometric altimeter, I was noticing some strange height results on my Strava. Things like gaining more altitude on short local rides, than when I’d been slogging uphill on longer rides at trail centres. This combined with seeing how useful Jen was finding her watch made me reconsider my view, so I started dropping hints for my birthday.

The fitness features, especially Strava, were my main reason for wanting an Apple Watch and I can safely say that my expectations were blown away! I would have been happy just using it with Strava to record my bike rides, but it is the off the bike fitness where it excels. The “three rings” concept, really encourages you to hit three different fitness goals each day – stand for at least a minute in an hour for twelve hours of the day, do at least thirty minutes of exercise and burn a predetermined number of calories (400 for me) by moving around. These daily goals are backed up with awards things like hitting goals on consecutive days, or doubling the move calorie targets. These targets are especially addictive, on more than one occasion I have found myself doing press ups before bed to continue a move streak, or getting up and going for a walk when the Watch reminds me that I’ve been sitting down for too long. I have however noticed oven the last six months or so that it has become a lot easier to hit my 400 calorie target – my Apple Watch wearing friends have also experienced this. I like to think we are getting fitter, or moving around more, but I expect that someone at Apple has modified the code.

I also use my Watch to track my sleep, it mostly confirms what I already knew, I’m a deep sleeper, but could do with going to bed a wee bit earlier. I also like the “Breathe” feature, although it always seems to prompt me to breathe worst moment. I don’t know what, if any, logic is behind these alerts.

The Watch includes a heart rate sensor, which has opened up a whole new load of data for me, especially during bike rides. On the other hand, too much data can be a bad thing! On a few occasions I have woken up to an alert on my Watch telling me it detected an abnormally high heart rate whilst I was asleep. This has led to various medical checks, none of which have found anything. So either there is a problem with the heart rate sensor on my Watch, or I have a rare/very occasional heart problem. I ordered a Wahoo Tickr heart rate monitor, which is on a chest strap, to help me rule out any problems with the Watch, but of course the issue has not reoccured. I now use the Tickr paired to my Watch to monitor heart rate on longer bike rides, as chest straps are meant to me more accurate than the optical sensors as used on the Watch.

Aside from fitness tracking I also use my Watch to preview notifications from my iPhone. I find it much easier to glance at my wrist to see a snippet of information, rather than taking my iPhone out of my pocket. Notifications from Apple apps, such as iMessages or email work great, you can usually see what the control the message is and give a brief response. However third party apps are a bit more hit and miss. For basic Siri tasks, such as setting a timer, it is much easier to use the Watch. I also find it useful on the bike, where I would usually need to remove my gloves to use my iPhone, I can send messages or even make and receive phone calls using Siri, whilst riding along! And Apple Pay – I doubt I will ever tire of being able to pay for things with my Watch.

The way the Watch and the iPhone hand off notifications to each other works seamlessly, which is actually frustrating for me as an owner of multiple Apple devices – if my Watch and iPhone can work that closely together why do I still get so many duplicated alerts on my Macs? Hopefully this is something Apple will work on in the future.

The only other problem I have with the Apple Watch is that I hardly ever wear my other watches these days. The Apple Watch integrates with my life so well that my mechanical watches rarely get worn. Sometimes I wonder if the stand goal is really to make sure that you are wearing your Apple Watch for at least twelve hours a day, rather than any other watch… I occasionally force myself to wear my mechanical watches, usually on special occasions and still love the amazing detail in the mechanisms, but I have been caught out trying to pay for shopping with them. The watch that has suffered the most is my G-Shock 5600, it used to be my daily watch, the only watch I would take when travelling etc but is neither as useful as the Apple Watch, nor as special as my technical watches. As I was writing this blog I took it out of my watch box and realised the battery was showing “low”, in the years I wore it was always on “high”, fortunately a few days on the windowsill recharged the battery for another few years.

On the subject of charging, when I first got the Apple Watch I charged it overnight, every night. If I forgot I could get two days use from one charge. These days I charge the Watch while I am getting changed, or having a shower – as it is only a small battery, it does not take long to charge at all.

To conclude, out of all the gadgets I have owned the Apple Watch fitted in to my life and made itself an essential item for me quicker than anything else. If I broke/lost it I would replace it without a doubt. It also makes me wonder what will happen to the luxury watch industry. I am usually a big fan of heritage and simplicity, but am now rarely found without my Apple Watch on my wrist.

Google Authenticator – How to Backup for Moving to a New Device

Recently I’ve had to start using two factor authentication (2FA), both for my AWS account and Bitcoin wallets. It seemed like there were two main options for apps to run this, Google Authenticator and Authy. Initially Authy looked like a good bet, it could sync across multiple devices, including smart watches, but it turns out this convenience means the security is weakened – to the point that Coinbase advised users not to use it! Google Authenticator goes the other way, it is extremely secure, but if you lose/reset your device the settings, and potentially access to your accounts are lost.

The only way to avoid this situation is to make a backup of your access codes at the time you add them to Authenticator. You can either do this by writing down the seed key, or taking a screenshot of the QR code. It is not advisable to keep these backups with your phone or readily accessible on an online computer, as this is one of the keys to your account. I prefer to print off a couple of copies, write – with a pen, which account the QR code is for and file them away separately. I also keep another copy on an encrypted memory stick. If you are using 2FA to access an online account and have not backed up your access codes – you should do it now!!!

When you get a new device, or wipe your existing device, it is just a case of re-scanning the QR code into Google Authenticator from your backup. You can test your backups by scanning them into Authenticator again, either on your existing device or a separate one – they will give the same six digit code as the original. To test that nothing was linked to my iPhone I also installed Authenticator on my old iPhone and was able to log into my AWS account – AWS is ideal for testing 2FA, as you can create a dummy account with 2FA enabled, without running the risk of losing access to your main account.

Saved by the Backup

In my last post I explained about my back up routine for WordPress, I wasn’t planning on testing it out so soon, but it has just saved my bacon! The plan was to spend an hour or so tweaking the blog to make it faster, by using the WP Super Cache plug in and Amazon Cloud Front, however something went badly wrong! The alarm bells should have started to ring when I noticed that most tutorials about using Amazon Cloud Front with WordPress referred to W3 Total Cache, however I preferred the look of WP Super Cache and fancied a challenge…

I was loosely following this guide, but somehow managed to take my website offline, probably by sending requests into a DNS blackhole. The problem was this meant I couldn’t get back onto my website to turn the caching off again. At this point I would also like to add that I couldn’t test this phase on my development server, as Cloud Front needed to pull data from the blog, which meant deploying on the live site.

I could still SSH into the server, so used the WP Super Cache uninstall instructions for “if all else fails and your site is broken”. However that didn’t help. At this point I was getting a little bit more panicked, but was very glad of my new backup strategy and that I’d had the foresight to make a backup just before I’d started fiddling with the blog. I feared the worst, that I would have to reinstall WordPress again from scratch and reload my data, reading this troubleshooting guide confirmed my fears.

Reinstalling WordPress isn’t the end of the world, I have done it a number of times, but for some reason I have been having a lot of permission issues on my web server, maybe I had taken security a bit too far. This meant that I couldn’t get my FTP client to upload my backup data. I ended up revisiting the AWS WordPress installation guide and also this blog post to find the correct settings and set them via SSL. At least I’ve had a lot of command line practice this evening!

Even with the permissions fixed, I couldn’t use the restore tool on Updraftplus (possibly due to restrictions I have added on AWS?), but was able to upload the data via FTP and got the blog up and running again. I still haven’t got the caching/CDN set up, but I think I’l take the easy route now and hopefully not need to test my backups again.

WordPress Backups Using UpdraftPlus and Amazon S3

I had a bit of a disaster the other day – I went to link to a blog post from a few months ago and it wasn’t there! I remember writing it, and knew it had posted, because I remembered some of the comments from when it appeared on my Facebook profile. I then remembered that there had been some funny goings on with the WordPress Mac app, I’d had a duplicate post and deleted it manually. However now it seems like the duplicate had also been deleted.

Of course it was at this point I realised that my latest backup was a couple of months before the post and I couldn’t recover it from anywhere. I was particularly annoyed at myself because I have a thorough backup routine for my Macs and especially my photography work, yet virtually nothing for my blog. However, it was the kick up the backside I needed to sort out a decent backup routine for my blog!

Given that I was the weak link when it came to backing up my log I wanted something automatic, that would run regularly and email me when it had completed. As with most things WordPress, there seemed to be loads of plugins available, most of them paid services. In my research I’d read good things about UpdraftPlus, so was pleased to find their free option, which is more than powerful enough for a small blog like mine.

To see if it UpdraftPlus lived up to the hype, I downloaded it onto my WordPress development environment (Chassis running on my iMac) and had a play. Looking at the list of remote storage services Amazon S3 was the obvious choice, as I already use Amazon Web Services to host my blog. Knowing the basics of cyber security, I only wanted UpdraftPlus to have minimal access to AWS, I had got myself lost in a maze of IAM, S3 buckets, users, groups and permissions. I was on the right track but this post on the UpdraftPlus blog, told me exactly what I needed to do. The IAM Policy Simulator on AWS was also a huge help in making sure my policies were both written and applied correctly. I went for the maximum security option, which also gave me a chance to delve into the workings of S3, setting up rules to archive then delete the data after periods of time.

Once deployed and tested on my development environment, it only took a matter of minutes to get working on my live blog, giving me regular, automated backups. Now the only task left to do is do rewrite the post that got lost…

A trip to the pub

This is my first go at making a video, so I thought I’d break myself in gently with a time lapse. Capturing the images was easy, I set the GoPro camera to take a photo every second, stuck it on the windscreen and drove to the pub (via the scenic route)! Winter in Warwickshire isn’t the most glamorous, or exciting of locations, but I got a new toy for Christmas and I wanted to use it!

The real challenge started when I got back from the pub with 2,500 images on the memory card, I had three options when it came to software, so I tried them all:

  • Lightroom – My photo editing software of choice, well within my comfort zone, I could import, back up and add my metadata to the images with two clicks, then process one image and sync settings to the rest. What I couldn’t do without adding plug ins, was compile them to a video at 30 frames per second, this is something I need to investigate further.
  • GoPro CineForm Studio – I’m always a bit vary with bundled software, but after a few teething problems (importing a folder full of images works, importing 2000 individual images doesn’t) I was able to get it to stitch the images together and edit the resulting video file, which I didn’t find too intuitive.
  • iMovie – Apple always seem to say how god Macs are for creative projects such as video, so their software was worth a look, although seemingly, to get the still images into iMovie they had to be imported to iPhoto. This integration is great, but only if you plan on using both, having said that iPhoto saved my bacon when I accidentally formatted the micro SD card in my camera, meaning I didn’t lose the first picture I took with the GoPro. Using iMovie I wasn’t able to stitch the images together faster than 10fps, with 30fps being what I needed, so I gave up on it for creating time lapses, but when it comes to working with multiple video files iMove seems to be the best application I have available, although I’ll need to upgrade it to export in 1080p high resolution.

In the end I used Lightroom to process the images and crop them to the 16:9 widescreen aspect ratio, then GoPro CineForm studio to combine them into a time lapse then compress them to upload to YouTube. I can see video and especially time lapses being a big thing for me in 2013, it’s certainly got my creative juices flowing, so watch this space.

The Best Camera…

Is the one that’s with you.

Not only is it a book/website by Chase Jarvis, one of my favourite photographers, it is a great way of thinking!

As part of his project Chase and his team have created an iPhone app, which makes post processing and sharing phones from the iPhone really easy, combining two of my passions.

Most of my images taken with my phone are posted on my Twitter feed, but can also be found on my part of the Best Camera website.

One of my favourite arty shots, taken with my iPhone, at the Trafford Centre, I love the simplicity of the black and white conversion:

Fountain iPhone photo

Fountain iPhone photo