Super Power Words

Now that Henry has started school properly, he is learning to read, and each week he gets sent home with three “super power words” to learn. Back in 2020, when Owen was in reception, he had the same words to learn, except we were teaching him at home. At the time, I thought it would be cool to have an iPad app for Owen to practice on, but I did not have the skills to knock one up quickly, nor the time to learn. Fast forward three years, to when Henry is learning to read, and I am working as a software developer full time, and it is the sort of thing I can knock up in an evening after the boys have gone to bed. So I did – as soon as Henry came home with his first set of super power words!

I used Create React App to bootstrap a basic application, as Next.js would be overkill for something so simple. Then it was just a case of adding a function to randomly display the words, and a click handler to call it. Initially, I hardcoded the words, to get to a “minimum viable product”, where testing showed I needed to add some more logic to prevent the same word being shown twice in a row. The plan was always to share it on my GitHub, so I changed it to read a list of comma-separated words from environment variables, and whilst there added the option to change the heading to make the app usable for other people.

The “user acceptance testing” (showing it to Henry) identified that the words needed to be all in lowercase, except “I”, and that the “a” character was not displayed as taught at school, so I figured that teaching a four-year-old to read was one of the few valid use cases for the “Comic Sans MS” font and switched to that.

To get the app onto Henry’s iPad, as he does not like laptops, I used AWS Amplify, which combines a basic CICD pipeline and serverless hosting, for a cheap hosting solution for simple web apps. I was also able to hook it up to Henry’s domain name. Builds are triggered by pushing app changes to the main branch of the GitHub repository for the project. When Henry gets a new set of words it is just a case of updating the environment variables in the Amplify web console and re-running the latest build.

If you want to try for yourself, you can check out (or fork!) the repository, create a .env file, set your environment variables and run “npm start”. Or simply have a look at www.henrycraik.co.uk.

AWS Certified Developer – Associate

One of my goals for 2023, and in fact 2022, was to pass the AWS Certified Developer – Associate exam. I am pleased to say that, after a lot of studying, I have achieved that!

For those that do not know, AWS is Amazon Web Services – Amazon’s cloud computing platform, which essentially makes all of the tools that Amazon have developed to run their online store available to other organisations from Apple down to individual developers/bloggers, like me. I have been using AWS to run this blog, and a few of my other sites for a few years. I have also been using it a lot at work, as we are an AWS Partner. The AWS Certified Developer – Associate certification is the next level up from the AWS Certified Cloud Practitioner qualification I achieved in 2020.

After a few goes at studying for the exam around work, and some disappointing results in mock exams, I joined an AWS instructor led accelerator program -an intensive five week course covering the content and exam strategy. I ended up taking a bit more than the five weeks to cover the content, but felt that it really helped. Unlike last time, where I took the exam remotely at home, I went in to the local test centre in Coventry, which was a lot less stressful than than taking the exam at home. It was also an excuse to get out on the bike, and treat myself to a celebratory ice cream afterwards!

Twitter Bot

Writing a Twitter bot has been one of those projects that I have wanted to do for years, but I had not had an idea for what the bot should do. Then I was sent a link to onewayroadtobeer.com and I thought I could do a countdown to what I am most looking forward to after lockdown ends – riding bikes with my friends!

The next step of the plan was easy, as an AWS Certified Cloud Practitioner I knew that Lambda is the right environment for running a small task once a day. In my previous Lambda projects, such as Automatically Deploying Website from Git to AWS S3, I have used the Python programming language, so opted for that again. From there it did not take me long to find Dylan Castillo’s excellent tutorial and GitHub repository for a Python Twitter bot on AWS Lambda. If anything it was too helpful, but it did force me to try writing the Python code on my Mac, rather than direct into the Lambda console on AWS, as I had done previously. This made it much easier to test/debug my changes to the code.

The changes were pretty minimal, instead of pulling the tweet from a file, the get_tweet function, compares the current date to the lifting of restrictions as defined in the government’s “Roadmap out of lockdown”, which I have hard coded for now. Hopefully the goalposts will not be moved too much! After a small tweak to change “in 1 day we will be able…” to “tomorrow we will be able…” the bot was ready to deploy. So far it has been tweeting out its daily message at 9:00 each morning, giving me a sense of pride whenever I see it in my Twitter feed. As well as building up the excitement for being able to ride with my friends – only 14 days to go!!!

You can follow the bot at @untilweride and if you are not already following me, my organic tweets are at @lewiscraik. If you want to to deploy a bot of your own, my project is on GitHub.

AWS Certified Cloud Practitioner

One of my goals for 2020 was to become an AWS Certified Cloud Practitioner – and after a stressful online exam yesterday, I am pleased to say that I have achieved that!

For those that do not know, AWS is Amazon Web Services – Amazon’s cloud computing platform, which essentially makes all of the tools that Amazon have developed to run their online store available to other organisations from Apple down to individual developers/bloggers, like me. I have been using AWS to run this blog, and a few of my other sites for a few years, and decided that it was about time to formalise all the skills that I have learned along the way. Whilst completing the online training I also picked up a few new things that I could apply to my website, or improve how I am using the tools.

The exam itself was really strict, I completed it online from home, although usually it would be possible to do it at a local testing centre. I was monitored the whole time through my webcam/microphone – with the exam being terminated if anyone else came into the room. I also had to have a totally clear desk – given that my usually crowded desk is currently doing double duty, with my work PC alongside my iMac, I opted to take the exam on my laptop at the dining table. As much as I like the idea of online exams, I do not think they are ready for the mainstream, after I had passed all of the entry requirements, showing my passport and desk space etc the application locked up, just as I should have been starting the exam. If AWS, probably the biggest cloud computing company in the world cannot get it right, I cannot see it being rolled out for GCSEs/A Levels! After giving it a decent amount of time to recover I ended up having to force a shutdown on my laptop and eventually managed to get back to the exam and start it. I found the exam hard, but finished it well within the time limit, and got told I had provisionally passed, with official confirmation arriving a day later, which is certainly an improvement over previous exams I have taken.

Automatically Deploying Website from Git to AWS S3

I am a big fan of Amazon AWS – this blog has been running on it for a few years now. Since moving to AWS S3 (for storage) and CloudFront (as a Content Delivery Network) to host static websites, such as my homepage, I have been trying to work out how to get them to automatically deploy when I update the Git repository I use to manage the source code. I looked in to it in some detail last year and concluded that AWS CodePipeline would get me close, but would require a workaround as it did not support deploying to S3. In the end I decided that a custom AWS Lambda function was needed.

Lambda is a service that hosts your code, in a state where it is ready to run when triggered, without needing to have a server. You are only billed for the time your code is running (above a free threshold), so it is perfect for small infrequent jobs, such as deploying changes to a website or even using it with Alexa for home automation. It seemed like an interesting area to explore and gain some knowledge, but I think I went in at the deep end, trying to develop a complex function, using an unfamiliar language (Node.js) on an unfamiliar platform. Then other tasks popped up and it fell by the wayside.

Then earlier this year I saw an announcement from AWS that CodePipeline would now support deploying to S3 and thought my problem had been solved. Although I must admit that I was a bit disappointed not to have the challenge to code it myself. Fast forward a few months and I had the opportunity to set up the CodePipeline, which was very easy. However, it only supported copying the code from the Git repository to the S3 bucket. It did not refresh Cloudfront, so my problem remained unsolved.

The CodePipeline did allow for an extra step to be added at the end of the process, which could be a Lambda function, so I went off in search of a Lambda function to trigger an invalidation on CloudFront when an S3 bucket has been updated. The first result I found was a blog post by Miguel Ángel Nieto, which explained the process well, but was designed to work for one S3 bucket and one CloudFront distribution. As I have multiple websites, I wanted a solution that I could deploy once, and use for all websites, so my search continued. Next I came across a blog post by Yago Nobre, which looked to do exactly what I needed. Except that I could not get the source code to work. I tried debugging it for a while, but was not making much progress. It did give me an understanding of how to link a bucket to a CloudFront distribution, trigger the Lambda function from the bucket and use the Boto3 AWS SDK for Python to extract the bucket ID and CloudFront distribution from the triggering bucket – all the things that were lacking from the first blog post/sample code. Fortunately both were written in Python, using the Boto3 AWS SDK, so I was able to start work on merging them.

I was not terribly familiar with the Python language, to the point of having to search how to make comments in the code, but I saw it as a good learning experience. What I actually found harder than the new-to-me language, was coding in the Lambda Management Console, which I had to do, due to both the inputs and outputs for the function being other AWS features, meaning I could not develop locally on my Mac. Discovering the CloudWatch logs console did make things easier, as I could use the print() function to check values of variables at various stages of the function running and work out where problems were. The comprehensive AWS documentation, particularly the Python Code Samples for S3 were also helpful. Another slight difficulty I experienced was the short delay between the bucket being updated and the Lambda function triggering, it was only a few minutes, but enough to add some confusion to the process.

Eventually I got to a point where adding or removing a file on an S3 bucket, would trigger an invalidation in the correct CloudFront distribution. In the end I did not need to link it to the end of the CodePipeline, as the Lambda function is triggered by the update to the S3 bucket (which itself is done by CodePipeline). All that was left to do was to tidy up the code, write some documentation, and share it on Github for anyone to use or modify. I have kept this post more about the backgound to this project, the code, and instructions to use it are all on Github.

This code probably only saves a few minutes each time I update one of my websites, and may take a number of years to cancel out the time I spent working on it. Even more if I factor in the time spent on the original version prior to the CodePipeline to S3 announcement, but I find coding so much more rewarding when you are solving an actual problem. I also feel like I have levelled up as a geek, by publishing my first repository on Github. Now with this little project out of the way, I can start work on a new server, and WordPress theme for this blog, which was one of my goals for 2019.

Google Authenticator – How to Backup for Moving to a New Device

Recently I’ve had to start using two factor authentication (2FA), both for my AWS account and Bitcoin wallets. It seemed like there were two main options for apps to run this, Google Authenticator and Authy. Initially Authy looked like a good bet, it could sync across multiple devices, including smart watches, but it turns out this convenience means the security is weakened – to the point that Coinbase advised users not to use it! Google Authenticator goes the other way, it is extremely secure, but if you lose/reset your device the settings, and potentially access to your accounts are lost.

The only way to avoid this situation is to make a backup of your access codes at the time you add them to Authenticator. You can either do this by writing down the seed key, or taking a screenshot of the QR code. It is not advisable to keep these backups with your phone or readily accessible on an online computer, as this is one of the keys to your account. I prefer to print off a couple of copies, write – with a pen, which account the QR code is for and file them away separately. I also keep another copy on an encrypted memory stick. If you are using 2FA to access an online account and have not backed up your access codes – you should do it now!!!

When you get a new device, or wipe your existing device, it is just a case of re-scanning the QR code into Google Authenticator from your backup. You can test your backups by scanning them into Authenticator again, either on your existing device or a separate one – they will give the same six digit code as the original. To test that nothing was linked to my iPhone I also installed Authenticator on my old iPhone and was able to log into my AWS account – AWS is ideal for testing 2FA, as you can create a dummy account with 2FA enabled, without running the risk of losing access to your main account.

WordPress Backups Using UpdraftPlus and Amazon S3

I had a bit of a disaster the other day – I went to link to a blog post from a few months ago and it wasn’t there! I remember writing it, and knew it had posted, because I remembered some of the comments from when it appeared on my Facebook profile. I then remembered that there had been some funny goings on with the WordPress Mac app, I’d had a duplicate post and deleted it manually. However now it seems like the duplicate had also been deleted.

Of course it was at this point I realised that my latest backup was a couple of months before the post and I couldn’t recover it from anywhere. I was particularly annoyed at myself because I have a thorough backup routine for my Macs and especially my photography work, yet virtually nothing for my blog. However, it was the kick up the backside I needed to sort out a decent backup routine for my blog!

Given that I was the weak link when it came to backing up my log I wanted something automatic, that would run regularly and email me when it had completed. As with most things WordPress, there seemed to be loads of plugins available, most of them paid services. In my research I’d read good things about UpdraftPlus, so was pleased to find their free option, which is more than powerful enough for a small blog like mine.

To see if it UpdraftPlus lived up to the hype, I downloaded it onto my WordPress development environment (Chassis running on my iMac) and had a play. Looking at the list of remote storage services Amazon S3 was the obvious choice, as I already use Amazon Web Services to host my blog. Knowing the basics of cyber security, I only wanted UpdraftPlus to have minimal access to AWS, I had got myself lost in a maze of IAM, S3 buckets, users, groups and permissions. I was on the right track but this post on the UpdraftPlus blog, told me exactly what I needed to do. The IAM Policy Simulator on AWS was also a huge help in making sure my policies were both written and applied correctly. I went for the maximum security option, which also gave me a chance to delve into the workings of S3, setting up rules to archive then delete the data after periods of time.

Once deployed and tested on my development environment, it only took a matter of minutes to get working on my live blog, giving me regular, automated backups. Now the only task left to do is do rewrite the post that got lost…