Tiernan's Comms Closet

Geek, Programmer, Photographer, network egineer…

AS204994, Own IP Space and Anycast

So, if you are reading this page, it is being delivered with the magic of Anycast… Well, technically, it was before, since i used Cloudflare, and it still is because of Cloudflare, but also because of my own ASN (As204994), some servers in different locations, and some magic, which i will explain a bit of in this post.

This all started late last year when i got my hands on an ASN and a /48 block of IPv6 addresses. I had been reading stuff about BGP, routing, etc, and decided to go all in. it was quite cheap with the help of HostUS. All in, it was about $50 for the year. As part of the process, i needed 2 upstream providers to say they would accept my announcement. They were Hurricane Electric though their Tunnel Broker service, and Vultr using a few of their VPSs.

After i got my space and ASN, i started to announce the V6 addresses over Vultr and Hurricane Electric, and all was good. I had 2 Vultr servers: 1 in London, UK, and one in New Jersey, USA. I had my home machine announce to HE, and then also link to both Vultr servers using Zerotier. All worked well, but due to some family issues, i never got around to putting it into production… till now.

Those 3 servers now share an IPv6 address on the loop-back port. When you (well, Cloudflare) asks for that IP, the closet (network) with that IP responds, and the NGinx server on that box sends back the contents of the site. This site is hosted on each box, since its fully static, but both AS204994 and TiernanOToole.net are hosted in Ghost, so Dublin (my machine in the house) serves them, and both Lon1 and Nyc1 do proxying. so, most requests from the US are hitting the box in NYC and the ones in Europe share either Dub1 or Lon1. I have some tweaks to do with which servers will be running where, and may add more, but currently its working well.

So, how do you figure out what server responded? Simple. Open your Dev tools on your browser, go to network tab, refresh, and see the response headers for anything on this domain. You should see something like below.

Over the next while, i will be updating tiernanotoole.net with more details on how this works, and more stuff will end up on AS204994.net too. If anyone notices any weird and wonderful issues, shout. If you have more questions, shout.

Blogging on an iPad Pro

So, a few months back I bought myself an iPad Pro. I got a 10.5" with 64GB Storage and the Smart Keyboard. Since then, i have been mostly using it for playing around: watching YouTube, Netflix, surfing on the couch, etc. but i started to wonder how “Pro” this was…so i went and did some testing, and in the end nearly all of this post is being written on it…

first, the good stuff:

  • Microsoft’s Remote Desktop Connection works perfect on the iPad Pro. I have RDPed into machines (with the help of ZeroTier)
  • Panic’s Prompt works well too… again, with ZeroTier, i can SSH into boxes and remote manage them. Handy for checking on docker instances…
  • Panic also have Coda for iOS. its a very nice (if somewhat expensive at $25) editor for the iPad. This post is being written on there now.
  • for Git stuff, i am using an app called Working Copy. Its free, to an extend, but if you need to do stuff like push changes, which is kind of important, then you need to pay a fee.
  • Coda and Working Copy work together with some magic built into Working Copy. It can act as a WebDav server, which Coda can the connect to. you open, edit, change and create docs, and Working Copy keeps note. then you swap to them and checkin. You need to have both apps on the screen at the same time (the docking feature works well for that) since iOS seems to kill some background tasks.
  • Unrelated to blogging, but i have also tried editing photos using Lightroom, and so far so good. I have used the Apple SD Card adapter to download 50MP photos (upwards of 60MB) from my Canon 5Ds quick enough, add them to Lightroom, make some changes and send them to Twitter, Facebook (not Instagram just yet…) and it works well. I have managed to hook it to my Gnarbox too.

Bad Stuff?

  • keyboard takes a bit of getting used to. the Stupid “Global” button to swap keyboards (from
    English to Emoji) is in the place you would expect to find CTRL. and CTRL, Option and Cmd (remember, this is a “Mac” style keyboard) and all shifted one place… I would have preferred if they moved that somewhere else, or removed it altogether…
  • a mouse would be very handy! I have tried pairing a bluetooth mouse to it, and no luck… it would be handy especially for editing documents and code, since touchscreen is “handy” some of the time, but not all the time…

So, there you have it. Blogging on an iPad. Would i give up my daily driver of my Surface Book and GodBoxV2 and just uses an iPad? Hell No… for basic stuff, it works well. Basic photo editing, blogging, surfing, etc, yes. But there is a reason my workstation has 16 processor cores and 160GB RAM: I need it. I have multiple copies of Visual Studio running, SQL Server, multiple VMs running different tasks, multiple web browsers, multiple monitors, etc. the iPad can do a good chunk of stuff, but not the major stuff… not yet. Don’t get me wrong: Word, Excel, Power Point, Outlook. all the major office tools work grand. But Visual Studio? SQL Management Studio? just not there yet…

So, what did i not do on the iPad, and ended up doing on the PC? Well, so far, nothing… using Coda and Working Copy, i wrote the text, previewed it and checked it into GitHub. Then, Prompt is being used to build it on my docker box, check into the static site and push, which will then publish. unless you see an update below, all went as planned and all was done on an iPad…

New Backup Plans

So, a few weeks back, CrashPlan, one of my chosen backup services, decided to kill their consumer backup plans. Which made me have to rethink my backup plan for the house…

Note: This is how I am backing up files, and may or may not work for you. Some of this is already in “production” as of today, but others are planned… Any questions, comments, etc, leave them in the comments.

So, first, as mentioned above, CrashPlan’s consumer backup options are now dead. They are giving you the option of either upgrading to their Small Business Plan for free till the end of your current contract + 2 months, or moving to Carbonite. For me, i just moved to their Small Business option, since its free, and meant that most of my backups just migrated over… i did not, how ever, look too much into Carbonite.

Just checking their site now, i would have to go for a minimum of the Plus plan, and that only allows me to backup 1 machine… I have 2 (currently) that get backed up… and probably more at a later date… and, given their yearly price for Plus is $75 per year, thats $150 for 2 current machines, and an extra $75 per machine after… Granted, they are saying its unlimited storage. but the extra fee per machine puts it out of contention…

For CrashPlans Small Business Plan, their plan is $10 per machine per month… So, my 2 machines would cost $240 per year to backup. They are giving me the next few months for free, and then for the first year, they are offering a discount of 75%, but we will see what happens when renewal comes along…

So, where does that leave me? Well, its looking like i will be using a mix of BackBlaze and Hubic. The current idea is as follows:

  • GodBoxV2 and the Mac Mini both have Drobos plugged into them: the GodBoxV2 has a Drobo 5D, and the mini has a Drobo mini. There is also a Drobo FS on the network. I know they are not backup solutions, but its better than having stuff on single drives…

  • For Windows boxes, I am backing up files to the cloud using Duplicati. These are backed up to both Hubic and BackBlaze B2. Photos and importing files are backed up to both, and stuff i want backed up, but not in 2 places, is just backed up on Hubic.

  • For my Mac mini, i use a a mix of Duplicati and Arq Backup. Arq got B2 support recently, so its very handy for that.

  • For some of my Linux Boxes, i use Duplicity. You may need to walk though the steps i have here to get it working with Hubic, and it also works with B2.

Some further notes:

  • Hubic are charging about EUR5 a month for 10Tb of storage. If you refer people, that can go up. I have maxed mine out at 12.5Tb.

  • Hubic is limited to 10mb/s for uploads and downloads. I never got anywhere close to that speed with CrashPlan (I’m based in Europe, they are in the US, so speed of light and what have you… maxed most of the time at 4ish). That being said, even those its “limited” to 10mb/s, i have seen higher.,.. might be 10 per thread…

  • I did, for a while, use the BackBlaze app and service. I just didn’t renew because of the move to B2. If you don’t want to mess with stuff, their App works great.And if you use this referal link you get a free month… Also, its FAST AF! it actually manged to max out my upload connection to their server! wont complain about that!

  • B2 is cheap. 0.5c per gig stored… So, it works out at about $5 per TB per month, minus your usual bandwidth fees…

  • Duplicati and Duplicity both have options to send emails on success or failure, or the like. I recommend having the emails on failure turned on at a minimum… also, on headless boxes, like my Linux box, i recommend checking the backup every week…

So, there we have it. Any questions?

Testing Forestry

So, as you probably know, this site is built with Jekyll. Jekyll is a Static Site Generator, basically taking an input of a load of text files (see the source repo for this site on Github here) and generating a load more HTML (the static HTML is hosted on Github here, which auto publishes to Azure App Service).

In previous posts, i have talked about using the likes of Visual Studio Code and Mark Down Monster to build the site. Well, a few days back, i found Forestry.io. Its a web application which, in my case, is linked with my GitHub repo (the Jekyll source one) and allows me to make changes to the code easily. Because the way i build my site is a little different, i manually build the site and push to the destination GitHub project, but they have features allowing you to push directly to SFTP or FTP servers, GitHub, or some other options.

The interface is nice and easy to use, and you can use drafts, etc. Mind you, even with drafts, because files are written into a public repo, they are not fully private… I suppose I could just make the source repo private… Anyway, they are free for single user sites (like this) or they have paid plans for teams (say, your business blog with more than 1 user updating it, for example).

VSCode and Markdown Monster with Powershell

A few years back, i created a post showing you how to add an Alias to PowerShell to easily start Sublime Text from a PowerShell command line . This worked well, but this is 2017 (that post is from 2012!) and my daily text editor has changed. I have moved to Visual Studio Code for most of my daily work. It works well 95% of the time. I still use Visual Studio Pro for C# Development, but for quick fixes and work on, say Go or smaller edits, Code is great. For blogging, on the other hand, I am trying out MarkDown Monster but code still has some nice features. We will see how tests go.

Anyway, to upgrade the post, and to be able to use VS Code and MarkDown Monster from your PowerShell command line, I added the following:

Set-Alias code "C:\Program Files (x86)\Microsoft VS Code\Code.exe"
Set-Alias mm "C:\Program Files (x86)\Markdown Monster\MarkdownMonster.exe"

You can either run this each time you open PowerShell (a bit of pain) or you can add it to your Microsoft.PowerShell_profile.ps1 which lives in your Document\WindowsPowerShell folder. If you dont have one, just create the file, add that piece of text, and next time you open PowerShell, you are good to go.

In my case, i am actually in the Visual Studio Code Insiders group, so my alias is:

Set-Alias code "C:\Program Files (x86)\Microsoft VS Code Insiders\Code - Insiders.exe"

When these are set (and you have restarted your PowerShell windows), you can now run the following commands:

code filename.txt
code folder
mm filename.txt

VSCode will open either files or folders, but it seems Markdown Monster only opens files.

Zerotier and Minio Followup

in a previous post, I talked about setting up a distributed S3 like data storage system using Minio and ZeroTier. Well, this week, the ZeroTier guys tweeted about this.

A few people then started asking questions, and looking for a follow up, so here it is…

First, a quick recap. I had 4 machines, all running Linux. Three of them were in 1 time zone (GMT+1) and one was in another (GMT). Looking at the Distributed Minio Quickstart Guide again, there is a mention of times being in sync… which is probably why this did not work as planned… and by “not work as planed”, I mean that Minio would crash, or not be responsive, or not write data in the place it should have… which was a pain. But looking at the documentation again, they do mention that Windows support is “experimental” which means, hopefully, some day it will be not so experimental, and might work… Given that most of my machines in house are Windows boxes, this would be a nice feature.

Now, what about ZeroTier? Given they posted it to their twitter? Well, it worked. it did the inter connect stuff well, and, given bandwidth limitations on a home broadband connection, it was still quite fast.

So, the question is, how fast? Well, on my Surface Book on a WiFi connection in the house, behind a Meraki MX64 firewall, connecting to the GodBoxV2 over FTP though ZeroTier, i get the following result:

the same download over FTP direct (no ZeroTier) does the following:

So, direct over FTP is faster… in this instance by about 70%, but, over the download, it did get slower (seen it hit 12 at one stage) and because its over WiFi, those are a bit wonky…

I did get one last screen shot:

as you can see, the Zerotier network adapter is showing 77.3Mbps, but the main network adapter is showing 80.8Mbps. There would be other traffic there, but if we assume there is nothing but ZeroTier traffic being sent, there is about 5% of an overhead.

So, to wrap up: Minio and its distributed storage system over ZeroTier needs more testing. Ideally, all hosts need to be in the same time zone, or at least have the same time… Will try work on that soon. As for ZeroTier? I am extremely happy with them. Its fast, easy to setup, and easy to configure. What more could you ask for? Oh, and free, unless you need a pro account!

Business Class Broadband… finally here….

So, after many (MANY) years messing with dual cable modems, struggling to get them working together, to get websites to even allow me in, having to use hacks and kluges to get it to work at all… I have given up. It has been a struggle getting two modems working properly. Load blanching kind of works… but it’s messy at best. Some sites kick you out every now and again because your IP changes. Some sites wont let you login at all… Mind you, some sites work grand and don’t ask questions…

And the whole idea of multiple modems, to allow you to download things faster, doesn’t work for everything… Anything you download in the browser is single threaded, so its limited to one modem… you can use use download accelerators, and they do work, but its an extra step, and some sites don’t work for that either (MSDN for example).

So, i have given up, bit the bullet, and moved to business class broadband from Virgin Media. It’s actually cheaper than the two residential lines i had, but it is also slower than the two combined: previously, it was two x 360/36mb/s. Now i am am on a single 400/40mb/s modem. That being said, there are definite advantages:

  • Static IPs pretty much as standard, and option of either one or five (no unbeaten!) Guess which one i went with? Its technically a /29 range, but the first usable IP is given to the modem, which acts as a gateway, so i end up with five usable.
  • Proper business class SLA. Any issues, someone who knows what they are talking about can help
  • Phone lines on a separate modem. So, i got phone lines with them, and they give a separate modem for those lines, so as not to interfere with the internet. that modem has no internet connection and is just for calls. They are also working on a VoIP/SIP offering, which is something i am interested in.
  • Guaranteed speed! They guarantee a minimum speed to the modem at all time. Business customers have a priority on the network, which is nice. And, during testing, so far, i am getting the advertised speed most of the time. I needed to download a Windows 10 ISO yesterday from the MSDN, and it came in at between 45 and 48MBytes/s!

So, only had it installed a week, and so far, so good. I have one IP given to my PFSesne box, and the rest given to a VyOS VM. The plan is to use the VyOS box for all network traffic, but first i need to do some testing and learning… Expect some posts on this soon!

Distributed S3 data storage using Minio (and Zerotier)

So, something i have been looking into in recient times has been Distributed Storage, and, more specifically, how to use the storage in my many, many machines to protect data, and also increese my usable space… There are a few projects on the market that do this (Ceph, NooBaa and Gluster all spring to mind) but some are more painful to setup than others… which brings me nicely to Minio. Minio is a 20ish MB executable you download from their site, mark it as executable (on Linux or Mac Boxes) and run… and you have yourself a S3 compatable storage server… Simples!

“But Wait!” i here you screem! “thats not distributed!”. Well, yes… but, it can be! Their Distributed Quick Start Guide, which is where i started with this, allows you to run a distributed copy of your data. I will let their documentation explain more, but this is what i did:

  • download the minio server (single executable file) on a minimum of 4 machines.
  • on each machine, run a command like the following:

replacing accesskey and secretkey with keys (check minio documentation to get these) and foldertoexport with, well, the folder you want to export!

For me, i have 4 servers currently clustered. 2 are in online.net (one in Paris, one in Amsterdam), 1 in OVH.NET (France, somewhere) and one in Dublin (GodBoxV2 currently). They are all interconnected using ZeroTier (I will explain that later) and so far, so good… only ran some basic tests, but with it, i could loose 2 machines and still have data… Not bad for free! I will run some speed tests soon.

Docker Jekyll and Mr ngrok

See what i did with the title?! Anyway, in my last post, i explained how i was building this site with Docker running on Windows 10 with the Anniversary update. Today, i am going to show you how to host it using Nginx and ngrok.

So, first, you should know what Nginx is at this stage… If not, check out their site. Next ngrok is basically a way of tunneling your localhost to the web. So, how do we build the whole lot together and serve your site to the internet? Well, this is what i have so far:

First, build your site in jekyll. for me, the command is

docker run --rm -v "$(pwd):/src" -w /src ruby sh -c 'bundle install --path vendor/bundle && exec jekyll build -s www.tiernanotoole.ie/ -d www.tiernanotoole.ie/_site/'

next, run an nginx server with that output folder:

docker run --name tiernanotoolenginx -v "$(pwd)/www.tiernanotoole.ie/_site/:/usr/share/nginx/html:ro" -d -p 8881:80 nginx

the docker container is called tiernanotoolenginx, since i could have multiple ones, and port 8881 is being redirected to port 80 on that container, but technically, it might not be needed due to the next command:

docker run --rm -it --link tiernanotoolenginx wernight/ngrok ngrok http tiernanotoolenginx:80

essentially, what we are doing here is running ngrok and pointing it at post 80 on the nginx container… you see i did not point at 8881, since we are using the continer directly… it might be different if you were not…

when that command runs, you get a screen telling you the URL of your site with some basic stats. your site is now hosted publically, via an ngrok tunnel! you could run that container as a daemon, and leave it running, but for me, i wanted to do some minor testing, so i can kill it when i want…

So, all is good with the world!

Building Jekyll sites with Docker on Windows

As some of you probably know (or based on the footer of the site) this site is built with Jekyll. Jekyll is a static web site builder, written in Ruby, and is a bit of a pain to build on Windows. Earlier on this year, I wrote up a post explaining how to use Jekyll on Windows using Bash on Ubuntu on Windows… It was a bit complicated, and, well, worked a few times, but was not too successfull… So, were do we go next? Well, Docker to the rescue!

I am running the Windows 10 Anniversary edtion witch has container and docker support. using the repo for this site and the scripts (specifically build-tiernanotooleie and geekphotographer.com) i can build the docker site on my local Windows machine and upload the sites as required (I host on NFSN and upload via RSync). The docker image i build from is a Linux docker image, do i need a Linux container running (and the docker tooling). I also use Bash on Ubuntu on Windows to upload using RSync. All is going well so far…