Tiernan's Comms Closet

Geek, Programmer, Photographer, network egineer…

Bulk updating Tasmota Devices over MQTT #100daysofhomelab

I have a load of these Smart Plugs from GoSund around the house (currently around 11, but more are still in boxes). The handy part of these is they can be re-programmed using Tuya Convert and using the following config you can get power usage and an on/off switch. I have mine hooked up to an MQTT server, and with the MQTT plug-in to Home Assistant, I get all the details about power usage and can control each device I need to (hence the 11 of them!).

But MQTT can be used for more than monitoring. I can send commands to the devices. Given that all of them are on a locked-down network, and only have access to the NTP server, internal DNS and the MQTT box, I needed to figure out how to get OTA updates to the box. Luckily, you can change the OTA update URL on the web interface and download it from a local endpoint… But, I am a Lazy Git, so I needed to figure out an automated way. Enter MQTT again.

First, you will need to log in to your Tasmota device, go to configuration, MQTT and Enter your MQTT host. Also, get your topic name while you are there and keep it handy.

Hit save and wait a few seconds for it to update. You need to watch the MQTT messages going through. I am using MQTT Explorer to see all messages.

For me, the tele topic has all my devices listed, plus stats around power, status, etc.

On the Tasmota site, they have documentation on sending commands over MQTT. I then installed the MQTT CLI on the mac (so I could automate this later) and ran the following commands:

mqtt pub -t cmnd/tasmota_<deviceid>/OtaUrl -m "<internal url hosting tasmota updates>/tamota.bin" -h <mqtt host> -p <mqtt port>

mqtt pub -t cmnd/tasmota_<deviceid>/Upgrade -m "1" -h <mqtt host> -p <mqtt port>

update <deviceid>, URLs, host and ports as required. for the internal URL, I just have a small copy of Nginx running in docker, and serving a folder with copies of the latest OTA files from the Tasmota Release page. I just wanted all the files and put them in the folder shared by NGinx. I need to automate it a bit better… maybe next time there is a new release?

the first one tells the device where the latest OTA file is. the second command kicks off the update. If your devices are not segregated, you can just leave the existing OTA Url there and kick off the upgrade task on its own… I wasn’t that brave…

Within a couple of seconds, you will start to see messages showing up in MQTT Explorer. After a couple of minutes, all devices will have been upgraded and rebooted (no power down, luckily) and all is good!

DNSControl and Github Actions #100daysofhomelab

I am participating in the #100daysofhomelab challenge and have been posting a lot on Twitter as @tiernano, but some posts and tasks I am doing will require longer-form write-ups. So, some updates will include either Videos (which will be published on my Youtube Channel) or blog posts, which will go here. This is the first of the blob posts.

DNSControl is a tool written by the Stackoverflow lads (when they called themselves StackExchange). It is designed to update DNS records and can work with DNS providers and registrars. I use it to update records in Cloudflare and Route53, but many providers are available. I wrote an article a while back about how to create reverse DNS records for IP space with Route53 and DNSControl, but most of it is still relevant, and the main documentation site for DNSControl has a lot of useful tips.

Up till this morning, if I wanted to update a record, I checked out the DNS records from my private Github repo, made the change, and ran the DNSControl commands on my machine (check for syntax checking the file, preview to show what will change at the provider level, and push to make the changes). But I wanted to have some automation for this. So, enter Github Actions.

I did a bit of digging and found a Github Action from koenrh called dnscontrol-action. The docs on this are quite simple to go through, so I created 2 action files for my Repo: preview and push. a Gist for Preview is below:

and the one for push is as follows:

The important parts are as follows:

In both preview and push, the check command does a syntax check of your DNS config file. Then preview will check the providers to see if any records need an update. When push runs, it will make the changes.

All my required secrets are set in the Github repo as secrets, so when the action is run, it will pull the required keys out. These are put into the environment variables. I use name.com and a registrar for some domains (though most have now moved to Cloudflare, and some, like my .ie domains, are with Blacknight, who are not supported on DNSControl). Cloudflare is used by the majority of my domains, and Route53 is used for 2 domains currently. There are around 53 domains current managed by this, and the plan is to add more. I also plan on getting some more automation around checking configs and sending alerts if anything changes.

So, enough “How it works” and show us it working!

Right. Let’s update my zt.tiernanotoole.net domain, which is used for Zerotier IPs internal to my network. It’s been a while since I did this, so most will be removed and a few adds… first, I create a new branch, called zt-update, and check it out in VSCode. I made my changes, git committed and git pushed to the branch.

at this stage, the actions have NOT run, since this is neither checked in to master, nor a PR for master.

I go into the create PR section, and I can see the changes I have made. in my case, I removed a load of unused records and added extras:

I now create my PR and wait for the checks to complete:

within a short time, I get an alert that all checks have passed, and I can find the results of the changes in the build (It was meant to add a note to the PR with the details, but I might be missing something in my config…)

Also, not sure why it is redacting part of my name here…

I check the rest of the list, and other than the deletes and creates in route53 for this domain, there are no other changes. So, being happy with that, I click the Merge Pull Request and the code is checked into master, and the DNSControl push command runs:

If i now go into Route53, i can see the records on the site:

Happy days! Next challenges to fix:

  • fix the PR to include the output of check and preview
  • only run a check and push on the master branch, and no need to run preview again…
  • run preview once a week and send alerts if anything has changed

Till next time, good luck!

Unifi Network Update 7.1.61

A few weeks back, Ubiquiti released a pre-release update for the Unifi Network Controller, version 7.1.61. It got installed on my UDM and I noticed a few interesting bits that you might find handy… First, you will need to be signed up for Unifi Early Access before you can download or even read the release notes, but this is just a quick update based on my findings so far.

The first thing to note: You can see the list of devices connected to switches on the Overview Tab. I can’t remember exactly when that was added, but I think it’s new…

Under the ports tab, you now have a ports insight option:

Clicking this give you:

You can also select multiple ports and make changes at a bulk level:

You can also see a bit more info about each port:

Teleport VPN is also now added. This makes giving someone access to your network a LOT easier than usual. They will need the WifiMan software on Android, iOS or Mac to join. Not sure what happens on a Windows machine… Maybe it’s coming soon? To use it, just generate a new link and send it to your user. Not sure how to remove them afterwards (if you want to give them temp access for example…)

Final Interesting part, and something I have been waiting for for a while, under Traffic Management, you can now create custom traffic rules:

You can set it based on destination Domain Name, IP or even the full internet:

And you can set the Source to be All Devices, group of devices (network) or individual (or multiple targeted) devices.

Finally, you can set the output internet connection.

If you had multiple internet connections, and one had better speeds for stuff like Netflix, or you wanted to send bulk data over a different link, you can do this using this feature. Very cool stuff.

So, still testing, but looking good so far.

Raspberry Pi in a car, part 2

For the last few weeks, I have been running a Raspberry Pi in my car, along with a small UPS and a Wifi Access point, allowing me to download videos from my dash cam and back them up to my NAS in the house. But I have had some teething issues, and I am currently thinking my way through some fixes…

  • First, the Pi is connected to both the network in the car (via ethernet) and network in the house (via Wifi). It seems that when the car is parked outside, sometimes the Pi can’t talk to the internet, and sometimes it can’t talk to the dashcam… It’s a routing issue, and it’s starting to annoy me…
  • I thought the onboard Wifi on the Pi was a little weak… it wasn’t getting much more than about 2-3Mbytes/s (16-24MBit/s) when downloading from the Pi to the House. Given the Pi was serving content from an SSD (not the internal MicroSD) I would have hoped for faster. I tried swapping in an external Wifi dongle with an aerial, but the same kind of speed… must be having issues getting through the metal and glass in the car, plus the metal, glass and brick in the house…
  • I started running out of disk space on the SSD on the Pi after about 3 or 4 weeks of video… so, I needed to tweak the command for the download script to only keep 14 days on the pi. Resilio Sync, the app I use to sync back to the house, has a “keep deleted files in an archive” folder option, so when the pi does delete the files, they are still stored on the Pi… I would like to find a way of automating that…

While trying to figure out how to fix part 1, I came up with an idea: I have an older Mikrotik RB951G that can be powered via a 12v adapter for the car. I am going to use that, along with a Huawei 4G dongle to act as an internet connection. The onboard Wifi will be in client mode, so when it’s near the house, it will connect to the main network and send traffic through that to the internet (or internal NAS) and when away, use the LTE modem. Then, using the Wifi dongle on the Raspberry Pi, use that as a Wifi AP.

Anything in the car that needs Wifi will connect to the Pi, which will act as a bridge to the Mikrotik. When the script needs to download files from the dashcam, it should have a direct connection to it, plus (hopefully) will be faster… then the Pi is connected to the internet through the Mikrotik. The Pi has both Tailscale and Zerotier on it for remote management, and the Mikrotik can be configured to use Wireguard to connect back to the house directly if required.

I have some of this working on a bench in the house, but it will be a while before I manage to get this running fully… Hopefully, I will have some more stuff sorted this weekend…

Running a Raspberry Pi in a car and backing up dashcam footage

A few months back (well, November 2020) I wrote about connecting to my car with Zerotier. In this post, I mentioned using a TP-Link router running OpenWRT and a Huawei LTE dongle to connect to the internet, which allowed me to then connect to my Blackvue Dashcam and watch remotely… But it had some issues I wanted to fix:

  • The Huawei Wingle was a little slower on 4G than I would have hoped…
  • When the power in the car went out, everything stopped working immediately (12V sockets in the car run for about 20 min after the engine shut off)
  • It did not connect to the WiFi in the house when parked
  • No option for backing up Video…

So, I went digging to find some alternatives… and I realized I had a load of them floating around the house: the Raspberry Pi. Specifically, the 4GB Pi 4. I got my hands on a Pi UPS Hat, a couple of 18650 Cells and an SSD Expansion board with a 512GB (overkill I know) SSD. I also got a BlackVue Power Magic Battery, B112, which will power the Dash Cam (a BlackVue DR750S-2CH). It has 2 USB ports, which allows me to run both the Pi and the new WIFI router, a Netgear Nighthawk M1.

When the car starts, it powers, via the 12V socket in the boot of the car (trunk for my American friends), which powers the Blackvue Battery. Cables run from there to the front of the car where the front camera is. (there is also a rear-facing camera in the boot too… more cables!) This then also starts the Pi and starts charging the 2 18650 batteries. Finally, well, at the same time really, the Nighthawk starts running too. Because the batter on this was running hot, the battery is removed from this.

The Pi is hooked to the Nighthawk via ethernet and the WIFI is set to connect to the house when it sees it. The BlackVue uses the WIFI from the Nighthawk for its internet requirements. When the pi boots, it connects to Zerotier for management via SSH or VNC (I use VNC to remote into the box and watch the live video when the car is parked or when someone else is driving).

There is also a python script that is scheduled to run every 15 min that downloads the videos from the Dashcam. It also downloads any GPS and other info. The folder these files are downloaded to is on the SSD and is shared with my machine at home via Resilio Sync. To make sure I don’t use all my LTE usage, the machine at home is set to only download what I want to download. So, if the car is somewhere else, I can download specific files when I want, or when at home, I can download full days, if required.

It’s been running for a few weeks now, and so far, so good. I haven’t had to do any clean up of the SSD, yet, but I would guess that eventually, I will need to look into that… With the 4G connection and Zerotier, I can then connect to my car and watch the live video whenever it is online, and whenever it is driving, within 15 min it will start downloading videos. I could, in theory, do a LOT more with the Pi in the car… Some ideas that come to mind:

  • Turn WIFI off on the Nighthawk and use the Pi as a Router, probably adding a second WIFI adapter to get better range… This could then have PiHole running on it for monitoring DNS traffic…
  • Since I have access to the GPS files in (somewhat) real-time, use it to map the car in somewhat real-time. Though, I do this already using Ruhavik and a Teltonika FMC-001.
  • Connecting to the car’s OBDII port (On-Board Diagnostics) and getting data from the car… Technically, again, the FMC001 does most of this, but in theory, it could be replaced with something else…

Keep an eye on the blog for future possible projects with this… Not sure where this project will get me, but we will figure it out at some stage… Leave a comment if you have questions!

Ubiquiti UDM Pro Fail over to Speedify

So, this has been a blog post in the making for a while now but never got around to fully writing it up, so here goes nothing…

I run a UDM Pro in the house. It has 2 WAN Links: 1 1Gb link and 1 10Gb Link. I also run AS204994, my own ASN with its own Transit and Peering connections, mostly in Europe. There is a VM in the house which acts as a connection to AS204994, which gives me a full connection to the Internet through my own ASN. More details on my AS204994 blog are here.

That connection is hooked up to the 10Gb Link on the UDM Pro, which is listed as the primary internet link. Details on how these works were uploaded in this video on YouTube:

In the video above, I was using OpenMPTCPRouter to connect to the internet, but it’s been causing some issues lately, I decided to try something else.

The new setup is an Intel Nuc (i3 with 32GB RAM and 2x512GB SSDs… VERY OVERKILL for the job at hand) running Ubuntu Linux. It has a USB Hub with 3 USB Ports and an Ethernet port connected, giving me 2 Ethernet ports on the box in total. 2 of the USB Ports are connected to USB 4G Modems from Huawei and the external ethernet port is directly connected to my cable modem.

USB Hub with 1 Huawei Modem and connection to second

Both modems and the ethernet port are connected to the NUC with full internet connections (The Huawei boxes give up NATed IPs, but the Cable modem is a full public IP) and then Speedify takes those 3 connections and does some bonding magic. Speedify is a handy little VPN service that does connection bonding. You can use it to make sure your internet is rock solid using multiple links, make sure streams are stable, etc. It can bond Wifi Links, LTE modems, Cable Modems, DSL, etc. Anything that can connect and be bonded. The only issue I have with it, compared to OpenMPTCPRouter is that you don’t control the upstream server…

Speedify is set in shared mode, so the internal port on the NUC is set to share the internet connection. This is hooked to the 1Gb WAN Port on the UDM Pro. This is set for failover only (currently the only option on a UDM Pro) so if my AS204994 link goes down (VM reboots, VM host dies, Cable modem connection goes out, etc) I will still have a connection. If the cable goes out, it will use just the 4G links, but if everything is running, I get all 3 connections.

Connecting to my car over ZeroTier

I use ZeroTier on my network for a good few things, including internal network peering between BGP VMs, management of machines, and now, connecting to my car over LTE. This is one of those posts that sounds silly, but is very handy! First, the parts list:

  • Car…
  • 3G/4G/5G modem of some sort. I am using a Huawei Wingle… Can be used without the Router below, but I wanted Zerotier, so I have it in modem only mode…
  • A router that supports Zerotier. I am using a modified TP-Link TL-WR703N upgraded to 16MB ROM and 64MB RAM. This is required for newer OpenWRT builds
  • a dashcam that connects over Wifi. I am using a BlackVue DR750S-2CH
  • Latest ROOter software from Of Modems and Men
  • Patients…

After installing the the latest copy of ROOter on the TPLink (or router of your choice) and getting the modem configured correctly (this took a while) you need to install the Zerotier software though the dashboard. Once installed, I joined my Zerotier network using the CLI (SSH into the router) and the approved it though the my.zerotier.com dashboard. Once its approved and connected, you can now go to the Zerotier IP and get to the router directly. From here, you can either setup a route in Zerotier to point at the internal network behind the router, or, in my case, setup a  SSH tunnel to the dashcam. I found the IP given to the dashcam and used SSH forwarding to get to it. Finally, i used the URLs from Digital-Nebula’s hackview repo to get to the different URLs. I use this to download stuff like GPS logs, emergency videos, etc. I have to clean up some scripts at some stage for this, and plan to upload them at some stage.

If anyone has any questions, leave a comment!

Backups, Backups, Backups!

I have posted about backups a few times on this site in recent years, and its still something I make tweaks to every now and again. The latest setup is probably over the top, but I will give you a walk though on it and some of it could be useful to some of you.

I have a couple of different machines and storage devices running that need backups. Some need daily backups, some could get away with weekly. The list is as follows:

  • GodBoxV1 (2X4 Core Xeon, 82GB RAM, Fedora, 512GB Boot SSD, 5x4TB HDD in ZFS RAIDZ1)
  • GodBoxV3 (2×20 Core Xeon, 192GB RAM, Ubuntu, 2x512GB NVMe SSD in RAID 0 for booth, 4X512GB NVMe SSD ZFS stripe for FAST storage, 8x8TB HDD in ZFS RAIDZ2 for bulk storage)
  • Docker Box (VM, runs a LOT of different containers on the network)
  • Synology DS1817+ (8x8TB HDD in SMR with 48TB usable, 2x10GB + 4x1Gb NICs)
  • QNAP TS-932X (5X8TB HDD in RAID 6 along with 4X512GB SSDs in RAID 5, 2X10Gb NICs)

2020-11-04_10-15-00_IMG_2342

GodBoxV2 and the 4 C6100 boxes are running Widows Server 2019, and I have 4 new C6220s which, when in production, may be either running Server 2019 or VMWare ESXi. More on this in a future post. GodBoxV1 and V3 are being backed up with Borg/Borgmatic, and the Server2019 boxes are running Hyper-V and the VMs are not backed up on a nightly basis, but that is planned in the future…

Borgmatic is basically a very nice and handy wrapper for Borg itself. It allows you to easily configure a YAML file with what you want to backup, what you want to exclude, where you want it backed up to (multiple locations if required) and details on retention, etc. It also allows you to send notices when something completes or fails. I have 3 main machines which are backed up using Borgmatic, but will probably add more at some stage. These three backup to 3 different locations; Local ZFS Storage in house (currently on GodBoxV1), RSync.NET and Hetzner’s Storage Box. [Note: Hetzner have 2 types of storage: Storage Box and Storage Share. Storage Share seems to be NextCloud and does not have BorgBackup installed. Storage Box can be used with BorgBackup though]

[Note: RSync.net have an offer for Borg Storage: 1.5c per Gig. So, 100Gb a year costs only $18. On their signup page, if you enter referral code 2019-09-13_05-27-04, I get some extra storage for backups on my end, and you can help me continue writing random stuff here!]

Nightly, Borgmatic runs and backs up everything important on GodBoxV1, V3 and the Docker Box, to all three locations. Then, on GodBoxV3, we backup some larger files (photos, video and other large data from my cameras) to Hetzner. I also plan on setting up a backup of those larger files to either my Synology or QNAP boxes. The reason the large files are only backed up to one current location is size; they currently weigh in at around 300GB, give or take, and I currently have around 200Gb of usable space with RSync.NET. My plan is to use the QNAP or Synology box as a secondary backup for this storage at some stage.

On a nightly basis, the Synology runs backups to both Backblaze B2, Wasabi and Hetzner using Hyper-Backup. Finally, on a weekly basis, some folders on the Synology are backed up to AWS Glacier.

This gives me a fairly good set of backup options, but there are some tweaks I want to make:

  • Important VMs on the Hyper-V Cluster should be backed up. Daily backup to local storage (QNAP, Synology, ZFS) and one weekly backup external (Hetzner, B2, RSync.net)
  • Large media files backed up to a second location, either local or remote.
  • Intel Nuc, Home Laptop and Mac Mini should also be backed up. 99% of the time they use storage from the ZFS pool or the NAS devices, but they still have local storage.
  • Look into backing up iPhones, Android Phones, iPads, etc, to local storage also. I do use PhotoSync to copy photos from my iPhone to the ZFS storage, which is backed up, but having something to backup the rest of the data, other than iCloud, would be handy.

So, thats my 2020 backup plan. Any comments, questions, etc, shout in the comments section.

Domain Joining a machine over VPN and Password Resets/Changes with Azure AD

With the whole Work From Home thing probably becoming more and more normal in the years to come (I can count on 2 hands how many times I have physically been in my main office in the last 7 months) there are a couple of certainties in that people will come up against. One is passwords expiring and needing to be changed, one is password resets being required and finally laptops or desktops needing to be domain joined or connected to the domain before they can be fully provisioned. As the (currently only) IT guy in our office, I have had to deal with these first hand, and decide to write this post, helping both my fellow employees, and possibly other IT Admins stuck in this challenge.

So, as the IT person, there are a couple of assumptions:

  • You have on premises AD
  • You have Azure AD (P1 and above seems to be required if users are mixed AD and on prem. Free allows just Cloud users).
  • Azure AD Sync installed and enabled

If all above are set, you will need to follow the steps to Enable Azure Active Directory Self Service Password Reset. I have enabled this on our domain. Next, you need to get your users to setup their secondary authentication for backup. All our users have a 2FA requirement, so most of them had that already. New users need to go though those setups. Finally, if a user needs to change or reset their password, they can do so though https://aka.ms/sspr. If all is done well, that reduces the amount of support calls I (and you) get.

Now, the next task: domain joining over VPN. This is a bit more “fun” to play with.

First, you need a VPN connection. We use Meraki gear using Active Directory for RADIUS auth. I wont go into too much details on setting that part up, but the script we use to build the VPN connections for users is below. This will probably be different for different VPNs, but this is our starting point.

Lines you need to change are at 8, 9, 10 and 47. Line 39 can also be modified to change from Split Tunneling (only sending traffic to internal subnets) or full Tunneling (all traffic over VPN). If you have multiple internal subnets, Line 49 can be copied with more.

The most important part we need though is line 34. The -AllUserConnection allows the connection to be available to all users on the machine, but also on the start screen. This is important.

So, with all that in place, you will need to connect to the VPN

you should now be able to join the domain as if you where on your local network.

Enter Domain details and change name of machine if required
when asked enter your domain username and password
You will be welcomed to the domain
and then asked to reboot

reboot your machine as usual and when it boots, you should see a new option on the login screen

VPN login option

Click this icon and if you only have one VPN connection the screen below will show up. If you have more than one, you will be given a list of options to use.

Login to VPN at the login screen

Enter your domain credentials. Since our AD and VPN use the same credentials, it will automatically log you in aswell.

Machine is now domain joined and logged in, and in my case, finishing setup

So, there you have it. How to domain join a machine outside the network. Now, in reality, Azure Active Directory and Intune would probably be the better option, but that’s future work…

Apple event October 2020

[NOTE] This post was done entirely on iPhone XS Max and a iPad Pro. Photos taken on the iPhone. Some edited on iPhone, some on the iPad. I have edited some text on the iPad with the keyboard, but if i missed anything, all was written mostly live, so apologies… Will add extra links to places like Engadget, etc, below.

Homepod mini. $99 available 16 November. The feature of intercom sounds good… When they mentioned the list of extra service, Spotify was very missing… [NOTE] I missed some stuff on this cause I was in a late meeting… This does look cool though.

iPhones. 5g available. 5g ultra wide band. 4gb down and 250mbs down ideal conditions. MmWave Support. Low latency support. But that’s normal for 5g. Verizon expanding their network to 60 cities by year end for ultra wide and and all cities for normal 5g. And it’s avail be on ALL models. Not just the high end. Very handy. Rumours had suggested it would be limited to either high end, or that mmWave would be available only on pro.

IPhone 12. First one announced. 5g support. New design. Looks very iphone 4 like. Bigger camera bump with 2 cameras. 6.1inch display. Smaller border. Super Renta XDR display. 2 million to 1 contrast ratio… 460ppi. Dolby vision hdr10 and hgl support too. 1200 nits. Ceramic shield on the screen to increase toughness. Tougher than any smartphone scree.

Most 5g bands in any smartphone. Even iOS core is modified to make 5g faster. When lower speeds will do, it can drop to LTE. Has been tested and gets up to 3.5Gb/s max and best conditions. 4Gb/s down on mmWave and best conditions and 1Gb/s in normal conditions.

A14 bionic. 5nm process. 11.8 billion transistors. 6 cores. 4 core gpu. Neural engine goes from 8 to 1y cores and 11 trillion operations per second.

Gaming stuff. Something called league of legends. I’m not a gamer, so… Hmm…. [I took this time to try upload photos for this post…]

Camera looks very cool. Larger aperture for better low light photos. video looks cool too…

MagSafe for iPhone. Qi charging with magets. 15w charger. NFC support too… New cases and wallet. And charger has magnet. Apple has a duo charger for both iPhone and Watch. Belkin have a car dock and a multi device charger too. I like the sound of the car dock, and a duo charger for iPhone and Apple Watch could be useful…

Recycling stuff. Lots of important stuff here… But very big words for trying to type live. But they are removing chargers and headphones from the box. Smaller box, which means they can get more on a shiping pallet, which reduces CO2. And by removing the headphones and charger, they can save 2 million metric tones of CO2 or 450k cars off the road. USB C to lightning cable included in the box.

Iphone 12 mini. Same spec as the full 12, just smaller.

12 mini starts at $699. 12 non mini is $799. More details of availability later in this post.

“There is simply nothing like iPhone 12”… Think that’s about to change now…

Pro line. They… Multiple… 12 pro. Still reminds me of the 4…

Pro camera also looks very cool. 12 pro max has better camera.

Pro raw option. Raw with some processing. Available later in the year. Works on all 4 cameras. Flexibility of raw with apples computational photography. Edit photos in photos app or in other professional apps. Wonder when light room gets it.

Pro video. Hdr shooting. Dolby vision Hdr recording in camera too. And the internet just went missing… Give me a sec…

Shoots the Hdr video at 4k 60fps. And it can be edited on the phone… Nice.

Lidar scanner. Interesting for ar objects but could be interesting. It was in the iPad pro. It can see in the dark too… 6x faster auto focus.

To finish up, a quick Gallery of the photos taken.