On AliExpress, I found this “Podofo” 10.1 inch LCD monitor for under EUR 40. It seems to tick all the boxes:
The BBC power supply should be able to power this monitor using a simple adapter cable.
The monitor was packaged well. It arrived in a cardboard box that contains all the cables, a power supply brick, a remote control as well as an adjustable monitor stand. The remote control battery (CR2025 button cell) is not included due to shipping constraints.
The monitor stand is attached to the display using a bolt and a square nut. You slide a square nut into the slot on the back of the monitor, and attach the stand to that nut. It worked, but I felt I had to over-tighten the bolt to get the monitor firmly attached.
The BBC Micro was configured for B/W output on the BNC socket; the monitor apparently had some issues staying in sync. Every 10 seconds or so, the monitor would briefly re-sync.
I fixed this by installing the S39 jumper in the BBC Micro (between the BNC connector and the RF modulator). This way, the BNC socket provides colour video output.
With colour video, the monitor has no trouble at all staying in sync. Image quality is more than sufficient; I can use RGB (with RGBtoHDMI) in case I need the absolute best output quality.
]]>The RGBtoHDMI adapter is an amazing Open Source project; it samples the “old” video output of your computer and converts it into super crisp HDMI output for your modern TV or monitor.
The “12-bit” board came with an adapter for the Acorn computer RGB output, as well as a professionally produced case to hold the adapter.
I used a Raspberry Pi Zero v1.3. After soldering on the header pins, I inserted it into the case. The MicroSD card will be installed later.
Next, connect the RGB adapter cable to the RGBtoHDMI board:
And install the RGBtoHDMI board on top of the Pi Zero:
Finished product:
The RGBtoHDMI software runs on “bare metal”, so there is no need for an operating system.
Go to the latest release, scroll down to “Assets” and download the ZIP-file (I’m running RGBtoHDMI_20230517_eb620884.zip).
Unpack the ZIP-file and copy all files to a freshly FAT32-formatted MicroSD card. Install the card into the RGBtoHDMI adapter - and you’re done.
NOTE: The RGBtoHDMI adapter is powered from the RGB connector on your computer - do not connect a USB power supply!
Connect the RGB connector to the BBC, connect the RGBtoHDMI adapter to your monitor (using a mini-HDMI to HDMI cable). Switch your monitor to HDMI input, and power on your Acorn Electron or BBC (Master). The adapter should power up (green LEDs), and after a brief wait you should see a menu overlay.
I used automatic calibration, and the result was great. Super crisp, pixel-perfect HDMI output from my BBC micro. Here’s a photo of the LCD monitor:
]]>I used a Tuya TS011F_plug_3 Zigbee wallplug (available under various brand names). It only supports polling for energy consumption, but that is fine for this purpose. The bike charger plugs into this wallplug.
To start charging, connect the battery to the charger and press the “on/off” switch on the wallplug. Once Home Assistant determines that the battery is fully charged, it switches off the wallplug. It also sends a notification to our iPhones / Apple Watches informing us that charging is complete, with the actual amount of energy consumed.
The automation is triggered when the wallplug changes from “off” to “on. Also, it triggers when the power is below 5 Watts for over 10 minutes - this indicates that the battery is charged. You may need to play with these numbers for your specific charger and battery.
At the start of charging, the wallplug energy counter is stored in “sensor.var_schuur_acculader_kwh” (I use the Variables+History integration, but you could also create Helpers to store this value)
When charging is done, the wallplug is switched off. Consumed energy is calculated and sent as part of the notification message.
The Home Assistant automation YAML looks like this:
alias: Schuur - Acculader
description: Fietsaccu lader in de schuur
trigger:
- platform: state
entity_id:
- switch.schuur_acculader
from: "off"
to: "on"
id: schuur_acculader_start
- platform: numeric_state
entity_id: sensor.schuur_acculader_power
for:
hours: 0
minutes: 10
seconds: 0
below: 5
id: schuur_acculader_finish
condition: []
action:
- choose:
- conditions:
- condition: trigger
id: schuur_acculader_start
sequence:
- service: variable.update_sensor
data:
replace_attributes: false
value: "{{ states('sensor.schuur_acculader_energy') }}"
target:
entity_id: sensor.var_schuur_acculader_kwh
- service: notify.all_iphones
data:
title: Fietsaccu
message: >-
Acculader is ingeschakeld, beginstand {{
states('sensor.var_schuur_acculader_kwh') }} kWh.
- conditions:
- condition: trigger
id: schuur_acculader_finish
- condition: state
entity_id: switch.schuur_acculader
state: "on"
sequence:
- type: turn_off
device_id: 4b2c9083c0ffee1d3fffa35ddeadbeef
entity_id: switch.schuur_acculader
domain: switch
- service: notify.all_iphones
data:
title: "Fietsaccu is opgeladen "
message: >-
De fietsaccu is opgeladen met {{ ( ( 1000*
states('sensor.schuur_acculader_energy') | float ) - ( 1000*
states('sensor.var_schuur_acculader_kwh') | float ) ) | round()
}} Wh, de lader is uitgeschakeld.
mode: single
Log in to your Synology NAS using SSH, become root (‘sudo -i’) and run:
insmod /lib/modules/tun.ko
On DSM >= 7.1, it should now automatically load on every boot. You can verify that the kernel module is loaded:
~~~
lsmod |grep tun
~~~
Add VPN credentials to your /volume1/docker/.env file:
# PrivateInternetAccess
PIA_USER="your_pia_username"
PIA_PASS="your_pia_password"
Add the Gluetun container to your docker-compose.yaml services:
Test your setup:
~~~
root@nas# cd /volume1/docker
root@nas# source .env
root@nas# docker-compose -f docker-compose.yaml up
~~~
If all is well, your VPN tunnel is now up and running. Next step is to add a service that uses this tunnel.
Add qbittorrent to your docker-compose.yaml - note that you need to copy the ‘ports:’ entry to the gluetun container!
Test your setup:
~~~
root@nas# cd /volume1/docker
root@nas# source .env
root@nas# docker-compose -f docker-compose.yaml up
root@nas# docker exec -ti qbittorrent /bin/bash
curl ifconfig.io
~~~
The ‘curl’ command should show the VPN exit node IP address, not your own IP address.
When accessing the qBittorrent web interface, you will probably only see ‘unauthorized’. To fix this error, stop the container and add the following to your /volume1/docker/appdata/qbittorrent/config/qBittorrent/qBittorrent.conf:
~~~
WebUI\HostHeaderValidation=false
~~~
The default login is “admin”, password “adminadmin”. Please change this ;-)
]]>Follow the instructions on www.smarthomebeginner.com, summarizing:
Log in to your Synology NAS web interface, and install the “Docker” package
Log in to your Synology NAS using SSH, become root (‘sudo -i’)and run:
root@nas# cd /var/packages/Docker/target/usr/bin
root@nas# mv docker-compose docker-compose.ORIG
root@nas# curl -L https://github.com/docker/compose/releases/download/v2.18.1/docker-compose-`uname -s`-`uname -m` -o docker-compose
root@nas# chmod 755 docker-compose
root@nas# docker-compose version
Docker Compose version v2.18.1
To check for newer versions, browse to the docker-compose releases page. Then, substitute “v2.18.1” in the ‘curl’ command above with the appropriate version number.
Initially, I will not use a reverse proxy. Ports 80 and 443 might be in use by the Diskstation software. I’ve decided to use ports 8080 and 8443 when adding a reverse proxy instead of changing the Synology default configuration.
I will use ‘bridge’ networking which means that the Docker containers share the IP address of the Synology NAS. You must select different port numbers for each application.
Create an environment file for Docker, similar to this:
~~~
root@nas# cat /volume1/docker/.env
PUID=1000
PGID=1000
TZ="Europe/Amsterdam"
DOCKERDIR="/volume1/docker"
~~~
Create a docker-compose.yaml file similar to this (minimal example, just runs the Portainer service on port 9000/tcp):
Test your setup:
~~~
root@nas# cd /volume1/docker
root@nas# source .env
root@nas# docker-compose -f docker-compose.yaml up
~~~
After a lot of searching I found a custom PCB to replace the CZ-TACG1. It was developed by the fine people at espthings.io - I managed to buy some boards from what appears to be their last batch. Fortunately, you can build your own interface using an ESP32 board and a 3.3V - 5V level shifter.
The software is based on the Open Source ‘esphome-panasonic-ac’ project by GitHub user Dominik. The project contains instructions on how to build and connect your own hardware interface.
The ESPHome configuration YAML looks like this:
In Home Assistant, a custom dashboard can be built using the following YAML:
I printed a nice case using the enclosure model files (.ZIP, local copy) provided by ESPthings.
]]>The project uses a 5V LJ18A3-8Z/BX Proximity sensor from AliExpress. Since I already had some ESP32 boards, that’s what I used.
The ESP32 runs on 3.3 Volt. The sensor needs 5 Volts, so in theory a level shifter is required. Even though the Espressif CEO claims that the chip is 5V tolerant, I still added a protection circuit to the proximity sensor, limiting the input voltage to safe values.
For testing, I mounted the components on a breadboard:
And 3D-printed a sensor mount:
The ESPHome configuration YAML looks like this:
To enable WiFi support, the printer needs to be running firmware version 4.4.0 or above - so I downloaded and installed the new printer firmware before attempting the hardware install. This took a couple of minutes since the boot loader is also upgraded.
Installing the ESP-01S module was uneventful - the excellent “How to set up Wi-Fi” guide on the Prusa3D website lists all required steps. Just take care when removing the power switch to prevent damage to the enclosure. The ESP-01S module fits - barely.
After powering on, the printer will flash the firmware on the WiFi module and will ask you to supply the WiFi credentials (in cleartext!) in a text file on a USB stick. A couple of seconds later, the printer joined the wireless network.
Now, go to the Network menu on the printer to find the IP address. Also, make a note of the Prusa Link API key - this is needed to access the webpage.
At that moment, you can use your PC to browse to the IP address listed (something like ‘http://192.168.1.123’). Enter the API key and you’re all set.
You may want to assign a static IP address (for example, in your router) so you can always find your printer at the same address.
The standard website functionality is quite limited, but at least you can drag and drop GCODE files to the printer. To improve the experience, create a new “Physical Printer” in Prusa Slicer.
Voila: no more shuttling USB-sticks around, just send the GCODE to the printer via WiFi! (there’s a button in PrusaSlicer for that)
]]>As of today, I only host statically generated websites, built using Jekyll. This makes maintenance a lot easier and removes most of the security threats.
I exported my website using the Wordpress to Jekyll Exporter plugin (developed on Github), then reviewed and updated all the blog posts.
There was a lot of cruft in the form of image thumbnails dating back to the days that the site ran Serendipity CMS - it took a lot of time to clean up and replace embedded video links with updated links.
I use the minimal-mistakes theme.
jekyll-feed
The jekyll-feed plugin creates an Atom/RSS feed for your site at /feed.xml
jekyll-target-blank
By default, hyperlinks do not open in a new tab/browser. The jekyll-target-blank plugin (Github link) for Jekyll fixes this.
jekyll-titles-from-headings
The jekyll-titles-from-headings plugin pulls the page title from the first Markdown heading when none is specified. This should not be needed if you always specify titles. I used this mostly while testing during the Wordpress migration.
Any attempt to upgrade these systems to a newer Windows 10 fails; the screen just goes black and the system hangs. No diagnostics whatsoever. After many frustrating attempts I decided to do a “clean” install.
I extracted the Product Key from the running OS using the free ProduKey utility. Then, I downloaded the Windows 10 Home installation media and created a bootable USB stick.
This time, the installer threw an error before hanging: “clock watchdog timeout”. Google finally came up with a few relevant links (tenforums.com and hardforum.com) that pointed me to the WiFi add-on card.
I never used WiFi on these PCs, so I pulled the card out and lo and behold: Windows 10 installation finally succeeds without problems!
]]>