Posted (Updated ) in Uncategorized

Mastodon is shaping up to be a real Twitter alternative contender especially in the tech-sphere so I wanted to try my hand at running my own instance in Docker. The experience for getting it up and running turned out to be the most streamlined but hopefully this guide will make things a little easier for others in the future.

For reference I’ve used this guide to install a local dev copy on my Macbook Pro as well as a production Ubuntu server.

Read More »

Posted in Uncategorized

As the proud owner of a Swann NVR-7090 4 security camera system, I’ve always had a passionate distaste for the SwannView Plus app. Not only is the app itself terribly made with a non-standard and unintuitive tooltip-free UI, but it’s also not even mentioned on the official Swann website outside of some random support threads. There is a Windows version that hasn’t been updated in years, and an OSX version that is 32-bit only and doesn’t work on modern versions of OSX. This set of cameras does not work in any of the other SwannView apps (Why are there so many of them? They all do the same job!). Argh! The one light at the end of the tunnel is that this system supports being streamed over the RTSP protocol allowing us access with more modern apps such as VLC. There’s a few things you’ll need to go do get everything working and I’ve done my best to describe every step in detail below.

Read More »

Posted in Uncategorized

Step 1: Create a self-signed root certificate

First, let’s create a self-signed root certificate:

1
2
openssl req -x509 -nodes -new -sha256 -days 390 -newkey rsa:2048 -keyout "RootCA.key" -out "RootCA.pem" -subj "/C=de/CN=localhost.local"
openssl x509 -outform pem -in "RootCA.pem" -out "RootCA.crt"

The parameter -days 390 sets the number of days, this certificate is valid. Starting on September 1st (2020), SSL/TLS certificates cannot be issued for longer than 13 months (397 days).

https://stackoverflow.com/a/65239775

If this time is too long, you will receive an NET::ERR_CERT_VALIDITY_TOO_LONG error. In the command above, this value was set to 390 days, which works for me.

Step 2: Define domains and subdomains that should be included in the certificate

For this, just create a text file named vhosts_domains.ext and insert the following contents:

1
2
3
4
5
6
7
8
authorityKeyIdentifier=keyid,issuer
basicConstraints=CA:FALSE
keyUsage = digitalSignature, nonRepudiation, keyEncipherment, dataEncipherment
subjectAltName = @alt_names
[alt_names]
DNS.1 = localhost
DNS.2 = *.mixable.blog.local
DNS.3 = mixable.blog.local

This example includes subdomains for a local development environment for the domain mixable.blog.local and all subdomains like www.mixable.blog.local or apps.mixable.blog.local.

If you plan to use a more general certificate e.g. to include all subdomains under ..blog.local, this will not work. The definition only supports ‘first level’ subdomains. It would be great, because this saves a lot of additional setup, but unfortunately this is note supported.

Step 3: Create the certificate

Now let’s create the certificate:

1
2
openssl req -new -nodes -newkey rsa:2048 -keyout localhost.key -out localhost.csr -subj "/C=de/ST=State/L=City/O=Organization/CN=localhost.local"
openssl x509 -req -sha256 -days 1024 -in localhost.csr -CA RootCA.pem -CAkey RootCA.key -CAcreateserial -extfile vhosts_domains.ext -out localhost.crt

Calling the two commands above will create the localhost certificate that includes all the provided domains and subdomains. Your file listing should look like this:

Step 4: Make the certificate available for Apache

Depending on your system, copy all those files into the the configuration folder of the Apache installation. In my case, the installation was done with the help of brew, so the local path is:

1
<code>/usr/local/etc/httpd/cert/</code>

At the end, it’s not important where those files are located, because we no add this path to the vhost definitions. For this, open your vhosts file and link the crt and the key file as follows:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
# mixable.blog.local
<VirtualHost *:80>
    ServerAdmin webmaster@example.com
    DocumentRoot "/Users/mathias/Sites/mixable.blog.local"
    ServerName mixable.blog.local
    ServerAlias mixable.blog.local
    ErrorLog "/usr/local/var/log/httpd/localhost-error.log"
    CustomLog "/usr/local/var/log/httpd/localhost-access.log" common
</VirtualHost>
<VirtualHost *:443>
    DocumentRoot "/Users/mathias/Sites/mixable.blog.local"
    ServerName mixable.blog.local
    SSLEngine on
    SSLCertificateFile "/usr/local/etc/httpd/cert/localhost.crt"
    SSLCertificateKeyFile "/usr/local/etc/httpd/cert/localhost.key"
</VirtualHost>

If you have additional vhost definitions, you can add the part to every server name entry and use the correct paths to SSLCertificateFile and SSLCertificateKeyFile.

After changing the vhost settings, it is required to restart your Apache server!

Step 5: Add the certificates to macOS

When opening a local website, the certificate should be used but you might see a NET::ERR_CERT_INVALID error. This is the case, because modern browsers/systems do not trust self-signed certificates by default. to overcome this issue, we have to add the created certificates to the macOS Keychain Access. For this, open the *.crt files in Keychain Access:

So that they are know by macOS:

And finally, update the trust settings of each certificate to “Always trust”:

You should now be able to use a secure connection between your browser and your local server:

Step 6: Additional fixes

The steps above might already work for Chrome and Safari. If you have problems with Firefox, just open settings and go to Privacy & Security. Then you have to import the root certificate file RootCA.crt, so that Firefox knows about your certificate.

This post was copied pretty much verbatim from Mathias Lipowski’s Create certificate for localhost domains on macOS.

Read More »

Posted in Uncategorized

With Elden Ring saving every step you take and stories of hackers destroying save files with various exploits it’s important to keep a healthy backup schedule. For that reason I’ve whipped up an improved version of my Subnautica backup script for Elden Ring.

This script will check if Elden Ring is running, and if so, will zip the save folder to whatever folder the script is located.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
@echo off
 
:loop
 
rem Determine if Elden Ring is running
tasklist /fi "ImageName eq eldenring.exe" /fo csv 2>NUL | find /I "eldenring.exe">NUL
 
rem Save the current datetime to a var https://stackoverflow.com/a/203116
For /f "tokens=2-4 delims=/ " %%a in ('date /t') do (set mydate=%%c_%%a_%%b)
For /f "tokens=1-2 delims=/:" %%a in ("%TIME%") do (set mytime=%%a_%%b)
set mydatetime="%mydate%__%mytime%"
 
rem If Elden Ring is running...
if "%ERRORLEVEL%"=="1" (
	rem Set up a folder name for our new backup
	SET stamp=backup-%mydatetime%.zip
 
	rem Back up with Windows 10 'tar.exe' https://superuser.com/a/1473257
	tar.exe -a -cf "%stamp%" "%USERPROFILE%\AppData\Roaming\EldenRing" 2> NUL
 
	echo [%TIME%] Backed up to %stamp%
) else (
	echo [%TIME%] Not running...
)
 
rem sleep for 30 minutes https://serverfault.com/a/432323
timeout /t 1800 /nobreak
 
goto loop

The latest version of this code is available over on my GitHub page. It requires no external tools to run, just Windows 10 or newer.

To run the script simply save the above code into a file with a filename like backup.bat and double click it. Run it whenever you’re playing and close it when you’re done.

Read More »

Posted in Uncategorized

After an unfortunate incident involving a crash and a lost save file, I decided to write a batch script to automatically back up my Subnautica 2 saves every x number of minutes. You’ll need 7-Zip and PowerShell installed.

This script will check if Subnautica 2 is running, and if so, will zip the Subnautica save folder to whatever folder the script is located.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
@echo off
 
:loop
 
rem Get HH:MM:SS for log prefix
FOR /F "tokens=* USEBACKQ" %%F IN (`powershell get-date -format ^"{HH:mm:ss}^"`) DO (
	SET logtime=%%F
)
 
rem Determine if subnautica is running
tasklist /fi "ImageName eq SubnauticaZero.exe" /fo csv 2>NUL | find /I "SubnauticaZero.exe">NUL
 
rem If subnautica is running...
if "%ERRORLEVEL%"=="0" (
 
	rem Get current datetime https://stackoverflow.com/a/2854857
	FOR /F "tokens=* USEBACKQ" %%F IN (`powershell get-date -format ^"{yyyy_MM_dd_HH_mm_ss}^"`) DO (
		SET mydatetime=%%F
	)
 
	rem Set up a folder name for our new backup
	SET stamp=slot0000-%mydatetime%.zip
 
	rem Back up with 7zip
	"C:\Program Files\7-Zip\7z.exe" a -tzip "%stamp%" "%USERPROFILE%\AppData\LocalLow\Unknown Worlds\Subnautica Below Zero\SubnauticaZero\SavedGames\slot0000" > NUL
 
	echo [%logtime%] Backed up to %stamp%
) else (
	echo [%logtime%] Not running...
)
 
rem sleep for 30 minutes https://stackoverflow.com/a/16803409
powershell -command "Start-Sleep -s 1800"
 
goto loop

To run the script simply save the above code into a file with a filename like backup.bat and double click it. Run it whenever you’re playing and close it when you’re done.

Read More »

Posted in Uncategorized

For something that should be simple, importing a model from VRoid Studio into Unreal Engine 4 as of the time of writing is extremely convoluted and time consuming – even with the great tools that have been created to speed up the process.

Below I’ll attempt to walk through every step I took to get things looking and working correctly as well as documenting my struggles along the way. Our basic workflow will be VRoid -> Blender -> Unreal.

I should note that as of the time of writing, the latest version of each of the pieces of software I’ll be using are as follows:

Read More »

Posted in Uncategorized

For the longest time now I’ve been having issues with certbot not being able to create a certificate for my domain, returning the error

Attempting to renew cert (mydomain.com) from /home/ubuntu/.certbot/config/renewal/mydomain.com.conf produced an unexpected error: Failed authorization procedure. mydomain.com (http-01): urn:ietf:params:acme:error:unauthorized :: The client lacks sufficient authorization :: Invalid response from https://mydomain.com/.well-known/acme-challenge/-jYlHtpK6x6LZ8B4KjHeY7RgchNFPoouXADS_XQtowc [2606:4700:3035::681c:1e6e]: “<!DOCTYPE html>\n<!–[if lt IE 7]> <html class=\”no-js ie6 oldie\” lang=\”en-US\”> <html class=\”no-js “. Skipping.

I think the reason for this is because I’m using Full (Strict) encryption mode in CloudFlare dashboard which requires a valid SSL certificate be present when communicating between my web server and CloudFlare.

Full (strict) SSL/TLS encryption mode with CloudFlare

The solution for this is instead of using certbot’s default authentication method, we instead make use of the certbot-dns-cloudflare plugin that will handle the Lets Encrypt challenge through DNS. This works by automatically creating and deleting our CloudFlare DNS TXT record for us during the certbot renew. Let’s set this up now.

Read More »

Posted (Updated ) in Uncategorized

With the official How to Transfer Data Between microSD Cards for Use on Nintendo Switch documentation being for Windows only, and the Reddit thread on the topic not coming up with anything that works on the latest firmware, I thought I’d write up a quick post on how I moved from a 128GB to 512GB card successfully using OS-X.

  1. Turn off your Switch by holding the power button for 3 seconds and selecting the relevant option.
  2. Remove your old SD card from your Switch
  3. Insert it into your Mac
  4. Run the following in your terminal
    1
    2
    
    mkdir ~/Desktop/sdcard
    cp -r /Volumes/Untitled/Nintendo ~/Desktop/sdcard
  5. Insert your new SD card into your Switch and turn it on
  6. If an error message about your SD card not being readable pops up, close it
  7. Go to Settings – System – Formatting Options – Format microSD Card and format your card
  8. Once the files have finished copying on your Mac eject your old SD card and store it away for safe keeping
  9. Turn your Switch off, take your new SD Card out and insert it into your Mac
  10. Run the following in your terminal
    1
    
    cp -r ~/Desktop/sdcard/Nintendo/* /Volumes/Untitled/Nintendo
  11. Once the files have finished copying eject your new SD Card, insert it into your Switch and turn your Switch on
  12. If there is no error message, you’re all done!

Read More »

Posted (Updated ) in Linux, Uncategorized

I have a bunch of sites in /var/www and need individual user logins with access to their respective sites. In this tutorial I’ll go over how to create a user, chroot jail them and allow access to specific folders (in our case web directories).

For reference I’m using a standard LAMP server on Ubuntu:

1
2
sudo apt-get install -y tasksel
sudo tasksel install lamp-server

but this tutorial will work for any web server configuration.

 

1. Create User, Assign Web Group

1
2
3
4
5
6
7
# Create the user setting group to www-data
sudo useradd -Ng www-data myuser
sudo passwd myuser
 
# Restrict login to SFTP only
sudo groupadd sftp-only
sudo usermod myuser -G sftp-only

 

Create their web directory and provide access

With the new user created, make a directory matching their website’s name and mount the real website folder to it:

1
2
3
4
5
6
7
8
9
# Create chroot directory and set permissions
mkdir -p /home/myuser/mysite.com/html
chmod 755 /home/myuser/mysite.com/html
 
# Mount the destination directory at the directory we just created
mount --bind /var/www/mysite.com/html /home/myuser/mysite.com/html
 
# Add the above command to /etc/rc.local to mount it on boot
nano /etc/rc.local

 

Restrict the user to SFTP Only

We only want to allow SFTP access for this user. First open /etc/passwd and make sure the end of the line has /bin/false like so:

1
2
tail -n1 /etc/passwd
# myuser:x:1001:33::/home/myuser:/bin/false

Now edit /etc/sshd/sshd_config to allow only SFTP myuser:

1
2
3
4
5
Match User myuser
  ChrootDirectory /home/myuser
  ForceCommand internal-sftp
  AllowTcpForwarding no
  X11Forwarding no

Restart the SSHD service:

1
sudo service sshd restart

Now when you try to SSH in with this user you’ll get the error:

This service allows sftp connections only.

 

That’s it! They should now be able to SFTP in and will only have a mysite.com directory with access to their web files.

 

Further Reading

mihai.ile’s post on Stack Overflow – How can I chroot sftp-only SSH users into their homes?

Read More »