Ansible script to self-host a bunch of services on your VPS in one go.
An Ansible playbook that sets up:
- Static website server (served through HTTP and Onion land).
- Open source, GDPR-compliant analytics
@
umami.domain
. - Nextcloud instance @
cloud.domain
. - Vaultwarden instance @
vault.domain
. - SearxNG instance @
searx.domain
. - Gitea instance @
git.domain
. - Regular unattended backups and updates for these services.
- HTTPS all the things.
- Regular unattended SSL certs renewal.
- Hardened NGINX reverse proxy.
- Hardened SSH setup.
- Firewall and Fail2ban.
- Regular unattended system updates.
- A bunch of useful CLI tools.
- Your dotfiles set up and ready to go (using GNU-Stow).
- Up to date neovim install (if nvim config is found in dotfiles).
The script requires a Debian based VPS.
The cheapest most basic VPS you can find should do.
The machine where this script is run should have root SSH access to the VPS.
Root SSH connections will be blocked as part of the setup and a dedicated sudo user will be used for the setup.
You need a valid domain name and your DNS records should be properly set up for your root domain as well as for (at least) the above-mentioned subdomains.
- Clone this repo
- Copy
.env-sample.yml
to.env.yml
and fill in your config - Run
./init.sh
You can use the --tags
flag to run only some of the roles:
./init.sh --tags="harden,nextcloud,searx"
You can check the available tags in the run.yml
file.
After the main playbook is done, you should find the Nextcloud, Gitea, SearxNG and Umami instances under their respective subdomains.
There should be a custom admin account already setup for Nextcloud and Gitea, as well as the default Umami admin user.
Have a look around and make yourself at home!
Public signups are disabled by default for Vaultwarden to improve security.
You'll have to visit vault.[your.domain.com]/admin
first, enter the
vaultwarden_password
defined in your .env.yml
file, and manually
allow your desired email address to sign up.
This behavior can be changed, and more info can be found here.
You'll find a lousy website under your root domain.
It is stored in /home/[REMOTE_USER]/website/
and you can modify it at any time
using scp
or rsync
to upload your static website, blog or whatever else.
rsync --recursive --compress --partial --progress --times local_website/* [REMOTE_USER]@[your.domain.com]:~/website
The default docker-compose installation process provided in the Umami docs is followed.
You can log in to umami.domain
following the official instructions.
In case you prefer something simpler, a lightly modified version of
this script is
available in /home/[REOMTE_USER]/analytics.rb
.
Run it as follows to extract useful information from your server:
sudo cat /var/log/nginx/acces.log | ~/analytics.rb
Both the OS and the individual services are updated on a monthly basis.
Backups for the services are done weekly and are stored by default under /home/[REMOTE_USER]/backups
.
You can download them to you local machine with something like:
rsync --recursive --compress --partial --progress --times --rsync-path="sudo rsync" [REMOTE_USER]@[your.domain.com]:~/backups local_backup_dir
Having your private spot on the internet shouldn't be a luxury.