4.2.3. Community Server

This is a recipe to set up a community server which is not only running GNU Health HIS but all related services from this project.

It contains a shared web server and database server but distinct application servers:

../../_images/community_server.png

Community server based on different Incus containers. The directed arrows represent connection requests that still expect responses.

The controller in the bottom right connects to all instances using SSH and holds the Certificate Authority (CA) if necessary.

The Incus base can be created with a shell script (requires Incus being installed and set up):

$ bash inventories/test/generate_community_server_incus.sh

Afterwards start from the controller and set a password inside:

$ incus shell controller
# passwd debian
    -> role, set and store this password
# su debian
$ cd

Next install the requirements from the first productive example.

Then continue inside /opt/ansible/gnuhealth-ansible on controller by handling the passwords:

$ for instance in "postgresql" "gnuhealth" "orthanc" "thalamus" "dhis2" "nginx"; do mv ~/${instance}.pw inventories/test/community_server/host_vars/${instance}.incus.yml; done
$ sudo chown $USER:ansible inventories/test/community_server/host_vars/*.yml
$ echo "postgresql_pw: $(tr -dc A-Za-z0-9 </dev/urandom | head -c 20; echo)" >> inventories/test/community_server/group_vars/all/vault.yml
$ echo "secretpassword" > passwordfile
$ for yml in $(ls inventories/test/community_server/host_vars/*.yml); do ansible-vault encrypt $yml --vault-password-file passwordfile; done
$ ansible-vault encrypt inventories/test/community_server/group_vars/all/vault.yml --vault-password-file passwordfile
$ rm passwordfile

The passwords in ~/${instance}.pw were created from the shell script generate_community_server_incus.sh.

Another password for PostgreSQL usage is directly created.

But secretpassword is a placeholder that should be replaced by a password that you role and store somewhere else. It is used to encrypt all the other passwords using Ansible Vault.

Now start by testing connectivity:

$ ansible -m command -a "apt update" -i inventories/test/community_server all --become --ask-vault-pass

And finally run the playbook:

$ ansible-playbook playbooks/community_server.yml -i inventories/test/community_server/ -e ansible_user=debian --ask-vault-pass

If the services should be accessible from outside create proxies - for example like this:

$ incus config device add nginx port443 proxy listen=tcp:0.0.0.0:9443 connect=tcp:127.0.0.1:443
$ incus config device add nginx port444 proxy listen=tcp:0.0.0.0:9444 connect=tcp:127.0.0.1:444
$ incus config device add nginx port445 proxy listen=tcp:0.0.0.0:9445 connect=tcp:127.0.0.1:445
$ incus config device add nginx port446 proxy listen=tcp:0.0.0.0:9446 connect=tcp:127.0.0.1:446

Warning

This was tested in a Incus network where IPv6 is disabled - otherwise PostgreSQL related tasks might fail as HBA rules are based on IPv4 addresses currently.