Configuring Synapse Pro for small hosts
New to Synapse Pro for small hosts? We recommend starting with the overview documentation for a high-level introduction.
See below for configuring Synapse Pro for small hosts cluster and tenants.
Setting up a Synapse Pro Shards cluster
The Synapse Pro Shards can be installed through helm using helm install synapse-shards -n <namespace> oci://registry.element.io/synapse-shards.
See Synapse Pro Shards Helm chart README
(synapse-shards)
for more information.
Tuning the Synapse Pro Shards controller behaviour
The controller's distributes tenants across shards, provisioning new shards only upon saturation. It will also move tenants to new shards to run Synapse version upgrades. Finally, when tenants are deleted, it will consolidate tenants to reduce the number of shards running, targetting and average tenant per shard load.
The 3 following values are available when setting up the chart:
## Maximum shards managed by this Synapse Pro Shards controller
maxShardsPerSynapseImageTag: 5
## Homeserver tenants per shard
maxTenantsPerShard: 25
## Average shards load redistribution threshold
## Will trigger a shard redistribution when the average load of all shards goes below this value
avgShardsLoadRedistributionThreshold: 15
- A shard is created only when a new shard is needed.
- All homeserver tenants are distributed across existing shards. Homeserver
tenants are added to existing shards until shards are full according to
maxTenantsPerShard. - When shards are full, a new shard will be created to host the homeserver
tenant if
maxShardsPerSynapseImageTaghas not been reached. If the maximum number of shards has been reached, the controller will retry until a spot is available in the shards. We recommend setting an alert if the shards are full. See Watching Tenants Metrics to see how to get your shards controller metrics. - When average tenants per shards goes below
avgShardsLoadRedistributionThreshold, tenants from shards with the lower load will be drained to reduce the number of shards running.
Setting up homeserver tenants with ESS Pro
Managing Synapse homeserver tenants
To deploy Synapse as a tenant in a Synapse Pro Shards cluster, you need to provide the following values:
synapse:
asTenantHook:
enabled: true
clusterId: synapse-shards # the name of the release of the synapse-shards chart used above
namespace: <shards namespace> # the synapse-shards cluster namespace
Migrating a small host tenant to a dedicated Synapse deployment
If a homeserver has out-grown the "small host" category and needs to be scaled to a
dedicated process with workers, the migration is straightforward. Simply disable the
synapse.asTenantHook.enabled value to deploy and manage the homeserver by the normal
matrix-stack chart.
Note: Multi-Tenancy is still in active development, and is released as a preview
feature. For now, you will have to delete the generated secret named <release
name>-synapse-tenant to disable the configured homeserver tenant
when doing so.
Advanced
Watching tenants metrics
The chart automatically deploys the required ServiceMonitor if the CRDs are installed
in the cluster.
The metrics prefix is synapse_shards_controller_.
Please reach out to Element to get the Grafana dashboard.
Backups
Shards are managed by the Synapse Pro Shards controller. You should back up the helm chart values used to deploy the Synapse Pro Shards controller in Git, according to GitOps best practices. Using the helm chart involves creating ConfigMap and Secrets that you should also back up.
Homeserver tenants are totally normal Synapse servers and should be backed up using the standard backup procedures:
- When deploying tenants with ESS Pro, our standard backup recommendations apply.
- When using your own deployment system, you should follow the official Synapse documentation for backups.
Media storage
Because of tenant mobility, we cannot rely on local media storage. Instead, we package multi_synapse with
the s3_storage_provider module. This module
should be used for each homeserver tenant to store its media in a distinct bucket.
See the S3 storage section in the Synapse docs for more information about configuring S3 storage.