How I deployed N8N from scratch

Rin Nguyen
6 min readOct 10, 2024

--

a sample workflow

For some reasons, my team decided to self-deploy N8N instead of using N8N Cloud. I started out on an AWS EC2 T3-Medium instance with no prior server experience, and learned a lot along the way. This article is about how I did the setup, and if you have any feedback, please do not hesitate to comment.

I — The Setup Loop

1. Use Docker Compose to deploy

Starting out I used Command Prompt to interact with the server, but then I found it is much more convenient to connect via Visual Studio (VS) Code. I just needed to add host info (Host name, Identity file path, User) in a config file in .ssh folder. With the convenience of VS Code, I added Git to my deployment folder, to back up my config files.

Initially, I was using Dockerfile, but when I found out I also needed a database container, I switched to Docker Compose as it’s much easier to work with. The two containers connected via a network, which I explicitly defined, but it could be the default network set up by Compose.

I allocated port 5432 to postgres, 5678 to N8N and 80 & 443 to NginX. For whitelisting access to the server and the app on browser, I used security group in the EC2 instance.

2. Use a Subdomain to run app on HTTPS

When I deployed the N8N app on localhost, there was no problem and everything ran smoothly. But going onto the server, after creating the admin account, I saw the error: ‘Init Problem: There was a problem loading init data: Unauthorized. Please advise’. And I was unable to use the app.

Later, I stumbled on a post saying I should use a domain/subdomain since I was connecting via HTTPS. If I continued with HTTP, I could use the solution as in this post. But I wanted safe connections so I chose to use a subdomain. And the subdomain didn’t cost extra since the domain incurs costs anyway, so our team were fine with using it.

I needed to obtain an SSL certificate for the app as it ensures safe data transfer. In the N8N container, I mounted the SSL key files so that it could access. The certificate is like an ID, so it must be renewed from time to time. I got the certificate using free tools like certbot & Let’s Encrypt, and need to update it every 90 days.

Setup for the automation of the certificate renewal is pretty simple. I just had to make sure all the necessary commands run along with the renewal, using the pre and post hooks (in .sh files). In my case, the tasks were stopping & restarting NginX, also spinning down then up the Docker containers. I made sure that Docker can always access the newly updated certificate, else it would show on the browser that the connection is NOT secure!

3. Use NginX to improve performance and prevent search engine indexing

This was when the app was still being tested and optimized. So one nice day I checked the N8N container log and saw some bots crawling my website. Since the app was internal and does not need external attention, I wanted to restrict that and found NginX.

Reading further, I learned that NginX also manages reverse proxy, load balancing, and caching. Its HTTPS server capabilities help maximize the site performance and stability. So it was well worth adding.

4. Database setup and access

As per N8N guide, I added a database (postgres) container for N8N app data storage. It is very easy to check the database with app pgAdmin. Configuration details include: server address (the instance URL), port 5432 and the credentials you set up.

For persistent data storage, I made sure to mount the volume (in the postgres volumes) to the right path: db_storage:/var/lib/postgresql/data. If not done right, data would be wrongly mounted and I could lose data when containers are down.

5. Backup and restoration

Backup is one important thing you should not skip. At first, I intended to do it only on the server, backing up daily (using cron job) and overwriting the backup file each run. But I was worried that if disaster strikes and I cannot check within one day, the valuable data could be lost.

the script to backup data

Meanwhile, if storing a file for each day, I feared it would take up a lot of space on server and I would have to occasionally delete. So I chose the easy way, onto S3! Taking advantage of its versioning feature, I could store the backup with minimum cost and can always roll back to historical versions.

One last thing which is equally important is restoring the database from the backup file. So I managed to find a command to do the restoration but the data was not showing up on the UI. Turned out there is one important table named workflow_tags, which regulates whether or not existing workflows are shown on UI.

I only backed up important tables, since tables like executions needn’t be backed up. If cannot run a script to restore workflows, you can copy the flow details into a JSON file and upload via the UI.

how to import workflow from file

II — Some N8N Nodes

1. Snowflake

Snowflake node in N8N does not support SSO, so I had to use a service account. For the account security, I did some whitelisting (via network policy). Good news is that, you just need to whitelist the server (instance) IP, since that is where the queries are being sent away to Snowflake.

configuration to create Snowflake connection

2. Teams

We wanted N8N to send messages to Teams chat/channel, but that required creating an Azure app. When I finally manged to create the app, turned out it was not usable because N8N requires a lot of access to the Teams account. So for security I had to abandon that and ended up using PowerAutomate as an intermediary.

In N8N I created an HTTP Request node, where I made a POST request to the PowerAutomate flow, which would then create a message in Teams.

the flow in PowerAutomate to create Teams message

3. Freshdesk

The Freshdesk node works very fine. But it does not allow creating email to customers, which is fine because I believe the email formatting simply cannot be done beautifully here. So we resorted to using it for creating ticket and downstream in Freshdesk we created Automations to trigger sending email based on the ticket.

Another problem is that Freshdesk node cannot show custom ticket fields, so I could not populate them. My workaround was to use an HTTP Request node and call the Freshdesk endpoint to create tickets. Finding the real custom fields’ ID was a bit tricky, so you should create a simple ticket first with no required field. After that you check the payload and you should find the custom fields starting with cf_ in name.

And if your input data does not match with the predefined values in a Freshdesk field, you can use a Code node before HTTP Request, to do the mapping.

a sample workflow to create ticket in Freshdesk with optional custom fields

--

--

Rin Nguyen
Rin Nguyen

Written by Rin Nguyen

A Data Engineer sharing her learning stories ... You can find me at https://www.linkedin.com/in/rin-nguyen-34761415b/.

No responses yet