Staging TRIRIGA

Abishek, tech
Back

We hit a certain problem in TRIRIGA - fetching and passing on data from our system to any external system, as fast as possible. The problem is, TRIRIGA is slow. Getting data, lets say some 500-600 buildings from a fairly busy system, would take as long as 30 seconds. Best case scenario, if the cache was all warmed up - then maybe 5-6 seconds. But this isn't fast enough, is it. At least not to our customers.

Now, we did work around this with a lot of query optimisation and pre-filtering. We were able to bring it down to a rate that was barely acceptable but acceptable nevertheless.

However, there was one solution for the rate at which these fetch APIs worked that was discarded. Using a staging table that would hold the data from TRIRIGA that the external system would look up to using restAPI instead of reaching TRIRIGA. And the staging table was updated regularly with data from TRIRIGA. The staging table could live with the poor rate of receiving 100 records a second. But like I mentioned earlier, this solution was discarded for a couple of reasons.

  1. This involved a tech stack that the customers weren't ready to expand to (at that time)
  2. The solution did not have a reliable use-case for real time applications - getting reservations, for example

And both the reasons make complete sense.

That didn't mean I must not try it though! I like databases and I really needed reason to learn how a mongodb worked, so I started building the whole thing locally.

I took the help from YouTube.

MongoDB

Unfortunately, I did not need a lot of learning to set this up. I didn't even need to set up a collection or a schema because all of it was going to be done from the backend of the webserver, which will be built using nodejs.

Just had to download and set the environment variables, and made sure mongod.exe was running.

Node.js

This was actually the mountain I had not prepared to climb. Honestly, I don't have a lot of experience in JavaScript. Everything was trial and error.

npm init first to initialise a project. This creates a "package.json" file with the defaults.

Then I installed the my dependencies using npm i express mongoose

Express acts like the middleware and Mongoose is the connector between the mongodb and the web server. Next is installing the development dependencies using npm i --save-dev dotenv nodemon so the dependencies do not get installed in an production environment. nodemon is for refreshing the server easily after code changes are made.

In the package.json, change the "test" name and value in the "scripts" object to "devStart": "nodemon server.js". Create a server.js, .env and .gitignore files in the same folder.

On the server.js, we require express and mongoose, and also connect to the mongodb. In my case, I connected to the localhost URL which was something like mongodb://0.0.0.0/people. It says people because I wanted to create a people collection. One important issue I faced was - when I set the localhost URL to 127.0.0.1 or just localhost, it did not work. I had to set it to 0.0.0.0. It is better to set this URL as a environment constant and define in the .env file.

Now define a router variable, preferably with the name of the collection or relevant to the type of data that has to be fetched - const peopleRouter = require('./routes/people') in my case. And then define a path of the router i.e., app.use('/people', peopleRouter), this means when I hit the API, it would route to localhost/people.

Create a "people.js" under a "routes" folder and in the people.js, add module.exports = router. This should fix all the errors that appear at this point.

The people.js file is where we define all the operations for the restAPI. This is pretty straight forward.

Before the routes are complete and tested, the schema for the mongo collection has to be defined. Create another "people.js", this time under a "models" folder. Create a new schema here and this should also be fairly easy. And, at last, export it - module.export = mongoose.model('People',peopleSchema). The parameters are the collection name on mongodb and the schema variable defined earlier in the same file.

I think that's all the server needs. Once it's up and running, you should be able to feed data into the API and view it on the mongodb. All the error messages per the schema can be added to the routes/people.js file.

We have two entities that are not connected - the web server with mongodb and TRIRIGA.

We connect them using the final piece, which is NodeRed.

NodeRed

Although I quickly understood how difficult the backend stuff was going to be, the NodeRed piece legit took me the longest. I had to build a flow that would invoke the test version of the existing APIs we built on TRIRIGA for the customer and the POST that to the mongodb using the web server we had just built.

Sounds simple but to build the flow in a way to structure the json response from TRIRIGA to be accepted by the restAPIs we had just built was a mammoth of a task. Here's how simple the flow looks on the outside:

node-red-flow

Once that was done and I "injected" it, bamm! Thousands of records were created on the mongodb. It was an amazing sight when it happened the first time. After I tested for a while, it didn't seem to fail.


Now to address the client's issue -

This involved a tech stack that the customers weren't ready to expand to (at that time)

This would still be a concern. But if I/we prove that the performance improvement is worth the investment into a new tech stack, I guess the clients would agree? 🤷

The solution did not have a reliable use-case for real time applications - getting reservations, for example

If the performance of this entire flow was faster than the existing real time APIs, then it would make sense to implement this on a larger scale. But the problem is proving this with a concept and unfortunately, development bandwidth would be a problem. I'm sure I can upskill enough to handle this as a day-to-day activity but only after it's agreed to, of course.

It really was an interesting journey though, hopefully I get to work on it, officially.

My explaining might need a ton of work so, here's the GitHub repo and the NodeRed flow for reference.

Reach out to me at hey@abishekvenkat.com for any queries!