How To Improve API Part 1
Indexing Database (written in golang)

Hafizhabdurrachman
6 min readMar 31, 2021

Are you a person who worked at a company as a back end engineer?, or a freelance worker? or student who has project from client?, if yes, maybe this article will helpful for you.

This article will explain one of thing that we can improve from our API. Please take a note, the API referred in this article is an API which directly connected to the database (DB) to get some data. So, if you have an API for just processing the input without connect to DB, this article will not be relevant for you, but hopefully this article can improve your knowledge 😃 .

On this part, we will try to improve our API by indexing database. This operation will improve our time execution when we have operation read from DB.

Let me give you a brief explanation, if you are asked to find a word from 2 kind of dictionaries, let say the first one is dictionary A which all of the word in this dictionary is not sorted by first alphabet (so the word is mix and random) and the other is dictionary B which like existing dictionary (sorted by first alphabet). Which dictionary take your time the most?,i guess most of you will answer dictionary A, because you have to read all the word one by one until you get the match word.

This is like query select on table. If you make apps for dictionary and have a query for got the definition by word like this

select * from table_dictionary where word = 'word1'

the process behind this is, it will search one by one from table until got match word1 on field word. It will takes less time if the data is few or luckily word1 match on the first row data, but what if your apps has already saved more than one hundred thousand words and word1 is not available on first row, which is very possible right?, this process will takes more time to search one by one every row until get matched.

Indexing on table is like as shortcut first alphabet in dictionary B. Index is a data structure which contains a set of keys and their references to the actual data on the table, which means it will reduce time to search one by one from the beginning (full scan).

You got it?, yes good 👍, so lets continue to the real case. We will like to compare how different execution time between using indexing and not. Before we go to next section, i assume that you guys already install golang and postgresql on your local, if you haven’t installed both, you can install both on how to install golang and how to install postgresql.

Preparation

Ok let’s continue. Now, we will make a simple apps for getting user detail data which combine from user profile, user family, and user transportation from given param request user id.

first, we have to create 3 tables for user profile, user family, and user transportation.

CREATE TABLE public.user_profile (
id BIGSERIAL PRIMARY KEY,
name TEXT DEFAULT '',
address TEXT DEFAULT '',
gender VARCHAR(16) DEFAULT ''
);
CREATE TABLE public.user_family (
id BIGSERIAL PRIMARY KEY,
user_id BIGSERIAL NOT NULL,
name TEXT DEFAULT '',
relation TEXT DEFAULT ''
);
CREATE TABLE public.user_transportation (
id BIGSERIAL PRIMARY KEY,
user_id BIGSERIAL NOT NULL,
name TEXT DEFAULT '',
type TEXT DEFAULT '',
colour TEXT DEFAULT ''
);

you can change the field whatever you want, but on this example, i will use those fields. I have already create 2288 user which every user has 2 until 5 families and 1 until 5 transportation.

After that let’s continue to the code. I will separate 3 files of code :

  1. Entity

This file is fill by all of structs that we need for this API.

2. Database

On this file, we create all functions which related getting data from DB. The function only need 2 parameters, the first one is user id which we will get it from parameter request and the second is DB connection. After we got the data from DB we have to set data to struct which we have create on the previous file and set it as one of return function. On the snippet, i just copy function GetUserProfile, you can make the others for user family and user transportation just like the function above or you can see all the code on my github (github link will be added at the end of article)

3. Main

This is heart of app, on this file it will included

  • Const for connection DB will be filled in string connection
  • handler function which process get data from DB
  • Tracker time for logging execution time on every function

On the snippet, the function GetUserDetail just hit function GetUserProfile, you can make the others for user family and user transportation just like the function above or you can see all the code on my github (github link will be added at the end of article).

The last action is prepare the indexing query.

CREATE INDEX CONCURRENTLY user_profile_id_index
ON user_profile (id);
CREATE INDEX CONCURRENTLY user_family_user_id_index
ON user_family (user_id);
CREATE INDEX CONCURRENTLY user_transportation_user_id_index
ON user_transportation (user_id);

For indexing, i use concurrently instead of standard index (not using concurrently) because it will not locking out writes operation to table, for detail you can read on postgresql create index concurrently docs. But its optional, it depend on how your use case

note : for testing time execution between indexing or not, you can add the indexing query first after that drop it or vice versa

Testing

For testing, you just running the main file go run main.go on terminal. After apps already started, go to your browser and open this link :

http://127.0.0.1:8080/user/{user_id}

{user_id} will be filled by random number from 1–2288 (because we only create 2288 users) and it will return detail user data.

You can do it also on API testing tool such postman or insomnia / just curl to your terminal (if you have already installed curl)

note: before testing, i have already completed the code for cater user family and user transportation

Comparison Result

  1. Time execution before adding indexing
print log before using index

2. Time execution after adding indexing

print log after using index

Summary

For below comparison by on average we can conclude that indexing will improve execution time API by reduce read time from database, but it depend on how many data that you have. If data only one hundred maybe you can’t see the difference significantly, but if you have data more than one hundred thousand or one million, it would be significantly different.

Eventually, it will depend on your use case, if your project just has one hundred rows, you don’t need indexing, because indexing will adding memory and not significantly improve your API, but if your project become bigger, indexing database will be a great choice for you to improve services and have a better execution time

You can see and clone all of the code and data on my github, you can test it by yourself and edit it as what you want. See you on the next part, Thank you! 😄

--

--