Byron Ang
Web Developer
Apar Technologies
Web Developer
Apar Technologies
About Me
I am a self taught developer passionate about technology and coding.
I have built up my technical knowledge through self study using material from various sources online like MIT Open Courseware, Udemy, YouTube and documentation.
If I do not have experience in the language, framework, or architecture you need, I know how to quickly learn them on my own.
I also know how to leverage Large Language Models (LLMs) and AI coding assistants like ChatGPT and Copilot to increase productivity and deliver tickets faster.
• Elements is built on .NET MVC
• It has an MSSQL database connected using Entity Framework
• It uses S3 as file storage (user uploads, etc)
• It uses KendoUI and jQuery with its cshtml pages
• It is run on IIS on AWS EC2
• If enhancements are large, we use micro-frontends
• Example 1, Widget Components. If we need a new component with complex
logic, we create a React project which compiles to a js file to embed in
the cshtml page
• Example 2, Microsites for entirely new pages.
We embed the react bundle js file in its own cshtml page. If this
page requires many APIs or its own DB, we create microservices on EKS
• Portal is built on .NET MVC
• It has a Dynamics 365 CRM database
• It uses Sharepoint for storage
• It uses KendoUI and jQuery with its cshtml pages
• It is run on IIS on its AWS EC2
• It also uses React for micro-frontends as needed
• There are plans to migrate away from the CRM database to reduce costs (S$120k annually)
• GMS 1.0 uses Dynamics 365 CRM for its UI and DB
• It uses Sharepoint for storage
• It runs on AWS EC2
• Dynamics 365 CRM is an all-in-one low/no code solution
• Views, Forms, Workflows and Entities are created and accessed using the CRM UI
• Functionality has been extended using C# plugins
• Additional resources like html pages, html components, javascript files are also imported
• HTML Pages. These are extra pages or components to link to or embed within CRM pages
• Javascript Files. These provide additional interactivity, validation, etc
• Work is in progress to migrate GMS away from Microsoft Dynamics, saving S$650k annually in licensing.
Everything is in-house to save an estimated S$477k on vendor fees
• Workflows are recreated with microsites (React) and microservices (.NET Core) on EKS
Formbuilder is an in-house package we maintain. It uses React and React Hook Form.
We use this package to build UIs. The final product is
compiled to JS to embed in other pages
It takes in an array of JSON objects that look like this:
[
{
type: "CustomNavBar"
name: "mainNavBar"
customCallbackHandler: () => {
/* Custom code here */
}
}
]
Each JSON object represents a React component.
Formbuilder will loop through the array and render each JSON object
in order. It looks at the "type" property to determine which
component to render, and other properties will be passed as props to the React component.
• Tech Stack: Older codebases used Java with Spring
MVC, and the JSP template engine. Old backend
microservices used Spring Boot. Some new microsites/services used frameworks
like React, Vue and Hapi.
There are also hybrid codebases using Java Spring MVC with JSP
templates and a modern UI framework like React.
These SPAs are compiled to JS and
embedded in the JSP.
• Message Centre Migration project.
The requirement was to recreate the existing Message Centre using NodeJS instead of Python.
I documented the architecture and behaviour of the original application.
I performed R&D and planned for the new application.
I set up a new repository and local environment for development.
I wrote, tested and finally deployed the entire application to production.
This is a longer writeup of the project
The original application had this structure:
Messages were stored in an array, which was stored in an
account document object in DynamoDB. Each function of the
application (adding, editing, deleting, etc) was written in
Python, hosted as individual AWS Lambdas and exposed as APIs. Old and
deleted messages are stored in an S3 bucket.
The migration is needed because the old application
was using a deprecated version of Python and aws-sdk. Some
of its functions were also bugged and non-functional.
It was decided to rewrite the application in NodeJS
(TypeScript)
For the database, I changed the structure of the
schema. Since DynamoDB does not support advanced querying of
nested items, retrieving individual messages involved
looping through a returned document. Additionally, each
document is limited to only 400KB of data. Adding more
messages will return an unhandled exception.
I changed it so that each document is an individual message.
Retrieving messages will now be a quick lookup, and there
will be no size limit.
For the application logic, I combined the functions into a single microservice/lambda using
Scandium Entrypoint and HapiJS webserver. Each function has
a route, and unit tests were added.
Next, I noticed that each message was encrypted using an
obsolete algorithm (AES-256-ECB). To enhance security, I
changed the encryption to AES-256-GCM.
Next, I noticed that the Cron Job Lambda to automatically
delete messages exceeding 3 months old and store them in S3
did not work. Upon analysis, I discovered that the original
developers had not taken into account that AWS limits lambda
execution to 15 minutes. This means the lambda is being
terminated before any work is done.
Due to red tape, I could only use AWS Lambda.
To resolve this, I used
DynamoDB pagination and stored a last evaluated key in a
dummy document within DynamoDB to preserve state
throughout sequential executions.
Next, I had to migrate messages from the old table to the
new table. The encryption must be changed, and each document
object must be changed. Again, I did not have access to ETL workflows,
and could only use AWS Lambda. I wrote a Cron Job
to process each message sequentially.
However, performance would be slow. To
speed it up, I used multi-threading using NodeJS'
worker_threads, and DynamoDB segmenting. Each additional
thread and segment provided massive speed improvements (6
threads increased performance by 500%).
To facilitate local development, I hosted local versions of
AWS DynamoDB, Key Management Service, S3 using Docker
images.
Next, I performed tests to determine how long the data
migration would take. In production, the database was 150GB.
I created a test database, and filled it with dummy messages
up till 150GB. I segmented the DB into 12 segments. Using
two separate Cron Job Lambdas, each having 6 threads (an AWS
Lambda has maximum 6 cores), the table was completely
processed in just 1 day.
• Traded electricity produced by Keppel Merlimau Cogen
on the Singapore Open Electricity Market (OEM).
• Imported renewable electricity from
regional multilateral power trade (LTMS-PIP, Lao
PDR-Thailand-Malaysia-Singapore Power Integration
Project)
• Used Visual Basic for Applications
(VBA) to automate workflows.
• Created VBA Macros to send regular update emails to
management on upstream supply, market conditions, and daily
trading operations.
• Created VBA Macros to calculate bids to the OEM based on
market conditions.
• Performed shift duty for gas & electricity market
operations.
• Coordinated closely with power plant on plant dispatch and
related matters.
• Coordinated closely with different business functions to
derive value within the Power & Renewable line of
business.
• Provided analytical support to business managers on
commercial and regulatory matters.
• Liaised and collaborated with regulators and industry
stakeholders.
Nanyang Technological University (NTU) | August 2018 – May
2022
Bachelor of Engineering – Chemical and Biomolecular Engineering
Recipient of NTU's ‘Nanyang Scholarship’ award
Victoria Junior College (VJC) | February 2014 – December
2015
Graduated with 7 Distinctions in the GCE ‘A’ Levels (90 Rank
Points)
Victoria School (VS) | February 2009 – December 2013
Graduated with 8 Distinctions in the GCE ‘O’ Levels (L1R5 of
6)
| Telephone | +65 9152 9980 |
| byronang@hotmail.com | |
| Github | https://github.com/KeyLimePie7 |