Logo

Top Rated Freelancer at Upwork

Feb 2021 - Present


As a Top Rated freelancer on Upwork for the past two years, I have successfully completed over 90 projects and have received a 5-star rating on 90% of them. While I am unable to list all of my projects, I am proud to highlight a few of my most notable achievements.

A mobile application that runs on Android and IOS devices. This application is designed to for cleaner and customers to connect. I will share more details about this project in the future.


Project Overview

This project involves scraping Amazon products using Playwright and storing the data in Google Sheets and sending Specific products to CDON API.

Flow

  1. Read URL and country from Google Sheet.
  2. Scrape product details from Amazon based on URL and country.
  3. Save images to Shopify via GraphQL API.
  4. Save product details to Google Sheet.
  5. Repeat the process for all products in the Google Sheet.
  6. If the column Platform has CDON value, send the product to CDON API.

Challenges

  • Handling Captcha
  • Used proxy rotation, but it was expensive.
  • Implemented Google Gemini API to solve image captchas.
  • If captcha is detected:
    1. Download the image.
    2. Send it to Gemini API.
    3. Receive the text solution from the API.
    4. Enter the solution in the captcha field.
  • If the captcha solution fails, repeat the process.

Lessons Learned

  • Learned about GraphQL API.
  • Learned about Google Gemini API.

Contribution from Usman

Usman made a significant contribution to the project by:

  • Image Quality Improvement: Helping in obtaining high-quality images from the Amazon product page.

This contribution was crucial in enhancing the project’s overall quality and accuracy.


Purpose

The purpose of this project was to integrate GitHub authentication to restrict site access to authorized team members. The goal was to ensure that only users part of a specific team (e.g., team1) can access the corresponding subdomain (e.g., team1.lablnet.com) after authentication. The solution needed to be scalable to accommodate any number of teams and subdomains.

Solution

  • To achieve this, I developed two Cloudflare workers:

    1. GitHub Authentication Redirect: Handles GitHub authentication and redirects users to the appropriate subdomain.
    2. Team Verification: Verifies team membership based on the subdomain and grants or denies access to the site.
  • To automate subdomain creation and mapping to the Cloudflare page branch, I wrote a GitHub action that utilizes the Cloudflare API to create CNAME records.

Challenges

  • Handling GitHub authentication redirects for multiple subdomains without pattern support (e.g., *.lablnet.com)
  • Ensuring that the solution is scalable and can accommodate any number of teams and subdomains.

Lessons Learned

  • The value of leveraging Cloudflare workers for custom authentication and routing logic

A mobile application that runs on Android and IOS devices. This application is designed to manage parking lots.

Features

Admin

  1. Admin can add parking slots.
  2. Admin can add addons.
  3. Admin can add terminal locations.

Employee

Admin also have these features.

  1. Employee can see the parking history.
  2. Employee can see the shuttle service requests.
  3. Employee can chat with the user.
  4. Employee can download the invoice.
  5. Employee can manage the parking lot orders.
  6. Employee can invoive the parking lot orders.

User

  1. User can register and login.
  2. User can book a parking slot.
  3. User can see the parking history.
  4. User can request shuttle service.
  5. User can download the invoice.
  6. User can chat with the admin/team.

Lessons Learned

I learned how to use firebase callable functions and how to use firebase storage.


Purpose

The purpose of PHP script was to sync Google contacts with a MySQL database. The script needed to fetch contacts from a Google account and store them in a MySQL database. The script also needed to update the database with any changes made to the Google contacts and vice versa.

Solution.

  • I developed a PHP script that authenticated with Google using OAuth 2.0 and fetched contacts from the Google account via the Google People API.
  • The script then connected to a MySQL database and stored the contacts in the database.
  • To ensure that the database was updated with any changes made to the Google contacts, I implemented a mechanism to compare the contacts in the database with the contacts fetched from Google and update the database accordingly.
  • The script use cron job to run the sync process at regular intervals.

Challenges

  • 2-way sync between Google contacts and MySQL database

Lessons Learned

  • I’ve learned how to use Google People API to fetch contacts from a Google account.

Purpose

The purpose of this project is to develop a basic Proof of Concept (POC) for software that extracts vehicle registration, mileage, and listing price from Autotrader UK car adverts, generates a CAP (Car Auction Prices) value based on the extracted data, and stores the data in a database.

The client pause the project because CAP calculation provider charging a lot of money for the data. The client is looking for a new CAP calculation provider.

Solution.

  • Web Scraping: Extract vehicle registration, mileage, and listing price from Autotrader UK car adverts using web scraping techniques.
    • As registration number is not available on autotrader website, so we will use the registration number from the image of the car.
    • For that we leverage the Gemini Vision API to extract the registration number from the image.

Challenges

  • Handling variations in Autotrader UK car advert formats and data extraction

Lessons Learned

  • The need for a scalable and efficient database solution for storing and retrieving large amounts of data.

This project evolved through several iterations, each marked by specific enhancements and the introduction of new features. Below is a detailed account of these versions, highlighting the key concepts, changes, and technical implementations.

Version 1 (Feb 1, 2024 - Feb 2, 2024)

Key concepts in Version 1

In the initial version, I developed a GitHub Action workflow designed to activate upon every push to the repository and each pull request. The primary objectives were to execute coverage and linter checks for a Rust project, capture these outputs, and automatically comment on the pull request with the results. This workflow was successfully implemented and rigorously tested within the repository. 1.

Version 2 (Feb 4, 2024 - Feb 10, 2024)

Key Changes in Version 2
  • Significant advancements were made in this iteration:

  • A new GitHub Action workflow, ai-comment.yml, was created to operate on every push and pull request, enhancing our project’s automation and integration capabilities. 2

  • A key task was to aggregate all Rust files following a specified schema, excluding any that matched defined patterns (‘test’, ‘schema’). This was achieved through a straightforward shell script, which efficiently processed and prepared these files for further analysis.

    • Note: I am sharing these files because they are on a public repo.

    • I’ve done this with a simple shell script below:

      rust_file_aggregator.sh
      #!/bin/bash
      
      # Initialize the file
      echo "" > rust.md
      
      # Find all the Rust files in the current directory
      # and its subdirectories.
      # and iterate over the files
      find . -name "*.rs" | while read file; do
          # Check if the file name matches the exclusion patterns
          # ['test', 'schema']
          if [[ $file != *test* && $file != *schema* ]]; then
              # Print the file name
              echo "Processing file: $file"
              # Append the file name to the output file
              echo "### FILE: $(basename $file)" >> rust.md
              # Append the file content to the output file
              cat $file >> rust.md
          fi
      done
      
    • These files were then transmitted to a mock API endpoint (Which I created), crafted using AWS Lambda and DynamoDB, demonstrating a practical application of serverless technologies in automating code review processes.

  • Additionally, a webhook workflow was set up to trigger upon receiving webhook events 3, further integrated with two AWS Lambda functions for dynamic API simulation API-CRON and repository data management Webhook.

    • API-CRON Which was part of Mock API to randomly simulate the API behavior.
    • Webhook Which reads data from the DynamoDB and add comment to the PR.

Version 3 (Feb 12, 2024- Feb 18, 2024)

Key Changes in Version 3

This phase marked a significant shift in the project’s direction, with the introduction of a GitHub App and Webhook, both hosted on AWS Lambda, showcasing a complex, integrated development environment:

  • The flow of that GitHub was as follow:
    • The user will install the GitHub App on their repository.
    • The GitHub App webhook will be triggered by the GitHub event.
    • The GitHub App will send the data to the AWS Lambda.
    • The AWS Lambda will process the data, save to the DynamoDB and commit the required file and secrets to the repository.
      • ai-comment.yml and rust_file_aggregator.sh will be committed to the repository.
      • The MOCK API url will be saved to the repository secrets.
  • The version 2 Lambda was used as it is just updated the following:
    • The webhook Lambda updated to update the comment on the PR with the data from the Mock API.
  • The new Lambda GithubAppWebhook added to handle the GitHub App webhook event.

Version 4 (Feb 21, 2024- March 01, 2024)

Key Changes in Version 4

This phase marked a more significant shift in the project’s direction, with the introduction of a GitHub App to manage the PR. Instead of committing the ai-comment.yml and rust_file_aggregator.sh to the repository, the GitHub App will handle the PR and comment on the PR with the data from the Mock API.

  • The flow of that GitHub was as follow:
    • The app now listens to the PR event and comment on the PR with the data from the Mock API.
    • The Github app clone the user repo whenever pr created or syncronized.
      • The App perform the required step as previously was hapening on the repo.
  • The lambda functions transformed to FastAPI`` and hosted on AWS EC2`.

Lessons Learned

  • How to create GitHub App and seemelessly integrate webhook with it that hosted on AWS Lambda.

I am currently working on a project called Collaboration App. Collaboration App is a web application that allows users to collaborate with each other. I am developing this project using Vue, TailwindCSS, and Firebase. The features of this project as follow:

  • Admin will invite user to join the app.
  • Admin can create another admin, user, room, and assign user to room.
  • User can join the room and chat with other user in the same room.
    • User can send text message.
    • User can send image.
    • User can send file.
  • Admin can create Zoom meeting or sechdule before meeting.
  • User can join the Zoom meeting from the app.
  • The Zoom meeting should be inside the app by using Zoom SDK.

Lessons Learned

  • I learned how to integrate Zoom SDK into the web application.
  • I learned how to use Firebase Functions to create Zoom meeting and join the meeting.
  • I learned how to use Firebase storage to store the image and file.

Purpose

The primary purpose of this CRM is to automate the process of receiving, validating, and distributing customer leads to various lead buyers based on pre-defined criteria and schedules. It aims to streamline lead management, ensure efficient lead distribution, and maintain a clear record of transactions and interactions without actual financial transactions within the system.

Features

  1. Receives submissions and performs basic validation and formatting.
  2. Automatically sends validated leads to the appropriate buyer’s CRM using their API key.
  3. Visually tracks the cost of leads against lead buyer’s balances without handling real transactions.
  4. Distributes leads based on a predefined ratio and schedule, ensuring fair and efficient distribution among buyers.
  5. Identifies repeat customers and routes their data to the original lead buyer at no extra cost.
  6. Provides separate interfaces for different user roles, with two-factor authentication for lead buyers and comprehensive management options for admins.

Challenges

  • Ensuring seamless integration with various external CRMs owned by lead buyers, each potentially having different API specifications.
  • Developing a fair and efficient algorithm for lead distribution that can handle varying schedules and ratios among lead buyers.
  • But I was able to overcome these challenges by:
    • Implementing a robust error handling system to identify and fix issues promptly.
    • Using a combination of Firebase Cloud Functions and Cloud Scheduler to automate lead distribution.
    • Using Firebase Cloud Firestore to store and retrieve data efficiently.

Lessons Learned

  • Learned how to implement 2FA.

Purpose

The primary objective is to develop a chatbot website that allows users to interact with a bot for various actions, including retrieving and storing information. This system aims to facilitate user-bot interactions in a structured and efficient manner.

Features

  1. Utilizes OpenAI’s APIs for natural language processing.
  2. Store User messages with conversation history.
  3. OpenAI moderation API to filter out inappropriate content.
    • Block conversations with inappropriate content.
  4. Admin Dashboard with impersonation feature.

Challenges

Prompt engineering is a new field that is still in its infancy. As such, there are many challenges that we will face during the development of this project. Some of the major ones are listed below: - Natural Language Processing: The chatbot will need to be able to understand and respond to user queries. - The queries can be like “I want to vote on rule 2” or “I want to vote on rule 2 with as netural”. - The bot will need to be able to understand the intent of the user and respond accordingly.

Lessons Learned

  • Learned how to use OpenAI’s APIs for natural language processing and prompt engineering.

This project entailed the development of an application where users, upon signing up, are navigated to a global view. In this view, users have the ability to sort patients and select a patient to view more detailed information, which includes various charts. The web application also incorporates an admin panel. This panel enables the admin to create, delete, and edit user profiles. The application is multilingual, offering support for French, English, and German. Additionally, the application is integrated with Google Analytics and Google Tag Manager for comprehensive data tracking and management.


The project involved creating a web-based Student Information System. The system was designed to manage student information, including registration details, course enrollment, and payment processing. The system had a user-friendly interface, and it provided a range of features to help users efficiently manage student data. These features included the ability to view and update student information, register students for courses, process payments, and generate reports. Overall, the system was a valuable tool for managing student information at educational institutions.


The project involved creating an alert system that would send notifications via SMS and email when stock prices were updated. This required web scraping techniques to gather the necessary data from various online sources. I was able to successfully implement the alert system, and it was able to provide timely notifications to the specified recipients. This project allowed me to showcase my web scraping skills, and I was pleased with the successful completion of this task.


Consulting with a client in Germany on the use of Python, Flask, and Firebase was an impressive and exciting experience. The project involved debugging issues and providing guidance on how to identify and solve future issues. I recommended using a hypothesis-driven approach, in which we develop a hypothesis about the cause of the problem and then use a divide-and-conquer strategy to systematically address each step. This approach proved to be effective, and the client was able to successfully resolve the issues with the project. Overall, it was a rewarding experience to be able to provide valuable guidance to the client and help them overcome challenges with their project.


ProtonCash is an ERP CRM system that is used to manage legal cases and grant funds as loans. Weareappointments is another CRM system that is used to manage leads. These two systems are integrated to provide a comprehensive solution for managing legal cases and identifying potential clients. ProtonCash is specifically designed to handle the financial aspects of legal cases, including the granting of loans and the tracking of payments. Weareappointments, on the other hand, is focused on managing the leads generated by the legal firm, including the tracking of client interactions and the scheduling of appointments. Together, these two systems provide a powerful tool for managing legal cases and client relationships.


One of my recent projects involved the development of a subscription software service for a client. The service offered customers the ability to choose from five products for their subscription, and utilized CheddarGetter for payment management, customer email communications, and other necessary tasks. However, the client was in the process of migrating to Stripe.com’s subscription billing service and transitioning their code from Asp.net to PHP. I was tasked with implementing these changes, as well as creating a new database table to store the customer’s subscription information and a simple admin panel to manage the subscriptions. I found this project to be engaging and was able to complete it within seven days, with the remaining time spent on testing by the client. This project allowed me to showcase my skills in web development and database management.