About Me

A nascent programmer, technological enthusiast, fascinated by constantly changing technology.

Academic qualifications

Degree in Information Technology (BE in IT) from Xavier's Institute of Engineering, Mumbai during the period of 2009-2013.

Skills

  • Languages:

  • Self taught in terms of Python, PHP, Python, Ruby, Javascript.
    • Experience with VIM, Bash, Zsh, Visual Basic, ASP through the course
    • Environments: Comfortable with Linux based systems
    • Knowledge of Agile (Iterative) development.
    • Experience in designing rich client and interactive interfaces, AngularJS, JavaScript, jQuery
    • Experience in handling Version Control systems like GIT.

Domain Expertise

  • Online Retail Services.

Work Experience

  • Ugam Solutions (Goregaon, Mumbai) from July 2013 to September 2014
    • Programmer in the Market Research and Analytics team.
    • Involved scripting in Dimensions (IBM tool for survey programming).
  • Ugam Solutions, Retail Tech team from September 2014 to July 2015
    • Product Engineer in the BIGM (Retail Tech Team).

    • Involved developing web based applications for the online retail market.

    • Projects worked on:

      1. Data Extraction web application: A web application through which crawled data would be processed and required attributes could be extracted. Technologies used:

      1. Flask, Natural Language processing, Python modules for processing the data.
      2. Flask, Apache, AngularJs, jQuery, Ajax, Html and CSS for Web Interface.
      3. Flask, Apache for web services.
  • Stylabs pvt. Ltd ssfrom August 2015 to Current
    • Working as a Sr. Backend developer.

    • Job responsibilities include
      1. Designing, Building/Debugging and Maintaining web interfaces.
      2. Writing scripts to automate and monitor processes.
      3. Working on optimization of current systems while continuously evaluate emerging technologies.
      4. Document current and future configuration processes and policies.
    • Projects worked on:

      1. Crawling interface: A web interface through which online retail stores can be crawled and extraction of required data.

      Technologies used:

      • Scrapy and Django for Crawling & Processing data.
      • Rabbitmq and Redis for distributing and managing urls.
      • Django, jQuery, Html, CSS for the Web UI.
      • Django, Apache, Passenger for the Web services.
      • Mongodb – Storage.

      2. Product Matching interface: A product matching interface for matching crawled products across websites.

      3. Chrome extension for Crawling: A chrome extension that would crawl curated products from the product website.

      Technologies used:

      1. Django/Python modules for processing the data and queuing it for crawling.
      2. Django, Ajax, jQuery, Html and CSS for the Web UI.
      3. Django, Apache, Passenger for the Web services
      4. Mongodb and Mysql as databases