A nascent programmer, technological enthusiast, fascinated by constantly changing technology.
Academic qualifications
Degree in Information Technology (BE in IT) from Xavier's Institute of Engineering, Mumbai during the period of 2009-2013.
Skills
Languages:
- Self taught in terms of Python, PHP, Python, Ruby, Javascript.
- Experience with VIM, Bash, Zsh, Visual Basic, ASP through the course
- Environments: Comfortable with Linux based systems
- Knowledge of Agile (Iterative) development.
- Experience in designing rich client and interactive interfaces, AngularJS, JavaScript, jQuery
- Experience in handling Version Control systems like GIT.
Domain Expertise
- Online Retail Services.
Work Experience
- Ugam Solutions (Goregaon, Mumbai) from July 2013 to September 2014
- Programmer in the Market Research and Analytics team.
- Involved scripting in Dimensions (IBM tool for survey programming).
- Ugam Solutions, Retail Tech team from September 2014 to July 2015
Product Engineer in the BIGM (Retail Tech Team).
Involved developing web based applications for the online retail market.
- Projects worked on:
1. Data Extraction web application: A web application through which crawled data would be processed and required attributes could be extracted. Technologies used:
- Flask, Natural Language processing, Python modules for processing the data.
- Flask, Apache, AngularJs, jQuery, Ajax, Html and CSS for Web Interface.
- Flask, Apache for web services.
- Stylabs pvt. Ltd ssfrom August 2015 to Current
Working as a Sr. Backend developer.
- Job responsibilities include
- Designing, Building/Debugging and Maintaining web interfaces.
- Writing scripts to automate and monitor processes.
- Working on optimization of current systems while continuously evaluate emerging technologies.
- Document current and future configuration processes and policies.
- Projects worked on:
1. Crawling interface: A web interface through which online retail stores can be crawled and extraction of required data.
Technologies used:
- Scrapy and Django for Crawling & Processing data.
- Rabbitmq and Redis for distributing and managing urls.
- Django, jQuery, Html, CSS for the Web UI.
- Django, Apache, Passenger for the Web services.
- Mongodb – Storage.
2. Product Matching interface: A product matching interface for matching crawled products across websites.
3. Chrome extension for Crawling: A chrome extension that would crawl curated products from the product website.
Technologies used:
- Django/Python modules for processing the data and queuing it for crawling.
- Django, Ajax, jQuery, Html and CSS for the Web UI.
- Django, Apache, Passenger for the Web services
- Mongodb and Mysql as databases