overview
- TX, USA
- Created an aggregated report daily for the client to make decisions and help analyze meal pattern trends
- Built an internal visualization platform for the clients to view historic data, make comparisons between various issuers, analytics for different bonds and market
- The model collects, merges daily data from market providers and applies different cleaning techniques to eliminate bad data points
- The model merges the daily data with the historical data and applies various quantitative algorithms to check the best fit for the day
- Captures the changes for each market to create a daily email alert to the client to help make better investment decisions
- Built the model on Azure platform using Python and Spark for the model development and Dash by plotly for visualizations
- Built REST APIs to easily add new analytics or issuers into the model
- Automate different workflows, which are initiated manually with Python scripts and Unix shell scripting
- Create, activate and program in Anaconda environment
- Worked on predictive analytics use-cases using Python language
- Clean data and processed third party spending data into maneuverable deliverables within specific format with Excel macros and python libraries such as NumPy, SQLAlchemy and matplotlib
- Used Pandas as API to put the data as time series and tabular format for manipulation and retrieval of data
- Helped with the migration from the old server to Jira database (Matching Fields) with Python scripts for transferring and verifying the information
- Analyze Format data using Machine Learning algorithm by Python Scikit-Learn
- Experience in python, Jupyter, Scientific computing stack (numpy, scipy, pandasand matplotlib
- Perform troubleshooting, fixed and deployed many Python bug fixes of the two main applications that were a main source of data for both customers and internal customer service team
- Write Python scripts to parse JSON documents and load the data in database
- Generating various capacity planning reports (graphical) using Python packages like Numpy, matplotlib
- Analyzing various logs that been generating and predicting/forecasting next occurrence of event with various Python libraries
- Created Autosys batch processes to fully automate the model to pick the latest as well as the best bond that fits best for that market. [O
- Created a framework using plotly, dash and flask for visualizing the trends and understanding patterns for each market using the history data. [O
- Used python APIs for extracting daily data from multiple vendors. [O
- Used Spark and SparkSQL for data integrations, manipulations. Worked on a POC for creating a docker image on azure to run the model [O
- Environment: Python, Pyspark, Spark SQL, Plotly, Dash, Flask, Post Man Microsoft Azure, Autosys, Docker [O