Data Analytics and Business Intelligence
Senior Analytics Leader with 15+ years in Banking and Insurance. I secured $6M in NPV savings through Fintech cost-benefit analysis and reclaimed 3 FTE capacity by automating high-risk manual applications. I retired 48 legacy jobs to streamline workflows while ensuring 100% accuracy in RI-B, RC-C, and RC-O regulatory reporting. By building executive Tableau dashboards and partnering cross-functionally, I translate complex risk data into actionable strategy and regional productivity.
Bellevue University | In Progress
U.P. Technical University, India
SAS Base 9 Certified | ABA Certification in Deposit Compliance
Duration: Sep 2022 - Present
Directed the streamlining and management of critical regulatory and Board of Directors reporting, ensuring 100% adherence to accuracy standards and timely submission for compliance. Managed business requirements and testing for large-scale data projects, ensuring all decision system changes adhered to strict change control procedures and corporate governance. Engineered a sophisticated risk monitoring framework using all-lines data to establish automated KRI thresholds, directly identifying million-dollar borrower exposures. Pioneered a strategic automation initiative that retired high-risk user-defined applications (UDAs), slashing operational risk and reclaiming 3 FTE in annual team capacity.
Duration: Jan 2017 - Sep 2022 | Role: SAS Consultant and Lead
Extraction of raw data from different sources like Data marts, Netezza, Oracle, DB2 and Mainframes. Planning and testing of General-Purpose Marts with Bank Data Warehouse in Snowflake and Oracle. Data preparation and bringing into standard format before analysis. Involved in production support activities, fixing failed jobs and ensuring SLA is met. Resolved incidents raised by users. Involved in fine tuning of jobs and increasing system productivity. Managed weekend activities to support backup of production systems. Documented all processes created and technical execution flows. Gathered new requirements for different modules and planned releases of enhancements.
Duration: Apr 2016 - Dec 2016 | Team Size: 10 | Role: Technical Lead
Client interaction for smooth execution of Development (Enhancements), Maintenance and production support work. Design, Development and implementation of SAS reports and security. Troubleshooting performance issues and optimization of SAS code in production. SAS FMS, SAS IFS and SAS DI administration tasks. SAS Code migration to Dev, Test and Production environment. SAS server health check, space and performance monitoring activity. SAS Platform and client issues resolution. User creation and access control in SAS metadata server. Analysis of issues and providing fix/resolution. Change request creation for production code deployment and ensuring user issue and incident resolution within SLA.
Duration: Oct 2015 - Apr 2016 | Team Size: 12 | Role: Technical Lead
CDH (Controlling Data Warehouse for GTB GY, CB&S GY & PBC JV GMC) developed to meet diversified needs of Management Information. Responsibility shifted from Finance to GTO (IT Managed Application) requiring compliance with IT Policies. Owned wider implications of any business change from organizational perspective. Defined business vision for the project along with client. Monitored progress in line with business vision. Contributed to key requirements, design and review sessions. Ensured collaboration across stakeholder business areas. Prepared summary reports and interacted with business users for requirement gathering. Owned overall sprint or iteration outcome. Conducted project risk identification and mitigation planning. Handled team and assigned tasks to offshore team members.
Duration: Sept 2012 - Oct 2015 | Team Size: 5
Shared Business Intelligence system is part of Business Insurance BI providing services for developing simple to complex reports and maintaining data warehouses. Interacted with business users for requirement gathering. Responsible for data validation of data being sent to analysts using SAS. Prepared summary reports. Conducted reviews with portfolio teams on data reconciliation. Tested new logic in SAS and provided signoff to IT team on implementation of new logic. Developed SAS Macros and procedures for program optimization. Maintained existing code and supported reporting functions with ad-hoc reporting capabilities.
Duration: Sept 2012 - Oct 2015 | Role: Sr. SAS Developer
Travelers Insurance is a 150-year-old player among the world's top non-life insurance providers providing insurance in property & liability, personal & commercial lines and re-insurance. Developed and deployed automated KPI/KRI scorecards and "Large Loss" reporting to isolate claims outliers. Responsible for data validation of data being sent to analysts using SAS. Tested new logic in SAS and provided signoff to IT team on implementation. Applied mathematical and statistical techniques to perform complex data mining, directly influencing customer risk scores. Developed SAS Macros and procedures for program optimization. Maintained existing code and supported reporting functions.
While equity markets maintain a veneer of optimism, a synthesis of credit and geopolitical data suggests we are entering a period of significant structural fragility. For risk professionals in banking and insurance, three specific signals demand immediate attention:
High-yield corporate bonds have recently broken above their long-term accumulated congestion zones. This technical breach suggests that the market is finally repricing for a “higher-for-longer” reality. When junk yields escape these consolidation zones, it often precedes a sharp tightening of corporate credit, putting immense pressure on firms that relied on low-cost debt for P&L optimization.
The auto subprime sector is currently acting as the “canary in the coal mine.” Delinquency rates have surged past 2008-era peaks, driven by inflated vehicle valuations and high interest rates. From a CX strategy perspective, this represents a total breakdown in the Member Effort Score (CES); consumers are reaching a breaking point that will likely spill over into broader retail banking products.
Ongoing global conflicts continue to act as a persistent inflationary floor. Beyond the humanitarian cost, the “War Premium” complicates central bank efforts to ease rates. This volatility necessitates more rigorous strategic hypothesis testing and NPV modeling to ensure capital is protected against sudden supply chain shocks.
Strategic Takeaway: We are moving from a regime of “cheap growth” to one of “rigorous defense.” Success in the current climate is defined by the ability to transition from reactive reporting to predictive risk frameworks that identify these anomalies before they manifest as realized losses on the balance sheet.
In this post, we’ll explore the fundamentals of ETL pipeline optimization and how to achieve 40% runtime reduction.
ETL (Extract, Transform, Load) is the backbone of modern data engineering. Let’s break down the optimization strategies:
By implementing these strategies at Citi Group, we achieved:
Effective ETL optimization requires:
Stay tuned for more insights on data engineering best practices!