You're an expert in creating AI engineer, producet developer, data science, advanced analytics, machine learning and consulting resume or CV. I'm looking for roles which are more focused towards senior AI engineer, product development, consulting, problem-solving, machine learning, advanced analytics, large language models, agentic AI and client engagement. Use high-level technical terminology like Medallion Architecture, OneLake, RAG (Retrieval-Augmented Generation), Agentic Workflows, Claude Skills and MLOps. Create an ATS style CV and also a single slide CV in powerpoint also. Please try to give it a spin of experienced AI engineer.
Please use the below information to generate a good CV for me. Please paraphrase this to align more towards a AI engineer role. Also, pick out important, latest and relevant details and make the best possible CV with most impactful details only. Please hide any client information, PII and sensitive information. Can you please make an impactful CV and emphasize number wherever possible. Thanks.
Mayank Mahawar is dedicated analytics professional with extensive hands-on experience in delivering analytical value by building advanced machine learning and deep learning models throughout the insurance value chain. His professional interests include implementing and researching novel algorithms, developing automated data pipelines, ML workflow operationalization, continuous improvement and designing efficient data science solutions to map business opportunities to value with stakeholders.
My educational qualifications are:
- Mayank holds a B. Tech degree from IIT Delhi with GPA of 7.618/10 in Mechanical Engineering. The 4 year degree was completed in May 2019.
- I have done my Senior Secondary School from Macro Vision Academy, Burhanpur, Madhya Pradesh with 92.6% in 2015
- I have done my High school from DAV Public school, Neeljay, Maharashtra with CGPA of 9.8/10 in 2013.
As for work experience, I'll try to list the latest ones first and older ones later,
- From Jun 2025 - present, I'm leading the team of 2 people working alongside other teams in a collaborative manner to develop product and solutions to make the claims workflow better starting with Motor Fraud Detection: Developed an end-to-end fraud model in Fabric. Using Synapse Data Science, Lakehouses, and MLflow for experiment tracking. Then developing PI Propensity & Severity Modeling: Built models to predict the likelihood and financial severity of Personal Injury claims. Highlight the use of Fabric Notebooks and Semantic Link to connect ML outputs to Power BI for business stakeholders. Unstructured Data Analysis: Performed deep-dive analysis on claim notes and medical document reviews using Fabric’s integration with Azure OpenAI. Document Fraud: Implemented automated checks for document tampering/anomalies within the Fabric pipeline. Agentic AI Exploration: I have been researching and prototyping Agentic AI workflows (e.g., using LangGraph or Semantic Kernel inside Fabric) to automate the 'Claims Adjuster' reasoning process—specifically how agents can autonomously 'triage' medical reports and flag inconsistencies. The Ai exploration also navigated the "Human-in-the-loop" Agentic workflows. In insurance, fully autonomous AI is rare due to regulation, so showing you built agents that assist a human adjuster makes your experience look much more "production-ready." I also did multiple external data smart entity resolution based on agentic systems, designing workflows to speed up claims reviews based on explainable AI using shapley and shapash, and impactful reporting at a deeper level with continuous stakeholder engagement code-reviews, playback and insight reviews. I also used open source LLMs to generate claim note summarization, using the hugging face inference, as a pilot for capability demonstration and ensured that the proof of concept is PII safe using Presidio. We delivered around 350K GBP of benefits in the testing phase itself for the phase I at an ROI 0f 4-5x for the intial phase as team, with more benefits to come as adoption increases with SME confidence. One of the main challenge was pseudo-anonymization of PII information with backwards compatibility with SAS (on the data platform) and python (in MS Fabric in analytics platform). We did this because the analytics platform was still in development so we have to ensure that it is PII safe always. This was achieved using SAS macros to anonymize the PII using Deterministic Encryption (AES/RSA) with Salt I also did LLM based reporting rationalization where we reduce the no. of reports and improve the efficacy of management insights and business reporting while maintaining the ability to produce on demand ad-hoc reporting for specialized needs, like we saw an increase in TPD spend in 2024-2025 as compared to previous years, there we factored in covid aspects, inflationary aspects, auto-CPI, Uk repair inflation and other risk factors to get a holistic view of the reasons of the inflation so that management and operations can act on it in a prescriptive manner.
- From Feb 2025 to until Jun 2025, we we're focusing on diagnostics for a major UK personal line insurer on all fronts, from architecture design to enable analytics, AI and new age LLM powered agentic system. The task was intricately assessing the current state and proposing interventions to better the overall claims system with a unique prioritization approach to identify high value use cases with the most potential circumventing the challenges around existing tech stack of SAS and SFTP data transfers. The mail goal was to enable AI and analytics despite the ongoing transition of the insurer's to cloud systems and IT overhaul related to entity separation. We managed this challenges and timelines and had the projects lined up in a flexible and robust manner to make the workflow better and ensure that we have ample work and transition smoothly into new age agentic systems. This involved AI engineering on Microsoft fabric, and product development knowledge to manage the work.
- In Jan 2025, I was focusing on using GenAI to create synthetic data to accelerate product development and claims IQ product which is basically claims lifecycle management project
- In August 2024 until present (December 2024), I started a data science project with Brit Insurance UK to develop a triaging app which will help underwriters triage better with the help of multi layered drill-down traffic light scoring system. The system comprises of rules-engine, LLM retrieval augmented generation, analytics integrated into frontend. My main focus was to work with Data scientists, ML Ops, Engineer to weave underwriter and business requirement into the application. The first phase of it that is the rules-engine is in production now and currently working on LLM based underwriter notes and proactive questionnaire generation.
- In September 2024, I've pitched Analytics and Data science solutions to Howden Broker. The solutions pitched were submission triage, bordereau management solution, cross-sell analytics and management, Also some Databricks and Azure AI. The pitch was focussed on partnered and phased approach with comprehensive opportunity diagnostic and market analysis.
- During Jun 2024 to July 2024, I was working with Hiscox to deliver in-house submission triaging product called Xtrakto which uses GenAI based data extraction and summarisation coupled with classification modelling for lead prioritisation and decline. It is currently in production. I was also involved in pitching Tesco for Request for proposal on claims data, analytics and AI strategy. I also pitched EXL Santander for GenAI modernisation accelerator which uses GenAI to convert SAS SQL to spark, provide roadmap, details of effective lineage capture, service as a product vs support and licensing model and also connecting other activities like data migration to delta lake. I also pitched Highmark health regarding the Genai based in-house developed data lineage and profiling tool which has capability to intricately interrogate ethe different data sources. I also pitched AIB for extracting Teradata SQL lineage and increasing it's visibility using out data lineage tool.
- During Sep 2023 to May 2024, I was working on gen-ai base code multi-lingual translation using LLM and data lineage. This found significant use case in accelerating legacy code translation and data and platform modernisation efforts. A good example is moving banking SAS code to Databricks in which case code can be optimised to use Databricks SQL and pyspark with higher order function implemented in python and endpoints using mlflow. The efforts have bagged 200k GBP project for 3month POC with >GBP 2M enterprise wide licensing projects over 3 years. This was made possible due to LoRa and QLoRA finetuning of LLMs with crafty prompt engineering. This was accompanied by significant market research, analysis and proven value delivery use-case. It's sold as a product as opposed toa factory model. It is observed that it can improve code translation FTE efficiency by 50% . The GenAI implementation is as much integration as much as development.
- I'm actively involved in ideating GenAI opportunities and solutions in insurance, banking, broking and fintech domain. The opportunities identified are, regulatory reporting, cross-sell, broker assist, data audit, pricing optimisation and fraud framework using business model canvas which includes, market testing, revenue potential, competitive edge, ROI and cost estimate. The was a pitch to CVS and McKesson for GenAI based data audit tool.
- I'm also active in strategizing insurance vertical strategy for market growth for EXL UK. The opportunities span analytics, data, right shoring, etc. solutioning.
- From Jan 2024 to Mar 2024, designed product bundle for comprehensive data solutions involving data enrichment, cross-sell, motor next best action etc. The design was done on Figma and then API and model pipeline design and partial implementation was done. The aim was to increase revenue using product as a service model.
- In Dec 2023 I worked in London UK for brit insurance to optimise outward reinsurance process by identifying opportunities of automation and careful management and optimisation. This identified 75.3% turnaround time savings for contract processing, resulting in improved efficiency and customer satisfaction.
- In Nov 2023, I worked on Cobol migration tool which would help banks with managing legacy code, we've had analysed around 10 million lines of Cobol code using network analysis, clustering and network algorithms to identify families and isolate auxiliary dependencies to propose better migration and maintainance strategy for LifePro tool and Prudential Bank.
- In October 2023, developed novel entity matching algos using Open source LLM for report rationalisation by identifying different KPIs across data sources and linking them to reduce the no. of reports from 354 to 170, which is -51.94%.
- In April 2023 to Sep 2023, worked for Hiscox on new business automation. Before that in Feb 2023 worked on product development.
- In Jan 2023, did data science enablement, model management and platform architecture design for canopius.
- In Jul 2022 to Dec 2022, worked on Ageas Image analytics project with estimated £427,870+£1,101,166 by top line and bottom line pricing adjustment and demand elasticity curve analysis.
- Analysed FCA consumer duty regulation and adherence and market research.
- Wrote motor fraud white paper as a publication in insurance thought leadership.
- Worked on call analytics on AWS for AA and insights from unstructured data. Speech and text analytics
- In Jan 2022 to Jul 2022 worked on segmentation image models to analyse property satellite imagery to identify property features for data augmentation and eventually live pricing adjustments.
- Before that worked on Image models for using morphological operations to identify roof features like roof type, staining, ponding etc. Leading UK Personal Line Insurer, Motor Fraud Propensity and Third Party Assist Model Developed and deployed end-to-end fraud capture pipelines on databricks with integrated NLP and external databases leading to an estimated £700K p.a. savings and 38% uplift. Implemented novel test and learn methodology with continuous SME feedback for evaluating 20+ external databases for usability in fraud model, reducing false positives by 70%. Performed real-time analytics on third-party claim differentials aiding in better offers and TP capture, leading to lower indemnity spend and overall better customer satisfaction. Leading Fortune 500 US Personal Line Insurer, Continuous Household Risk Model Developed multiple segmentation and classification deep learning models to identify roof type, shape, stains, tarp etc., from satellite imagery of houses for a continuous risk score. Developed a combination of Mask-RCNN segmentation and an ensemble of EfficientNet B7 classification models to create hybrid model for significant uplift to house risk models. Created a shadow-detection pipeline that removes the shadow from trees and structures to provide a significant lift in slight stain detection, ponding and missing shingles models. Major UK Personal Line Insurer, Data & Analytics Diagnostics Performed digital diagnostic and assessment of the insurer's current data and analytics landscape, identifying gaps and performing a cost-benefit analysis and ROI calculation. Identified critical areas of analytical interventions for reducing indemnity spend in the claims function for 4 LOBs, with the expected benefit of £3-4M p.a. and process optimization. Top commercial Property and Casualty Insurer in UK, Product risk adjusted customer cross-sell model Ingested data from Duns & Bradstreet and multiple other relational data sources using a combination of SQL Server and pyODBC API calls with routine data quality reporting. Developed automated ETL processes having sequential transforms like Weight of Evidence, Information Value and Auto-Encoders for data cleaning and feature reduction. Delivered LightGBM models using Python and SQL to identify cross-sell opportunities and worked with the client to increase value by optimized marketing factoring in risk appetite. Implemented an optimization framework using hyperopt and skopt, reducing modelling time by ~5x, resulting in increased batch frequency and faster overall turnaround time. Formulated geography market sizing analysis to identify high-worth cross-sell opportunities of $ 120 M commercial insurance products across UKI, Europe and Australia.
- In 2019 developed cross-sell model. Top commercial Property and Casualty Insurer in UK, Product risk adjusted customer cross-sell model. Ingested data from Duns & Bradstreet and multiple other relational data sources using a combination of SQL Server and pyODBC API calls with routine data quality reporting. Developed automated ETL processes having sequential transforms like Weight of Evidence, Information Value and Auto-Encoders for data cleaning and feature reduction. Delivered LightGBM models using Python and SQL to identify cross-sell opportunities and worked with the client to increase value by optimized marketing factoring in risk appetite. Implemented an optimization framework using hyperopt and skopt, reducing modelling time by ~5x, resulting in increased batch frequency and faster overall turnaround time. Formulated geography market sizing analysis to identify high-worth cross-sell opportunities of $ 120 M commercial insurance products across UKI, Europe and Australia.
Key skills: Classification, Regression, Hyper-parameter tuning, Image Processing, NLP, Transfer Learning, Segmentation, Recommendation system, Ensemble, GLM, LGBM, Clustering, Feature Reduction Technical Skills: Azure, Spark, Sagemaker, Python, SQL, MATLAB. Packages – SHAP, Lime, LightGBM, Hyperopt, Tensorflow, OpenCV, Spacy, NLTK, OptBinning, MLflow, Scipy, Pandas, Requests, Prophet