Over 10+ years of Software Life Cycle experience in System Analysis, Design, Development, Implementation, and Testing of Data Warehouse and Data Integration Applications in domains like Banking, Pharmaceutical, and Insurance.
Over 10+ years of professional experience in information technology as a data engineer specializing in database development, ETL development, data modeling, report development, and big data technologies.
Proficient in data integration and data warehousing using Informatica PowerCenter, AWS Glue, SQL Server Integration Services (SSIS).
Chartered Financial Analyst with expertise in financial risk management, banking, financial services, risk management, and IT services.
Skilled in designing business intelligence solutions with Microsoft SQL Server, utilizing MS SQL Server Integration Services (SSIS), MS SQL Server Reporting Services (SSRS), and SQL Server Analysis Services (SSAS).
Extensive utilization of Informatica PowerCenter and Informatica Data Quality (IDQ) for ETL processes, including extraction, transformation, and cleansing of data from various sources.
Proficient in Amazon Web Services (AWS) cloud services such as Snowflake, EC2, S3, RDS, EMR, VPC, IAM, RedShift, and more.
Strong expertise in relational database systems, including Oracle, MS SQL Server, Teradata, MS Access, and DB2, along with proficiency in SQL, PL/SQL, and SQL PLUS.
Hands-on experience with the AWS Snowflake cloud data warehouse and the AWS S3 bucket for integrating data from multiple source systems.
Extensive experience in integrating Informatica Data Quality (IDQ) with Informatica PowerCenter.
Proficient in data mining solutions and generating data visualizations using Tableau, PowerBI, and Alteryx.
Well-versed in Cloudera ecosystems such as HDFS, Hive, SQOOP, HBASE, Kafka, and Spark.
Utilization of Flume, Kafka, and Spark Streaming to ingest real-time data to HDFS.
Experience with the AWS Data Pipeline for configuring data loads from S3 into Redshift.
Proficient in data migration from Teradata to the AWS Snowflake environment using Python and BI tools like Alteryx.
Skilled in building and architecting data pipelines, end-to-end ETL, and ELT processes for data ingestion and transformation.
Experience in moving data between GCP and Azure using the Azure Data Factory.
Developed Python scripts for parsing flat files, CSV, XML, and JSON files and loading data into a data warehouse.
Developed automated migration scripts using Unix shell scripting, Python, Oracle/TD SQL, and TD Macros.
kchanikya1998@gmail.com
(205) 552-2779
Sign up to view 10 direct reports
Get started
This person is not in any teams