About the position
We are seeking a highly experienced Data Architect with strong expertise in Azure, .NET/C#, modern data warehousing, and distributed data processing. The ideal candidate will be responsible for designing scalable data architectures, data models, and end-to-end data pipelines supporting enterprise analytics, reporting, and data-driven applications.
Key Responsibilities
- Design and architect end-to-end data platforms leveraging Azure cloud services, big data frameworks, and modern data warehouse technologies.
- Develop enterprise-grade data models using 3NF, star schema, and snowflake structures.
- Define data architecture standards, best practices, and governance guidelines.
- Build scalable ETL/ELT pipelines using tools like Azure Data Factory (ADF), Informatica, Talend, dbt, and cloud-native services.
- Partner with engineering teams to develop data ingestion, transformation, and processing workflows in C#, .NET, Python, and SQL.
- Implement distributed data processing frameworks leveraging Spark, Databricks, Kafka, Hadoop, or Delta Lake.
- Design and optimize cloud-based modern data warehousing solutions (e.g., Snowflake, Databricks, Synapse, BigQuery).
- Ensure data quality, security, lineage, and metadata management across all data layers.
- Collaborate with business stakeholders, data engineers, solution architects, and analytics teams to ensure scalable and accurate data delivery.
- Evaluate and recommend new technologies, tools, and patterns to enhance the data ecosystem.
Requirements - Cloud & Programming
- Strong experience with Azure data and storage services (Azure Data Lake, Synapse, Databricks, Azure SQL, ADF).
- Programming expertise in C#, .NET, Python, and SQL.
- Data Modeling
- Expertise in:
- 3rd Normal Form (3NF)
- Star schema
- Snowflake schema
- Hands-on experience designing conceptual, logical, and physical data models.
- ETL/ELT & Data Integration
- ADF
- Informatica
- Talend
- dbt
- Cloud-native transformation frameworks (Databricks SQL/PySpark)
- Modern Data Warehousing & Big Data
- Snowflake
- Databricks (Delta Lake)
- Azure Synapse Analytics
- Google BigQuery
- Exposure to big data ecosystems:
- Spark
- Kafka
- Delta Lake
- Hadoop/HDFS
- Data Processing & Architecture
- Experience with batch/streaming data pipelines.
- Knowledge of data governance, cataloging, and master data management concepts.
- Understanding of distributed computing principles and performance tuning.
Preferred Qualifications
- Certifications in Azure Data Engineering/Architecture (e.g., DP-203, AZ-305).
- Experience working with enterprise-scale data platforms in BFSI or regulated industries.
- Exposure to BI/reporting (Power BI, Tableau) is a plus.
Education
- Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, or related field.
Desired Skills:
- Azure Data Lake
- Databricks
- Azure SQL
- ADF
- C#
- .NET
- Python
Desired Qualification Level:
About The Employer: