About the position
Our client is seeking an experienced Data Architect with strong expertise in modern cloud data platforms, including Azure Databricks, AWS, Snowflake, Azure Synapse, and Azure Data Factory (ADF). The ideal candidate will have a solid background in presales, solutioning, POC (Proof of Concept) development, and direct client interaction.
This role requires a blend of technical depth, architectural thinking, communication skills, and customer-facing experience to support both delivery and business development initiatives.
Key Responsibilities
Data Architecture Engineering
- Design and implement enterprise-grade data architectures across Azure, AWS, and hybrid environments.
- Develop scalable data pipelines and ETL/ELT solutions using Azure Databricks, ADF, Synapse, and Snowflake.
- Define data models, data flow diagrams, and reference architectures.
- Ensure best practices for data governance, data quality, and metadata management.
Cloud Platform Expertise
- Architect end-to-end data solutions on Azure (Synapse, ADF, Databricks), AWS (Redshift, Glue, S3, Lambda), and Snowflake.
- Optimise data storage, compute, and cost across cloud platforms.
- Evaluate cloud-native tools and recommend fit-for-purpose solutions.
Presales Solutioning
- Work closely with sales teams to support presales activities, including solutioning, estimation, and proposal building.
- Conduct technical workshops, discovery sessions, and requirement analysis with prospective clients.
- Prepare and deliver compelling technical presentations, architecture diagrams, and solution roadmaps.
POCs Innovation
- Lead and execute Proof of Concepts (POCs) to demonstrate solution feasibility.
- Build prototypes using Databricks, Snowflake, Synapse, or AWS components to validate business use cases.
- Stay updated with new features and innovations across cloud platforms.
Client Interaction Stakeholder Management
- Engage directly with clients to understand business needs and translate them into technical architectures.
- Act as a trusted advisor to customers on data strategy, cloud adoption, and platform modernisation.
- Collaborate with cross-functional teams, including delivery, product, business analysts, and leadership.
- 10+ years of experience in data engineering, data architecture, or cloud data platforms.
- Hands-on expertise in:
- Azure Databricks (Python/Spark/Delta Lake)
- Azure Synapse Analytics
- Azure Data Factory (ADF)
- Snowflake (SQL, Snowpipe, Streams Tasks)
- AWS Data Stack (S3, Redshift, Glue, Lambda, EMR)
- Strong understanding of data warehousing, data lakes, lakehouse architecture, and ETL/ELT methodologies.
- Excellent SQL, Spark, and Python skills.
- Experience in presales, RFP responses, client workshops, and technical solutioning.
- Strong communication and presentation skills.
Preferred Qualifications
- Certifications in Azure / AWS / Snowflake.
- Experience leading large-scale cloud migration or modernisation projects.
- Knowledge of data governance tools (e.g., Purview, Collibra) is a plus.
Desired Skills:
- Systems Analysis
- Complex Problem Solving
- Programming/configuration
- Critical Thinking
- Time Management