Senior Azure Data Consultant

Engineering · Remote · Remote possible

Job description

Who We Are

Since the start, we've focused on building a collaborative environment with remote, geographically distributed teams – within the United States. Trility serves clients from all corners of the United States and globally. While Trility is headquartered in Des Moines, Iowa, we support remote work and flexible hours.

We look for team members with the grit necessary to forge paths where none previously exist, to get back up when circumstances knock you down, adapt to the changing needs of the client even when it is uncomfortable, and to deliver on our commitments. You respect and value people, recognize that over-communication is barely good enough, thrive on solving complex problems, have a passion for building teams, and know that delivering what a client actually values is more important than your own predispositions. You understand value proposition, love delivering value, and take pride in learning the expansive and ever-changing business of clients. You are self-motivated and relentlessly working to become more today than you were yesterday.

What You Will Do

Trility Consulting is actively seeking a Senior Azure Data Engineer who is an expert in the field and a powerhouse on the keyboard. In this role, you will partner with Trility clients, taking them from “napkin to production.” You will listen to their challenges with empathy while using your unique problem-solving skills to discover the right approach and deliver value throughout the engagement.  

This is a remote position with 1099 and W2 options.

  • 5+ years building and working with data services in Microsoft Azure
  • Significant experience with Azure Data Factory, Databricks, Azure Synapse, Azure Fabric, or similar technologies
  • Design, analyze, and document data flows, integrations, modeling, and transformations
  • A holistic approach to data architecture and design including considerations for data quality, data governance, and performance optimization
  • Design and implement robust enterprise data models to support large-scale data processing and analytics that ensures data integrity, scalability, and efficient data flow across various business systems
  • Lead discussions and provide oversight on Data Model design across multiple departments, teams, and subject areas
  • Significant experience building end to end data pipelines in spark, leveraging Medallion architecture and engineering best practices
  • Strong skills in software engineering best practices
  • Assist with definition of DataOps standards and testing framework for development, testing, and production environment deployments
  • Experience working in an agile software environment
  • Employs best practices in information security for cloud-based data environments
  • Uses resource provisioning tools such as Terraform, Bicep, or Azure Resource Manager (ARM)
  • Familiar with modern software source code management and review processes, such as git and pull requests, along with SaaS-based services, such as GitHub, GitLab, and Atlassian Bitbucket
  • Uses scripting tools such as Azure CLI, PowerShell, Bash, Python, and SED/AWK
  • Uses build and release tools such as Azure DevOps, Jenkins, and, GitHub Actions
  • Excellent written and verbal communication skills – comfortable presenting technical information to non-technical stakeholders