Senior Data Engineer ~ Remote within the United States
Responsibilities:
- Data Architecture and Integration - Integrate diverse data sources by designing and implementing scalable high performance data pipelines and infrastructure.
- Build ETL processes using a variety of methods to facilitate efficient movement of data between systems
- Incorporate best practices to ensure data pipelines are robust and include error handling and alerting.
- Data Model: Design data models that turn complex organizational data into easy to use data structures ensuring seamless flow for operational, analytics and reporting needs.
- Use DBT or Snowflake native capabilities to transform data and ensure data is accurate and up to date.
- Optimize data structures and queries to ensure efficient data storage and retrieval.
- Data Quality: Implement and enforce data quality standards to ensure the accuracy and reliability of educational data used in academic assessments globally.
- Collaboration: Collaborate with cross-functional teams, including analysts and business stakeholders, to understand unique data requirements for data processing or analytics needs.
- Security: Implement and maintain data security measures to protect sensitive information, ensuring compliance with international data privacy regulations.
- Snowflake Expertise: Possess expertise in Snowflake for effective data warehousing, transformations, data sharing, and structured management in a cloud-based environment.
- Documentation: Create and maintain comprehensive documentation for data engineering processes, systems, and workflows
- Team Leadership: Provide technical leadership and mentorship to junior members of the data engineering team, fostering a collaborative environment dedicated to excellence in handling educational data.
- Tableau: Utilize Tableau to deliver data insights using impactful visualizations in a meaningful and actionable manner.
Qualifications:
- Education: Bachelor's or Master's Degree in Computer Science (they’d love this!), Information Technology, or a related field, with a strong emphasis on Data related technologies.
- Experience: Significant experience (usually 5+ years) in data engineering roles, with a focus on designing and implementing scalable data solutions.
- Programming Skills: Proficiency in programming languages such as Python, Java, or Scala.
- Database Knowledge: Expertise in working with both SQL and NoSQL databases, with hands-on experience in database design and optimization. Proficiency in Snowflake is required.
- Knowledgeable of Machine Learning concepts and an interest to grow in the area.
- Cloud Platforms: Experience with cloud platforms like Azure (it's what they use!), AWS, or Google Cloud Platform, and proficiency in leveraging cloud-based.
- Version Control: Knowledge of version control systems like Git or Azure DevOps for managing the codebase.
- Problem-Solving Skills: Strong analytical and problem-solving skills to address complex data engineering challenges specific to educational data.
- Communication: Excellent communication skills to effectively collaborate with cross-functional teams and non-technical stakeholders.
- Adaptability: Ability to adapt to evolving technologies and industry best practices in the field of data engineering.
Nice to Have:
- Experience using TIBCO EBX
- Experience using Boomi or other low code ETL tools
- Experience using Azure for Data Engineering tasks i.e. function apps and other Azure functionality that can be used to build data pipelines.