Description:
Responsibilities Design, develop, and maintain scalable ETL pipelines to support both analytical and operational business needs. Work with large datasets using Spark, Databricks, and Python/Scala/Java for data transformation and processing. Develop and optimize data models and architectures in Snowflake and other relational databases. Implement data orchestration and workflow automation using Airflow or similar tools. Collaborate with cross-functional teams to ensure data reliability, consistenc
Oct 8, 2025;
from:
dice.com