Description:
Data Engineer Skills: Airflow, Spark, Data Brick, Snowflake, Deltalake, Kafka Onsite: Glendale CA Full Time Key Responsibilities:Data Pipeline Development: Design, implement, and maintain scalable and efficient data pipelines using Apache Airflow for orchestrating workflows. Big Data Processing: Build and optimize data workflows utilizing Apache Spark for large-scale data processing, ensuring high performance and reliability. Cloud Data Platforms: Work with Databricks to leverage a unified an
Apr 2, 2025;
from:
dice.com