Description:
Position: Data Engineer with Google Cloud Platform-12+ years Location: Onsite Sunnyvale, CA Need locals Mandatory Skills: SparkPython ScalaGoogle Cloud PlatformAirflowDAGETLPySpark Job Description : Design, develop, and automate data processing workflows using Airflow, PySpark, and Dataproc on Google Cloud Platform.Develop ETL (Extract, Transform, Load) processes that handle diverse data sources and formats.Manage and provision Google Cloud Platform resources including Dataproc clusters, serve
Aug 25, 2025;
from:
dice.com