AWS Data Engineer

Position Overview:

We at NETSOL Cloud Services dept are looking for an experienced Big data engineer with solid data engineering, data lake, ETL, DWH concepts. You will build data solutions using state of the art technologies to acquire, ingest and transform big data and publish into DWH following best practices at each ETL stage.

Responsibilities:

  • Design and develop data applications on AWS using big data technologies to ingest, process, and analyze large disparate datasets.
  • Build robust data pipelines on Cloud using Airflow, Spark/EMR, Kinesis/Kafka, AWS Glue, Lambda or other AWS technologies.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from various data sources using SQL and AWS big data technologies.

Requirements:

  • 1-2 years of Experience delivering Data lake and data warehousing projects in AWS
  • ETL experience and understanding
  • SQL (Joins, Window Functions, CTEs)
  • EMR / Spark
  • Redshift
  • ETL tool – Airflow, AWS Glue etc
  • Python / Scala
  • Understanding of dimensional modelling, Facts & dimensions tables
  • AWS certifications
    • Certified Machine Learning Specialty

Nice to Have:

  • Snowflake
  • Kinesis/Kafka
  • Databricks
  • DBT
  • AI/ML

Careers@netsoltech.com

Job Details

  • Software & Web Development
  • Innovation Lab
  • Permanent
  • Lahore
  • 2
  • Experienced Profesional
  • 1 year to 3 years
  • March 25, 2024
  • April 30, 2024
Share this job
Show all jobs