AWS Data Engineer

About NETSOL: NETSOL Technologies is at the forefront of providing sophisticated software and services for asset financing and leasing. Our solutions cater to the automotive, equipment, banking, and lending industries with our comprehensive suite of IT services, including software development, AWS consulting, and more.

About the role:

We at NETSOL Cloud Services dept are looking for an experienced Big data engineer with solid data engineering, data lake, ETL, DWH concepts. You will build data solutions using state of the art technologies to acquire, ingest and transform big data and publish into DWH following best practices at each ETL stage.

Key Responsibilities:

  • Design and develop data applications on AWS using big data technologies to ingest, process, and analyze large disparate datasets.
  • Build robust data pipelines on Cloud using Airflow, Spark/EMR, Kinesis/Kafka, AWS Glue, Lambda or other AWS technologies.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from various data sources using SQL and AWS big data technologies.

Requirements:

  • 1-2 years of Experience delivering Data lake and data warehousing projects in AWS
  • ETL experience and understanding
  • SQL (Joins, Window Functions, CTEs)
  • EMR / Spark
  • Redshift
  • ETL tool – Airflow, AWS Glue etc
  • Python / Scala
  • Understanding of dimensional modelling, Facts & dimensions tables
  • AWS certifications
    • Certified Machine Learning Specialty

Nice to Have:

  • Snowflake
  • Kinesis/Kafka
  • Databricks
  • DBT
  • AI/ML

 

Why NETSOL? Join our innovative team that values creativity and collaboration. We provide our employees with opportunities for professional growth in a supportive and forward-thinking environment.

Careers@netsoltech.com

Job Details

  • Software & Web Development
  • Innovation Lab
  • Permanent
  • Lahore
  • 4
  • Experienced Profesional
  • 1 year to 3 years
  • March 25, 2024
  • May 31, 2024
Share this job
Show all jobs