Job Description
Join a leading fintech company as a Data Engineer, where you’ll be responsible for designing and optimizing data pipelines, ETL processes, and data architecture for high-scale applications.
Key Responsibilities:
✅ Build, manage, and optimize ETL data pipelines
✅ Work with structured and unstructured data from various sources
✅ Ensure data security, compliance, and integrity
✅ Collaborate with data scientists, engineers, and business teams
Requirements:
🔹 4+ years of experience in data engineering, Python, and SQL
🔹 Experience with big data technologies (Spark, Hadoop, Kafka)
🔹 Strong knowledge of cloud platforms (AWS, GCP, Azure)
🔹 Expertise in data modeling and data warehousing (Snowflake, Redshift, BigQuery)
Preferred Skills:
✔️ Experience with Apache Airflow, Kubernetes, or Terraform
✔️ Exposure to machine learning and AI-driven analytics
✔️ Understanding of DevOps practices for data engineering
📝 If you have not heard from us within 2 weeks, please consider your application unsuccessful.