N

Data Engineer II

Nium
Full-time
On-site
Mumbai

Nium is the global infrastructure company powering real-time cross-border payments. Founded to deliver the payments infrastructure of tomorrow, today, we are building a programmable, borderless, and compliant money-movement layer that powers transactions between people, businesses, and intelligent systems — enabling banks, fintechs, payroll providers, travel platforms, marketplaces, and other global enterprises to move money instantly, anywhere in the world. 


Co-headquartered in San Francisco and Singapore, with offices in 14 markets and team members across 20+ countries, we take pride in a culture anchored in Keeping It Simple, Making It Better, and Winning Together. 2025 was the strongest year in our 10-year history, with record revenue, record transaction volumes, and EBITDA profitability — and we are now entering one of the most dynamic chapters in our journey. We believe the best work happens face-to-face, and we operate a hybrid model with three in-office days per week to strengthen collaboration, alignment, and innovation. 


We move over $50B annually across a network that spans 190+ countries, 100 currencies, and 100 real-time corridors. We power fast payouts to accounts, wallets, and cards; enable local collections in 35 markets; and support card issuance in 34 countries — all backed by licenses across 40+ markets. 


With over $300M raised to date, Nium offers ambitious builders the opportunity to shape the future of global money movement — at scale. 


The Role


At Nium, we’re revolutionising the global payments industry — and data sits at the core of everything we do. We’re looking for a passionate and skilled Data Engineer to join our India-based Data Engineering team.


This is more than a purely technical role. You’ll help shape the future of our data platform as we migrate from on-premises systems to AWS cloud. You’ll work hands-on with modern tools such as Airflow, Redshift, dbt, and Kafka, collaborating closely with product, engineering, analytics, and business teams to deliver high-quality, scalable data solutions that power real-world payments and enable data-driven decision-making.


You’ll have the opportunity to contribute to high-impact projects, solve complex technical problems, and work on a platform that supports millions of users globally. If you’re motivated by learning, collaboration, and building resilient data platforms at scale, this role offers a strong opportunity for growth.

\n


Key Responsibilities
  • Build and optimise scalable, maintainable, and high-performance data pipelines and workflows.
  • Ingest, transform, and deliver high-volume data from a range of structured and unstructured sources, including MySQL databases and real-time Kafka streams.
  • Design and maintain performant data models in Redshift, and optimise SQL queries to support analytics and reporting workloads.
  • Contribute to our cloud migration journey and help evolve the data architecture with new ideas and improvements.
  • Collaborate with cross-functional, globally distributed teams to translate business requirements into robust technical solutions.
  • Embed data quality, reliability, observability, and security as first-class principles across the data platform.
  • Support and enhance data solutions operating in a mission-critical payments environment.
  • Continuously learn, share knowledge, and help raise standards across data governance, quality, and security practices.


Requirements
  • 2–3 years of professional experience in Data Engineering or a related role.
  • Strong SQL expertise, including query optimisation, complex transformations, and datamodelling.
  • Solid Python skills for data engineering and pipeline development.
  • Hands-on experience with AWS services such as Redshift, Lambda, Glue, and S3, as well as Airflow for orchestration.
  • Familiarity with modern transformation and modelling practices using tools such as dbt.
  • A collaborative mindset, strong problem-solving skills, critical thinking, and a willingness to take ownership and initiative.
Nice to Have
  • Experience contributing to large-scale cloud migration initiatives.
  • Knowledge of real-time data streaming using Kafka.
  • Exposure to CI/CD pipelines and infrastructure-as-code for data platforms.
  • Interest in exploring AI/ML use cases within modern data ecosystems.
  • Understanding of the payments domain or fintech data challenges.


\n

What we offer at Nium  

 

We Value Performance: Through competitive salaries, performance bonuses, sales commissions, equity for specific roles and recognition programs, we ensure that all our employees are well rewarded and incentivized for their hard work. 


We Care for Our Employees: The wellness of Nium’ers is our #1 priority. We offer medical coverage along with 24/7 employee assistance program, generous vacation programs including our year-end shut down. We also provide a flexible working hybrid working environment (3 days per week in the office). 


We Upskill Ourselves: We are curious, and always want to learn more with a focus on upskilling ourselves. We provide role-specific training, internal workshops, and a learning stipend.


We Celebrate Together: We recognize that work is also about creating great relationships with each other. We celebrate together with company-wide social events, team bonding activities, happy hours, team offsites, and much more!  


We Thrive with Diversity: Nium is truly a global company, with more than 33 nationalities, based in 18+ countries and more than 10 office locations. As an equal opportunity employer, we are committed to providing a safe and welcoming environment for everyone.