Senior Data Platform Engineer
G2
Software Engineering
Gurugram, Haryana, India
Location
Gurgaon
Employment Type
Full time
Department
Digital Markets
About G2 - The Company
G2 is the world's largest and most trusted software marketplace. When you join G2, you’re joining the industry’s leading team that helps businesses reach their peak potential by powering decisions and strategies with trusted insights from real software users.
Now, we have joined forces with Capterra, SoftwareAdvice, and GetApp to create the largest source of online data and software insights to fuel intelligent buying in the age of AI. With 200M+ combined annual visitors and 6M verified reviews, we are now the centralized place to enable software buyers to make better and faster decisions with confidence.
And we are just getting started! We are setting out to transform the global B2B software industry and become the most trusted data foundation for buyers and sellers of software for the age of AI.
Does that sound exciting to you? Come join us as we try to reach our next PEAK!
About G2 - Our People
At G2, everything we are and what we do is grounded in our PEAK values— (Performance + Entrepreneurship + Authenticity + Kindness. Working at G2 means you are part of a value-driven, growing global community that climbs PEAKs together. We cheer for each other’s successes, learn from our mistakes, and support and lean on one another during challenging times. With ambition and entrepreneurial spirit we push each other to take on challenging work, which will help us all to grow and learn.
You will be part of a global, diverse team of smart, dedicated, and kind individuals - each with unique talents, aspirations, and life experiences. At the heart of our community and culture are our people-led ERGs, which celebrate and highlight the diverse identities of our global team. As an organization, we are intentional about our DEI and philanthropic work (like our G2 Gives program) because it encourages us all to be better people.
About The Role
The Data Platform Engineer is responsible for enabling data practitioners such as data engineers, business analysts and data scientists to self-serve on a robust and scalable platform infrastructure by designing and building the right tools and reusable data platform frameworks.
In This Role, You Will:
Design, develop, and maintain the storage, processing, orchestration, cataloging and governance components of a scalable and secure data platform.
Build the tools, libraries, and services that allow other teams to own and manage their own pipelines and workflows independently.
Provide self-service infrastructure (e.g., templates, SDKs, CI/CD patterns, DBT macros) to support repeatable and consistent data engineering practices.
Implement and manage data platform components: orchestration frameworks, data catalog, access control layers, and metadata systems.
Collaborate with stakeholders to define SLAs, monitoring, and observability across the data stack.
Champion infrastructure as code, automation, and standardization across the platform.
Ensure data security, compliance, and cost efficiency across environments.
Mentor and guide other data platform associates with solutioning, code reviews and best practices adoption.
Minimum Qualifications:
We realize applying for jobs can feel daunting at times. Even if you don’t check all the boxes in the job description, we encourage you to apply anyway.
6+ years of experience in Data/Infrastructure Engineering, with at least 3 years focused on building self-service platforms.
Proficiency in Python, SQL and experience building reusable templates and frameworks.
Deep understanding of cloud data resources in AWS (S3, EKS, Glue, Athena etc.).
Expertise in building and supporting solutions on Snowflake.
Experience in setting up integrations or connecting tools (for e.g. Snowflake - Power BI, Snowflake - AWS services etc.)
Hands-on experience with orchestration frameworks (Airflow, Prefect etc.).
Experience with distributed systems and data processing frameworks (e.g., Apache Spark).
Comfortable building and debugging CI/CD, infrastructure as code (Terraform), and GitOps practices.
Familiarity with Kubernetes, Docker, and container-based deployment models.
Demonstrated capability in security & governance: RBAC, masking, SSO (Okta), secrets management and audit logging.
What Can Help Your Application Stand Out:
Experience in Data Science productionisation (MLOps).
Active contributions to open-source data projects.
Experience implementing Apache Iceberg for open-table formats.
Experience implementing Kafka, Kinesis, or Flink for streaming architectures.
Exposure to observability tools (Datadog, splunk etc.).
Our Commitment to Inclusivity and Diversity
At G2, we are committed to creating an inclusive and diverse environment where people of every background can thrive and feel welcome. We consider applicants without regard to race, color, creed, religion, national origin, genetic information, gender identity or expression, sexual orientation, pregnancy, age, or marital, veteran, or physical or mental disability status. Learn more about our commitments here.
--
For job applicants in California, the United Kingdom, and the European Union, please review this applicant privacy notice before applying to this job.
How We Use AI Technology in Our Hiring Process
G2 incorporates AI-powered technology to enhance our candidate evaluation process. These tools may assist with initial application screening, skills assessment analysis, and identifying candidates whose qualifications align with specific role requirements. While AI technology supports our recruitment workflow, all final hiring decisions remain under human oversight and judgment.
Your Choice Matters: If you would prefer that your application be reviewed without AI assistance, you can opt out by entering your email address in the email entry field at the bottom of the Automated Processing Legal Notice. Choosing to opt out will not disadvantage your application in any way—we will ensure your materials receive a thorough manual review by our hiring team.
For additional details about how we handle your information throughout the application process, please review G2's Applicant Privacy Notice.