Job Details
Job Description
Responsibilities:
Built real-time data ingestion pipelines using Apache Kafka for ingesting user behavior and transactional events.
Designed Flink-based streaming applications for complex event processing (CEP), sessionization, and near real-time analytics.
Integrated Flink with Kafka, object storage (S3), and NoSQL stores for enriched data processing.
Tuned Kafka and Flink for optimal throughput and minimal latency.
Worked in agile environments, collaborating with DevOps and data science teams for production-ready deployment.
Core Tech: Apache Kafka, Apache Flink, Java/Scala, Docker/Kubernetes
- Bachelor's degree in Computer Science, Software Engineering, Information Systems, or related field
Experience: 5–10+ years overall, with at least 2–3 years focused on Flink and Kafka
Industry Fit: Experience in high-volume transactional systems or data-driven products
Soft Skills: Proactive, analytical, strong communicator, agile mindset
How to Apply: