Lead Java/Big Data Software Developer
- Develop and deploy highly-available, fault-tolerant software that will help drive improvements towards the features, reliability, performance, and efficiency of the Genesys Cloud Analytics platform.
- Actively review code, mentor, and provide peer feedback.
- Collaborate with engineering teams to identify and resolve pain points and proselytize best practices.
- Partner with various teams to transform concepts into requirements and requirements into services and tools.
- Engineer efficient, adaptable and scalable architecture for all stages of data lifecycle (ingest, streaming, structured and unstructured storage, search, aggregation) in support of a variety of data applications.
- Build abstractions and re-usable developer tooling to allow other engineers to quickly build streaming/batch self-service pipelines.
- Build, deploy, maintain, and automate large global deployments in AWS.
- Troubleshoot production issues and come up with solutions as required.
This may be the perfect job for you if:
- You have a strong engineering background with ability to design software systems from the ground up.
- You have expertise in Java. Python and other object-oriented languages are a plus.
- You have experience in web-scale data and large-scale distributed systems, ideally on cloud infrastructure.
- You have a product mindset. You are energized by building things that will be heavily used.
- Open to mentoring and collaborating with junior members of the team.
- Be adaptable and open to exploring new technologies and prototyping solutions within a reasonable cadence.
- You have engineered scalable software using big data technologies (e.g., Hadoop, Spark, Hive, Presto, Elasticsearch, etc).
- You have experience building data pipelines (real-time or batch) on large complex datasets.
- You have worked on and understand messaging/queueing/stream processing systems.
- You design not just with a mind for solving a problem, but also with maintainability, testability, monitorability, and automation as top concerns.
Technologies we use and practices we hold dear:
- Right tool for the right job over we-always-did-it-this-way.
- We pick the language and frameworks best suited for specific problems.
- Packer and Ansible for immutable machine images.
- AWS for cloud infrastructure.
- Automation for everything. CI/CD, testing, scaling, healing, etc.
- Hadoop, Hive, and Spark for batch.
- Airflow for orchestration.
- Dynamo, Elasticsearch, Presto, and S3 for query and storage.
Back to jobs