- Hybrid Model
- Long Term Contract
- Daily Rate
A leading global e-commerce company is seeking several experienced Senior Software Engineers to join their Data Platforms Team. This is a fantastic opportunity to work on large-scale distributed data systems that drive business-critical operations.
About the Team and the Role
The Hadoop Team is central to the company's big data infrastructure. They deliver scalable, robust solutions for data processing and storage, aligned with strategic goals to drive data-informed decision-making and improve user experience. The team is responsible for optimizing Hadoop-based platforms that underpin key innovation initiatives across the organization.
Key Responsibilities
- System Optimization & Scalability:
- Lead the enhancement of Hadoop-based systems to ensure high availability, fault tolerance, and scalability. Your work will be essential in maintaining reliable performance and preparing the platform for future growth.
- High-Impact Project Delivery:
- Drive key projects focused on improving data processing efficiency and system performance. Your efforts will directly impact the organization's ability to harness large-scale data for insights.
- Innovative Tool Development:
- Design and develop customer-facing tools to streamline data access, management, and user interaction. Enhance the operational efficiency and experience for both internal and external stakeholders.
- System Integration:
- Ensure seamless integration of Hadoop with other platforms and tools. Enable cross-functional teams to effectively leverage data with reduced manual intervention, driving strategic decision-making.
Required Skills & Experience
- Bachelor’s degree in Computer Science, Information Technology, or related discipline.
- Proven expertise in one or more of the following programming languages: Java, Scala, Python.
- Solid understanding of data structures, algorithms, and performance optimization techniques.
- Hands-on experience with Linux/Unix environments, including shell scripting and core system operations.
- Strong problem-solving and analytical skills, particularly in the context of distributed systems and large-scale data platforms.
Desirable (Nice to Have)
- Experience with big data frameworks such as Hive, Spark, or HBase.
- Contributions to open-source projects in the Hadoop or data engineering ecosystem.