Senior Data Infrastructure Engineer
This role is based in Athens, Greece.
Connecting networks, clouds and businesses, Console Connect by PCCW Global is dedicated to helping organisations overcome the barriers and complexity of connecting to the cloud. Our goal is to provide businesses with on-demand, dedicated connectivity into cloud service providers and partners around the globe, making access to business-critical applications simple, predictable and ultra-secure.
Console Connect by PCCW Global is the world’s first global software-defined interconnection platform, born out of the belief that business connectivity should be simpler and more accessible for all. Console Connect enables users to efficiently manage their private connections via a user-friendly interface, regardless of their level of technical expertise.
Backed by PCCW Global, one of the world’s leading telecommunications groups with a tier 1 global IP network, Console Connect is completely scalable and offers maximum resilience and reliability, leaving you confident and secure in your cloud connections.
Role Overview
The candidate should blend a deep understanding of data infrastructure with software development and operational skills, aligning with the broad responsibilities of the role.
Candidates who demonstrate a balance of technical expertise, practical experience, and soft skills are ideal for the position.
Key Responsibilities
- Python Microservices Development: Design and develop Python-based microservices to support data operations. This involves creating scalable and efficient services for data processing, transformation, and integration, ensuring they align with the overall data infrastructure.
- Design and Development of Data Infrastructure: Build and maintain robust data infrastructure, including databases, data warehouses, and big data platforms, ensuring they are optimized for performance and scalability.
- Data Streaming and Processing Technologies: Implement and manage data streaming platforms like Apache Kafka for real-time data processing, and ensure seamless integration with other components of the data infrastructure.
- Collaboration with Cross-functional Teams: Work closely with data scientists, software engineers, and other stakeholders to understand data and operational requirements, ensuring that the infrastructure and microservices developed meet these needs effectively.
- Continuous Integration and Continuous Deployment (CI/CD): Develop and maintain CI/CD pipelines for the automated testing and deployment of microservices and data infrastructure components. This includes integrating code quality checks, automated testing frameworks, and deployment strategies to streamline development and release processes.
- System Maintenance, Monitoring, and Troubleshooting: Oversee the regular maintenance of the data infrastructure and microservices, monitoring system performance, and quickly addressing any operational issues.
- Security and Data Compliance: Ensure that all aspects of the data infrastructure and microservices adhere to necessary security standards and compliance regulations, particularly in handling sensitive data.
- Innovation and Best Practices: Stay updated with the latest trends and best practices in data infrastructure, microservices architecture, and CI/CD processes. Continuously explore and adopt new technologies and tools to enhance the data ecosystem.
- Documentation and Effective Communication: Maintain clear documentation for system architectures, configurations, and operational procedures. Effectively communicate technical concepts and updates to both technical and non-technical team members.
Qualifications
- Bachelor’s or Master’s Degree in Computer Science, Information Technology, Engineering, or a related field. Alternatively, significant relevant experience can sometimes substitute for formal education.
- Experience with Data Infrastructure: Proven experience in designing, implementing, and maintaining data infrastructure such as databases, data warehouses, and big data platforms.
- Proficiency in Python: Strong skills in Python programming, particularly in developing microservices and data processing scripts.
- Expertise in Data Streaming and Processing: Hands-on experience with data streaming technologies like Apache Kafka and familiarity with OLAP databases.
- Knowledge of CI/CD Tools: Experience with continuous integration and continuous deployment tools and methodologies. Familiarity with automation tools like Jenkins, GitLab CI, or similar.
- Experience with Containerization and Orchestration: Proficiency in using Docker, Kubernetes, or similar containerization and orchestration tools.
- Microservices Architecture: Understanding of microservices architecture principles and experience in developing and deploying microservices.
- Familiarity with Data Management Tools: Experience with data management, transformation, and reporting tools such as Airbyte, Apache Airflow, and others.
- Database Knowledge: Proficiency in SQL and experience with relational databases like MySQL and alternative databases like Druid as well as an understanding of database replication technologies.
- API Development: Experience with REST API development and management tools like Tyk.
- Problem-Solving Skills: Strong analytical and problem-solving abilities, with a keen attention to detail.
- Open Source Contributions: Experience contributing to open source projects or familiarity with open source community engagement is a plus.
- Agile Methodologies: Familiarity with Agile software development practices, including the use of tools like JIRA and Git.
- Communication Skills: Excellent verbal and written communication skills, crucial for collaborating in a remote-first environment and documenting technical decisions.
- Team Collaboration: Ability to work effectively in a team, perform/lead peer reviews, and coordinate with cross-functional teams.
- Adaptability and Continuous Learning: Eagerness to learn new technologies and adapt to changing technological landscapes.
Athens, I, GR