Intellectica is recruiting a Senior AI & Data Engineer on behalf of a leading technology services group headquartered in Athens, Greece, operating under a hybrid working model. The selected candidate will join a high-performing Data & AI team and will play a pivotal role in designing, developing, and deploying cutting-edge AI solutions, contributing to transformative initiatives that enhance business performance, unlock new value streams, and drive innovation across diverse industries.
Key activities and responsibilities of this role include:
- Leading the end-to-end design and delivery of scalable data and machine learning solutions, aligning technical architecture with business objectives and strategic priorities
- Collaborating globally with data engineers, architects, and business stakeholders to define robust data architecture and modeling requirements
- Developing and optimizing ETL processes to integrate data from multiple sources, ensuring high data quality and reliability
- Adhering to software engineering best practices to deliver maintainable, scalable, and resilient data solutions
- Researching, designing, building, implementing, maintaining, and continuously improving Machine Learning systems
- Communicating complex technical solutions clearly to senior management and diverse stakeholders
- Mentoring junior team members, providing guidance to ensure high-quality deliverables
- Contributing to sales activities through expertise in data and platform architecture
- Staying current with industry trends and contributing to internal initiatives, R&D, and business development projects
- Any other task that may be required from time to time
Professional experience & qualifications of a successful candidate:
- Bachelor’s degree in Computer Science, Engineering, or a related field; a Master’s degree or PhD will be considered an asset
- Minimum of 3 years in data engineering, with a strong emphasis on cloud environments – AWS, GCP, Azure, or Cloud Native platforms
- Proven expertise in designing and managing scalable, end-to-end data pipelines (ADF, Airflow, dbt), across batch, micro-batch, and streaming architectures (e.g., Kafka, Databricks Structured Streaming, StreamSets)
- Strong experience with Big Data platforms (Hadoop, Databricks, Hive, Kafka, Apache Iceberg, Microsoft Fabric), Data Warehouses (Teradata, Snowflake, BigQuery), and Lakehouse architectures (Delta Lake, Apache Hudi), along with solid knowledge of data governance, quality, security, and metadata management (e.g., Unity Catalog, Apache Atlas, Informatica)
- Proficiency in SQL, Python, and PySpark, writing scalable and maintainable code using object-oriented principles, with practical experience in REST API integration and Linux-based environments
- Experience implementing DevOps and MLOps practices, including Git workflows, CI/CD pipelines (Azure DevOps, Jenkins, GitHub Actions), containerization and orchestration (Docker, Kubernetes), MLflow, and service mesh/API gateway architectures (e.g., Istio)
- Familiarity with Agile and Waterfall delivery methodologies
- Industry certifications (AWS, Azure, GCP, CNCF/Kubernetes, Databricks) are considered an asset; demonstrable hands-on expertise is equally valued
- Fluency in Greek with professional-level English communication skills
Core competencies of a successful candidate:
- Strong creative problem-solving skills, with a focus on addressing real-world business challenges
- Comfortable working within a global, highly collaborative team environment
- Demonstrated leadership skills, setting clear priorities and driving effective execution
- Proactive, dependable, and results-driven professional mindset
- Willingness and flexibility to travel occasionally as required
Apply Now
Let's Meet
