Talan – Positive Innovation
Talan is an international consulting group specializing in innovation and business transformation through technology. With over 7,200 consultants in 21 countries and a turnover of €850M, we are committed to delivering impactful, future-ready solutions.
Talan at a Glance
Headquartered in Paris and operating globally, Talan combines technology, innovation, and empowerment to deliver measurable results for our clients. Over the past 22 years, we’ve built a strong presence in the IT and consulting landscape, and we’re on track to reach €1 billion in revenue this year.
Our Core Areas of Expertise
Data & Technologies: We design and implement large-scale, end-to-end architecture and data solutions, including data integration, data science, visualization, Big Data, AI, and Generative AI.
Cloud & Application Services: We integrate leading platforms such as SAP, Salesforce, Oracle, Microsoft, AWS, and IBM Maximo, helping clients transition to the cloud and improve operational efficiency.
Management & Innovation Consulting: We lead business and digital transformation initiatives through project and change management best practices (PM, PMO, Agile, Scrum, Product Ownership), and support domains such as Supply Chain, Cybersecurity, and ESG/Low-Carbon strategies.
We work with major global clients across diverse sectors, including Transport & Logistics, Financial Services, Energy & Utilities, Retail, and Media & Telecommunications.
The Technical Expert – Data & AI is responsible for providing technical leadership, advanced expertise, and deep knowledge in data and artificial intelligence to design, build, optimize, and support SCIB’s data and AI platforms, pipelines, and architectures. The role acts as a technical reference within the CIO organization, ensuring that data and AI initiatives are robust, scalable, secure, and aligned with business strategy and the group’s data governance framework.
Key Responsibilities
Platform leadership, build, and maintenance:
Define standards, best practices, and frameworks for the construction, deployment, and maintenance of data and AI infrastructures (on-premises, cloud, or hybrid).
ML/AI model implementation and operations:
Participate in the implementation and deployment of machine learning, deep learning, and AI models, ensuring integrity, performance, monitoring, and ongoing maintenance.
Technical validation and tool evaluation:
Evaluate new technologies, tools, and platforms (open source, commercial, cloud) to optimize the data/AI ecosystem; lead proof-of-concepts (PoCs) and benchmarking activities.
Architecture definition and solution design support:
Support the design of data and AI technical solutions — including data pipelines, data lakes/warehouses, streaming, MLOps, and analytics platforms — to meet strategic objectives.
Integration and deployment:
Coordinate the integration of data and AI solutions with other corporate systems, ensuring compatibility, security, and efficiency.
Governance, security, and compliance:
Ensure solutions comply with data governance, security, privacy, and regulatory standards (e.g., data regulations, internal policies).
Technical support and mentoring:
Act as a technical point of reference by resolving questions from architects, data engineers, and data scientists; guiding and reviewing technical designs; and promoting best practices and standards.
Cross-functional collaboration:
Work closely with business, analytics, cybersecurity, infrastructure, and compliance teams to ensure alignment between business needs, regulations, and technical capabilities.
Documentation and technical communication:
Document architectures, processes, and technical decisions; communicate risks, technical decisions, and project status to the CIO and relevant stakeholders.
Continuous innovation:
Stay current with emerging trends in AI, big data, MLOps, automation, cost optimization, and performance. Promote piloting of new technologies when appropriate.
Strong Technical Knowledge
Programming languages:
Python, SQL. Preferred: Scala, Java, R.
Platforms / Big Data / Cloud:
Experience with cloud services (AWS, Azure, GCP) or hybrid/on-prem environments. Knowledge of tools such as Spark, Databricks, Hadoop, Kafka, Airflow, Trino, etc.
Databases and modeling:
Relational and dimensional modeling; NoSQL databases (document, key-value, columnar); data lakes and data warehouses.
Machine Learning / AI / MLOps:
ML/AI frameworks, training and inference pipelines, orchestration, containers (Docker, Kubernetes), deployment, monitoring, and model versioning.
Infrastructure & automation:
Infrastructure as Code (IaC), containers, orchestration, CI/CD pipelines, and automation tools.
Security and compliance:
Knowledge of data governance regulations, data protection, encryption, access control, and security best practices.
What do we offer you?
If you are passionate about data, development & tech, we want to meet you!
Other similar jobs that might interest you