Easy Apply
12
Date posted
Work from home options
Commute time
Application method
Job language
Skills
Categories
Sectors
Regions
Cities
Experience
Type of Contract
Type of Work

Filters

SortingRelevance
Date posted
    Work from home options
      Commute time
      Application method
        Job language
          Skills
            Categories
              Sectors
                Regions
                  Cities
                    Experience
                      Type of Contract
                        Type of Work

                          Popular Jobs

                          • IT
                          • Communication
                          • Business
                          • Management
                          • Design
                          • Problem Management
                          • Data Insights Engineer
                          • Adviser Sales
                          • Analyst Business
                          • Assistant Business Development
                          • Apprentice
                          • Azure Architect
                          • BD Manager
                          • Account Manager Key Accounts
                          • Accounting Assistant Payroll

                          show more

                          Date posted

                          New jobs
                          14


                          Commute time

                          Skills








                          Categories

                          IT
                          102






                          304 results for Data Pipelines jobs in Diegem within a 30 km radius

                          Senior Data Engineer

                          Smals
                          Brussel
                          Partially remote
                          • Trouble-shooten van issues en performance bottlenecks om de data pipelines te optimaliseren - Als Senior Data Engineer maak je deel uit van het data team dat verantwoordelijk is voor de implementatie van het gemeenschappelijk dataplatform bij een klant die een belangrijke speler is in de sociale zekerheid voor de domeinen werkloosheid, loopbaanonderbreking en tijdskrediet. Je combineert een analytische aanpak met technische expertise om transformation pipelines te ontwerpen, te bouwen en te onderhouden. Je bent actief betrokken bij de ingestion fase van de data lifecycle —van allerlei source systemen naar het Data Platform, die gebruikt maakt van de medallion architecture — en het verwerken van de gegevens in de eerste twee zones van het lakehouse. Samen met de binnen het team zorg je voor de orchestration opdat data accuraat, toegankelijk en afgestemd is op de ...
                          • Trouble-shooten van issues en performance bottlenecks om de data pipelines te optimaliseren - Als Senior Data Engineer maak je deel uit van het data team dat verantwoordelijk is voor de implementatie van het gemeenschappelijk dataplatform bij een klant die een belangrijke speler is in de sociale zekerheid voor de domeinen werkloosheid, loopbaanonderbreking en tijdskrediet. Je combineert een analytische aanpak met technische expertise om transformation pipelines te ontwerpen, te bouwen en te onderhouden. Je bent actief betrokken bij de ingestion fase van de data lifecycle —van allerlei source systemen naar het Data Platform, die gebruikt maakt van de medallion architecture — en het verwerken van de gegevens in de eerste twee zones van het lakehouse. Samen met de binnen het team zorg je voor de orchestration opdat data accuraat, toegankelijk en afgestemd is op de ...
                          more
                          • Trouble-shooten van issues en performance bottlenecks om de data pipelines te optimaliseren - Als Data Engineer maak je deel uit van het data team dat verantwoordelijk is voor de implementatie van het gemeenschappelijk dataplatform bij een klant die een belangrijke speler is in de sociale zekerheid voor de domeinen werkloosheid, loopbaanonderbreking en tijdskrediet. Je combineert een analytische aanpak met technische expertise om transformation pipelines te ontwerpen, te bouwen en te onderhouden. Je bent actief betrokken bij de ingestion fase van de data lifecycle —van allerlei source systemen naar het Data Platform, die gebruikt maakt van de medallion architecture — en het verwerken van de gegevens in de eerste twee zones van het lakehouse. Samen met de binnen het team zorg je voor de orchestration opdat data accuraat, toegankelijk en afgestemd is op de noden van de organisatie.
                          • Trouble-shooten van issues en performance bottlenecks om de data pipelines te optimaliseren - Als Data Engineer maak je deel uit van het data team dat verantwoordelijk is voor de implementatie van het gemeenschappelijk dataplatform bij een klant die een belangrijke speler is in de sociale zekerheid voor de domeinen werkloosheid, loopbaanonderbreking en tijdskrediet. Je combineert een analytische aanpak met technische expertise om transformation pipelines te ontwerpen, te bouwen en te onderhouden. Je bent actief betrokken bij de ingestion fase van de data lifecycle —van allerlei source systemen naar het Data Platform, die gebruikt maakt van de medallion architecture — en het verwerken van de gegevens in de eerste twee zones van het lakehouse. Samen met de binnen het team zorg je voor de orchestration opdat data accuraat, toegankelijk en afgestemd is op de noden van de organisatie.
                          more
                          Wil je bouwen aan schaalbare data pipelines, werken met moderne cloudtechnologie en impact maken in real-life dataprojecten? * Ontwerpen, bouwen en onderhouden van data pipelines in Azure-omgevingen - Ben jij een gedreven Data Engineer die ervaring heeft met Apache Spark, Databricks of Microsoft Fabric? Dan ben jij misschien de data engineer die wij zoeken! * Samenwerken met experts binnen het LACO-team waar je als teamspeler je passie voor data kan delen en verrijken
                          Wil je bouwen aan schaalbare data pipelines, werken met moderne cloudtechnologie en impact maken in real-life dataprojecten? * Ontwerpen, bouwen en onderhouden van data pipelines in Azure-omgevingen - Ben jij een gedreven Data Engineer die ervaring heeft met Apache Spark, Databricks of Microsoft Fabric? Dan ben jij misschien de data engineer die wij zoeken! * Samenwerken met experts binnen het LACO-team waar je als teamspeler je passie voor data kan delen en verrijken
                          more
                          Experience with Data Engineering & data pipelines We complement these partnerships with in-depth Cyber, Data & AI, Software development and multi cloud expertise. As Lead Data Architect, you will be at the heart of our customers data strategy. You'll define how it should be structured and help them make decisions that are data-driven. Alongside Rik, our Head of Data & Analytics, you will sync with the customer and define what can be done with the data they have and how it can be used to tackle different business challenges. You will constantly be supported by the Data Tribe that is full of fellow engineers with a * Have a know how of data-architecture and data-infrastructure * Pipeline development experience with Python, Golang or Java * Experience in Google Cloud or other cloud environments & data processing solutions
                          Experience with Data Engineering & data pipelines We complement these partnerships with in-depth Cyber, Data & AI, Software development and multi cloud expertise. As Lead Data Architect, you will be at the heart of our customers data strategy. You'll define how it should be structured and help them make decisions that are data-driven. Alongside Rik, our Head of Data & Analytics, you will sync with the customer and define what can be done with the data they have and how it can be used to tackle different business challenges. You will constantly be supported by the Data Tribe that is full of fellow engineers with a * Have a know how of data-architecture and data-infrastructure * Pipeline development experience with Python, Golang or Java * Experience in Google Cloud or other cloud environments & data processing solutions
                          more

                          Senior Data Scientist - ML Engineer

                          Devoteam
                          Machelen
                          Easy Apply
                          Build and maintain data pipelines for data ingestion, processing, and transformation - We complement these partnerships with in-depth Cyber, Data & AI, Software development and multi cloud expertise. We are seeking a highly motivated and experienced Senior Data Scientist - ML Engineer to join our growing Data & AI team (50 * Contribute to the development of Devoteam's data * 5+ years of experience in a data science or machine learning role * Experience with big data technologies such as Hadoop, Spark, or Hive * Work with innovative and impactful Cloud, Data & AI projects for industry-leading clients
                          Build and maintain data pipelines for data ingestion, processing, and transformation - We complement these partnerships with in-depth Cyber, Data & AI, Software development and multi cloud expertise. We are seeking a highly motivated and experienced Senior Data Scientist - ML Engineer to join our growing Data & AI team (50 * Contribute to the development of Devoteam's data * 5+ years of experience in a data science or machine learning role * Experience with big data technologies such as Hadoop, Spark, or Hive * Work with innovative and impactful Cloud, Data & AI projects for industry-leading clients
                          more
                          You will have the opportunity to work with a variety of clients across different industries, collaborating with both internal teams and client stakeholders to deliver high-quality data pipelines and enable data-decision-. * Data Pipeline Engineering: Design, develop, and deploy robust and scalable data pipelines using Databricks, incorporating data extraction from diverse sources (databases, APIs, streaming platforms), transformation and cleansing using Spark, and loading into target systems (data lakes, data warehouses, etc.). * Databricks Ecosystem Expertise: Utilize the full capabilities of the Databricks platform, including Databricks SQL, Delta Lake, Databricks Runtime, and Databricks Workflows, to orchestrate complex data workflows and ensure data quality and pipeline reliability. * Data Quality Assurance: Implement rigorous data quality checks and validation ...
                          You will have the opportunity to work with a variety of clients across different industries, collaborating with both internal teams and client stakeholders to deliver high-quality data pipelines and enable data-decision-. * Data Pipeline Engineering: Design, develop, and deploy robust and scalable data pipelines using Databricks, incorporating data extraction from diverse sources (databases, APIs, streaming platforms), transformation and cleansing using Spark, and loading into target systems (data lakes, data warehouses, etc.). * Databricks Ecosystem Expertise: Utilize the full capabilities of the Databricks platform, including Databricks SQL, Delta Lake, Databricks Runtime, and Databricks Workflows, to orchestrate complex data workflows and ensure data quality and pipeline reliability. * Data Quality Assurance: Implement rigorous data quality checks and validation ...
                          more
                          Support the DataCamp Data Platform, which is deployed on BigQuery, with data pipeline jobs managed by Airflow. DataCamp's mission is to empower everyone with the data and AI skills essential for 21st-century success. By providing practical, engaging learning experiences, DataCamp equips learners and organizations of all sizes to harness the power of data and AI. As a trusted partner to over 17 million learners and 6,000+ companies, including 80% of the Fortune 1000, DataCamp is leading the charge in addressing the critical data and AI skills shortage. DataCamp's infrastructure team, which is part of the Platform Engineering department, is a T-shaped cross functional team that looks after CI/CD pipelines, cloud infrastructure (deployed on AWS), logging, monitoring and security. The infrastructure team also looks after the data platform (deployed on GCP) as we have ...
                          Support the DataCamp Data Platform, which is deployed on BigQuery, with data pipeline jobs managed by Airflow. DataCamp's mission is to empower everyone with the data and AI skills essential for 21st-century success. By providing practical, engaging learning experiences, DataCamp equips learners and organizations of all sizes to harness the power of data and AI. As a trusted partner to over 17 million learners and 6,000+ companies, including 80% of the Fortune 1000, DataCamp is leading the charge in addressing the critical data and AI skills shortage. DataCamp's infrastructure team, which is part of the Platform Engineering department, is a T-shaped cross functional team that looks after CI/CD pipelines, cloud infrastructure (deployed on AWS), logging, monitoring and security. The infrastructure team also looks after the data platform (deployed on GCP) as we have ...
                          more
                          Easy Apply
                          Experience with Data Engineering & data pipelines We complement these partnerships with in-depth Cyber, Data & AI, Software development and multi cloud expertise. As Lead Data Engineer, you will be at the heart of our customers data strategy. You'll define how it should be structured and help them make decisions that are data-driven. Alongside Sven, our Head of Data & Analytics, you will sync with the customer and define what can be done with the data they have and how it can be used to tackle different business challenges. You will constantly be supported by the Data Tribe that is full of fellow engineers with a * Have a know how of data-architecture and data-infrastructure * Pipeline development experience with Python, Golang or Java * Experience in Google Cloud or other cloud environments & data processing solutions * interview with Sven, our Head of Data & Analytics: You'll have ...
                          Experience with Data Engineering & data pipelines We complement these partnerships with in-depth Cyber, Data & AI, Software development and multi cloud expertise. As Lead Data Engineer, you will be at the heart of our customers data strategy. You'll define how it should be structured and help them make decisions that are data-driven. Alongside Sven, our Head of Data & Analytics, you will sync with the customer and define what can be done with the data they have and how it can be used to tackle different business challenges. You will constantly be supported by the Data Tribe that is full of fellow engineers with a * Have a know how of data-architecture and data-infrastructure * Pipeline development experience with Python, Golang or Java * Experience in Google Cloud or other cloud environments & data processing solutions * interview with Sven, our Head of Data & Analytics: You'll have ...
                          more
                          As a Senior Google Data Engineer, you will be responsible for the end-to-end design, implementation, and optimization of robust, scalable data pipelines and solutions primarily on Google Cloud Platform (GCP). * 5+ years of experience in Data Engineering with a focus on building and maintaining large-scale, production-grade data pipelines. We complement these partnerships with in-depth Cyber, Data & AI, Software development and multi cloud expertise. Important to know, if you are not a Google Data Engineer but an AWS Data Engineer or Azure Data Engineer and you really would like to learn and grow into a Google Data Engineer role, no problem we can - You will translate complex business requirements into technical blueprints and lead the hands-on delivery of the data infrastructure. Alongside Rik, our Head of Data & Analytics, you will sync with the customer and serve as the ...
                          As a Senior Google Data Engineer, you will be responsible for the end-to-end design, implementation, and optimization of robust, scalable data pipelines and solutions primarily on Google Cloud Platform (GCP). * 5+ years of experience in Data Engineering with a focus on building and maintaining large-scale, production-grade data pipelines. We complement these partnerships with in-depth Cyber, Data & AI, Software development and multi cloud expertise. Important to know, if you are not a Google Data Engineer but an AWS Data Engineer or Azure Data Engineer and you really would like to learn and grow into a Google Data Engineer role, no problem we can - You will translate complex business requirements into technical blueprints and lead the hands-on delivery of the data infrastructure. Alongside Rik, our Head of Data & Analytics, you will sync with the customer and serve as the ...
                          more
                          Design & Architecture: Architect, build, and maintain scalable data pipelines, data models, and Lakehouse architectures using OneLake, Data Factory, and Synapse Data Engineering/Warehouse, Databricks. We complement these partnerships with in-depth Cyber, Data & AI, Software development and multi cloud expertise. Lead the Data Revolution. Drive Client Success with Data & AI. Your Mission: Turning Data Insights into Impact - As a Microsoft Data Solution & Presales Architect, you are more than a technical leader—you are a visionary and evangelist. We are seeking a highly experienced Microsoft Data & AI Solution & Pre-sales Architect to not only master the Microsoft Fabric platform and related services but also to act as a and in client engagements. You will transform complex business challenges into actionable, predictive solutions, enabling data-driven decisions for our -leading clients.
                          Design & Architecture: Architect, build, and maintain scalable data pipelines, data models, and Lakehouse architectures using OneLake, Data Factory, and Synapse Data Engineering/Warehouse, Databricks. We complement these partnerships with in-depth Cyber, Data & AI, Software development and multi cloud expertise. Lead the Data Revolution. Drive Client Success with Data & AI. Your Mission: Turning Data Insights into Impact - As a Microsoft Data Solution & Presales Architect, you are more than a technical leader—you are a visionary and evangelist. We are seeking a highly experienced Microsoft Data & AI Solution & Pre-sales Architect to not only master the Microsoft Fabric platform and related services but also to act as a and in client engagements. You will transform complex business challenges into actionable, predictive solutions, enabling data-driven decisions for our -leading clients.
                          more
                          Design, build, and maintain scalable data pipelines and data models within Microsoft Fabric, utilizing components like OneLake, Data Factory and Data Engineering. We complement these partnerships with in-depth Cyber, Data & AI, Software development and multi cloud expertise. You will play a crucial role in transforming raw data into actionable insights, enabling data-driven decision-the organization. * Develop and optimize data warehousing solutions using Data Warehouse in Fabric. * Implement and manage robust data governance, security, and compliance best practices within the Fabric environment, potentially leveraging tools like Microsoft Purview. * Leverage Data Science and Real-Time Analytics capabilities within Fabric to deliver advanced analytics and predictive insights. * Utilize Data Activator to monitor data and trigger actions based on defined patterns and conditions.
                          Design, build, and maintain scalable data pipelines and data models within Microsoft Fabric, utilizing components like OneLake, Data Factory and Data Engineering. We complement these partnerships with in-depth Cyber, Data & AI, Software development and multi cloud expertise. You will play a crucial role in transforming raw data into actionable insights, enabling data-driven decision-the organization. * Develop and optimize data warehousing solutions using Data Warehouse in Fabric. * Implement and manage robust data governance, security, and compliance best practices within the Fabric environment, potentially leveraging tools like Microsoft Purview. * Leverage Data Science and Real-Time Analytics capabilities within Fabric to deliver advanced analytics and predictive insights. * Utilize Data Activator to monitor data and trigger actions based on defined patterns and conditions.
                          more
                          Our SaaS platform leverages advanced data technology and semantic search to deliver high-quality leads, empowering companies to grow their sales pipelines efficiently. •Experience with data scraping and pipeline orchestration
                          Our SaaS platform leverages advanced data technology and semantic search to deliver high-quality leads, empowering companies to grow their sales pipelines efficiently. •Experience with data scraping and pipeline orchestration
                          more
                          Easy Apply
                          Build and maintain data platforms and pipelines using Azure, AWS or Google Cloud, Databricks, and Python. We complement these partnerships with in-depth Cyber, Data & AI, Software development and multi cloud expertise. Are you passionate about data-driven platforms, machine learning models, and innovative applications? As a Data Scientist/Engineer at Devoteam, you'll play a hands-on role in delivering impactful solutions across various clients and industries. * Platform Setup: Design and implement robust data platforms using Azure, AWS or Google Cloud and Databricks. * Data Management: Gather, prepare, and manage data for analysis. * Act as a strategic to client Data Science teams, helping them navigate their challenges and aspirations. * Inspire clients with new data-driven opportunities. * Bring best practices in Git, DevOps, and CICD to enhance client data platforms.
                          Build and maintain data platforms and pipelines using Azure, AWS or Google Cloud, Databricks, and Python. We complement these partnerships with in-depth Cyber, Data & AI, Software development and multi cloud expertise. Are you passionate about data-driven platforms, machine learning models, and innovative applications? As a Data Scientist/Engineer at Devoteam, you'll play a hands-on role in delivering impactful solutions across various clients and industries. * Platform Setup: Design and implement robust data platforms using Azure, AWS or Google Cloud and Databricks. * Data Management: Gather, prepare, and manage data for analysis. * Act as a strategic to client Data Science teams, helping them navigate their challenges and aspirations. * Inspire clients with new data-driven opportunities. * Bring best practices in Git, DevOps, and CICD to enhance client data platforms.
                          more
                          To process the large amounts of data we receive daily, we built internal tools around a flexible cloud-based data processing pipeline which includes several AI models. We are looking for a talented Full Stack Software Engineer to help us build web applications for management, visualization and analysis of drone acquired data. Our web platform provides fast and easy access to massive amounts of geospatial data. Live volume measurements, 3D visualizations and data aggregation are just examples of tools we build to allow our customers to make informed decisions.
                          To process the large amounts of data we receive daily, we built internal tools around a flexible cloud-based data processing pipeline which includes several AI models. We are looking for a talented Full Stack Software Engineer to help us build web applications for management, visualization and analysis of drone acquired data. Our web platform provides fast and easy access to massive amounts of geospatial data. Live volume measurements, 3D visualizations and data aggregation are just examples of tools we build to allow our customers to make informed decisions.
                          more
                          NEW
                          To process the large amounts of data we receive daily, we built internal tools around a flexible cloud-based data processing pipeline which includes several AI models. We are looking for a talented Full Stack Software Engineer to help us build web applications for management, visualization and analysis of drone acquired data. Our web platform provides fast and easy access to massive amounts of geospatial data. Live volume measurements, 3D visualizations and data aggregation are just examples of tools we build to allow our customers to make informed decisions.
                          To process the large amounts of data we receive daily, we built internal tools around a flexible cloud-based data processing pipeline which includes several AI models. We are looking for a talented Full Stack Software Engineer to help us build web applications for management, visualization and analysis of drone acquired data. Our web platform provides fast and easy access to massive amounts of geospatial data. Live volume measurements, 3D visualizations and data aggregation are just examples of tools we build to allow our customers to make informed decisions.
                          more
                          To process the large amounts of data we receive daily, we built internal tools around a flexible cloud-based data processing pipeline which includes several AI models. We are looking for a talented Full Stack Software Engineer to help us build web applications for management, visualization and analysis of drone acquired data. Our web platform provides fast and easy access to massive amounts of geospatial data. Live volume measurements, 3D visualizations and data aggregation are just examples of tools we build to allow our customers to make informed decisions.
                          To process the large amounts of data we receive daily, we built internal tools around a flexible cloud-based data processing pipeline which includes several AI models. We are looking for a talented Full Stack Software Engineer to help us build web applications for management, visualization and analysis of drone acquired data. Our web platform provides fast and easy access to massive amounts of geospatial data. Live volume measurements, 3D visualizations and data aggregation are just examples of tools we build to allow our customers to make informed decisions.
                          more

                          No match yet? There are 288 more jobs that could match your search

                          Als Senior Data Analyst maak je deel uit van het data-team dat verantwoordelijk is voor de implementatie van het gemeenschappelijk dataplatform bij een klant die een belangrijke speler is in de sociale zekerheid voor de domeinen werkloosheid, loopbaanonderbreking en tijdskrediet. Je definieert de conceptuele datamodellen op basis van de businessvereisten, en samen met de andere binnen het team (analisten, , ) zorg je ervoor dat deze data-oplossingen afgestemd zijn op de noden van de organisatie. Als brug tussen het data-team en de business stakeholders verzamel en analyseer je de noden, vertaal je de business requirements naar technische specificaties en draag je actief bij aan een vlotte communicatie en afstemming binnen een transversale organisatie. Je speelt een actieve rol in het exploreren van data om diepgaande analyses, dashboards en decision-ready insights te produceren.
                          Als Senior Data Analyst maak je deel uit van het data-team dat verantwoordelijk is voor de implementatie van het gemeenschappelijk dataplatform bij een klant die een belangrijke speler is in de sociale zekerheid voor de domeinen werkloosheid, loopbaanonderbreking en tijdskrediet. Je definieert de conceptuele datamodellen op basis van de businessvereisten, en samen met de andere binnen het team (analisten, , ) zorg je ervoor dat deze data-oplossingen afgestemd zijn op de noden van de organisatie. Als brug tussen het data-team en de business stakeholders verzamel en analyseer je de noden, vertaal je de business requirements naar technische specificaties en draag je actief bij aan een vlotte communicatie en afstemming binnen een transversale organisatie. Je speelt een actieve rol in het exploreren van data om diepgaande analyses, dashboards en decision-ready insights te produceren.
                          more
                          Als Data Engineer Python ben je verantwoordelijk voor het ontwerpen, ontwikkelen en onderhouden van onze data-infrastructuur, voornamelijk met behulp van Python om onze dataverwerkingsworkflows te automatiseren en optimaliseren. • Samenwerken met om schaalbare data-opslagoplossingen te bouwen. • Je hebt een bachelor- of masterdiploma in Computerwetenschappen, Data Engineering, Informatiesystemen of een gerelateerd vakgebied, en idealiter solide professionele ervaring op het gebied van data binnen een Data Warehouse en/of Big Data omgeving. - Geavanceerde Python-expertise met diepgaande kennis van data engineering bibliotheken en frameworks. - Uitgebreide ervaring in Data Management methodologieën en best practices. - Sterke achtergrond in Data Storage oplossingen, inclusief zowel relationele als moderne data-architecturen.
                          Als Data Engineer Python ben je verantwoordelijk voor het ontwerpen, ontwikkelen en onderhouden van onze data-infrastructuur, voornamelijk met behulp van Python om onze dataverwerkingsworkflows te automatiseren en optimaliseren. • Samenwerken met om schaalbare data-opslagoplossingen te bouwen. • Je hebt een bachelor- of masterdiploma in Computerwetenschappen, Data Engineering, Informatiesystemen of een gerelateerd vakgebied, en idealiter solide professionele ervaring op het gebied van data binnen een Data Warehouse en/of Big Data omgeving. - Geavanceerde Python-expertise met diepgaande kennis van data engineering bibliotheken en frameworks. - Uitgebreide ervaring in Data Management methodologieën en best practices. - Sterke achtergrond in Data Storage oplossingen, inclusief zowel relationele als moderne data-architecturen.
                          more
                          Als Senior Data Analytics Engineer maak je deel uit van Information Delivery Services (IDS), het data-team dat verantwoordelijk is voor de implementatie van het gemeenschappelijk dataplatform bij een klant die een belangrijke speler is in de sociale zekerheid voor de domeinen werkloosheid, loopbaanonderbreking en tijdskrediet. Je combineert een analytische aanpak met technische expertise om schaalbare datamodellen en transformation pipelines te ontwerpen, te bouwen en te onderhouden. Je bent actief betrokken bij elke fase van de data-lifecycle -van data-analyse en modeling tot workflow orchestration en performance optimization- om ervoor te zorgen dat data accuraat, toegankelijk en afgestemd is op de noden van de organisatie. • Opzetten van orchestration van data-workflows met Dagster voor betrouwbare en tijdige levering.
                          Als Senior Data Analytics Engineer maak je deel uit van Information Delivery Services (IDS), het data-team dat verantwoordelijk is voor de implementatie van het gemeenschappelijk dataplatform bij een klant die een belangrijke speler is in de sociale zekerheid voor de domeinen werkloosheid, loopbaanonderbreking en tijdskrediet. Je combineert een analytische aanpak met technische expertise om schaalbare datamodellen en transformation pipelines te ontwerpen, te bouwen en te onderhouden. Je bent actief betrokken bij elke fase van de data-lifecycle -van data-analyse en modeling tot workflow orchestration en performance optimization- om ervoor te zorgen dat data accuraat, toegankelijk en afgestemd is op de noden van de organisatie. • Opzetten van orchestration van data-workflows met Dagster voor betrouwbare en tijdige levering.
                          more
                          Daarbij wordt gebruikt gemaakt van de Azure Integration Services zoals Logic apps, Azure functions, Event hub, Data factory, … * Je hebt ervarin met Ms Azure integration (AIS service), (Logic App et service bus), Azure function development, Agile organization (Devops Azure, Jira , Git ….), API/web trafic analysis, cloud management via pipeline/biceps/teraform).
                          Daarbij wordt gebruikt gemaakt van de Azure Integration Services zoals Logic apps, Azure functions, Event hub, Data factory, … * Je hebt ervarin met Ms Azure integration (AIS service), (Logic App et service bus), Azure function development, Agile organization (Devops Azure, Jira , Git ….), API/web trafic analysis, cloud management via pipeline/biceps/teraform).
                          more
                          Easy Apply
                          Being heavily driven by the modern technology landscape covering Cloud transformation, AI solutions, Data Analytics and Cybersecurity it is time to grow their footprint in Microsoft Stack * Solution Selling: Understand customer needs and effectively position Microsoft Cloud (Azure, M365 and Power platform), Data/AI and Security - Qualify leads and progress them through the sales pipeline. * Accurate Forecasting: Maintain accurate sales forecasts and provide regular updates on pipeline and deal progress * Proven Track Record: 5 years of proven experience in selling IT solutions, with a strong focus on Microsoft Cloud, Data & AI and Security
                          Being heavily driven by the modern technology landscape covering Cloud transformation, AI solutions, Data Analytics and Cybersecurity it is time to grow their footprint in Microsoft Stack * Solution Selling: Understand customer needs and effectively position Microsoft Cloud (Azure, M365 and Power platform), Data/AI and Security - Qualify leads and progress them through the sales pipeline. * Accurate Forecasting: Maintain accurate sales forecasts and provide regular updates on pipeline and deal progress * Proven Track Record: 5 years of proven experience in selling IT solutions, with a strong focus on Microsoft Cloud, Data & AI and Security
                          more
                          Experience with data warehousing, ETL/ELT pipelines, real-time data streaming technologies, data mesh architectures and data mesh organisations is a plus - We complement these partnerships with in-depth Cyber, Data & AI, Software development and multi cloud expertise. Are you a Data & AI Product Owner passionate about shaping the future of data-driven solutions? Do you thrive in a consulting environment, guiding clients on their data journey across multiple cloud platforms? As a Data & AI Product Owner, you'll be a key member of our Data Strategy team. You'll work closely with clients to understand their business challenges and translate them into a compelling vision and capabilities roadmap for data and AI products. You'll be the linchpin connecting business needs with technical solutions, ensuring that our data initiatives deliver real, tangible value.
                          Experience with data warehousing, ETL/ELT pipelines, real-time data streaming technologies, data mesh architectures and data mesh organisations is a plus - We complement these partnerships with in-depth Cyber, Data & AI, Software development and multi cloud expertise. Are you a Data & AI Product Owner passionate about shaping the future of data-driven solutions? Do you thrive in a consulting environment, guiding clients on their data journey across multiple cloud platforms? As a Data & AI Product Owner, you'll be a key member of our Data Strategy team. You'll work closely with clients to understand their business challenges and translate them into a compelling vision and capabilities roadmap for data and AI products. You'll be the linchpin connecting business needs with technical solutions, ensuring that our data initiatives deliver real, tangible value.
                          more
                          Experience in topics like deep learning, natural language processing, database design, and data and machine learning pipelines. DataCamp's mission is to empower everyone with the data and AI skills essential for 21st-century success. By providing practical, engaging learning experiences, DataCamp equips learners and organizations of all sizes to harness the power of data and AI. As a trusted partner to over 17 million learners and 6,000+ companies, including 80% of the Fortune 1000, DataCamp is leading the charge in addressing the critical data and AI skills shortage. As a Curriculum Manager - Data Science and Data Engineering, you will directly contribute to DataCamp's mission to democratize data education by helping build the best platform to learn data and AI skills. In this role, you'll work closely with instructors and internal across curriculum, engineering, product, and marketing ...
                          Experience in topics like deep learning, natural language processing, database design, and data and machine learning pipelines. DataCamp's mission is to empower everyone with the data and AI skills essential for 21st-century success. By providing practical, engaging learning experiences, DataCamp equips learners and organizations of all sizes to harness the power of data and AI. As a trusted partner to over 17 million learners and 6,000+ companies, including 80% of the Fortune 1000, DataCamp is leading the charge in addressing the critical data and AI skills shortage. As a Curriculum Manager - Data Science and Data Engineering, you will directly contribute to DataCamp's mission to democratize data education by helping build the best platform to learn data and AI skills. In this role, you'll work closely with instructors and internal across curriculum, engineering, product, and marketing ...
                          more
                          We complement these partnerships with in-depth Cyber, Data & AI and Software development. As a certified Google Cloud Managed Services partner, we develop cutting-edge practical solutions using AI, Machine Learning, Big Data, and Google Workspace. You will be part of a team that designs and builds scalable business solutions using the latest Google technologies, serving as a strategic partner to enterprises by driving innovation across multiple domains: this includes modernizing complex legacy systems and migrating critical workloads to the cloud, implementing advanced AI/ML capabilities (including GenAI and Vertex AI) to extract new business value from data, engineering large-scale data platforms (like BigQuery) for deep analytics, and ultimately building future-proof, secure, and resilient cloud-native application architectures.
                          We complement these partnerships with in-depth Cyber, Data & AI and Software development. As a certified Google Cloud Managed Services partner, we develop cutting-edge practical solutions using AI, Machine Learning, Big Data, and Google Workspace. You will be part of a team that designs and builds scalable business solutions using the latest Google technologies, serving as a strategic partner to enterprises by driving innovation across multiple domains: this includes modernizing complex legacy systems and migrating critical workloads to the cloud, implementing advanced AI/ML capabilities (including GenAI and Vertex AI) to extract new business value from data, engineering large-scale data platforms (like BigQuery) for deep analytics, and ultimately building future-proof, secure, and resilient cloud-native application architectures.
                          more

                          Popular Jobs


                          show more

                          Frequently Asked Questions

                          How many Data Pipelines jobs are there in Diegem?
                          There are 304 Data Pipelines jobs within 20 miles of Diegem available on StepStone right now.

                          What other similar jobs are there to Data Pipelines jobs in Diegem?
                          As well as Data Pipelines jobs in Diegem, you can find IT, Communication, Business, amongst many others.

                          Which skills are beneficial for Data Pipelines jobs in Diegem?
                          The following skills are useful for Data Pipelines jobs in Diegem: Communication, Business, Management, Design, Problem Management.

                          How many temporary and part-time Data Pipelines jobs are available in Diegem?
                          There are 0 temporary Data Pipelines jobs and 7 part-time Data Pipelines jobs available in Diegem at the moment.