About DHL IT Services and the AI & Analytics team:
At DHL IT Services, we design, build and run IT solutions for the whole of DPDHL globally. The AI & Analytics team builds and runs solutions to get much more value out of our data. We help our business colleagues all over the world with machine learning algorithms, predictive models and visualizations. We manage more than 46 AI & Big Data applications, with 3.000 active users in 87 countries, and up to 100,000,000 daily transactions.
Integration of AI & Big Data into business processes to compete in a data-driven world needs state of the art technology. Our infrastructure, hosted on-premise and in the cloud (Azure and GCP), includes MapR, Airflow, Spark, Kafka, jupyter, Kubeflow, Jenkins, GitHub, Tableau, Power BI, Synapse (Analytics), Databricks and other interesting tools.
We like to do everything in an Agile/DevOps way. No more throwing the “problem code” to support, and no silos. Our teams are completely product-oriented, having end to end responsibility for the success of our products.
Who are we looking for?
Currently, we are looking for a Senior Data Engineer. In this role, you will have the opportunity to design and develop solutions, contribute to roadmaps of Big Data architecture and provide mentorship and feedback to more junior team members. We are looking for someone to help us manage the petabytes of data we have and turn them into something valuable.
Does that sound a bit like you? Let’s talk! Even if you don’t tick all the boxes below, we’d love to hear from you. Our new department is rapidly growing and we’re looking for people with a can-do mindset to join us on our digitalization journey. Thank you for considering DHL as the next step in your career – we really do believe we can make a difference together!
What will you need?
• University Degree in Computer Science, Information Systems, Business Administration, or related field
• 3+ years of experience in a Data Engineer role
• Strong analytical skills related to working with structured, semi-structured and unstructured datasets
• Hands-on experience with Data Lake/Big Data projects in implementation of on-premise or cloud platforms (preferably Azure)
• Hands-on experience with Docker and Kubernetes and related ecosystem tooling on-prem and in public clouds
• Hands-on experience with public clouds (preferred: GCP, Azure), with a specific focus on using them for Data Lakes
• Experience working with Big Data technologies – e.g. Spark, Kafka, HDFS, Hive, Hadoop distributions (Cloudera or MapR)
• Experience with streaming platforms/frameworks such as Kafka, Spark-Streaming, Flink
• Good programming skills (Java/Scala/Python)
• Advanced working SQL knowledge and experience with relational databases, query authoring (SQL), as well as a working familiarity with a variety of databases
• Proven experience in building and optimizing Big Data pipelines, architecture and data sets
• Experience in performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
• Experience in building processes supporting data transformation, data structures, metadata, dependency and workload management
• A successful history of manipulating, processing and extracting value from large, disconnected datasets
You should have:
• Certificates in some of the core technologies
• The ability to collaborate across different teams/geographies/stakeholders/levels of seniority
• Customer focus with an eye on continuous improvement
• An energetic, enthusiastic and results-oriented personality
• The ability to coach other team members - you must be a team player!
• A strong will to overcome the complexities involved in developing and supporting data pipelines
• English – Fluent spoken and written (C1 level)
• Great team of IT professionals and a possibility of technical development, including unlimited access to Coursera
• An extra week of holiday (25 days/year)
• 6 Self-sickness days/year
• Company car / car allowance
• Full salary compensation for up to 10 days absence due to illness per the calendar year
• Lunch vouchers fully covered by DHL
• Cafeteria benefit system by EDENRED
• On-going professional and technical training and certifications
• Multicultural environment in modern offices
• Multisport card
• Employee Referral Program
• Company sponsorship of various sports and social clubs
• Smart casual dress code
For more details contact email@example.com. #digitalplatforms