Big Data Architect
About DHL IT Services and the AI & Analytics team
At DHL IT Services, we are designing, building and running IT solutions for the whole DPDHL globally.
The AI & Analytics team builds and runs solutions to get much more value out of our data. We help our business colleagues all over the world with machine learning algorithms, predictive models and visualizations. We manage more than 46 AI & Big Data Applications, 3.000 active users, 87 countries and up to 100,000,000 daily transaction.
Integration of AI & Big Data into business processes to compete in a data driven world needs state of the art technology. Our infrastructure, hosted on-prem and in the cloud (Azure and GCP), includes MapR, Airflow, Spark, Kafka, jupyter, Kubeflow, Jenkins, GitHub, Tableau, Power BI, Synapse (Analytics), Databricks and further interesting tools.
We like to do everything in an Agile/DevOps way. No more throwing the “problem code” to support, no silos. Our teams are completely product oriented, having end to end responsibility for the success of our product.
Who do we look for?
Currently, we are looking for Senior BigData Architect. In this role, you will have the opportunity to design and develop solutions, contribute to roadmaps of Big Data architectures and provide mentorship and feedback to more junior team members. We are looking for someone to help us manage the petabytes of data we have and turn them into value.
Does that sound a bit like you? Let’s talk! Even if you don’t tick all the boxes below, we’d love to hear from you; our new department is rapidly growing and we’re looking for many people with the can-do mindset to join us on our digitalization journey. Thank you for considering DHL as the next step in your career – we do believe we can make a difference together! #digitalplatforms
What will you need?
- University Degree in Computer Science, Information Systems, Business Administration, or related field
- 3+ years of experience in the role of Big Data Architect or related
- 3+ years of experience in Hadoop technologies: Hive, Spark, Kafka, HBase, HDFS
- Experience with development in Java, Python, or both
- Knowledge of Dockers, OpenShift, Kubernetes; interconnection of the these technologies to Hadoop/MapR is a plus
- Knowledge of modeling languages and tools (UML, MS Visio)
- Experience with Jira and Confluence
- Technology agnostic mindset with a proven experience to identify and learn new systems, languages and frameworks
- Experience with cloud Big Data Solutions – preferably experience with Azure Data Lake, Analytics and other related services: ADLS Gen2, Azure Functions, Azure Databricks, Azure Data Factory, EventHub, Stream Analytics, Data Explorer, Synapse Analytics/SQL DW, AKS, API management;
- Have a solid understanding of delivery methodology (SCRUM, Waterfall) and lead teams in the implementation of the solution according to the design/architecture.
You should have:
- Certifications in some of the core technologies
- Knowledge of visualization tools (Tableau, PowerBI)
- Experience in creating CI/CD pipelines
- Knowledge of Machine Learning principles and tools
- Knowledge of AutoML tools (e.g. Data Robot)
- Certification in agile (PMI-ACP, Scrum Alliance)
- Experience with Azure or GCP
- Experience with Ansible scripts
- Overview on architecture trends with an eye on market/technical conditions and future direction.
- Experience in defining new architectures and ability to drive an independent project from an architectural standpoint.
- Experience leading customer workshop sessions to educate customers on the latest technology trends and best practices.
- Ability of authoring, editing and presenting technical documents.
·English – Fluent spoken and written (C1 level)
Zahlen und Fakten
DHL INFORMATION SERVICES (INDIA) LLP
„Die Deutsche Post DHL bietet mir Sicherheit, die Chance mich weiterzuentwickeln und die Möglichkeit, in fast jedem Land der Welt zu arbeiten.“
Aktueller Mitarbeiter – Senior Consultant in Bonn