We are now looking for a Senior Software Engineer with overall 6 to 10 years of experience who are passionate about building analytical solutions on our new Data Platform. You will help the H&M group to derive the most of our data assets - with ultimate accountability for the business and customer value it delivers.
To do this, you will be responsible for on boarding existing data platforms and solutions to our to-be self-service data platform which is based on a Data Mesh architecture and the role itself requires you to be a natural leader in exploring new analytical designs. As a senior software engineer, you are good at coaching others – like leading programming sessions, presenting and aligning solutions to complex problems and have a wide perspective on engineering practices to build scalable and robust platforms.
You are very familiar with modern engineering practices have a DevOps mindset and familiar with automating software delivery. When it comes to the platform itself, we see that you are very familiar with multiple cloud platforms, monitoring, CI/CD, Infrastructure-as-Code and have knowledge about secure coding practices.
Working closely with the Product Area leadership and other Engineers to ensure overall alignment & key results. We believe you have passion for agile, have a strong agile and DevOps mindset as have a deep understanding and passion for software development. You are smart, hungry, and humble always striving to continuously improve, learn and believe in a learning organization.
As a Senior Software Engineer you have great leadership skills, believes in a non-hierarchical culture of collaboration, transparency, safety, and trust.
Qualifications:
- Hands-on experience from Google Cloud Platform technologies like Google Cloud Platform, Pub/Sub, Cloud Functions, DataFlow, Google Cloud Storage, BigTable, BigQuery, DataLab and DataStudio.
- Good understanding of Google Cloud Composer/Airflow
- Experience within Java, Python, SQL/Go
- Experience with message queues like Kafka, Google Pub/Sub, Storm or other stream processing tools.
- Experience building data infrastructure in the cloud (e.g., AWS, Google Cloud, Azure)
- Practical experience with Python and SQL including the ability to write, analyze, and debug processes supporting data transformation, data structures, metadata, dependency, and workload management
- Build reliable data pipelines to clean, aggregate, and transform large volumes of data from multiple sources.
- Build effective data APIs endpoints to support data applications across the company
- Hands-on experience deploying analytical models to solve business problems
- Knowledge of containerization and container orchestration technologies such as Google Kubernetes Engine (GKE).
- Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders.
Requirements:
- Have several years of experience as a software developer with strong abilities in modern engineering practices
- Advanced knowledge in Infrastructure as Code and cloud-based platforms
- Have competence within different agile frameworks and supporting tools
- Great collaboration and communication skills in English
- Have strong problem-solving skills – able to find solutions in complex situations and help it come alive by coding