
Junior Data Engineer - e-Xperience Program Associate
Allegro
Hybrid
Status
Hexjobs Insights
Junior Data Engineer role at Allegro involves developing ETL/ELT pipelines, working with Google Cloud tools, and ensuring data quality. Requires Python, SQL knowledge, and English proficiency.
Schlüsselwörter
ETL
ELT
Python
SQL
Google Cloud
BigQuery
Cloud Composer
DataOps
software engineering
data analysis
Vorteile
- Flexible working hours in hybrid model
- Mentorship and support throughout the program
- Wide selection of fringe benefits in a cafeteria plan
- Access to an internal learning platform and training courses
- Opportunities for collaboration and personal growth
Technologies we use
About the project
Your responsibilities
- You will develop and maintain ETL/ELT data pipelines processing massive datasets from the Allegro platform.
- You will expand your expertise in the Google Cloud ecosystem, working with tools like BigQuery and Cloud Composer (Airflow)
- You will be part of a cross-functional team, collaborating closely with Data Scientists and Analysts to provide high-quality, structured data for ML models and business reporting.
- You will verify data quality and implement basic monitoring solutions to ensure the reliability of our data warehouse.
- You will optimize existing data processes, write clean code, and learn best practices in DataOps and software engineering.
- You will be part of the Infrastructure department, learning how we manage and analyze vast amounts of technical data across our data centers and the public cloud.
- You will be responsible for assisting in building a seamless data process flow, cooperating with various teams to understand existing metrics and company targets.
Our requirements
- Are familiar with Python (or Java/Scala) and know how to write clean, maintainable code.
- Are familiar with the basics of SQL and want to learn how to use it in practice for processing large-scale technical datasets.
- Like analyses and can apply critical thinking to identify basic trends and patterns, treating complex data as a puzzle waiting to be solved.
- Are eager to continuously develop your skills and expand knowledge in a highly technical infrastructure environment, tackling the challenges of large-scale datasets in one of Europe's leading e-commerce platforms.
- Know English at B2+ level
What we offer
- Flexible working hours in the hybrid model (4/1) - working hours start between 7:00 a.m. and 10:00 a.m.
- The opportunity to learn, work on exciting challenges, collaborate with amazing people and have an unforgettable adventure
- Mentorship and support from your buddy throughout your entire program
- Additionally, you will be part of a supportive, inclusive culture that fosters personal growth, career development, and the building of meaningful connections with colleagues
- A wide selection of fringe benefits in a cafeteria plan - you choose what you like (e.g. medical, sports or lunch packages, insurance, purchase vouchers)
- The necessary tools for work
- Working in a team you can always count on - we have on board top-class specialists and experts to learn from
- Hackathons/Open days, workshops, guilds, meetups and internal knowledge sharing
- Internal learning platform (including training courses on work organization, means of communication, motivation to work and various technologies and subject-matter issues)
Benefits
#goodtobehere means that:
- You will join a team you can count on - we work with top-class specialists who have sharing knowledge and experience in their DNA.
- You will love our level of autonomy in team organization, the space for continuous development, and the opportunity to try new things. You get to choose which technology solves the problem and you are responsible for what you create.
- You will value our Developer Experience and the full platform of tools and technologies that make creating software easier. We rely on an internal ecosystem based on self-service and widely used tools such as Kubernetes, Docker, Consul, GitHub, and GitHub Actions. Thanks to this, you can contribute to Allegro from your very first days on the job.
- You will be equipped with modern AI tools to automate repetitive tasks, allowing you to focus on developing new services and refining existing ones (also leveraging AI support).
- You will create solutions that will be used (and loved!) by your friends, family and millions of our customers.
- You will meet the Allegro Scale, which starts with over 1000 microservices, an open-source data bus (Hermes) with 300K+ rps, a Service Mesh with 1M+ rps, tens of petabytes of data, and production-used machine learning.
- You will become part of Allegro Tech - We speak at industry conferences, cooperate with tech communities, run our own blog (it's been over 10 years!), record podcasts, lead guilds, and we organize our own internal conference - the Allegro Tech Meeting. We create solutions we love (and can) to talk about!
- Send us your CV and… see you at Allegro!
Aufrufe: 4
| Veröffentlicht | vor 23 Tagen |
| Läuft ab | in 7 Tagen |
| Arbeitsmodus | Hybrid |
Ähnliche Jobs, die für Sie von Interesse sein könnten
Basierend auf "Junior Data Engineer - e-Xperience Program Associate"
Keine Angebote gefunden, versuchen Sie, Ihre Suchkriterien zu ändern.