Kto szuka:
HAYS
Stanowisko:
Data Engineer
Lokalizacja:
Warszawa
mazowieckie
Opis stanowiska podany przez pracodawcę:
Data Engineer
Warszawa
NR REF.: 1194024
Hays IT Contracting is a cooperation based on B2B rules. We connect IT specialists with the most interesting, technological projects on the market.Join the group of 500 satisfied Contractors working for Hays’ clients!
For our Client we are currently looking for Candidates for the position of:
Data Engineer
Location: hybrid / WarsawBranch: project in IT sectorJob type: B2B (with Hays Poland)Length: first contract for 6 months B2B, then full cooperation with the clientRate: 120-160 PLN/h net on B2BStart date: ASAP, max 1 month notice periodWorking hours: flexible Remote work: 1 day a week in the office (office in Warsaw)Methodology: Scrum
Tech stack:Python, PySpark, Databricks, ETL, Power BI, Azure, SQL
Project:Project for a large client from the IT sector
What will you do:
- Develop Python scripts using pandas and numpy to clean, transform, and validate large datasets
- Integrate Spark jobs into ETL pipelines running on Databricks
- Set up and configure Databricks workspaces, clusters, and job schedules
- Design and implement ETL workflows using tools like Azure Data Factory or custom Python scripts
- Use Azure Data Lake, Blob Storage, and Azure SQL for data storage and processing
What will you get:
- Long-term cooperation with the client implementing projects for the largest players in the banking, insurance, telco and more sectors
- Standard benefits - preferential rates for LuxMed and Multisport packages
- When you choose to work via Hays, you also get the opportunity to work for many of Hays39 other leading clients in the future
What we expect from you:
- Experience - 3 years and more
- Expertise in Python – advanced knowledge of Python, including libraries like pandas, numpy, etc., and data processing tools
- Experience in PySpark – hands-on experience with Apache Spark and its integration with large-scale data processing systems
- Experience in Databricks - setting up workspaces, clusters, managing permissions, Unity Catalog
- Experience with ETL and data pipelines – designing, implementing, and optimizing ETL processes in cloud or on-premises environments
- Experience with cloud platforms - Azure
- Experience in Power BI - building and deploying reports, dashboards and datasets
- Basic understanding of DAX language
- SQL proficiency – solid experience working with relational databases (e.g., PostgreSQL, MySQL, SQL Server)
- Understanding of data architecture – knowledge of building data infrastructure and integrating multiple data sources across the organization
- Analytical skills – ability to solve complex problems and optimize data processing workflows
- Collaboration and communication skills – ability to work effectively in a cross-functional team and communicate with various stakeholders
- Very good command of spoken and written English (min. B2)
What would be a plus:Higher education in the field of computer science, mathematics or related
What will the recruitment process look like:
Kontakt do pracodawcy:
Kliknij tutaj, aby skontaktować się z pracodawcą lub wysłać swoje CV »