Big Data Engineer

Praha
Development
Automotive
Financial Services
Technology:
AWS
Azure
Cloud
Data warehouse
Java
Kafka
Linux
Python
SQL
Contract type:
Full-time
Contract
Seniority:
Medior
Senior

Job description

Trask is a company with a turnover exceeding one billion and a team of hundreds of experts who have been introducing the latest technologies and innovations to the Czech and European markets for over a quarter of a century.

We boast the largest integration department in Central Europe, specializing in intelligent automation, cloud solutions, and API economy. Leveraging artificial intelligence, we optimize production processes in automotive plants, digitize companies, and implement paperless loan systems. This is just a glimpse into our diverse portfolio. Join us to discover more – we are currently seeking a Big Data Engineer.


Trask is a company with a turnover exceeding one billion and a team of hundreds of experts who have been introducing the latest technologies and innovations to the Czech and European markets for over a quarter of a century.

We boast the largest integration department in Central Europe, specializing in intelligent automation, cloud solutions, and API economy. Leveraging artificial intelligence, we optimize production processes in automotive plants, digitize companies, and implement paperless loan systems. This is just a glimpse into our diverse portfolio. Join us to discover more – we are currently seeking a Big Data Engineer.

Job description

  • Manage data for clients in real-time, providing materials for regular reporting
  • Create and implement solutions like Data Warehouse, Data Lake, Data Lakehouse and Operational Data Store
  • Prepare data sources for integration using various ETL tools and APIs
  • Generate reports and dashboards for managerial decision-making
  • Visualize data

Required qualifications

  • Knowledge of SQL and data formats such as JSON, XML, CSV, AVRO
  • Experience with at least one cloud platform (Azure, AWS), including access rights setup, security, data transfers, and infrastructure as code is a plus.
  • Experience with at least one data platform (Cloudera, Snowflake, DataBricks, Dremio)
  • Also with different data transfer modes (Stream, Batch, Event-based)
  • Proficiency in at least one programming language - Java, Scala, or Python.
  • Versioning (Git) and CI/CD
  • Linux (shell, admin experience)
  • Experience with data integration tools is an advantage
  • Familiarity with a streaming platform (Kafka, Flink) and reporting tools (Grafana, Power BI, Tableau) is a plus.
  • English proficiency at a communicative level.

What you may encounter with us:

  • Kafka, Flink, NiFi
  • AirFlow, Spark, Talend
  • DataBricks, SnowFlake, DBT
  • Git
  • AWS, Azure
  • Cloudera, Hbase, PostgreSQL
  • Swagger

What are we offering

  • Customized working hours: Flexible scheduling tailored to your individual needs.
  • Vacation and time off: Guaranteed 5 weeks of vacation and 3 sick days.
  • Flexible work environment: Ability to work from home, company offices, or directly at client sites.
  • Education: Training, conference attendance, e-learning programs, and language courses.
  • On-site refreshments: Diverse selection including cookies, fruit, coffee, and hot chocolate.
  • Seamless mobile communication: 50 GB of mobile data + unlimited company-paid calling. Discounted O2 Family tariffs for family members.
  • Choice of benefits: MultiSport card, pension insurance contribution, discounts (Alza, pharmacies, tickets, and other experiences).
  • Additional employee perks: Access to employee loans, discounts with business partners, opportunities for volunteer work, and participation in team-building activities. 

Advantageous skills

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

We design and deliver cutting edge IT projects for enterprise innovators. Any size. End to end.

Your Future Manager:
Tomis Martin
Director
Department:
Data Warehouses | Data Science & Customer Intelligence
Expert Articles

Interested in this position?

Apply now
Send us a message
Name *
Surname *
E-mail *
Phone *
LinkedIn
Message
CV
Max. size 10MB.
Nahrávání...
fileuploaded.jpg
Nahrávání se nezdařilo. Max. velikost souboru je 10 MB.
* Fields marked with an asterisk (*) are required.
Thank you! Your submission has been received!
Something went wrong while submitting the form. Please try again.

Our 5 stage recruitment process

Send us your CV or LinkedIn profile, and by the next business day, you’ll hear from us. Glide through the first interview and your future manager will be keen to meet you for a follow-up chat. Impress there, and an offer will be on the table before you know it. Yes, it's as straightforward as it sounds.

If your role is technical, we like to ensure a perfect fit with a simple test task. Consider it a sneak peek into the exciting challenges you'll tackle with us.

1

Submit your resume or LinkedIn profile, so we can meet you.

2

Begin our conversation with a call or e-mail.

3

First interview to discuss your potential and role.

4

Second interview with
a test task

5

If everything clicks, we give you a job offer.

arrow down