Data Engineer, DTG

Job summary
Permanent contract
Praha
Salary: Not specified
No remote work
Skills & expertise
Generated content
Programming languages
Java
Kubernetes
Scala
Terraform
+5
Apply

Pure Storage
Pure Storage

Interested in this job?

Apply
Questions and answers about the job

The position

Job description

SHOULD YOU ACCEPT THIS CHALLENGE…

You will be responsible for design, development and expansion of data assets that support business intelligence, advanced analytics, and reporting. You will also develop, maintain, and support the enterprise data warehouse system, corresponding data marts, and data interfaces for use in reporting, analytics, and various applications. You will work closely with business users as well as engineers from various teams to ensure their data scenarios are implemented using best practices.

This role offers a unique opportunity to merge technical expertise with creative problem-solving, making a significant impact in how our company harnesses the power of data. You'll be instrumental in developing the data infrastructure necessary for management of data at scale.

 

IN THIS ROLE YOU WILL BE...

  • Collaborate with engineers, product managers and data scientists to understand data needs, representing key data insights in a meaningful way.
  • Communication with internal team members and stakeholders in other time zones for the purpose of requirements clarification and progress reporting.
  • Develop solutions to manage various data pipelines and transformations.
  • Independently design, build and launch new data extraction, transformation and loading processes in production, mentoring others around efficient queries.
  • Experience data governance and metadata management.
  • Data safety and a keen understanding of data management life cycle is critical for this role.
  • Crisp communication and the ability to influence decisions is also important.
  • Program Management capabilities will be needed to successfully deliver in this role.

WHAT YOU’LL NEED TO BRING TO THIS ROLE...

  • 3+ years of experience with SQL, ETL, data modelling, and at least one programming language (e.g., Python, Java, C#, Scala, etc.).
  • Hands-on experience with data quality management capabilities and analysis, data transformations, etc.
  • Experience with the data pipeline and workflow management tools: Airflow, Prefect, etc.
  • Nice to have: Experience with stream-processing systems: Spark-Streaming, Kafka etc.
  • Nice to have: Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
  • Nice to have: Infrastructure management (AWS, Kubernetes, Terraform); DataOps principles; MLOps, preparing ML-ready datasets;

We are primarily an in-office environment and therefore, you will be expected to work from the Prague office in compliance with Pure’s policies, unless you are on PTO, or work travel, or other approved leave.

Salary ranges are determined based on role, level and location. For positions open to candidates in multiple geographical locations, the base salary range is reflective of the labor market across the applicable locations. 

This role may be eligible for incentive pay and/or equity. 

And because we understand the value of bringing your full and best self to work, we offer a variety of perks to manage a healthy balance, including flexible time off, wellness resources, and company-sponsored team events - check out purebenefits.com for more information. 

INCLUDE FOR POSTING LOCATION IDENTIFICATION

#LI-ONSITE

Want to know more?

Apply