Imagen institucional
Imagen institucional

Data Engineer

Capital Federal, Buenos Aires, Argentina

Tecnología, Sistemas y Telecomunicaciones/Tecnologia / Sistemas

Full-time
Remoto

Hace 2 meses

Postularse

Hace 2 meses

Capital Federal, Buenos Aires, Argentina

Tecnología, Sistemas y Telecomunicaciones/Tecnologia / Sistemas

Full-time
Remoto

Hace 2 meses

Postularse
Descripción del puesto

We started by building an app to make saving and investing in cryptocurrencies easy and fast, and we became one of the leading platforms in the region. With a presence in Argentina, Peru, and Colombia, we are looking to expand regionally and into a variety of investment assets (stocks, bonds, ETFs, and more).

Join us! We are looking for engineers with different levels of expertise to be part of our amazing team.

The role

We are looking for our next*Data Engineer to design, build, and maintain data pipelines. Maintain and optimize the necessary data infrastructure for the extraction, transformation, and loading of data from a wide variety of data sources. Automate data workflows, such as data ingestion, aggregation, and ETL or ELT processing. Create, maintain, and deploy data products for analytics and data science teams on cloud platforms.

We enjoy working in an environment where you can freely share your ideas and rely on the collaboration of all teams. We trust each other to speak openly and care about maintaining long-term relationships.

What you’ll do:

  • Design, build, and maintain data pipelines.
  • Maintain and optimize the necessary data infrastructure for accurate extraction, transformation, and loading of data from a wide variety of data sources.
  • Automate data workflows, such as data ingestion, aggregation, and ETL or ELT processing.
  • Prepare raw data in data warehouses into a consumable dataset for technical and non-technical stakeholders.
  • Create, maintain, and deploy data products for analytics and data science teams on cloud platforms, preferably GCP and/or AWS.
  • Develop systems and architecture that support the different stages of the Machine Learning flow.
  • Ensure the integrity of the data by promoting the data driven culture to all squads.

Requisitos

What you’ve done:

  • Knowledge of a Cloud platform, preferably GCP or AWS (in that order of preference).
  • Intermediate/advanced knowledge of Python.
  • Intermediate/advanced knowledge of SQL.
  • Experience managing and deploying an orchestrator, such as Dagster, Apache Airflow, Prefect, etc.
  • Excellent teamwork skills. Be humble and collaborative. A true Team Player!
  • Ability to listen to your stakeholders and translate their needs into requirements, executing them with your team.
  • Proactive and responsible work ethic.

Nice to have:

  • Knowledge of DBT.
  • Focus on impact and a consistent track record of delivering results for users and the business.
  • Ability to think big and develop initiatives with real and measurable impact.

Beneficios

  • Salaries in USD
  • 3 weeks of PTO per year
  • An awesome office in Palermo and also work-from-home friendly
  • Extra PTO week for holidays
  • Flex Friday
  • Work with the best tools and hardware in the industry
  • Excellent health insurance + family group
  • Free day for your birthday

Detalles

Nivel mínimo de educación: Universitario (En Curso)

Powered by Logo