Our client is a software artisan passionate about helping companies build awesome solutions. With an agile process that is built on top of the best engineering practices, their team is comprised of full-stack developers and architects who are well-versed in the latest technologies and love what they do! They believe that transparent, honest, and fluent communication, both remotely and on-site, is a key factor in the success of any project.
What we are looking for:
The ideal candidate is a seasoned Engineer oriented to a Data & Analitycs ecosystem (ADF, Databricks, ADLS, Synapse, Python, etc.), someone who also has wide experience in enterprise infrastructure, On-Premise and Cloud, and working in agile teams leveraging the tools that enable success such as CI/CD pipelines and following best practices. We are currently evaluating professionals with mixed experience and seniorities. We will be considering mid-level, senior and Lead candidates.
- Develop ETL pipelines in Python and Azure Data Factory (Azure is preferred but we also evaluate exp. in AWS)
- Software engineering and systems integration through REST APIs and other standard interfaces.
- Work together with a team of professional engineers with the objective of automating processes, deploying and building infrastructure as code and managing the architecture of multicloud systems.
- To Participate in agile ceremonies, weekly demos and such.
- To Communicate your daily commitments.
- Upper-intermediate or advanced English (spoken/written)
- 3+ years of relevant work experience.
- Proficient with Python, Java or .NET C#.
- Proficient with SQL.
- Experience with Agile ceremonies.
- Passionate about good engineering practices and testing.
- Ability to organize, prioritize and communicate daily/weekly goals.
- You like to learn, are curious, humble and like to get things done.
- Passion for working in a customer facing setting.
Nice to have:
- Proficient with ETL products (Spark, Databricks, Snowflake, Azure Data Factory, etc.)
- Proficient with Azure Data Factory.
- Proficient with Databricks/Snowflake and PySpark.
- Proficient developing DevOps/CICD pipelines.
- Proficient with Azure DevOps YAML Pipelines.
- Proficient with Azure cloud services: ARM templates, API management, App Service, VMs, AKS, ACR, Gateways.
- Terraform, Docker & Kubernetes.
Place of work: Remote from Argentina. For those living in Buenos Aires, it is expected that they can come to the office 1-2 times a month (Vicente López, Buenos Aires).
Contract type: Contractor. Full time (9 - 18hs Argentina).
- RRHH and Technical Questionnaire (60min)
- Technical Interview with Live Coding (1.30h)
Pago en USD
- Argentine holidays
- 2 weeks of vacation
- Annual training
- Performance bonus
Fecha Publicación: 23-10-2023