Description
Our client is on the lookout for a dedicated Data Engineer to drive the development of data processing and transformation workflows. The ideal candidate should possess an analytical mindset, strong technical expertise, and a passion for solving complex data challenges. This role involves working closely with cross-functional teams to ensure data integrity and availability, contributing significantly to the core of the company's data-driven initiatives.
The position requires advanced Python skills, including expertise in object-oriented programming, as well as experience or a strong interest in AI development. The ideal candidate should have a deep understanding of data engineering principles while also being open to leveraging AI-driven solutions to optimize data workflows and infrastructure.
Responsibilities:
- Design, build, and maintain scalable data pipelines and ETL processes to handle large volumes of data efficiently.
- Implement and manage robust data architectures that adhere to best practices for data modeling, security, and governance.
- Work alongside data scientists to create and manage machine learning and data processing frameworks.
- Optimize data retrieval, ensuring high performance and reliability across data systems.
- Utilize cloud platforms (AWS) to develop data storage and processing solutions.
- Monitor data operations to troubleshoot and resolve issues effectively and prevent future occurrences.
- Collaborate with cross-functional teams to integrate various data sources and ensure seamless data workflow.
Required Skills:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- Upper-intermediate or Advanced English level (written/spoken)
- 7+ years of experience in data engineering roles.
- Strong in Python, with strong scripting and automation skills.
- Experience with Big Data technologies such as Apache Spark, Hadoop, or similar frameworks.
- Familiarity with data modeling concepts and tools (e.g., ERD, UML).
- Hands-on experience with ETL tools and data integration techniques.
- Solid understanding of relational (SQL Server, PostgreSQL) and NoSQL databases (MongoDB, Cassandra).
- Strong cloud technology experience with AWS, Terraform, Looker, Snowflake, and Data Modeling.
- Experience with version control systems such as Git.
Preferred Qualifications:
- Experience with data mining and machine learning algorithms.
- Knowledge of containerization technologies such as Docker and orchestration platforms like Kubernetes.
- Background in financial or travel industries is a plus.
- Strong problem-solving skills and ability to implement solutions independently.
- Familiarity with Agile methodologies and team collaboration tools
Place of work: Remote from LatAm.
Contract Type: Full time - Contractor. Long-term. EST and MST time zones
Selection process:
1) Technical Interview (60')
2) HR Interview (30')
3) Technical Interview with client (60')
Pago en USD
Benefits
- 10 PTOs (8 + Christmas and New Year)
#SalesforceDeveloper #Apex #LightningWebComponents #SalesforceArchitecture #PlatformDeveloperI #EnglishFluency #TechCertification #ITJobs #SoftwareDevelopment #CareerGrowth
Fecha PublicaciĆ³n: 10-03-2025
Follow us