¿Desea conocer información de nuestros sistemas de gestión empresarial para PYMES o plataformas para medianas y grandes corporaciones?

Data Engineer w/ Snowflake Experience

We are looking for an English-speaking Data Engineer with Snowflake experience to collaborate closely with an international client on a top-tier, unique product challenge! We are in the early stages of a Snowflake cloud data warehouse implementation. As such, you’ll be responsible for general ETL development, implementing new solutions, and working on modern warehousing and integration technologies. This role will require an in depth understanding of cloud data integration tools and cloud data warehousing (Snowflake experience is critical). Most importantly, we need a strong engineer with the ability to lead and deliver, helping to drive tangible and effective results. We’re growing fast, so this could be the perfect opportunity for someone looking to magnify their impact!  

As a Data Engineer, you’ll:

    Work in Snowflake to...
  • Read, extract, transform, stage and load data to selected tools and frameworks as required and requested.
  • Perform tasks such as writing scripts, web scraping, calling APIs, write SQL queries, etc.
  • Work closely with the engineering team to integrate your work into production systems.
  • Process unstructured data into a form suitable for analysis.
  • Integrate up-and-coming data management and software engineering technologies into existing data structures.
  • Develop set processes for data mining, data modeling, and data production.
  • Create custom software components and analytics applications.
  • Research new uses for existing data.
  • Employ an array of technological languages and tools to connect systems together.

What you’ll need to succeed:  

  • Experience building Data Pipelines from the ground up
  • Experience in some of the following tools: AWS, Snowflake (required), Apache Spark, EMR, Hadoop, Kafka, KStream, Data as a service (API)
  • Proven experience with Kafka - Producer, Consumer, and Streams API
  • Knowledge in KStream, KSQL, Schema registry
  • Proficiency in transferring unstructured data into structured data
  • Ability to build and maintain optimal data pipelines
  • Some experience in the following is preferred, but not required: AWS, Python, Scala, SQL, Kafka/Kinessis, Spark
  • Familiarity with CI/CD processes

As part of our team, you’ll:   

  • Have access to a broad spectrum of learning possibilities because of our ability to connect data engineering to the whole development lifecycle.
  • Bring your expertise to all areas of the team and the company, not work in a siloed area.
  • Work on a constantly evolving service with people that want to shape and implement that vision, creating an open set of challenges.
  • Collaborate with international clients to define processes, cultural shifts, and necessary technologies in order to maximize business value.
  • Work in an agile environment, implementing and following engineering best practices in different projects.
  • Constantly learn from very talented people and participate in courses and training activities, you’ll be joining a global company of more than 4,500 colleagues.
  • Positively impact the communities where we operate through a variety of social impact programs.

PSL operates at the intersection of drive, quality and innovation, providing advanced software engineering services at the speed of nearshore. With more than 600 professionals operating out of offices in Medellín, Bogotá and Cali Colombia, PSL is part of Perficient’s optimized global delivery approach. We put people first, which means giving our teams the space and the resources they need to discover and innovate, supporting them in their development endeavors and career growth, and providing unique and top-tier challenges collaborating with international clients. We’re growing fast and we’re looking for ambitious, collaborative problem solvers that want to help us drive that global growth!