Python Data Engineer in ClickIT Smart Technologies

Closed job - No longer receiving applicants

This job is posted by ClickIT Smart Technologies on behalf of
ClickIT: DevOps and Software Development is a Cloud and DevOps Agency who has developed cloud based solutions for almost 10 years, for customers from all around the world. Our core competencies are Financial Services, Healthcare, MarTech, Ecommerce, Big Data & Analytics and our Experience comes with startups and mid-large enterprises. We are AWS and GCP certified partners with an experience of helping more than 200 product and service-centric companies based out of the US with their cloud migration and DevOps initiatives. This position its to be part of our team.
ClickIT is looking for a Data + Python champion to join our software team and help one of our customers to create complex integrations for consumer sites and browser applications. Our customer offers a privacy-safe consumer search marketplace, and requires the support of well proven Data engineers who can innovate and boost performance on current and new integrations.

Funciones del cargo

  • Develop and maintain ad-tech related data systems and system components using Python/Java/Scala and other software technologies:
  • ETL/ELT Pipelines
  • Data-driven applications
  • Clients integration automation components (Google, Amazon, LiveRamp)
  • Work with team for technical implementation of Data Engineering projects.
  • Work closely with QA, DevOps, and other engineering teams to integrate, test and release different system components (all stages of SDLC).

Requerimientos del cargo

  • BS in Engineering, Computer Science or related discipline
  • Must have 3+ years strong hands-on development experience using Python
  • Experience developing, delivering and maintaining scalable data aggregation pipelines
  • Experience with relational SQL and NoSQL databases
  • Experience with AWS/ GCP/ Azure
  • Experience working in a Linux environment/ Shell Scripting
  • Experience with data pipeline and workflow management tools like Airflow, Luigi, Azkaban
  • Ability to collaborate with multiple teams and understanding testing and deployment methodologies
  • Good understanding of design patterns and effective use of data structures
  • Experience building software in continuous integration and delivery fashion in an automated way
  • Experience working in an agile software development team using Jira
  • Excellent documentation, communication, and troubleshooting skills


  • Experience with big data and streaming technologies like Apache Kafka, Apache Spark and Data Warehouse (Redshift/ Snowflake/ BigQuery etc)
  • Familiarity with Docker, CI/CD and Microservices
  • Java/ Scala development experience
  • Digital advertising industry experience


Fully Remote
Wellness Benefit
Referral Program
Learning support including paid online courses, trainings and certifications for your professional growth

Fully remote You can work from anywhere in the world.
Flexible hours Flexible schedule and freedom for attending family needs or personal errands.

Best Place to Code This company is a Best Place to Code.

Best Place to Code acknowledges the companies that strive to offer the best possible workplace to software and technology employees.

Remote work policy

Fully remote

Candidates can reside anywhere in the world.

Life's too short for bad jobs.
Sign up for free and find jobs that are truly your match.