PearlSoft Technologies

ETL/DATA ENGINEER

Role: ETL/DATA ENGINEER 

4+ years

Location: Infopark, Cochin

Job Description – ETL / Data Engineer 

Key Responsibilities 

  • Design, develop, and maintain ETL pipelines to extract, transform, and load data from various sources into data warehouses or data lakes. 
  • Work closely with business and analytics teams to understand data requirements and translate them into scalable data solutions. 
  • Optimize existing ETL workflows for performance, scalability, and reliability. 
  • Manage data integrations from APIs, flat files, cloud storage, and third-party systems. 
  • Build automated data validation, monitoring, and error-handling mechanisms. 
  • Ensure data quality, consistency, and integrity across platforms. 
  • Troubleshoot and resolve issues related to data loading, processing, and pipeline failures. 
  • Collaborate with database administrators, BI developers, and data scientists to support data initiatives. 
  • Create and maintain comprehensive technical documentation for ETL processes, data flows, and data models. 

Required Skills 

  • 4+ years of experience in ETL development or data engineering. 
  • Strong expertise in SQL including complex joins, window functions, and performance tuning. 
  • Hands-on experience with ETL tools such as SSIS, Talend, Azure Data Factory, or similar platforms. 
  • Solid understanding of data warehousing concepts including star schema, SCD, normalization, and fact/dimension modeling. 
  • Knowledge of cloud data technologies across Azure, AWS, or GCP. 
  • Experience working with relational and NoSQL databases (e.g., SQL Server, Oracle, Snowflake). 
  • Familiarity with Python or PySpark for data transformation and automation. 
  • Understanding of version control systems (Git) and CI/CD pipelines. 
  • Hands-on experience with API integrations and working with JSON/XML formats. 

Preferred Qualifications 

  • Knowledge of data modeling tools, metadata management, and data governance practices. 
  • Experience with big data frameworks such as Spark or Hadoop. 
  • Exposure to monitoring and logging tools like CloudWatch, Azure Monitor, or Grafana. 
  • Familiarity with BI tools such as Power BI, Tableau, or QuickSight.