Find The Perfect Job

All Filters


25+

1000k+


View all
Education
Apply

Data Engineer ×
Showing 1-3 of 3 jobs
Full Time
Part Time
0 year
0k+
Male
Female
Both
Work From Office
Work From Home
Field Job
Apply

Data Engineer
Stargate
  • 2 - 3 yrs
  • 36,000 - 40,000 / month
  • Amroli
  • GCP AWS. Python Java SQL
    • Full Time
    graduate
    2 - 3 yrs
    36000 - 40000 / month
    5
    Stargate
    Full Time

    Working Type : Work From Office
    Job Description :

    Job Description

    The Opportunity:

    Flywire is seeking a motivated and skilled Data Engineer to contribute to the development and optimisation of our data platforms and pipelines. In this role, you will work closely with the data engineering, analytics engineering, and business intelligence teams to build, maintain, and enhance the infrastructure that supports Flywire’s data needs. You will have the opportunity to work with modern data technologies in a cloud-based environment while contributing to projects that have a direct impact on the business.

    Key Responsibilities:

    • Assist in the design, development, and maintenance of scalable and efficient data pipelines and ETL/ELT processes.
    • Contribute to the optimisation of existing data workflows and applications for performance and resource efficiency.
    • Write, test, and deploy data transformation jobs using tools like dbt.
    • Work with streaming data frameworks and infrastructure to process real-time and near real-time data.
    • Support the development and maintenance of data models within our cloud data warehouse (Big Query).
    • Collaborate with data scientists, BI developers, analytics engineers, and other data engineers to understand data requirements and deliver reliable data solutions.
    • Implement data quality checks and monitoring to ensure data accuracy and reliability.
    • Participate in code reviews and contribute to the team & engineering standards and best practices.
    • Troubleshoot and resolve issues related to data pipelines and data infrastructure.
    • Learn and apply new technologies and techniques in the data engineering
    Data Engineer
    Zenith Leap Solution Pvt. Ltd
    • 6 - 7 yrs
    • Not Mentioned
  • Bengaluru
  • Proficiency in Azure Data Bricks Azure Data Factory SQL Azure DevOps gitlab and data visualization tools like Power B
    • Full Time
    graduate
    6 - 7 yrs
    No required
    5
    Zenith Leap Solution Pvt. Ltd
    Full Time

    Working Type : Work From Office
    Job Description :

    Egis

    Job Description:-

    Job Summary: The Data Engineer will be responsible for leveraging the data platform to create data products for business. This role involves the development of data products, data pipelines, data transformation, data cleansing, data normalization, deployment & support for various functions within EGIS.

    Technologies: Azure Data Factory, Data Bricks, SQL, Azure DevOps, Azure Azure Devops, gitlab, Power BI

    Key Responsibilities:

    · Design and implement Extract, Transform, Load (ETL) processes to move data from various sources (on-premises, cloud, and third-party APIs) to Azure data platforms.

    · Integrate data from diverse sources into Azure-based systems like Azure Data Lake/Azure SQL Database.

    · Use Azure Data Factory or Data Bricks orchestration tools to automate and schedule data pipelines/Job workflows

    · Design efficient data models (e.g., star or snowflake schema) for use in analytical applications and reporting systems.

    · Optimize SQL queries and scripts for performance, ensuring low-latency and efficient data processing

    · Continuously monitor the health of pipelines, jobs and infrastructure, ensuring they are running efficiently and securely

    · Assist in building dashboards and reporting solutions using tools like Power BI, ensuring data is made available in an user-friendly format

    · Address and resolve data issues, including failures in ETL pipelines, system performance problems, and data inconsistencies

    · Ensure that all data engineering processes comply with industry security standards and best practices, including encryption, access controls, and data masking

    Data Engineer
    Zenith Leap Solution Pvt. Ltd
    • 0 - 5 yrs
    • Not Mentioned
  • Mangalore
  • Databases Programming Cloud computing
    • Full Time
    graduate
    0 - 5 yrs
    No required
    20
    Zenith Leap Solution Pvt. Ltd
    Full Time

    Working Type : Work From Office
    Job Description :

    Coresight Research

    Job description:-

    We are seeking a talented and experienced Data Engineer to join the Coresight team. As a Data Engineer, you will play a crucial role in designing, developing, and maintaining the data infrastructure and systems that enable efficient and accurate data collection, storage, processing, and analysis. You will collaborate closely with data scientists, analysts, and other stakeholders to understand their requirements and implement robust solutions that address their data needs.

    Responsibilities:-
    • Data Pipeline Development: Design, develop, and maintain scalable and efficient data pipelines for collecting, processing, and storing large volumes of structured and unstructured data from various sources. Ensure data quality, reliability, and integrity throughout the pipeline.
    • Data Warehouse and Storage Management: Architect, implement, and manage data warehousing and storage solutions, including data lakes, data marts, and relational databases. Optimize data storage and retrieval mechanisms to support high-performance analytics and reporting.
    • Data Transformation and ETL: Transform raw data into usable formats by implementing Extract, Transform, Load (ETL) processes. Cleanse, filter, and aggregate data to ensure consistency and accuracy. Develop and maintain ETL workflows and schedules.
    • Data Modeling: Design and implement data models that support efficient data retrieval, analysis, and reporting. Collaborate with data scientists and analysts to understand their modeling requirements and provide them with structured datasets for analysis.
    • Web scraping: Design and deploy web scraping solutions to collect structured and unstructured data from websites, APIs, and other online repositories. Optimize web scraping processes by implementing efficient scraping strategies, managing API rate limits, handling dynamic content, and overcoming anti-scraping measures. Collaborate with cross-functional teams to identify relevant data sources, define data requirements, and establish scraping methodologies to acquire data in a reliable and automated manner.
    • Data Governance and Security: Establish data governance processes, policies, and standards to ensure data privacy, security, and compliance. Implement appropriate access controls and data protection measures. Monitor and address data quality issues.

    Requirements:
    • Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
    • Proven experience as a Data Engineer or in a similar role, with a focus on data pipeline development, data warehousing, and ETL processes.
    • Strong programming skills in languages such as Python & SQL
    • Proficiency in working with databases (SQL and NoSQL) and data warehousing technologies (e.g., Snowflake).
    • Familiarity with cloud platforms (e.g. Azure, GCP) and their data-related services (e.g., S3, EC2, BigQuery).
    • Experience with data integration and ETL tools (e.g., Apache Airflow, Informatica) is a plus.
    • Solid understanding of data modeling concepts and database design principles.
    • Strong problem-solving skills and the ability to analyze complex data-related issues.

    Powered by XEAM Ventures Private Limited