Senior Data Engineer

Full Time
Brooklyn, NY
Posted
Job description
Job Title: Senior Data Engineer

Job Location: Brooklyn, NY /Remote an option with onsite visits for critical meetings as needed

Job Type: 12 Month Contract

Work Shift: Normal business hours Monday-Friday 35 hours/week

(not including mandatory unpaid meal break after 6 hours of work).

Pay Rate: $170/hr

The Senior Data Engineer will assist the Application Engineering dept in building a robust, secure, and modern data pipelines to ingest, process, transform MyCity applications data using Informatica Intelligent Cloud Services or a comparable ETL/ELT tool employing modern data movement strategies and methodologies. The data needs to be store in an Azure Data Lake and eventually brought into a cloud-based data store such as Snowflake or a similar data store. Then, the Analytics reporting solution needs to be built either using cloud-based Google Looker or Microsoft Power BI or another similar tool.

Using the Cloud OTI Data Platform, the Senior Data Engineer will build highly available, robust, concrete data pipelines and reporting solutions following industry best practices, adhering to OTI security guidelines, modifying, and running automated CI/CD pipelines used for releasing code, and modifying and executing terraform modules for deploying infrastructure components, working collaboratively under the direction of OTI data engineering management and leads. The resource will be a person with integrity who is dependable and is fully focused on delivering optimal solutions with little to no maintenance and operations overhead.

Responsibilities

  • Be experienced in Data Engineering best practices, technologies, tools and processes.
  • Bring sound knowledge of Data Warehouses and LakeHouse concepts and practical implementation experience.
  • Build a framework of repeatable solutions and playbooks enabling efficient and predictable data pipelines.
  • Have hands-on development experience in the implementation of an agile, cloud centric data warehousing and reporting platform with team members of various experience level.
  • Interact with clients, both technical and non-technical stakeholders.
  • Handle relationships with end users. Interact regularly to gather feedback, listen to their issues and concerns, and recommend solutions.
  • Meet critical deadlines and deliver in short sprints.
  • Ensure successful delivery of new reports and dashboards as needed.
  • Maintain and curate data documentation including Architectural Decision Records (ADR), how-to guides, data lineage and ownership using Azure DevOps or similar tool.
  • Maintain query performance and tuning to ensure cost optimization.
  • Participate in joint application development sessions with co-engineers and end users and be willing to brainstorm.
  • Complete technical documentation and be willing to transfer knowledge as needed.
MANDATORY SKILLS/EXPERIENCE

Note: Candidates who do not have the mandatory skills will not be considered

  • 12+ years developing Data Pipelines / Flows using ETL/ELT tools and technologies.
  • 10+ years of strong SQL fluency (query optimization, windowing functions, aggregation, etc.).
  • 5+ years building complex Analytics and Reporting solutions.
  • 3+ years experience with a cloud data lake/warehouse solution (Snowflake, Redshift, GCP etc.).
  • Hands on experience working with data integration tools like Informatica Intelligent Cloud Services, Informatica Power Center or SSIS or a similar tool.
  • Extensive experience developing production grade, large scale data solutions.
  • Experience performing conceptual, logical, and physical data modeling using data modeling tools in complex, large-scale environments.
  • Experience working with Microsoft Azure cloud computing platform and services.
  • Experience managing data orchestration at scale using tools such as Airflow, Prefect and Dagster.
  • Experience with traditional RDMS platforms (Oracle and SQL Server).
  • Experience working with version control systems (e.g., Git)
  • Good understanding of CI/CD principles.
  • Experience developing dashboards and reports in applications such as Oracle Analytics Server (OAS), Microsoft Power BI and Google Looker.
DESIRABLE SKILLS/EXPERIENCE:
  • Experience using Azure services for Security, Blob Storage, Data Lake, Databricks, Data Factory etc.
  • Programming experience with Python or Java
  • Experience with Azure Monitoring services
  • Microsoft Certified Azure Solutions Architect Expert or a Snowpro Certification or a similar one

conforminform.com is the go-to platform for job seekers looking for the best job postings from around the web. With a focus on quality, the platform guarantees that all job postings are from reliable sources and are up-to-date. It also offers a variety of tools to help users find the perfect job for them, such as searching by location and filtering by industry. Furthermore, conforminform.com provides helpful resources like resume tips and career advice to give job seekers an edge in their search. With its commitment to quality and user-friendliness, conforminform.com is the ideal place to find your next job.

Intrested in this job?

Related Jobs

All Related Listed jobs