We are seeking an experienced professional who apart from the required mathematical and statistical expertise also possesses the natural curiosity and creative mind to ask questions, connect the dots, and uncover opportunities that lie hidden with the ultimate goal of realizing the datas full potential.
Roles and Responsibilities:
Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack
Ability to provide solutions that are forward-thinking in data engineering and analytics space
Collaboration with DW/BI leads to understanding new ETL pipeline development requirements.
Triage issues to find gaps in existing pipelines and fix the issues.
Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs.
Help joiner team members to resolve issues and technical challenges.
Drive technical discussion with client architect and team members.
Orchestrate the data pipelines in scheduler via Airflow.
Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack
Ability to provide solutions that are forward-thinking in data engineering and analytics space.
Collaboration with DW/BI leads to understanding new ETL pipeline development requirements.
Triage issues to find gaps in existing pipelines and fix the issues.
Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs.
Help joiner team members to resolve issues and technical challenges.
Drive technical discussion with client architect and team members.
Orchestrate the data pipelines in scheduler via Airflow.
Skills and Qualifications:
Bachelor's and/or masters degree in computer science or equivalent experience.
3+ Years of experience in Data & Analytics with Good Communication and presentations skills.
At least 2 years" experience in Databricks implementations, 2 large scale data warehouse end-to- end implementation experience.
Must have Databricks certified architect.
Proficiency in SQL and experience with scripting languages (e.g., Python, spark, Pyspark) for data manipulation and automation.
Solid understanding of cloud platforms (AWS, Azure, GCP) and their integration with Databricks.
Familiarity with data governance and data management practices. exposure to Data sharing, unity catalog, DBT, replication tools, performance tuning will be added advantage / must have skills.
About Tredence:
Tredence focuses on last-mile delivery of powerful insights into profitable actions by uniting its strengths in business analytics, data science and software engineering. The largest companies across industries are engaging with us and deploying their prediction and optimization solutions at scale. Head quartered in the San Francisco Bay Area, we serve clients in the US, Canada, Europe, and Southeast Asia. Tredence is an equal opportunity employer. We celebrate and support diversity and are committed to creating an inclusive environment for all employees.
Visit our website for more details: https://www.tredence.com/,