Senior Data Engineer
Brivo Systems LLC Brivo is an Equal Opportunity/Affirmative Action Employer - Minorities/Females/Protected Veterans/Disabled
About the Role
As
...
Brivo Systems LLC
Brivo is an Equal Opportunity/Affirmative Action Employer - Minorities/Females/Protected Veterans/Disabled
About the Role
As the Senior Data Engineer, your mission will be to work with modern cloud tooling (Kubernetes, Serverless Stacks, Snowflake, Airflow, RDS, Kafka, EMR -Python/Spark) in a dual-cloud (AWS and Azure) deployment across multiple agile development teams, in an engineering and admin capacity to implement and evangelize best practices for data management to maintain and optimize the core datasets and pipelines driving the vision of Brivo and advancing the organization's data strategy.
You will be a key member of the team responsible for driving the company's Data Strategy through ownership of the underlying data platform supporting the company's SaaS IoT Security products, with a strong focus on maintaining reliability and resilience. In this role you will also be expected to:
- Create, manage, monitor and maintain data pipelines across multiple applications and environments.
- Proactively monitor, tune, and report on the performance of the platform: databases, tooling, and infrastructure on which they run.
- Build expert-level knowledge and understanding of the applications and their underlying data to provide support (24/7 Team Operating Model) for production database environments to ensure the highest standards of availability, resilience, integrity, security and performance required by our business systems.
- Collaborating with Development and QA teams to provide best-practices, guidance and insight into data management and operations, as well as to support and assist in the development, testing, tuning, and deployment of applications.
- Strategize and implement the next generation of the data platform with a focus on building analytical datasets to facilitate BI and ML applications.
- Improving existing processes by finding opportunities for and implementing automation wherever possible.
- Manage, maintain and monitor disaster recovery strategies, and security controls in accordance with company policies, procedures, and processes.
- Design, develop and maintain appropriate levels of documentation for the data platform, processes and procedures.
About You
We are seeking a self-starting, ambitious, and collaborative, seasoned data professional with 4+ years of experience developing solutions that leverages: RDBMS (preferably Postgres); NoSQL databases like Dynamo or Cassandra; data pipelines in Python and Spark; cloud warehousing tools like Snowflake; and data streaming technologies like Kafka and Kinesis. You should have deep knowledge of and passion for data management and data engineering best practices, specifically with a focus on Reliability, Resilience and Scalability, and a drive to understand systems holistically.
- Deep understanding of RDBMS technologies, specifically as it pertains to monitoring and tuning performance, including the use of logging and monitoring tools such as CloudWatch, Datadog, ELK, Splunk, etc...
- Understanding and having opinions on the challenges of scaling Big Data systems to support multi-national implementations.
- Experience with data warehousing concepts including dimensional and fact modeling and strategies for building data lakes.
- Must be familiar with how to leverage the core building blocks within AWS to properly build and secure data for SaaS applications (ex. RDS, Lambda, EMR, S3, IAM, etc.)
- Experience building and orchestrating Data or ETL pipelines, preferably leveraging a mix of tooling including Airflow, Python, Spark, and Data Replication Tools.
- Ability to analyze, diagnose and tune database and query performance at all associated layers (database, network, server, disk).
- Working experience managing Database schema versioning with automated tools like MyBatis, Flyway or Liquibase.
- Knowledge of development best practices
- If you read this far, add the word pepper to your resume when you submit
- 3+ years development experience writing, maintaining and documenting code in shell, Python, or other scripting languages with an understanding of development best-practices, programming languages (esp. Java, Javascript, Python, or PHP) and their associated ORM engines (i.e.. Hibernate, SQLAlchemy, mongoose)
- Knowledge of or experience with DataOps is a plus!
PI155511358
Below are some other jobs we think you might be interested in.
-
Senior Network Administrator
- Irvine Ranch Water District
- Irvine, CA, USA
Apr 29 -
Senior Cyber Security Analyst
- Valley Water
- San Jose, CA, USA
May 13 -
Senior IT Project Manager
- Pennsylvania Turnpike Commission
- Middletown, PA, United States
- Hybrid
May 09 -
AdaptiveWork Systems Engineer
- Pinkerton
- Seattle, WA, USA
May 03 -
DevOps Engineer III
- Security Risk Advisors
- Philadelphia, PA, USA
- Remote
May 01 -
IT Vendor & Contracts Manager
- Washington State Department of Natural Resources
- Olympia, WA, USA
- Hybrid
May 03 -
Chief Information Officer (CIO)
- National Institute on Aging
- Bethesda, MD, USA
May 09 -
Executive Vice President Information Technology and Chief Information Officer (EVP & CIO)
- Bonneville Power Administration
- Portland, OR, USA
- Hybrid
Apr 27 -
Mobile/Web Developer (.Net)
- Air Line Pilots Association
- Atlanta, GA, USA
May 17 -
Assistant Director, Infrastructure and Business Development
- Metropolitan Transportation Commission
- San Francisco, CA, USA
- Hybrid
Apr 29 -
Unified Technology Solutions Architect
- Pennsylvania Turnpike Commission
- Middletown, PA, United States
- Hybrid
May 16 -
Assistant Director, Service Delivery
- Metropolitan Transportation Commission
- San Francisco, CA, USA
- Hybrid
Apr 29 -
IT Service Management Specialist
- Pennsylvania Turnpike Commission
- Middletown, PA, United States
- Hybrid
May 01