Back to jobs

Data Engineer

The Value Maximizer
California, United States
Full-time
AI tools:
Spark
Kafka
Snowflake
Databricks
Applications go directly to the hiring team

Full Description

This role focuses on enabling front-office, advisor, and trading operations through low-latency data pipelines, scalable architectures, and governed data platforms. You will work closely with trading desks, portfolio management, and digital platforms to deliver reliable, compliant, and high-throughput data solutions. Key ResponsibilitiesTrading Data Platform Engineering

* Design and build real-time and batch data pipelines supporting trading workflows (orders, executions, positions, market data)

* Develop low-latency data processing systems for near real-time decisioning· Build scalable data architectures for high-volume transaction data· Enable event-driven architectures using streaming platforms (Kafka, Kinesis) Wealth Management & Trading Integration

* Integrate with trading platforms (OMS/EMS), portfolio systems, and advisor platforms· Support use cases such as:

* Trade lifecycle tracking (order → execution → settlement)

* Portfolio performance and analytics· Advisor dashboards and client reporting· Ensure data consistency across front-, middle-, and back-office systems Data Engineering & Architecture

* Build and manage data lakes / lakehouse architectures (Delta Lake, Iceberg, etc.)

* Develop ETL/ELT pipelines using modern frameworks· Design data models optimized for trading and analytics workloads· Implement API-driven data access layers for downstream consumption Performance, Scalability & Reliability

* Optimize pipelines for low latency, high throughput, and fault tolerance· Implement data quality, reconciliation, and observability frameworks· Ensure high availability and disaster recovery for critical trading data systems Governance, Risk & Compliance

* Implement data governance, lineage, and auditability· Ensure compliance with regulatory requirements (SEC, FINRA, etc.)

* Enable data security, entitlements, and access controls· Support trade surveillance and reporting requirements Collaboration & Delivery

* Partner with trading desks, product teams, and architects to translate requirements into scalable data solutions· Work closely with AI/analytics teams to enable downstream insights and models· Mentor junior engineers and contribute to data engineering best practices Required Qualifications

* 7-12+ years of experience in data engineering or backend engineering· Strong expertise in:

* Python / Scala / Java· SQL and distributed data processing (Spark, Flink, etc.)

* Hands-on experience with:

* Streaming platforms (Kafka, Kinesis, Pulsar)

* Data lake / warehouse technologies (Snowflake, Databricks, Redshift)

* Experience building real-time or near real-time data pipelines· Strong understanding of data modeling and large-scale distributed systems Preferred Qualifications

* Experience in Wealth Management or Capital Markets trading systems· Familiarity with OMS/EMS platforms (e.g., Charles River Development, Aladdin, FIS)

* Knowledge of market data (equities, fixed income, derivatives) and trade lifecycle / post-trade processing· Experience with cloud-native data platforms (AWS, Azure, GCP)

* Exposure to real-time analytics and risk systems

Applications go to the hiring team directly