Data Engineer

TripleWhale

TripleWhale

Software Engineering, Data Science
Israel
Posted on Dec 27, 2022
By submitting, I acknowledge Triple Whale's Job Candidate Privacy Notice
What Do We Do?
Triple Whale centralizes and simplifies all the metrics used by Shopify sellers, enabling them to save more time and make more money. We help e-commerce businesses accurately track and forecast all kinds of metrics, including changes in profits and operational metrics. Our goal is to infuse AI into e-commerce, empowering shops to have an accurate picture of their business and make optimal choices that boost their bottom line. Launched in mid-2021, Triple Whale already has over 11,000 clients, top-tier investors, and growing hubs in Columbus, North Carolina, and Israel.
Triple Whale is looking for talented Data Engineers to join our growing team. This is a hybrid role, with 3 days a week in-office required (Jerusalem or Bnei Brak offices).
Our Data Engineers are responsible for building and operating the data systems to deliver value to the end users and to internal users, by expanding and optimizing the data pipelines and data services, ensuring data integrity and driving a data-driven culture. In addition, you will support our software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. This is an amazing opportunity to get into Triple Whale on the ground floor and have a direct hand in designing our company’s data architecture to support our first generation of products and data initiatives.
Our ideal candidate is experienced with data pipeline builders and data wranglers, and someone who enjoys optimizing data systems and building them from the ground up. Must be self-directed and comfortable supporting the data needs of multiple teams, systems and products and comfortable working in a fast-paced and often pivoting environment.
Responsibilities
* Build and maintain our data repositories with timely and quality data
* Build and maintain data pipelines from internal databases and SaaS applications
* Create and maintain architecture and systems documentation
* Write maintainable, performant code
* Implement the DevOps, DataOps and FinOps philosophy in everything you do
* Collaborate with Data Analysts and Data Scientists to drive efficiencies for their work
* Collaborate with other functions to ensure data needs are addressed
* Constantly search for automation opportunities
* Constantly improve product quality, security, and performance
* Desire to continually keep up with advancements in data engineering practices
Qualifications
* At least 3 years of professional experience building and maintaining production data systems in cloud environments like GCP
* Professional experience using JavaScript and/or other modern programming language
* Demonstrably deep understanding of SQL and analytical data warehouses
* Experience with NOSQL databases, eg: ElasticSearch, Mongo, Firestore, BigTable
* Hands-on experience with data pipeline tools (eg: Dataflow, Airflow, dbt)
* Strong data modeling skills
* Experience with MLOps - advantage
* Familiarity with agile software development methodologies
* Ability to work 3 days a week in-office (Jerusalem or Bnei Brak)
Our Values
We Are Customer Obsessed: From our mission to every detailed project, everything we do is designed to create a positive impact for our customers.
We Move (Very!) Quickly: The speed at which we work, iterate, and deliver value is our most competitive advantage.
We Are Trustworthy: Candor, directness, and honest communication helps us learn, grow and improve so we can win together.
We Are Curious: We extend beyond our comfort zone and ask questions that guide us towards new, creative, and bold paths.
We Act Like A Mensch: We act with honor, integrity and empathy, and have deep respect for our customers and each other.
In the News
By submitting, I acknowledge Triple Whale's Job Candidate Privacy Notice