Marketo is truly at the forefront of the rapidly evolving and competitive landscape of Marketing Technology and experiencing hyper-growth. We have the most inspired, customer-obsessed people supercharging a nation of empowered marketers with inspiration, education and an unrivaled engagement platform. Our people are fueled with a passion for innovation, competition, and a relentless commitment to making the Marketer successful. That fire is what makes Marketo an amazing place to work. Headquartered in San Mateo, CA, Marketo serves customers all over the world and has offices in Denver, CO; Portland, OR; Atlanta, GA; Seattle, WA; Dublin, Ireland; London, UK; Tel Aviv; Sydney, Australia; and Tokyo, Japan.
You’ll be performing the data pipeline development effort for Analytics in Marketo which deals with petabytes of data and serves thousands of customers.
You would be the key person developing all backend data pipelines including batch and streaming pipelines and grow as a professional to be the go-to person for all our backend needs for Analytics.
We are looking for a data geek with technical background in analytics space who has a strong focus and dedication to delivery; a passionate data engineer with expertise in developing data pipelines to support data warehouse, data lake and business intelligence in a SaaS environment. Experience in building pipelines for supporting AI/ML products would be a big plus.
If you’re passionate about innovative technology around data analytics, business intelligence and artificial intelligence domain (data lake, big data, spark, scala, kafka, DataProc, AirFlow, SnowFlake, Looker, TensorFlow), thrive in delivering tightly scheduled roadmaps, Marketo is the place for you!
- Develop data pipelines including batch and streaming, to support data warehouse, business intelligence and AI/ML products.
- Be part of multiple projects at the same time and deliver on time in Agile environment.
- Work closely with architects, data scientists, engineers and product managers to understand product and engineering requirements to ensure the delivery.
- Learn new technologies/technical solutions with respect to Analytics and Artificial Intelligence product roadmap.
- Mentor junior team members, be able to drive at times for some of the deliverables.
- 5+ years of experience (including big data, distributed data processing, business intelligence and data warehouse) in a SaaS environment.
- 3+ years of experience in building data pipelines (both batch and streaming)
- 3+ years of experience working in Agile/Scrum setup.
- Experience developing distributed applications and API in a SaaS environment
- Great communication skills, and the ability to lead junior team members.
- Solid experience of SQL and NoSQL database technologies.
- Proficiency in Java/Scala, Spark, shell scripting.
- Experience on building pipelines either on GCP or on AWS stack.
- Understanding of GCP stack comprising of CloudSQL, DataProc, Spanner, BigQuery, DataFlow, TensorFlow.
- Experience of building traditional data warehousing using ELT/ETL tools.
- Exposure to UI technologies: Node-JS,React, Redux
- Experience with analytics product offerings to multi-tenant architecture
- Experience with Looker, SnowFlake
- Experience of building Data Lake
- Experience with ETL frameworks, data warehouse and BI products (Talend, Pentaho BI)
Marketo is an equal opportunity employer.