Website Cobbleweb
Data Engineer
The golden thread linking each step in our process isdata. Without it, we cannot help our clients make informed decisions about their target audience, marketing channels, product features and much more.
Thats where you come in. We are looking for an experiencedData Engineerwho will help us create and manage appropriate metrics models for our clients marketplace projects. That includes collaborating with our Business Analyst to identify the right metrics for each project and then collecting, managing, and converting raw data into useful information.
Our ideal candidate understands that the metrics models that we build during the Discovery phase of each project go beyond determining what users are doing; they aim to seek the fundamental reason why things exist at all. Your mission is to help our clients discover their business in a way that will constantly evolve their thinking and their products to realise their ultimate vision.
Your metrics models will help our growth hacking efforts, finding the best way to acquire, activate, retain and convert our clients user bases. Using the Pirates Metrics Model to measure and analyse our clients website or mobile apps, to help us adjust whatever is necessary to improve performance. You are comfortable building and managing data pipelines for technical metrics (track if the product is working as expected and quickly identify technical problems), as well as UX/UI metrics that help us increase audience engagement.
Current projects that you can expect to work on includeNestify, a fast-growing property management platform. We have been asked to implement performance tracking for their employees (via admin and employee dashboards) and identify new business opportunities (cities to focus on, optimal pricing, etc.)
You will also help us build CobbleWebs internal communication system and knowledge base known as Umy. This set of internal tools will support our globally distributed company structure.
Job Responsibilities
Design, deliver and continuously test data pipelines that will aggregate data into reports.
Collaborate with the team to create innovative proofs-of-concept, pilot projects, minimum viable products, and business cases.
Transform data into valuable insights that inform business decisions, making use of our internal data platforms and applying appropriate analytical techniques.
Help us to understand our users and serve them better through data, conversations, and active research to hear from them directly.
Engineer reliable data pipelines for sourcing, processing, distributing, and storing data in different ways, using data platform infrastructure effectively.
Produce and automate delivery of key metrics and KPIs to the business. In some cases, this will mean simply making data available and in others it will constitute developing full reports for end users.
Monitor usage of data platforms and work with clients to deprecate reports and data sets that are not needed and create a continuous improvement model for the data.
Work with clients to understand data issues, tracing back data lineage and helping the business put appropriate data cleansing and quality processes in place.
Work with stakeholders to define and establish data quality rules, definitions and strategies in line with business strategies and goals.
Monitor and set standards for data quality.
Prioritise data issues.
Job Requirements
Expert with Python(5+ years experience)
Experience with SQL and NoSQL (5+ years experience)
Experience with database technologies like Relational, NoSQL, MPP, Vector and Columnar databases (3+ years experience)
Experience in AWS (3+ years experience)
A comprehensive understanding of cloud data warehousing and data transformation (extract, transform and load) processes and supporting technologies such as Airbyte, Dbt, Dagster, AWS S3, EMR, Data Lakehouse, and other analytics tools.
Experience in manipulating data through cleansing, parsing, standardising etc., especially in relation to improving data quality and integrity
Proven ability to design Data Models and ETL pipelines that meet business requirements in the most efficient manner.
You have designed and deployed data pipelines and ETL systems for data-at-scale
Previous experience in meeting the visualisation, reporting and analytics needs of key business functions through the development of presentation and data models
Experienced in defining and developing data sets, models and cubes.
Knowledge of the emerging technologies that support Business Intelligence, Analytics and Data.
You have a curious level-headed approach to problem-solving, with a fine eye for detail and the ability to look at the wider business context to spot opportunities for improvement.
Passionate about data and unlocking data for the masses
BSc or MS in Computer Science or related technical fields. Equivalent work experience will also be considered.
To apply for this job please visit www.cobbleweb.co.uk.