Data Engineer
British Gas Trading Ltd
London, UK
3d ago

We need a forward thinking, innovative engineer with experience in handling big data applications for our New Energy Platform (NEP).

The purpose of the Data Engineer is to evolve a Data Platform based on open-source and AWS components to service New Energy Platforms.

  • The platform will be NEP’s end-to-end data solution from ingesting data from energy meters and other sources; validating and storing data;
  • aggregating information, providing a platform to run data-science algorithms; and delivering information back to products;
  • third parties and other systems. You will also work on the data lake for business and partnerships with the aim of democratising data within the business.

    We take a software engineering approach to data engineering in that we build data applications as well as data pipelines.

    We can be flexible on location / working from home.

    The role

    You will be reporting to the Head of Data Platforms and will work closely with data scientists, data analysts and Devops engineers- ensuring that the business receives the solutions they need.

  • Writing production quality code including extensive test coverage.
  • Designing data systems that will scale to large numbers of users.
  • Peer review other engineer’s code to ensure quality.
  • Deploy services to staging and production environments.
  • Support production services including participation in rotation with others in the team
  • Drive best practices across teams and products for data-driven product development and delivery
  • Use Github when required to enable code re-use across the team
  • Work in an Agile environment tracking your tasks using ADO
  • Work alongside other members of the Data Team on large projects to meet deadlines and requirements set by our stakeholders
  • The person

  • Experience of working as a developer in a cross-functional team.
  • Significant experience in the software engineering field, preferably within a data discipline.
  • Good knowledge of Python and concurrent programming
  • Quick learner with eagerness to learn new things and experiment with new technologies
  • Willing to learn Data Science algorithms and produce code to implement them at scale
  • Experience in working with the following technologies : AWS (Redshift, S3, Kinesis, Lambdas especially)SparkKubernetesApache Airflow (non-essential but nice to have)
  • Familiarity with deployment & container technologies including Jenkins, Docker, Serverless
  • Interest in Real time and distributed systems
  • PLEASE APPLY ONLINE by hitting the ' Apply ' button.

    Applications will ONLY be accepted via the Apply’ button.

    This role is being handled by the Centrica recruitment team and NO agency contact is required.

    Recommended Skills

    DockerKubernetesCuriosityJenkinsAlgorithmsBig Data

    Report this job
    checkmark

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Apply
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Continue
    Application form