The role is to support the design, development and implementation of the Holovis Data Platform. This will involve utilising distributed storage and processing systems including but not limited to;
- Cloudera CDH Hadoop distribution.
- Impala Kudu
- Kafka based data processing (streaming).
- Sqoop/Streamsets for batch and real-time data pipelines.
- Apache Spark
The Data Platform will provide the foundation for the BI, Analytics and ML/AI services and products Holovis wishes to provide to our clients. The Data Platform system must therefore be secure, scalable, flexible and system portable.
You will be responsible for:
- The design, development and implementation of a secure, flexible, scalable and system portable Data Platform that can support the provision of data storage and processing, BI reporting, Analytics and ML/AI products and services.
- The development of innovative and compelling data engineering solutions relevant to Holovis core expertise across Entertainment, Enterprise and Defence including but not limited to;
- Virtual Reality
- Augmented/Mixed Reality
- AV, projection and interactive visuals.
- Application & game development
- Immersive, experiential turnkey solutions
- Working with the Data Innovation Manager, clients and colleagues on data related projects including completing requirements capture, architectural design work, documentation, data integrations and migrations where appropriate.
- Demonstrable expertise in the design, development and implementation of enterprise scale ‘big data’ data platforms, preferably utilising the Cloudera CDH Hadoop distribution.
- Extensive experience in the installation and design of Hadoop Clusters (CDH).
- Expertise in the design of hardware specifications and demand-based storage/processing estimation.
- Good general data management and modelling experience, confident in your understanding of the appropriate concepts relevant to database design and administration.
- Extensive application level programming skills, not just scripting languages. Java
- Installation, maintenance and administration of distributed Data Platforms (Hadoop Eco-system/Cloudera/Hortonworks dist.).
- Linux administration & shell scripting.
- IT infrastructure design and data pipeline architecture expertise.
- Strong T-SQL skills and a good understanding of data management principles.
- Network configuration and webserver/HTTP knowledge.
- Database Administration, including distributed/cluster-based systems such as Hadoop.
- Database Development, including tables, procedures, indexes, etc.
You must have the right to live, work and drive in the UK.
We will consider applicants who would prefer to work in any of our London, Lutterworth or Manchester offices.
Send your CV to firstname.lastname@example.org