To support the design, development and implementation of the Holovis Data Platform. This will involve utilising distributed storage and processing systems including:
- Cloudera CDH Hadoop distribution.
- Impala Kudu
- Kafka based data processing (streaming).
- Sqoop/Streamsets for batch and real-time data pipelines.
- Apache Spark
The Data Platform will provide the foundation for the BI, Analytics and ML/AI services and products Holovis wishes to provide to our clients. The Data Platform system must therefore be secure, scalable, flexible and portable.
You will work on:
- The design, development and implementation of a secure, flexible, scalable and system portable Data Platform that can support the provision of data storage and processing, BI reporting, Analytics and ML/AI products and services
- The development of innovative and compelling data engineering solutions relevant to Holovis core expertise across Entertainment, Enterprise and Defence including:
- Virtual Reality
- Augmented/Mixed Reality
- AV, Projections and interactive visuals
- Application & Game Development
- Immersive, experiential turnkey solutions
- Completing requirements capture, architectural design work, documentation, data integrations and migrations where appropriate.
- Attain subject matter expert status on relevant data systems and data sources within Holovis and our target markets/disciplines
- Create custom solutions and IP that support the provision of data storage and processing, BI reporting, Analytics and ML/AI products and services
- Administration, maintenance and performance of Holovis Data Platform implementations.
- Creating and maintaining a Holovis data and analytics technology roadmap.
- Capturing technical and performance requirements.
- Supporting the organisation in Proof of Concept, R&D and Pre-sales activities through provision of demonstration products and peripheral, instrument and/or technology integrations
- Creating a standard data schema interface within Holovis e.g. XML/AVRO/JSON.
- Producing and maintaining documentation relevant to the Data Platform.
- Bachelor’s or advanced degree in a relevant discipline e.g. Computer Science
- Demonstrable expertise in the design, development and implementation of enterprise scale ‘big data’ data platforms, preferably utilising the Cloudera CDH Hadoop distribution
- Experience in the installation and design of Hadoop Clusters (CDH)
- Expertise in the design of hardware specifications and demand based storage/processing estimation
- Good general data management and modelling experience, confident in your understanding of the appropriate concepts relevant to database design and administration
- Installation, maintenance and administration of distributed Data Platforms (Hadoop Eco-system/Cloudera/Hortonworks dist.)
- Linux administration & shell scripting
- IT infrastructure design and data pipeline architecture expertise
- T-SQL skills and a good understanding of data management principles.
- Programming skills (Java, Python)
- Network configuration and webserver/HTTP knowledge
To apply send your CV to email@example.com, show us something interesting that you’ve done.
When you apply for a role at Holovis you will be sending us personal data. Please check our Job Applicant Privacy Notice before agreeing to send us you data.
|Job Category||Current Jobs|
|Duration of employmen||Permanent|