System Administrator with a taste for Hadoop
Etraveli Group / Datajobb / Göteborg
Observera att sista ansökningsdag har passerat.
Visa alla datajobb i Göteborg,
Mölndal,
Partille,
Kungälv,
Lerum eller i
hela Sverige Visa alla jobb hos Etraveli Group i Göteborg,
Stockholm,
Uppsala eller i
hela Sverige Etraveli Group is looking for a Hadoop Cluster and System Administrator to strengthen the admin capabilities of our Bisam team.
Etraveli Group is one of the leading flight-centric online travel agencies (OTA) in the world. The core business is airline tickets and travel-related services, targeted towards high volumes of price sensitive consumers and B2B services within travel. Expertise in travel e-commerce and economies of scale together with an impressive proprietary IT platform, gives us a scalable and profitable business model.
The core of the technical platform, the Internet Booking Engine (IBE), serves price-comparison sites and customers with attractive travel options retrieved from centralised flight marketplaces. Every minute, 120 million travel options are automatically assessed, gathered, and re-priced. Intense data analysis, price optimisation, and configuration automation is crucial in order to keep our margins high. Etraveli Group's data stock with air search, order, and tactical competition analysis data is impressive and still growing rapidly in dimensions and volume.
Bisam (Business Intelligence, Statistical Analysis and Modelling) is our cross functional data science and data engineering team. The team is located in Gothenburg, but works closely with corresponding teams in Athens, Stockholm and Uppsala. The roles in the team span across developers, analysts, database managers, and information engineers. The offered position will sort under the Development department, which is managed from Gothenburg.
The work as Hadoop Cluster and System Administrator will consist primarily of design, implementation and administering of on-premise cluster infrastructure and services on an ongoing basis.
Prepare to work closely with a team of data engineers and data scientists but also with system administrators and DBAs. This means you will act as the main responsible administrator for the team and its systems and be a bridge to other teams within the field.
The position requires a solid technical background or the ability to prove the skills within the field. Strong communication skills are required. Our team is small but strong, so you should be prepared to roll up your sleeves and dig in where there is a need, whether that is internally, attending some random system configuration, or externally with requirement management or information engineering.
All in all, times will be great. You will be working in a highly creative environment, surrounded by awesome coworkers, challenging problems, and demanding, rewarding customers.
Key responsibilities:
• Deployment, configuration and operation of Hadoop components and services
• Monitoring of health, connectivity and security of production and non-production Hadoop environments
• Planning and execution maintenance and cluster upgrades with minimal downtime
• Capacity planning and scaling
• Backup and recovery operations
• Performance monitoring and fine tuning
• Handling and resolution of incidents, escalation of issues to vendors
• Use of analytics to improve service, availability, and customer interaction
• Documentation and knowledge sharing
• Cooperation with infrastructure, operations, business intelligence and development teams
• Management of integration with Active Directory/Kerberos and provision of support for user and group management
• Assistance to data engineers in ETL transformations and pipeline setup and management
• Management and optimization of disk space for handling data
• Installation of patches and upgrading of software as and when needed
Required qualifications:
• Passion for BigData
• Experience with Cloudera CDH or similar
• Experience working with Hadoop and the following services: HDFS, MapReduce, Impala, Spark, Flume, Hue, Hive, Sentry, Oozie, Zookeeper, Sqoop, Kafka, YARN
• Relational and NoSQL database systems
• Linux (CentOS)
• Shell scripting
• Linux performance tuning
Preferred qualifications:
• Virtualization (KVM, oVirt, VMware Sphere)
• Networking (TCP/IP)
• Active Directory
• Kerberos
• TLS/SSL
We also offer
A flexible work schedule and work in a diverse environment with offices in both Sweden and Athens. The position is located in Gothenburg, ten minutes by foot from the Central Station. The offices are modern and capacious, with plenty of natural light and a pinball machine and ping-pong table for leisure. Breakfast is served every morning in the kitchen.
How to apply
Please send your CV and a cover letter by clicking "Apply For This Job."
Candidates will be selected for interviews on a running basis throughout the application window, which lasts until 23/6. Apply today since the position may be filled before the last date.
Any questions about the job? Contact
kay.wittig@etraveligroup.com.
Varaktighet, arbetstid
Heltid Anställningstid enligt överenskommelse
Publiceringsdatum2019-05-23ErsättningLön enligt överenskommelse
Så ansöker duSista dag att ansöka är 2019-11-09
Klicka på denna länk för att göra din ansökanKontaktMarielle Hallberg
marielle.hallberg@etraveli.comFöretagEtraveli Group
AdressEtraveli Group
Kungsgatan 34
41119 Göteborg
KontorsadressKungsgatan 34, Göteborg
Jobbnummer 4805808
Observera att sista ansökningsdag har passerat.