Company:
Vallum Associates
Location: Leeds
Closing Date: 02/11/2024
Hours: Full Time
Type: Permanent
Job Requirements / Description
Role- Databricks Tech SME
Rate- £650/day (inside IR35)
Location- Leeds (3 days a week)
Contract- 6 months with extension
Have you got what it takes to succeed The following information should be read carefully by all candidates.
Role:
community’s most vulnerable. Connect your passion with purpose, teaming with people who thrive on finding innovative solutions. Here are the details on this position. As a Databricks Tech SME, you can contribute your skills as we harness the power of technology to help our clients improve the health and well-being of the members they serve — a community’s most vulnerable. Connect your passion with purpose, teaming with people who thrive on finding innovative solutions. Here are the details on this position.
Mandatory Skills:
Deploy, support, and administer the Databricks in the Cloud including but not limited to:
Memory Management
Handle scheduling queues & messages
Performance Management
Supervise cluster health checks
Regular back up & recovery
Node commissioning & decommissioning
Code deployments
Server log analysis
Manage tenants including workspace creation, user management, cloud resources, and account usage monitoring
Advise on architecture and scaling for Databricks environments
Collect user requirements and set up data storage, notebooks, and compute to fulfil business use cases
Serve as a point of contact for technical issues in the environment
Monitor usage, report on cost management, and participate in cost planning
What we're looking for:
3+ years’ experience in data administration, from setting up the Databricks environment to successfully administering it
Proven experience as the Databricks account owner, managing workspaces, audit logs, and high-level usage monitoring
Experience managing cluster and jobs configuration options
Experience with GCP or AWS set up of Databricks
Proven ability to meet established service levels, availability, performance, data privacy, and security guidelines is necessary
Be able to Install, troubleshoot, integrate with Identity Access Management for Databricks
Good knowledge on GCP and Kubernetes - focus of this is around ensuring HSBC controls are adhered to as part of the deployment of Dev. Be able to create Blueprint for Production including billing/support model, security, naming conventions etc.
Terraform knowledge
Write up required:
How to set up Databricks on Cloud
What considerations should be taken on migrating workloads to Databricks?
Has he set up Databricks control planes?
Describe consideration / aspects between setting up data bricks for a POC on a dev environment as opposed to a Production environment
Share this job
Vallum Associates
Useful Links