A Spark-aware scaler that utilizes ML-based models that profile previous job runs.
A run-time tool to effectively adjust cluster resources, ensuring optimal performance without over– or under-provisioning.
Maximizes stability and reduces costs.
Continuously monitors and adapts to the evolving needs of running jobs.
Dynamic Cluster
Configuration
Constantly monitors the usage patterns of your evolving workloads.
Adjusts configurations dynamically based on predicted data volumes.
By evaluating the likelihood of spot instance interruptions across different availability zones, our ML-powered fleets allocate the optimal resources in real time.
Tunes Spark configurations to match the selected profile for peak performance and lowest cost.
Tailored
Orchestration
Schedule the job at the optimal time to minimize costs and maximize stability, while automatically utilizing shared compute resources between workloads.
Set a flexible SLA and define the desired completion time for a specific workload.
Zipher evaluates the entire DAG and the dependencies between workloads.
Cost Visibility
Gain insight into the spending across all Databricks workloads, providing a detailed analysis of cost and resource distribution.
Analyze expenses over various timeframes, distinguishing between Databricks and cloud provider costs and breaking them down by parameters such as workers, photon, etc.
Quickly identify your primary cost drivers and promptly detect and address any cost spikes.
Notifications
Receive notifications of cost anomalies in your Databricks account via Slack, Teams, and Email so that you can always keep track of your infrastructure.
Stay informed with regular spending summaries to monitor your expenses and ensure they align with your contract.
Get direct alerts to specific teams, channels, and users for efficient communication.
Integrations
Connects seamlessly with your entire Databricks account.
Supports integration with all leading cloud service providers, including AWS, Azure, and Google Cloud.
Works harmoniously with your data orchestration and transformation tools, such as Airflow, Azure Data Factory, and dbt.
Syncs with your Infrastructure-as-Code (IaC) solutions like Terraform or with your custom automation scripts.