icon/x Created with Sketch.

Splunk Cookie Policy

We use our own and third-party cookies to provide you with a great online experience. We also use these cookies to improve our products and services, support our marketing campaigns, and advertise to you on our website and other websites. Some cookies may continue to collect information after you have left our website. Learn more (including how to update your settings) here.
Accept Cookie Policy

We are working on something new...

A Fresh New Splunkbase
We are designing a New Splunkbase to improve search and discoverability of apps. Check out our new and improved features like Categories and Collections. New Splunkbase is currently in preview mode, as it is under active development. We welcome you to navigate New Splunkbase and give us feedback.

Accept License Agreements

Thank You

Downloading Splunk App for Data Science and Deep Learning
SHA256 checksum (splunk-app-for-data-science-and-deep-learning_512.tgz) ca78e2da32f8ea90e9c1736e5506ed80e9494067a231b10fadc558f1ee1a569b SHA256 checksum (splunk-app-for-data-science-and-deep-learning_500.tgz) c231f1f54f03a7538e8890108833ac36e4099cdd58dd7c69c1e07447350c6918 SHA256 checksum (splunk-app-for-data-science-and-deep-learning_390.tgz) cd0aec5ea8455b3c8092c4608aaa0ede14f0cc109d6c9ca1f50b1046013122e8 SHA256 checksum (splunk-app-for-data-science-and-deep-learning_380.tgz) 6863b96abdb882a66e56eb7ccd8e1bfd423a012aba461b0669127b1d2bf2c1ea SHA256 checksum (splunk-app-for-data-science-and-deep-learning_370.tgz) ae39b6caff8e5a8ec715824cf235397a8cc13b279c27560edaaff14197edd3c9 SHA256 checksum (splunk-app-for-data-science-and-deep-learning_360.tgz) 4f83ea5c8f72423c4b2f2c6b481612d437dbe3ebc20610bbf00c75fc153d6e0d SHA256 checksum (splunk-app-for-data-science-and-deep-learning_350.tgz) ec05176ca36832016527720ed737ba9656e683ae99cd0c6c413b5d168c511960 SHA256 checksum (splunk-app-for-data-science-and-deep-learning_340.tgz) ec0c6ad21d6ecd457972966ac178a748e94b5a2717f5b9950bdfc05733ced43c SHA256 checksum (splunk-app-for-data-science-and-deep-learning_330.tgz) 736ca07575ad7a9079472283640188f7384dbcf82cf9e1fdc60d67683f180773 SHA256 checksum (splunk-app-for-data-science-and-deep-learning_320.tgz) 82e916685fcefe18ae8e9962d27f55f60eeda48cf5598224e23967a3415bb114 SHA256 checksum (splunk-app-for-data-science-and-deep-learning_311.tgz) 597ea5ce47695fd2fe1e53022de3b23fe8c564bd2925a91964782d7dff23ecbd SHA256 checksum (splunk-app-for-data-science-and-deep-learning_300.tgz) 2f365a258f584386929571685748bb9b391ef8d48a55743ebb0fe0a861cb59c1 SHA256 checksum (splunk-app-for-data-science-and-deep-learning_230.tgz) 4b27e7a590e448324ed88c326bf6bd56a3e2385b214da758d93131278f50fe1d
To install your download
For instructions specific to your download, click the Details tab after closing this window.

Flag As Inappropriate

splunk

Splunk App for Data Science and Deep Learning

Splunk Cloud
Splunk Built
Overview
Details
The Splunk App for Data Science and Deep Learning (DSDL), formerly known as the Deep Learning Toolkit (DLTK), lets you integrate advanced custom machine learning and deep learning systems with the Splunk platform. DSDL extends the Splunk Machine Learning Toolkit (MLTK) with prebuilt Docker containers for TensorFlow, PyTorch, and a collection of data science, NLP, and classical machine learning libraries. By using predefined workflows for rapid development with Jupyter Lab Notebooks, DSDL enables you to build, test,and operationalise your custom models with Splunk. You can leverage GPUs for compute intense training tasks and flexibly deploy models on CPU or GPU enabled containers. The app ships with various examples that showcase different deep learning and machine learning algorithms for classification, regression, forecasting, clustering, graph analytics, and NLP. Learn from these examples to tackle your advanced data science use cases in the areas of IT Operations, Security, Application Development, IoT, Business Analytics, and beyond.

Splunk App for Data Science and Deep Learning (DSDL)

For DSDL - formerly known as Deep Learning Toolkit (DLTK) - you find documentation on https://docs.splunk.com/Documentation/DSDL. Feel free to open issues, report bugs or raise feature requests on https://github.com/splunk/splunk-mltk-container-docker. You can also post your questions and engage with answers on community.splunk.com. Thanks for your collaboration!

Prerequisites

Quick start guide

  • Ensure Splunk Machine Learning Toolkit is installed and configured properly for your Splunk deployment.
  • Restart your Splunk instance after installing the Machine Learning Toolkit and the Deep Learning Toolkit App for Splunk.
  • You need to have an internet connected Docker environment accessible with permissions to pull the prebuilt MLTK container images from Dockerhub and start containers. If you are running Docker in an air-gapped environment read the description in the app.
  • Setup the Deep Learning Toolkit App for Splunk by connecting it to your Docker environment using the setup page in the app.
  • Start a development container with a Golden Image CPU from the container management dashboard and run one of the examples to verify that the Deep Learning Toolkit app works, e.g.
    -- Neural Network Classifier Example for Tensorflow
    -- Logistic Regression Classifier Example for PyTorch

Build your own containers

Extend the app with custom MLTK Containers: if you want to rebuild the existing MLTK Container images or want to build your own custom images navigate to https://github.com/splunk/splunk-mltk-container-docker

Further information, recent blog posts and additional resources

List of available examples (43)

  • Binary neural network classifier build on keras and TensorFlow
  • Logistic regression using PyTorch
  • Multi class neural network classifier using PyTorch with GPU
  • Multi class neural network classifier using PyTorch for DGA
  • Gradient boosting model with Spark's MLLib applied to the DGA dataset
  • XGBoost classifier with SHAP explainability
  • Automated machine learning with auto-sklearn
  • Linear regression using the TensorFlow™ estimator class
  • Regression using the TensorFlow™ Deep Neural Network (DNN) estimator class
  • XGBoost regressor
  • Support vector regressor with grid search
  • Multivariate LSTM regressor
  • Forecasting time series using TensorFlow (CNN)
  • Forecasting time series using TensorFlow (LSTM)
  • Forecasting time series using the Prophet library
  • Basic auto encoder using TensorFlow™
  • Distributed algorithm execution with DASK for KMeans
  • Clustering with UMAP and DBSCAN
  • Example of Host Clustering using UMAP on JA3 Signatures
  • Named Entity Recognition using spaCy for NLP tasks
  • Named Entity Recognition using spaCy Ginza (Japanese)
  • Sentiment Analysis using spaCy
  • Transformer based Text Classification using LLMs
  • Transformer based Text Summarization using LLMs
  • Transformer based Text Similarity Scoring using LLMs
  • Transformer based Zero Shot Classification using LLMs
  • Hashing Encoder with Similarity Matching using a Vector DB
  • Graph centrality algorithms using NetworkX
  • Graph community detection with Rapids (GPU accelerated)
  • Graph Data Science notebook for Graphistry
  • Causal Inference with causalnex
  • Frequent itemset mining with Spark FP Growth
  • Recommender system with Spark Collaborative Filtering
  • Rapids UMAP (GPU accelerated)
  • Process Mining with PM4Py
  • Time series analysis with STUMPY
  • Changepoint Detection
  • Anomaly Detection with Robust Random Cut Forest
  • Hidden Markov Models (HMM)
  • Online Learning Anomaly Detection
  • Time series decomposition (STL)
  • Anomaly Detection with PyOD Using Empirical Cumulative Distribution Functions (ECOD)
  • Correlation Matrix and Pair Plot
  • Spark Pi Hello World example

FAQ

Q: When I launch a container the first time, I cannot access Jupyter Lab.
A: The selected container image will be downloaded from dockerhub automatically in the background when you launch a container for the first time. Depending on your network this can take a while to download the docker image for the first time as the image sizes range from 2-12 GB. Please allow for some time to get the images initially pulled from Dockerhub. You can check which docker images are available locally by running docker images on your CLI.

Q: When I run DLTK 3.5 (or later) locally my browser showes insecure connection.
A: From DLTK 3.5 onwards the container images use HTTPS by default with self signed certificates for the data transfer related api endpoints and Jupyter Lab. Many browsers show "insecure connection" warnings and some allow to suppress that for localhost connections used during development. For production use, please work with your administrators to secure your setup and build containers with your own certificates as needed or use more advanced container environment setups.

Q: The example dashboards show no results or throw errors.
A: First, ensure that the right container image is downloaded and up and running for the specific example (e.g. TensorFlow examples require a TensorFlow container). Secondly, ensure that you have verified the associated notebook code exists in Juypter Lab and you have explicitly saved the notebook again (hit save button). By doing this, a python module is saved automatically generated (located in the /app/model folder in Juypter) which is needed to run the examples and populate the dashboards. Lastly, please check if MLTK's app permissions are set to global so that DLTK can use the lookup files used in most of the examples.

Q: Containers suddenly get stopped in about 1 minute after start.
A: Most likely you have two or more DLTK apps installed and configured to use the same container environment. In DLTK 3.x versions there is a scheduled search (MLTK Container Sync) that ensures synchronization of running containers and associated models for the DLTK app. If more than one DLTK app is running, this can cause synchronization collisions and therefore containers get stopped. When using DLTK 3.x please connect each DLTK app in a 1:1 relationship with your Docker and/or Kubernetes environment.

Q: After a version update I see the error "unable to read JSON response" when running a DLTK related search.
A: This may indicate that some part of the local configuration of DLTK is out of sync. The issue was reported to be resolved by opening the setup dashboard with the existing settings and just "test and save" again to reconfirm the configuration.

Q: Where are my notebooks stored in the docker environment?
A: By default, there are 2 docker volumes automatically mounted for persistance in your docker environment. Those volumes are named "mltk-container-app" and "mltk-container-notebooks". You can verify by running docker volume ls on your CLI. Important note: from DLTK version 3.1 onwards there is a new default volume called "mltk-container-data" - see migration notes below.

Q: What is the password for Jupyter Lab?
A: Please have a look at the Model Development Guide page in the Deep Learning Toolkit app. We recommed you set an own strong password.

Q: When I use conda or pip install in a running container the imported libraries can not be accessed
A: For added security reasons the default user in a DLTK container is not the root user and therefore libraries are installed under the user dltk which can cause issues in using them in Juypter. We recommend to use the build scripts and modify your Dockerfile and built your own container image. This way you also have the libraries installed persistently and you don't need to reinstall them when you restart the container the next time.

Notebooks Migration note for change to 3.1 and later versions

Due to the addition of Kubernetes there was a change and addition made to the way how volumes behave. From 3.1 on and with the use of the golden image the container directory /srv is the default and notebooks and app code is in there. In earlier versions there were two docker volumes mounted into /srv/app and /srv/notebooks which will be mapped into a backup folder in Jupyter from version 3.1 on. For migration simply copy your notebooks from the backup folder back into the notebooks and app folder in case those are empty.

Release Notes

Version 5.1.2
May 8, 2024
  • Bug fixes in several Juypter notebooks
  • Update and rebuilt of all containers
  • Consolidation of examples
  • Deprecation of examples and notebooks:
    -- Automated Machine Learning
    -- Linear Regressor
    -- Deep Neural Network Regressor
    -- Causal Inference
    -- Online Learning Anomaly Detection
    -- Named Entity Recognition and Extraction for Japanese
Version 5.0.0
Oct. 25, 2022
  • Splunk supported
  • UI and menu restructure
  • Documentation added
  • Juypter Lab password change in setup
  • Changed app setup behavior
  • Refresh of all prebuilt container images
  • Docker based UI guided image builder
  • Neural network designer
  • Example of Host Clustering using UMAP on JA3 Signatures - thanks Josh Cowling!
  • Various dashboard and examples fixes
  • Improved logging
Version 3.9.0
June 13, 2022
  • Refresh of all images
  • New ECOD anomaly detection example
  • Additional certificate handling options
  • Interactive Splunk search bar in Jupyter Lab Notebooks
  • Functionality to send data from containers to Splunk HEC
  • Bug fix in SHAP dashboard
  • Added Ingress option for Kubernetes clusters (thanks to Marcus Schiesser!)
Version 3.8.0
March 3, 2022
  • Integration with Splunk Observability
  • Benchmarks for different dataset sizes
  • Online learning anomaly detection algorithm
  • Container operations dashboard
  • Update of all images
  • Various bugfixes
Version 3.7.0
Nov. 4, 2021
  • Update DLTK to work with MLTK 5.3 and backwards for 5.x
  • Option for Custom SSL Certificate in Setup UI
  • Simple Example for Hidden Markov Models
  • Minor other fixes
Version 3.6.0
July 14, 2021
  • Bug fixes, updates and performance improvements
  • Operational overview dashboard
  • Drill-down links on the container management dashboard
  • New examples:
    -- Automated Machine Learning with auto-sklearn
    -- Robust Random Cut Forest for anomaly detection
    -- Time series decomposition for seasonality and trend (STL)
    -- Sentiment Analysis with spaCy
  • Image update: Golden Image CPU 3.6
Version 3.5.0
Feb. 22, 2021

Updated container images for:
- Golden Image CPU
- Golden Image GPU
- Rapids 0.17
- Spark 3.0.1

With improvements for:
- Model management with MLflow
- Integrated GIT version control in Jupyter Lab
- HTTPS for api and jupyter endpoints

Newly added algorithm examples:
- Matrix profiles with STUMPY
- Changepoint Detection
- Multivariate LSTM Regressor

Version 3.4.0
Dec. 21, 2020

Added algorithm examples for
- Support Vector Regressor with grid search in scikit-learn
- Causal inference
Updated Golden Image GPU to version 3.4.1
Example for Process Mining with PM4Py
Other minor small fixes

Version 3.3.0
Oct. 6, 2020
  • Added XGBoost Classifier with SHAP for Explainability
  • Added XGBoost Regressor
  • Bug fixes in setup UI
Version 3.2.0
June 22, 2020

Graphics:
- Background graphics in content overview page
- Docker + Kubernetes status green highlight in setup dashboard
- Content UI icons refresh

New docker images for Spark and Rapids:
- Spark Image (Experimental)
- Rapids Image (Experimental)

Content updates:
- Correlation Matrix and seaborn plot embedding
- DGA datashader example
- Spark GradientBoosting (non distributed, local client only)
- Spark Hello World / Barebone / Pi
- Spark FP Growth
- Spark ALS Recommender System
- Rapids Graph example
- Rapids UMAP example

Other:
- Passwords.conf add for missing kubernetes field
- Search head cluster config replication (Thank you Martin!)
- Return dataframe of arbitrary/changed shapes

Version 3.1.1
May 11, 2020
  • Setup options for Kubernetes and Openshift environment
  • Refresh of the container image ("golden image") with added Jupyter Lab Extensions for integrated Tensorboard and DASK management
  • New example for forecasting with Prophet
  • New example for distributed machine learning with DASK
  • New example for graph related algorithms with NetworkX
  • New examples for device-agnostic PyTorch CPU/GPU
  • New example for Japanese language NLP library Ginza
  • Fix of splunk_server=local in | rest calls
  • Several UI updates on dashboards
  • Bugfix with auth_mode in sync handler
  • Several other bug fixes
Version 3.0.0
Nov. 29, 2019

This version is only compatible with Splunk 8.0+ and MLTK 5.0+
If you run on Splunk 7.x and MLTK 4.x please select, download and install Deep Learning Toolkit version 2.3.0

Version 2.3.0
Oct. 14, 2019

This version is only compatible with Splunk 7.x and MLTK 4.x only.
If you run on Splunk 8.0+ and MLTK 5.0+ please select, download and install Deep Learning Toolkit version 3.0.0


Subscribe Share

Are you a developer?

As a Splunkbase app developer, you will have access to all Splunk development resources and receive a 10GB license to build an app that will help solve use cases for customers all over the world. Splunkbase has 1000+ apps from Splunk, our partners and our community. Find an app for most any data source and user need, or simply create your own with help from our developer portal.

Follow Us:
Splunk, Splunk>,Turn Data Into Doing, Data-to-Everything, and D2E are trademarks or registered trademarks of Splunk LLC in the United States and other countries. All other brand names,product names,or trademarks belong to their respective owners.