https://creativecommons.org/publicdomain/zero/1.0/https://creativecommons.org/publicdomain/zero/1.0/
This dataset is the Python wheel package file for PyTorch Geometric external library (to install PyG just pip install torch_geometric
). PyTorch Geometric is the torch implementation used to build the graph neural network. For details, please refer to torch_geometric.👋
Note: These library are not required to install PyG. I compile the wheel files because it takes a long to install them. If you want to use a specific version, please refer to this notebook.
If you need to install l5kit with no internet, you can attach this dataset and use pip
to install from this dataset.
!pip install --no-index -f /kaggle/input/kaggle-l5kit-110 pip==20.2.3
!pip install --no-index -f /kaggle/input/kaggle-l5kit-110 l5kit
Package to Install Autogluon 0.7.0 without internet for kaggle competitions
Add dataset and run the following command on notebook
!pip install --no-index --find-links /kaggle/input/autogluon-0-7-0-offline-setup-wheel/Autogluon_0_7_0_setup/Autogluon_0_7_0 autogluon
Reference: https://www.kaggle.com/code/anandsiva/autogluon-full-offline-setup
Install staintools 2.1.2 offline in Kaggle notebooks. The wheels were built in a Kaggle notebook in August 2022.
Just add this dataset and run:
!pip install /kaggle/input/staintools-offline/spams-2.6.5.4-cp37-cp37m-linux_x86_64.whl
!pip install /kaggle/input/staintools-offline/staintools-2.1.2-py3-none-any.whl
Latest polars sources. This can be included when you are on an offline notebook.
I'll try to keep this up-to-date, but if there is a newer version, please do tell me and I'll update it.
Current version: 1.31.0
To install in your notebook:
jupyter
!pip install -q --no-index --find-links /kaggle/input/offline-polars polars
License: MIT
MIT Licensehttps://opensource.org/licenses/MIT
License information was derived automatically
This dataset contains .whl files for installing langchain on Kaggle. Compatible with Python 3.10
Examples to use (without Internet connection):
!pip install langchain --no-index --find-links=file:///kaggle/input/langchain-whl-llm/
!pip install -U /kaggle/input/langchain-whl-llm/bitsandbytes-0.43.1-py3-none-manylinux_2_24_x86_64.whl -qq
This dataset was created by Vijayabhaskar J
!pip install --no-index ../input/cellposewheels/cellpose-0.7.2-py3-none-any.whl --find-links=../input/cellposewheels
Thanks to @slawekbiel for pioneering in his Inference307 Kernel.
The license for each wheel is available within their respective package.
This dataset provides the dicomsdl package version 0.109.2 for use offline.
pip install /kaggle/input/dicomsdl--0-109-2/dicomsdl-0.109.2-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl
Apache License, v2.0https://www.apache.org/licenses/LICENSE-2.0
License information was derived automatically
All of the most recent releases from Hugging Face. Updated daily!
This also includes the most updated source versions from github. These have the "dev" tag in the filename.
Install them offline in your notebook by using the following command (changing the path to whichever dataset you want):
!pip install --no-deps --no-index /kaggle/input/hf-libraries/bitsandbytes/bitsandbytes-0.43.0-py3-none-manylinux_2_24_x86_64.whl
To use datasets
, you will also need to install the hotfix.
!pip install --no-deps --no-index /kaggle/input/hf-libraries/datasets/pyarrow_hotfix-0.6-py3-none-any.whl
Every so often, I will delete the oldest versions of bitsandbytes because those filesizes are relatively large (100MB each)
Install files for Detecto and pascal-voc-writer. Thanks to the authors for great tools.
Project links:
* Detecto
* pascal-voc-writer
License: MIT License (MIT)
Usage:
python
!pip install /kaggle/input/ultralytics-whl/ultralytics-8.0.139-py3-none-any.whl
GitHub: https://github.com/ultralytics/ultralytics
PyPi: https://pypi.org/project/ultralytics/
!pip install ../input/pythonpackageaudiometadata-0111/pytzdata-2020.1-py2.py3-none-any.whl
!pip install ../input/pythonpackageaudiometadata-0111/pendulum-2.1.2-cp37-cp37m-manylinux1_x86_64.whl
!pip install ../input/pythonpackageaudiometadata-0111/bidict-0.21.2-py2.py3-none-any.whl
!pip install ../input/pythonpackageaudiometadata-0111/pprintpp-0.4.0-py2.py3-none-any.whl
!pip install ../input/pythonpackageaudiometadata-0111/tbm_utils-2.6.0-py3-none-any.whl
!pip install ../input/pythonpackageaudiometadata-0111/bitstruct-8.11.0/bitstruct-8.11.0
!pip install ../input/pythonpackageaudiometadata-0111/audio_metadata-0.11.1-py3-none-any.whl
This dataset contains the Keras Self-Attention Python package and dependencies. To install in a notebook without internet.
Check the example notebook for the story: https://www.kaggle.com/donkeys/pip-install-package-from-dataset-no-internet
This dataset was created by Đàm Trọng Tuyên
!pip install ../input/pytorchvideo015/av-9.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl !pip install ../input/pytorchvideo015/iopath-0.1.10-py3-none-any.whl !pip install ../input/pytorchvideo015/fvcore-0.1.5.post20220512-py3-none-any.whl !pip install ../input/pytorchvideo015/parameterized-0.8.1-py2.py3-none-any.whl !pip install ../input/pytorchvideo015/pytorchvideo-0.1.5-py3-none-any.whl
For use in code competitions with no Internet. Add Data this dataset. Install with !mkdir -p /tmp/pip/cache/ !cp ../input/hdbscan0828whl/hdbscan-0.8.28-cp37-cp37m-linuxx8664.whl /tmp/pip/cache/ !pip install --no-index --find-links /tmp/pip/cache/ hdbscan
Latest version 0.8.28 like https://www.kaggle.com/datasets/something4kag/hdbscan0827-whl import hdbscan no longer worked in notebooks, not available in docker now. Made it work!
https://github.com/scikit-learn-contrib/hdbscan
scikit-learn-contrib/hdbscan is licensed under the BSD 3-Clause "New" or "Revised" License
A permissive license similar to the BSD 2-Clause License, but with a 3rd clause that prohibits others from using the name of the copyright holder or its contributors to promote derived products without written consent.
This dataset was created by tensor choko
This dataset was created by minwo3o
This is a Python package for manipulating 2-dimensional tabular data structures (aka data frames). It is close in spirit to pandas or SFrame; however we put specific emphasis on speed and big data support. As the name suggests, the package is closely related to R's data.table and attempts to mimic its core algorithms and API.
The wheel file for installing datatable v0.11.0
!pip install ../input/python-datatable/datatable-0.11.0-cp37-cp37m-manylinux2010_x86_64.whl > /dev/null
import datatable as dt
data = dt.fread("filename").to_pandas()
https://github.com/h2oai/datatable
https://datatable.readthedocs.io/en/latest/index.html
https://creativecommons.org/publicdomain/zero/1.0/https://creativecommons.org/publicdomain/zero/1.0/
This dataset is the Python wheel package file for PyTorch Geometric external library (to install PyG just pip install torch_geometric
). PyTorch Geometric is the torch implementation used to build the graph neural network. For details, please refer to torch_geometric.👋
Note: These library are not required to install PyG. I compile the wheel files because it takes a long to install them. If you want to use a specific version, please refer to this notebook.