4 datasets found
  1. d

    SPD LTDS Appendix 2 Transformer Data (Table 2) - Dataset - Datopian CKAN...

    • demo.dev.datopian.com
    Updated May 27, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). SPD LTDS Appendix 2 Transformer Data (Table 2) - Dataset - Datopian CKAN instance [Dataset]. https://demo.dev.datopian.com/dataset/sp-energy-networks--spd-ltds-appendix-2-transformer-data-table-2
    Explore at:
    Dataset updated
    May 27, 2025
    Description

    The "SPD LTDS Appendix 2 Transformer Data (Table 2)" dataset provides the equipment parameter data for each group of transformers on the SP Distribution (SPD) system. The insertion of N/K in the table indicates where a parameter has not been verified.The column titled "Reverse Rating" shows the reverse power flow capability of the tap changer.Click here to access our full Long Term Development Statements for both SP Distribution (SPD) & SP Manweb (SPM). For additional information on column definitions, please click on the Dataset schema link below.Note: A fully formatted copy of this appendix can be downloaded from the Export tab under Alternative exports.Download dataset metadata (JSON)If you wish to provide feedback at a dataset or row level, please click on the “Feedback” tab above. Data Triage:As part of our commitment to enhancing the transparency, and accessibility of the data we share, we publish the results of our Data Triage process.Our Data Triage documentation includes our Risk Assessments; detailing any controls we have implemented to prevent exposure of sensitive information. Click here to access the Data Triage documentation for the Long Term Development Statement dataset.To access our full suite of Data Triage documentation, visit the SP Energy Networks Data & Information page.

  2. Dataset for "Towards Signed Distance Function based Metamaterial Design:...

    • zenodo.org
    zip
    Updated Apr 24, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Qibang Liu; Qibang Liu (2025). Dataset for "Towards Signed Distance Function based Metamaterial Design: Neural Operator Transformer for Forward Prediction and Diffusion Model for Inverse Design" [Dataset]. http://doi.org/10.5281/zenodo.15121966
    Explore at:
    zipAvailable download formats
    Dataset updated
    Apr 24, 2025
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Qibang Liu; Qibang Liu
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The dataset for "Towards Signed Distance Function based Metamaterial Design: Neural Operator Transformer for Forward Prediction and Diffusion Model for Inverse Design", include geometries, corresponding stress-strain curves, Mises stress field, displacement field, sign distance functions, boundary points cloud, mesh nodes, mesh cell connetivity.

  3. f

    Data_Sheet_1_Self-attention in vision transformers performs perceptual...

    • frontiersin.figshare.com
    pdf
    Updated Jun 29, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Paria Mehrani; John K. Tsotsos (2023). Data_Sheet_1_Self-attention in vision transformers performs perceptual grouping, not attention.PDF [Dataset]. http://doi.org/10.3389/fcomp.2023.1178450.s001
    Explore at:
    pdfAvailable download formats
    Dataset updated
    Jun 29, 2023
    Dataset provided by
    Frontiers
    Authors
    Paria Mehrani; John K. Tsotsos
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Recently, a considerable number of studies in computer vision involve deep neural architectures called vision transformers. Visual processing in these models incorporates computational models that are claimed to implement attention mechanisms. Despite an increasing body of work that attempts to understand the role of attention mechanisms in vision transformers, their effect is largely unknown. Here, we asked if the attention mechanisms in vision transformers exhibit similar effects as those known in human visual attention. To answer this question, we revisited the attention formulation in these models and found that despite the name, computationally, these models perform a special class of relaxation labeling with similarity grouping effects. Additionally, whereas modern experimental findings reveal that human visual attention involves both feed-forward and feedback mechanisms, the purely feed-forward architecture of vision transformers suggests that attention in these models cannot have the same effects as those known in humans. To quantify these observations, we evaluated grouping performance in a family of vision transformers. Our results suggest that self-attention modules group figures in the stimuli based on similarity of visual features such as color. Also, in a singleton detection experiment as an instance of salient object detection, we studied if these models exhibit similar effects as those of feed-forward visual salience mechanisms thought to be utilized in human visual attention. We found that generally, the transformer-based attention modules assign more salience either to distractors or the ground, the opposite of both human and computational salience. Together, our study suggests that the mechanisms in vision transformers perform perceptual organization based on feature similarity and not attention.

  4. PDE dataset

    • kaggle.com
    Updated Jun 27, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Shuhao Cao (2021). PDE dataset [Dataset]. https://www.kaggle.com/scaomath/pde-dataset/discussion
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Jun 27, 2021
    Dataset provided by
    Kagglehttp://kaggle.com/
    Authors
    Shuhao Cao
    Description

    Context

    Tasks

    This dataset contains the Burgers' equation and Darcy flow benchmarks featured in several operator learning tasks involving parametric partial differential equations: for \(a\in \mathcal{A}\)

    \[L_a (u) = f \quad \text{ in } \Omega \subset\mathbb{R}^m,\]

    with appropriate boundary or initial conditions. The tasks are

    • Solving the equation above for a class of initial conditions \(f = u(\cdot, 0)\).
    • Find the mapping between a varying parameter \(a\) to the solution \(u\).
    • The inverse coefficient identification mapping: \(u+\text{noise} \mapsto a\).

    Introductory posts

    Data

    The datasets are given in the form of matlab file. The dataset can be loaded to torch.Tensor format using the following snippets. The first index is the samples; the rest dimensions contains the spacial discretizations of the data and the targets. data_path specified by user.

    Burgers

    PDE for \(u \in H^1((0,2\pi)) \cap C^0(\mathbb{S}^1)\) \[ \frac {\partial u}{\partial t} + u \frac{\partial u}{\partial x} = u \frac {\partial ^{2}u}{\partial x^{2}} \quad \text{ for } (x,t)\in (0,2\pi) \times (0,1],\] Initial condition \(u(x, 0) = u_0(x) \text{ for } x\in (0,2\pi)\).

    data = loadmat(data_path)
    x_data = data['a'] # input
    y_data = data['u'] # target
    

    Darcy flow

    In a forward problem, a is the input and u is the solution (target). In an inverse problem, u is the input and a is the target.

    PDE: the coefficient \(a\in L^{\infty}(\Omega)\), \(u \in H^1_0(\Omega)\) \[- abla \cdot( a(x) abla u ) = 1\quad \text{in }\; (0,1)^2,\] with a zero Dirichlet boundary condition.

    from scipy.io import loadmat
    data = loadmat(data_path)
    a = data['coeff']
    u = data['sol']
    

    License

    The dataset is owned by Zongyi Li and the license is MIT: https://github.com/zongyi-li/fourier_neural_operator

    Bibliography

    @misc{li2020fourier,
       title={Fourier Neural Operator for Parametric Partial Differential Equations}, 
       author={Zongyi Li and Nikola Kovachki and Kamyar Azizzadenesheli and Burigede Liu and Kaushik Bhattacharya and Andrew Stuart and Anima Anandkumar},
       year={2020},
       eprint={2010.08895},
       archivePrefix={arXiv},
       primaryClass={cs.LG}
    }
    
    @misc{li2020neural,
       title={Neural Operator: Graph Kernel Network for Partial Differential Equations}, 
       author={Zongyi Li and Nikola Kovachki and Kamyar Azizzadenesheli and Burigede Liu and Kaushik Bhattacharya and Andrew Stuart and Anima Anandkumar},
       year={2020},
       eprint={2003.03485},
       archivePrefix={arXiv},
       primaryClass={cs.LG}
    }
    
    @misc{cao2021,
       title={Choose a Transformer: Fourier or Galerkin}, 
       author={Shuhao Cao},
       year={2021},
       eprint={2105.14995},
       archivePrefix={arXiv},
       primaryClass={cs.LG}
    }
    
  5. Not seeing a result you expected?
    Learn how you can add new datasets to our index.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
(2025). SPD LTDS Appendix 2 Transformer Data (Table 2) - Dataset - Datopian CKAN instance [Dataset]. https://demo.dev.datopian.com/dataset/sp-energy-networks--spd-ltds-appendix-2-transformer-data-table-2

SPD LTDS Appendix 2 Transformer Data (Table 2) - Dataset - Datopian CKAN instance

Explore at:
Dataset updated
May 27, 2025
Description

The "SPD LTDS Appendix 2 Transformer Data (Table 2)" dataset provides the equipment parameter data for each group of transformers on the SP Distribution (SPD) system. The insertion of N/K in the table indicates where a parameter has not been verified.The column titled "Reverse Rating" shows the reverse power flow capability of the tap changer.Click here to access our full Long Term Development Statements for both SP Distribution (SPD) & SP Manweb (SPM). For additional information on column definitions, please click on the Dataset schema link below.Note: A fully formatted copy of this appendix can be downloaded from the Export tab under Alternative exports.Download dataset metadata (JSON)If you wish to provide feedback at a dataset or row level, please click on the “Feedback” tab above. Data Triage:As part of our commitment to enhancing the transparency, and accessibility of the data we share, we publish the results of our Data Triage process.Our Data Triage documentation includes our Risk Assessments; detailing any controls we have implemented to prevent exposure of sensitive information. Click here to access the Data Triage documentation for the Long Term Development Statement dataset.To access our full suite of Data Triage documentation, visit the SP Energy Networks Data & Information page.

Search
Clear search
Close search
Google apps
Main menu