Skip to content

Commit

Permalink
Merge pull request #1334 from dandi/bf-codespell
Browse files Browse the repository at this point in the history
codespell: tuneup of config and some new typos detected fixes
  • Loading branch information
yarikoptic committed Oct 16, 2023
2 parents e375663 + 58b61f7 commit a0199fc
Show file tree
Hide file tree
Showing 4 changed files with 5 additions and 5 deletions.
2 changes: 1 addition & 1 deletion dandi/tests/data/metadata/dandimeta_migration.new.json
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
"schemaKey": "Dandiset",
"schemaVersion": "0.4.0",
"name": "A NWB-based dataset and processing pipeline of human single-neuron activity during a declarative memory task",
"description": "A challenge for data sharing in systems neuroscience is the multitude of different data formats used. Neurodata Without Borders: Neurophysiology 2.0 (NWB:N) has emerged as a standardized data format for the storage of cellular-level data together with meta-data, stimulus information, and behavior. A key next step to facilitate NWB:N adoption is to provide easy to use processing pipelines to import/export data from/to NWB:N. Here, we present a NWB-formatted dataset of 1863 single neurons recorded from the medial temporal lobes of 59 human subjects undergoing intracranial monitoring while they performed a recognition memory task. We provide code to analyze and export/import stimuli, behavior, and electrophysiological recordings to/from NWB in both MATLAB and Python. The data files are NWB:N compliant, which affords interoperability between programming languages and operating systems. This combined data and code release is a case study for how to utilize NWB:N for human single-neuron recordings and enables easy re-use of this hard-to-obtain data for both teaching and research on the mechanisms of human memory.",
"description": "A challenge for data sharing in systems neuroscience is the multitude of different data formats used. Neurodata Without Borders: Neurophysiology 2.0 (NWB:N) has emerged as a standardized data format for the storage of cellular-level data together with meta-data, stimulus information, and behavior. A key next step to facilitate NWB:N adoption is to provide easy to use processing pipelines to import/export data from/to NWB:N. Here, we present a NWB-formatted dataset of 1863 single neurons recorded from the medial temporal lobes of 59 human subjects undergoing intracranial monitoring while they performed a recognition memory task. We provide code to analyze and export/import stimuli, behavior, and electrophysiological recordings to/from NWB in both MATLAB and Python. The data files are NWB:N compliant, which affords interoperability between programming languages and operating systems. This combined data and code release is a case study for how to utilize NWB:N for human single-neuron recordings and enables easy reuse of this hard-to-obtain data for both teaching and research on the mechanisms of human memory.",
"contributor": [
{
"schemaKey": "Person",
Expand Down
2 changes: 1 addition & 1 deletion docs/design/python-api-1.md
Original file line number Diff line number Diff line change
Expand Up @@ -118,7 +118,7 @@ Designs for an improved Python API
* The basic methods simply upload/download everything, blocking until completion, and return either nothing or a summary of everything that was uploaded/downloaded
* These methods have `show_progress=True` options for whether to display progress output using pyout or to remain silent
* The upload methods return an `Asset` or collection of `Asset`s. This can be implemented by having the final value yielded by the `iter_` upload function contain an `"asset"` field.
* Each method also has an iterator variant (named with an "`iter_`" preffix) that can be used to iterate over values containing progress information compatible with pyout
* Each method also has an iterator variant (named with an "`iter_`" prefix) that can be used to iterate over values containing progress information compatible with pyout
* These methods do not output anything (aside perhaps from logging)

* An `UploadProgressDict` is a `dict` containing some number of the following keys:
Expand Down
4 changes: 2 additions & 2 deletions setup.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -127,10 +127,10 @@ tag_prefix =
parentdir_prefix =

[codespell]
skip = dandi/_version.py,dandi/due.py,versioneer.py
skip = _version.py,due.py,versioneer.py,*.vcr.yaml,venv,venvs
# Don't warn about "[l]ist" in the abbrev_prompt() docstring:
# TE is present in the BIDS schema
ignore-regex = (\[\w\]\w+|TE)
ignore-regex = (\[\w\]\w+|TE|ignore "bu" strings)
exclude-file = .codespellignore

[mypy]
Expand Down
2 changes: 1 addition & 1 deletion tools/update-assets-on-server
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
Composed by Satra (with only little changes by yoh).
Initially based on code in dandisets' backups2datalad.py code for updating
as a part of that script but it was intefering with the updates to datalad thus
as a part of that script but it was interfering with the updates to datalad thus
extracted into a separate script.
"""

Expand Down

0 comments on commit a0199fc

Please sign in to comment.