Workbench:StatisticsCommands

From Van Essen Lab

(Difference between revisions)
Jump to: navigation, search
(caret_command -surface-identify-sulci)
(Commands)
Line 1: Line 1:
= Commands =
= Commands =
-
Are we replacing metric with Cifti Scalar? Can convert metric (func) with "wb_command -cifti-convert"
+
Are we replacing metric with Cifti Scalar?
 +
TSC: no, processing cifti in nontrivial ways generally has to operate on each structure independently, which is far more convenient as metric files than trying to deal directly with the mappings.
 +
 +
NOTE: -cifti-convert is a HACK to get cifti data into cifti-unaware applications, like matlab.  -cifti-create-dense-timeseries is currently the main way to create cifti from non-cifti standard data files.
== caret_command -file-convert ==
== caret_command -file-convert ==
Line 21: Line 24:
  Column      Minimum      Maximum          Mean    Sample Dev    % Positive    % Negative  Column Name
  Column      Minimum      Maximum          Mean    Sample Dev    % Positive    % Negative  Column Name
     1            0.000          1.842                1.225          0.401        91.472          0.000        Masked Myelin Map
     1            0.000          1.842                1.225          0.401        91.472          0.000        Masked Myelin Map
 +
 +
TSC: I think having one command work on all file types could be fine, since it doesn't output a file.
== caret_command -show-scene ==
== caret_command -show-scene ==
Line 62: Line 67:
== caret_command -surface-identify-sulci ==
== caret_command -surface-identify-sulci ==
This is a very large, complex operation.
This is a very large, complex operation.
 +
 +
TSC: this is mainly needed for depth, which we are going to reengineer anyway
== caret_command -surface-region-of-interest-selection ==
== caret_command -surface-region-of-interest-selection ==
This is a big command but may be able to use "wb_command -cifti-math"
This is a big command but may be able to use "wb_command -cifti-math"
 +
== caret_command -surface-to-segmentation-volume ==
== caret_command -surface-to-segmentation-volume ==
 +
This is possible with wb_command -create-signed-distance-volume, followed by -volume-math 'x<0', could be provided as a separate command
== caret_command -surface-topology-disconnect-nodes ==
== caret_command -surface-topology-disconnect-nodes ==
Line 75: Line 84:
Is there a desire to create with dimensions, spacing, origin?  Name of stereotaxic space?  Or both?
Is there a desire to create with dimensions, spacing, origin?  Name of stereotaxic space?  Or both?
 +
 +
TSC: it should be quite easy - I would opt not to provide any standard spaces, since it is much more likely you will already have a reference volume for those.
== caret_command -volume-set-origin ==
== caret_command -volume-set-origin ==
Line 82: Line 93:
See above.
See above.
 +
 +
TSC: these should be rolled into one command that just sets the entire sform
== caret_command -volume-histogram  ==  
== caret_command -volume-histogram  ==  
This should probably be generalized to work with any mappable data file (displayed as overlay) that maps with a palette.
This should probably be generalized to work with any mappable data file (displayed as overlay) that maps with a palette.
 +
 +
TSC: what would the output be?
== caret_command -volume-information  ==
== caret_command -volume-information  ==
Line 94: Line 109:
See  metric-information near top.
See  metric-information near top.
 +
 +
TSC: redundant, all volumes are nifti.
== caret_command -metric-composite  ==
== caret_command -metric-composite  ==
-
Use wb_command -cifti-merge.
+
Use wb_command -metric-merge (similar exist for cifti and volume, and soon for label).
== caret_command -metric-composite-identified-columns  ==  
== caret_command -metric-composite-identified-columns  ==  
-
Use wb_command -cifti-merge.
+
Use wb_command -metric-merge (similar exist for cifti and volume, and soon for label).
== caret_command -metric-math-postfix  ==
== caret_command -metric-math-postfix  ==
-
Use wb_command -cifti-math.
+
Use wb_command -metric-math (similar exist for cifti and volume).
== caret_command -metric-set-column-name  ==
== caret_command -metric-set-column-name  ==
-
Generalize to work on any mappable data file (displayed as an overlay).
+
Use wb_command -set-map-names.
== caret_command -metric-set-column-to-scalar  ==
== caret_command -metric-set-column-to-scalar  ==
-
Use wb_command -cifti-math.
+
Use wb_command -metric-math (similar exist for cifti and volume).
= Statistics Commands =
= Statistics Commands =

Revision as of 23:37, 30 August 2013

Contents

Commands

Are we replacing metric with Cifti Scalar?

TSC: no, processing cifti in nontrivial ways generally has to operate on each structure independently, which is far more convenient as metric files than trying to deal directly with the mappings.

NOTE: -cifti-convert is a HACK to get cifti data into cifti-unaware applications, like matlab. -cifti-create-dense-timeseries is currently the main way to create cifti from non-cifti standard data files.

caret_command -file-convert

e.g., if you want to convert to/from VTK; wb_import doesn't like some gii, and I haven't decoded why

If there is a file that is failing to convert, let us know.

wb_import is part of the Caret5 source. It may be possible to move it to the workbench source but may be a lot of work.

caret_command -metric-information

We should probably have either a "file information" command that works for any file and whose output is file dependent. An alternative is a "map information" command that works for any map file (is selectable as an overlay). For files that are mapped with a palette, the output should be very similar to the existing command since all of these files make descriptive statistics available. For files mapped with a label table, we may want to output the map indices and names and optionally the contents of the label table (keys/names).


Filename: Glasser_PilotIII.L.T1DividedByT2_RemovedOutliers_s5.20k_fs_LR.func.gii
Number of Nodes: 20252
Number of Columns: 1
Column      Minimum      Maximum           Mean     Sample Dev     % Positive     % Negative   Column Name
    1            0.000           1.842                 1.225          0.401         91.472          0.000         Masked Myelin Map

TSC: I think having one command work on all file types could be fine, since it doesn't output a file.

caret_command -show-scene

There is a show scene in wb_command. We should compare output to see if any needed functionality is missing.

caret_command -surface-border-projection

A projection algorithm is in Workbench and can be used here.

caret_command -surface-border-unprojection

An unprojection algorithm is in Workbench and can be used here.

caret_command -surface-border-draw-around-roi

Probably can incorporate the algorithm from Caret5.

caret_command -surface-cell-projection

Cells are not in Workbench but are the same as foci.

caret_command -surface-cell-unprojection

Cells are not in Workbench but are the same as foci.

caret_command -surface-foci-projection

We do not have foci files in Workbench, just foci projection. wb_command only imports foci projection but could be updated for foci. Alternative is to convert foci to foci projection in Caret5.

Foci projection is in Workbench.

caret_command -surface-foci-unprojection

Foci unprojection is in Workbench.

caret_command -surface-folding-measures

Probably can copy from Caret5 but will need some updating.

caret_command -surface-identify-sulci

This is a very large, complex operation.

TSC: this is mainly needed for depth, which we are going to reengineer anyway

caret_command -surface-region-of-interest-selection

This is a big command but may be able to use "wb_command -cifti-math"

caret_command -surface-to-segmentation-volume

This is possible with wb_command -create-signed-distance-volume, followed by -volume-math 'x<0', could be provided as a separate command

caret_command -surface-topology-disconnect-nodes

Probably can copy from Caret5 but will need some updating.

caret_command -volume-create

Is there a desire to create with dimensions, spacing, origin? Name of stereotaxic space? Or both?

TSC: it should be quite easy - I would opt not to provide any standard spaces, since it is much more likely you will already have a reference volume for those.

caret_command -volume-set-origin

Create a command that allows user to update a volume file's attributes (origin/spacing).

caret_command -volume-set-spacing

See above.

TSC: these should be rolled into one command that just sets the entire sform

caret_command -volume-histogram

This should probably be generalized to work with any mappable data file (displayed as overlay) that maps with a palette.

TSC: what would the output be?

caret_command -volume-information

See metric-information near top.

caret_command -volume-information-nifti

See metric-information near top.

TSC: redundant, all volumes are nifti.

caret_command -metric-composite

Use wb_command -metric-merge (similar exist for cifti and volume, and soon for label).

caret_command -metric-composite-identified-columns

Use wb_command -metric-merge (similar exist for cifti and volume, and soon for label).

caret_command -metric-math-postfix

Use wb_command -metric-math (similar exist for cifti and volume).

caret_command -metric-set-column-name

Use wb_command -set-map-names.

caret_command -metric-set-column-to-scalar

Use wb_command -metric-math (similar exist for cifti and volume).

Statistics Commands

Will this be a new command line program "wb_stats" or will the commands be added to "wb_command"?

All statistics command will operate on CIFTI Scalar files containing surface-based data. Will there be one structure per file or multiple structures (left/right)? Is there ANY POSSIBILITY that these statistic command will need to operate on CIFTI Scalar files containing volume data or volume files?

Inferential Linear Statistics Commands

For the linear statistics commands we could reformat and solve using a General Linear Model Caret:Documentation:StatisticsGLM . However, it will be simpler just to use the standard algorithm for each statistical test (and probably best if reviewers of papers ask). Caret:Documentation:Statistics

If possible, parallel processing with OpenMP should be used. Since these commands often operate on 'vectors' of data, GPU processing (OpenCL) may be possible but this decision should be deferred.

All of these commands are in either Caret5 or Caret6 so implementation should be fairly straightforward and the previous implementations can be used to validate operation of the new commands.

caret6_stats -inferential-anova-one-way

caret6_stats -inferential-anova-one-way-coordinate-difference

caret6_stats -inferential-interhemispheric

caret6_stats -inferential-t-test-one-sample

caret6_stats -inferential-t-test-paired

caret6_stats -inferential-t-test-two-sample

Significance Testing Statistics Commands

caret6_stats -significance-cluster-threshold MAYBE

caret6_stats -significance-threshold-free

Personal tools
Sums Database