# Caret:Documentation:Statistics:TFCE Implementation

### From Van Essen Lab

(New page: = TFCE Implementation for Surface Metrics = == High Level Explanation == The core concept of our implementation was to grow the clusters from the top (highest values) down, thereby avoid...) |
|||

Line 5: | Line 5: | ||

The core concept of our implementation was to grow the clusters from the top (highest values) down, thereby avoiding the need to do many expensive threshold operations. Instead, each cluster is separately broken into "slices" which are computed one at a time, and added to all nodes within a cluster. When clusters merge, the slices are aligned and all slices from that point on are added to all members of the new, merged cluster, as shown in the image below. | The core concept of our implementation was to grow the clusters from the top (highest values) down, thereby avoiding the need to do many expensive threshold operations. Instead, each cluster is separately broken into "slices" which are computed one at a time, and added to all nodes within a cluster. When clusters merge, the slices are aligned and all slices from that point on are added to all members of the new, merged cluster, as shown in the image below. | ||

- | [[ | + | [[Image:TFCE_slices.jpg]] |

At first, the clusters have separate slices computed regularly (shown as yellow and blue), but just as they merge, a new slice is computed for each of them separately, so that the previous threshold for each of them is the same. All following slices (green) are then computed for the merged cluster. | At first, the clusters have separate slices computed regularly (shown as yellow and blue), but just as they merge, a new slice is computed for each of them separately, so that the previous threshold for each of them is the same. All following slices (green) are then computed for the merged cluster. |

## Revision as of 20:26, 13 August 2009

# TFCE Implementation for Surface Metrics

## High Level Explanation

The core concept of our implementation was to grow the clusters from the top (highest values) down, thereby avoiding the need to do many expensive threshold operations. Instead, each cluster is separately broken into "slices" which are computed one at a time, and added to all nodes within a cluster. When clusters merge, the slices are aligned and all slices from that point on are added to all members of the new, merged cluster, as shown in the image below.

At first, the clusters have separate slices computed regularly (shown as yellow and blue), but just as they merge, a new slice is computed for each of them separately, so that the previous threshold for each of them is the same. All following slices (green) are then computed for the merged cluster.

## Programmer's Perspective

Our implementation of TFCE for surface metrics starts by adding all positive value nodes to a max heap, ensuring every time we remove a node, there are no nodes as yet unseen with a greater value. The nodes are then removed one at a time, and considered to see if they are adjacent to a cluster, or should start a new cluster, by examining whether any of its neighbors belong to a cluster. In tracking the growing clusters, each one has a separate previously calculated threshold, signifying the last time it had a slice computed and added to the values of its members. If the node is adjacent to more than one cluster, each cluster immediately has a new slice computed at the current node's value, before merging the clusters into a larger cluster. This way, the cluster area at the "top" and "bottom" of each slice never has a large discontinuity, which could cause inconsistency in the numerical approximation of the integral, the justification being that the clusters do not truly merge until after the current threshold.

After all neighbors are examined, if the current node has a different value than the next node in the heap, and was only adjacent to one cluster, the change in height, delta h, is compared to 2 thresholds: a magnitude for the minimum step to take, in our case, 0.01, and a ratio of the previous threshold, in our case, 0.001. If delta h is smaller than either one of these values, then the slice is not computed, and the previous height is left alone. The ratio step serves to ensure numerical stability using floating points, as the precision of a floating point is approximately proportional to its own magnitude. The magnitude step ensures that as values approach zero, where they become less important, that following the minimum ratio step does not keep it from reaching the end of the computation swiftly. If the next node to be examined has the same value as the current node, a list of clusters that have grown since they were last recomputed is maintained, and the next node in the heap is considered. When the value of the current node is different then the value of the next node, all clusters in the list are then checked according to the delta h cutoffs described above, possibly recomputed, and the list is cleared. Additionally, when all nodes of the specified sign have been added to clusters, each cluster has its final (to zero) slice computed. The entire process is then repeated for the negative values by flipping them to be positive, doing the computation, and flipping them negative again.