3 resultados para Reliability (Statistics).

em Duke University


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measuring the entorhinal cortex (ERC) is challenging due to lateral border discrimination from the perirhinal cortex. From a sample of 39 nondemented older adults who completed volumetric image scans and verbal memory indices, we examined reliability and validity concerns for three ERC protocols with different lateral boundary guidelines (i.e., Goncharova, Dickerson, Stoub, & deToledo-Morrell, 2001; Honeycutt et al., 1998; Insausti et al., 1998). We used three novice raters to assess inter-rater reliability on a subset of scans (216 total ERCs), with the entire dataset measured by one rater with strong intra-rater reliability on each technique (234 total ERCs). We found moderate to strong inter-rater reliability for two techniques with consistent ERC lateral boundary endpoints (Goncharova, Honeycutt), with negligible to moderate reliability for the technique requiring consideration of collateral sulcal depth (Insausti). Left ERC and story memory associations were moderate and positive for two techniques designed to exclude the perirhinal cortex (Insausti, Goncharova), with the Insausti technique continuing to explain 10% of memory score variance after additionally controlling for depression symptom severity. Right ERC-story memory associations were nonexistent after excluding an outlier. Researchers are encouraged to consider challenges of rater training for ERC techniques and how lateral boundary endpoints may impact structure-function associations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nolan and Temple Lang argue that “the ability to express statistical computations is an es- sential skill.” A key related capacity is the ability to conduct and present data analysis in a way that another person can understand and replicate. The copy-and-paste workflow that is an artifact of antiquated user-interface design makes reproducibility of statistical analysis more difficult, especially as data become increasingly complex and statistical methods become increasingly sophisticated. R Markdown is a new technology that makes creating fully-reproducible statistical analysis simple and painless. It provides a solution suitable not only for cutting edge research, but also for use in an introductory statistics course. We present experiential and statistical evidence that R Markdown can be used effectively in introductory statistics courses, and discuss its role in the rapidly-changing world of statistical computation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Slowly-compressed single crystals, bulk metallic glasses (BMGs), rocks, granular materials, and the earth all deform via intermittent slips or "quakes". We find that although these systems span 12 decades in length scale, they all show the same scaling behavior for their slip size distributions and other statistical properties. Remarkably, the size distributions follow the same power law multiplied with the same exponential cutoff. The cutoff grows with applied force for materials spanning length scales from nanometers to kilometers. The tuneability of the cutoff with stress reflects "tuned critical" behavior, rather than self-organized criticality (SOC), which would imply stress-independence. A simple mean field model for avalanches of slipping weak spots explains the agreement across scales. It predicts the observed slip-size distributions and the observed stress-dependent cutoff function. The results enable extrapolations from one scale to another, and from one force to another, across different materials and structures, from nanocrystals to earthquakes.