27 resultados para downloading of data


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: SPARCLE is a cross-sectional survey in nine European regions, examining the relationship of the environment of children with cerebral palsy to their participation and quality of life. The objective of this report is to assess data quality, in particular heterogeneity between regions, family and item non-response and potential for bias. Methods: 1,174 children aged 8–12 years were selected from eight population-based registers of children with cerebral palsy; one further centre recruited 75 children from multiple sources. Families were visited by trained researchers who administered psychometric questionnaires. Logistic regression was used to assess factors related to family non-response and self-completion of questionnaires by children. Results: 431/1,174 (37%) families identified from registers did not respond: 146 (12%) were not traced; of the 1,028 traced families, 250 (24%) declined to participate and 35 (3%) were not approached. Families whose disabled children could walk unaided were more likely to decline to participate. 818 children entered the study of which 500 (61%) self-reported their quality of life; children with low IQ, seizures or inability to walk were less likely to self-report. There was substantial heterogeneity between regions in response rates and socio-demographic characteristics of families but not in age or gender of children. Item non-response was 2% for children and ranged from 0.4% to 5% for questionnaires completed by parents. Conclusion: While the proportion of untraced families was higher than in similar surveys, the refusal rate was comparable. To reduce bias, all analyses should allow for region, walking ability, age and socio-demographic characteristics. The 75 children in the region without a population based register are unlikely to introduce bias

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dynamic power consumption is very dependent on interconnect, so clever mapping of digital signal processing algorithms to parallelised realisations with data locality is vital. This is a particular problem for fast algorithm implementations where typically, designers will have sacrificed circuit structure for efficiency in software implementation. This study outlines an approach for reducing the dynamic power consumption of a class of fast algorithms by minimising the index space separation; this allows the generation of field programmable gate array (FPGA) implementations with reduced power consumption. It is shown how a 50% reduction in relative index space separation results in a measured power gain of 36 and 37% over a Cooley-Tukey Fast Fourier Transform (FFT)-based solution for both actual power measurements for a Xilinx Virtex-II FPGA implementation and circuit measurements for a Xilinx Virtex-5 implementation. The authors show the generality of the approach by applying it to a number of other fast algorithms namely the discrete cosine, the discrete Hartley and the Walsh-Hadamard transforms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study is to survey the use of networks and network-based methods in systems biology. This study starts with an introduction to graph theory and basic measures allowing to quantify structural properties of networks. Then, the authors present important network classes and gene networks as well as methods for their analysis. In the last part of this study, the authors review approaches that aim at analysing the functional organisation of gene networks and the use of networks in medicine. In addition to this, the authors advocate networks as a systematic approach to general problems in systems biology, because networks are capable of assuming multiple roles that are very beneficial connecting experimental data with a functional interpretation in biological terms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of bit-level systolic array circuits as building blocks in the construction of larger word-level systolic systems is investigated. It is shown that the overall structure and detailed timing of such systems may be derived quite simply using the dependence graph and cut-set procedure developed by S. Y. Kung (1988). This provides an attractive and intuitive approach to the bit-level design of many VLSI signal processing components. The technique can be applied to ripple-through and partly pipelined circuits as well as fully systolic designs. It therefore provides a means of examining the relative tradeoff between levels of pipelining, chip area, power consumption, and throughput rate within a given VLSI design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data obtained with any research tool must be reproducible, a concept referred to as reliability. Three techniques are often used to evaluate reliability of tools using continuous data in aging research: intraclass correlation coefficients (ICC), Pearson correlations, and paired t tests. These are often construed as equivalent when applied to reliability. This is not correct, and may lead researchers to select instruments based on statistics that may not reflect actual reliability. The purpose of this paper is to compare the reliability estimates produced by these three techniques and determine the preferable technique. A hypothetical dataset was produced to evaluate the reliability estimates obtained with ICC, Pearson correlations, and paired t tests in three different situations. For each situation two sets of 20 observations were created to simulate an intrarater or inter-rater paradigm, based on 20 participants with two observations per participant. Situations were designed to demonstrate good agreement, systematic bias, or substantial random measurement error. In the situation demonstrating good agreement, all three techniques supported the conclusion that the data were reliable. In the situation demonstrating systematic bias, the ICC and t test suggested the data were not reliable, whereas the Pearson correlation suggested high reliability despite the systematic discrepancy. In the situation representing substantial random measurement error where low reliability was expected, the ICC and Pearson coefficient accurately illustrated this. The t test suggested the data were reliable. The ICC is the preferred technique to measure reliability. Although there are some limitations associated with the use of this technique, they can be overcome.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many applications in applied statistics researchers reduce the complexity of a data set by combining a group of variables into a single measure using factor analysis or an index number. We argue that such compression loses information if the data actually has high dimensionality. We advocate the use of a non-parametric estimator, commonly used in physics (the Takens estimator), to estimate the correlation dimension of the data prior to compression. The advantage of this approach over traditional linear data compression approaches is that the data does not have to be linearized. Applying our ideas to the United Nations Human Development Index we find that the four variables that are used in its construction have dimension three and the index loses information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High-quality data from appropriate archives are needed for the continuing improvement of radiocarbon calibration curves. We discuss here the basic assumptions behind 14C dating that necessitate calibration and the relative strengths and weaknesses of archives from which calibration data are obtained. We also highlight the procedures, problems and uncertainties involved in determining atmospheric and surface ocean 14C/12C in these archives, including a discussion of the various methods used to derive an independent absolute timescale and uncertainty. The types of data required for the current IntCal database and calibration curve model are tabulated with examples.