78 resultados para Futures Studies methods
Resumo:
Modern methods of spawning new technological motifs are not appropriate when it is desired to realize artificial life as an actual real world entity unto itself (Pattee 1995; Brooks 2006; Chalmers 1995). Many fundamental aspects of such a machine are absent in common methods, which generally lack methodologies of construction. In this paper we mix classical and modern studies in order to attempt to realize an artificial life form from first principles. A model of an algorithm is introduced, its methodology of construction is presented, and the fundamental source from which it sprang is discussed.
Resumo:
Procedures for routine analysis of soil phosphorus (P) have been used for assessment of P status, distribution and P losses from cultivated mineral soils. No similar studies have been carried out on wetland peat soils. The objective was to compare extraction efficiency of ammonium lactate (PAL), sodium bicarbonate (P-Olsen), and double calcium lactate (P-DCaL) and P distribution in the soil profile of wetland peat soils. For this purpose, 34 samples of the 0-30, 30-60 and 60-90 cm layers were collected from peat soils in Germany, Israel, Poland, Slovenia, Sweden and the United Kingdom and analysed for P. Mean soil pH (CaCl2, 0.01 M) was 5.84, 5.51 and 5.47 in the 0-30, 30-60 and 60-90 cm layers, respectively. The P-DCaL was consistently about half the magnitude of either P-AL or P-Olsen. The efficiency of P extraction increased in the order P-DCaL < P-AL &LE; P-Olsen, with corresponding means (mg kg(-1)) for all soils (34 samples) of 15.32, 33.49 and 34.27 in 0-30 cm; 8.87, 17.30 and 21.46 in 30-60 cm; and 5.69, 14.00 and 21.40 in 60-90 cm. The means decreased with depth. When examining soils for each country separately, P-Olsen was relatively evenly distributed in the German, UK and Slovenian soils. P-Olsen was linearly correlated (r = 0.594, P = 0.0002) with pH, whereas the three P tests (except P-Olsen vs P-DCaL) significantly correlated with each other (P = 0.017850.0001). The strongest correlation (r = 0.617, P = 0.0001) was recorded for P-AL vs P-DCaL) and the two methods were inter-convertible using a regression equation: P-AL = -22.593 + 5.353 pH + 1.423 P-DCaL, R-2 = 0.550.
Resumo:
This title presents a fascinating analysis of how children in their first year of high school feel about their schools, its place in their lives and its role in their futures. This highly topical monograph focuses on how children in their first year of high school feel about school, its place in their lives and its role in their futures. The theoretical context of the study is the focus in educational studies on children's voice and children's active role in education, together with the focus in the sociology of childhood on children as active constructors of their lives and childhood as a subject of serious study. The importance of young people's life plans and the alignment between education and ambitions was recognized in the Sloan Foundation study of American teenagers. In many Western societies there is concern that children from less advantaged social backgrounds have limited aspirations, and are disproportionately unlikely to go to university. This book is highly relevant to understanding the nature of children's engagement with education, the choices and constraints they experience and the reasons some young people fail to take advantage of educational opportunities. "Continuum Studies in Educational Research" (CSER) is a major new series in the field of educational research. Written by experts and scholars for experts and scholars, this ground-breaking series focuses on research in the areas of comparative education, history, lifelong learning, philosophy, policy, post-compulsory education, psychology and sociology. Based on cutting edge research and written with lucidity and passion, the CSER series showcases only those books that really matter in education - studies that are major, that will be remembered for having made a difference.
Resumo:
This paper investigates the applications of capture–recapture methods to human populations. Capture–recapture methods are commonly used in estimating the size of wildlife populations but can also be used in epidemiology and social sciences, for estimating prevalence of a particular disease or the size of the homeless population in a certain area. Here we focus on estimating the prevalence of infectious diseases. Several estimators of population size are considered: the Lincoln–Petersen estimator and its modified version, the Chapman estimator, Chao’s lower bound estimator, the Zelterman’s estimator, McKendrick’s moment estimator and the maximum likelihood estimator. In order to evaluate these estimators, they are applied to real, three-source, capture-recapture data. By conditioning on each of the sources of three source data, we have been able to compare the estimators with the true value that they are estimating. The Chapman and Chao estimators were compared in terms of their relative bias. A variance formula derived through conditioning is suggested for Chao’s estimator, and normal 95% confidence intervals are calculated for this and the Chapman estimator. We then compare the coverage of the respective confidence intervals. Furthermore, a simulation study is included to compare Chao’s and Chapman’s estimator. Results indicate that Chao’s estimator is less biased than Chapman’s estimator unless both sources are independent. Chao’s estimator has also the smaller mean squared error. Finally, the implications and limitations of the above methods are discussed, with suggestions for further development.
Resumo:
The paper considers meta-analysis of diagnostic studies that use a continuous score for classification of study participants into healthy or diseased groups. Classification is often done on the basis of a threshold or cut-off value, which might vary between studies. Consequently, conventional meta-analysis methodology focusing solely on separate analysis of sensitivity and specificity might be confounded by a potentially unknown variation of the cut-off value. To cope with this phenomena it is suggested to use, instead, an overall estimate of the misclassification error previously suggested and used as Youden’s index and; furthermore, it is argued that this index is less prone to between-study variation of cut-off values. A simple Mantel–Haenszel estimator as a summary measure of the overall misclassification error is suggested, which adjusts for a potential study effect. The measure of the misclassification error based on Youden’s index is advantageous in that it easily allows an extension to a likelihood approach, which is then able to cope with unobserved heterogeneity via a nonparametric mixture model. All methods are illustrated at hand of an example on a diagnostic meta-analysis on duplex doppler ultrasound, with angiography as the standard for stroke prevention.
Resumo:
It is generally accepted that genetics may be an important factor in explaining the variation between patients’ responses to certain drugs. However, identification and confirmation of the responsible genetic variants is proving to be a challenge in many cases. A number of difficulties that maybe encountered in pursuit of these variants, such as non-replication of a true effect, population structure and selection bias, can be mitigated or at least reduced by appropriate statistical methodology. Another major statistical challenge facing pharmacogenetics studies is trying to detect possibly small polygenic effects using large volumes of genetic data, while controlling the number of false positive signals. Here we review statistical design and analysis options available for investigations of genetic resistance to anti-epileptic drugs.
Resumo:
Objectives: To conduct it detailed evaluation, with meta-analyses, of the published evidence on milk and dairy consumption and the incidence of vascular diseases and diabetes. Also to summarise the evidence on milk and dairy consumption and cancer reported by the World Cancer Research Fund and then to consider the relevance of milk and dairy consumption to survival in the UK, a typical Western community. Finally, published evidence on relationships with whole milk and fat-reduced milks was examined. Methods: Prospective cohort studies of vascular disease and diabetes with baseline data on milk or dairy consumption and a relevant disease outcome were identified by searching MEDLINE, and reference lists in the relevant published reports. Meta-analyses of relationships in these reports were conducted. The likely effect of milk and dairy consumption on survival was then considered, taking into account the results of published overviews of relationships of these foods with cancer. Results: From meta-analysis of 15 studies the relative risk of stroke and/or heart disease in subjects with high milk or dairy consumption was 0.84 (95% CI 0.76, 0,93) and 0.79 (0.75, 0.82) respectively, relative to the risk in those with low consumption. Four studies reported incident diabetes as an outcome, and the relative risk in the Subjects with the highest intake of milk or diary foods was 0.92 (0.86, 0.97). Conclusions: Set against the proportion of total deaths attributable to the life-threatening diseases in the UK, vascular disease, diabetes and cancer, the results of meta-analyses provide evidence of an overall survival advantage from the consumption of milk and dairy foods.
Resumo:
The soil fauna is often a neglected group in many large-scale studies of farmland biodiversity due to difficulties in extracting organisms efficiently from the soil. This study assesses the relative efficiency of the simple and cheap sampling method of handsorting against Berlese-Tullgren funnel and Winkler apparatus extraction. Soil cores were taken from grassy arable field margins and wheat fields in Cambridgeshire, UK, and the efficiencies of the three methods in assessing the abundances and species densities of soil macroinver-tebrates were compared. Handsorting in most cases was as efficient at extracting the majority of the soil macrofauna as the Berlese-Tullgren funnel and Winkler bag methods, although it underestimated the species densities of the woodlice and adult beetles. There were no obvious biases among the three methods for the particular vegetation types sampled and no significant differences in the size distributions of the earthworms and beetles. Proportionally fewer damaged earthworms were recorded in larger (25 x 25 cm) soil cores when compared with smaller ones (15 x 15 cm). Handsorting has many benefits, including targeted extraction, minimum disturbance to the habitat and shorter sampling periods and may be the most appropriate method for studies of farmland biodiversity when a high number of soil cores need to be sampled. (C) 2008 Elsevier Masson SAS. All rights reserved.
Resumo:
In a sequential clinical trial, accrual of data on patients often continues after the stopping criterion for the study has been met. This is termed “overrunning.” Overrunning occurs mainly when the primary response from each patient is measured after some extended observation period. The objective of this article is to compare two methods of allowing for overrunning. In particular, simulation studies are reported that assess the two procedures in terms of how well they maintain the intended type I error rate. The effect on power resulting from the incorporation of “overrunning data” using the two procedures is evaluated.
Resumo:
We consider the comparison of two formulations in terms of average bioequivalence using the 2 × 2 cross-over design. In a bioequivalence study, the primary outcome is a pharmacokinetic measure, such as the area under the plasma concentration by time curve, which is usually assumed to have a lognormal distribution. The criterion typically used for claiming bioequivalence is that the 90% confidence interval for the ratio of the means should lie within the interval (0.80, 1.25), or equivalently the 90% confidence interval for the differences in the means on the natural log scale should be within the interval (-0.2231, 0.2231). We compare the gold standard method for calculation of the sample size based on the non-central t distribution with those based on the central t and normal distributions. In practice, the differences between the various approaches are likely to be small. Further approximations to the power function are sometimes used to simplify the calculations. These approximations should be used with caution, because the sample size required for a desirable level of power might be under- or overestimated compared to the gold standard method. However, in some situations the approximate methods produce very similar sample sizes to the gold standard method. Copyright © 2005 John Wiley & Sons, Ltd.
Resumo:
This paper investigates the applications of capture-recapture methods to human populations. Capture-recapture methods are commonly used in estimating the size of wildlife populations but can also be used in epidemiology and social sciences, for estimating prevalence of a particular disease or the size of the homeless population in a certain area. Here we focus on estimating the prevalence of infectious diseases. Several estimators of population size are considered: the Lincoln-Petersen estimator and its modified version, the Chapman estimator, Chao's lower bound estimator, the Zelterman's estimator, McKendrick's moment estimator and the maximum likelihood estimator. In order to evaluate these estimators, they are applied to real, three-source, capture-recapture data. By conditioning on each of the sources of three source data, we have been able to compare the estimators with the true value that they are estimating. The Chapman and Chao estimators were compared in terms of their relative bias. A variance formula derived through conditioning is suggested for Chao's estimator, and normal 95% confidence intervals are calculated for this and the Chapman estimator. We then compare the coverage of the respective confidence intervals. Furthermore, a simulation study is included to compare Chao's and Chapman's estimator. Results indicate that Chao's estimator is less biased than Chapman's estimator unless both sources are independent. Chao's estimator has also the smaller mean squared error. Finally, the implications and limitations of the above methods are discussed, with suggestions for further development.
Resumo:
Background: Molecular tools may help to uncover closely related and still diverging species from a wide variety of taxa and provide insight into the mechanisms, pace and geography of marine speciation. There is a certain controversy on the phylogeography and speciation modes of species-groups with an Eastern Atlantic-Western Indian Ocean distribution, with previous studies suggesting that older events (Miocene) and/or more recent (Pleistocene) oceanographic processes could have influenced the phylogeny of marine taxa. The spiny lobster genus Palinurus allows for testing among speciation hypotheses, since it has a particular distribution with two groups of three species each in the Northeastern Atlantic (P. elephas, P. mauritanicus and P. charlestoni) and Southeastern Atlantic and Southwestern Indian Oceans (P. gilchristi, P. delagoae and P. barbarae). In the present study, we obtain a more complete understanding of the phylogenetic relationships among these species through a combined dataset with both nuclear and mitochondrial markers, by testing alternative hypotheses on both the mutation rate and tree topology under the recently developed approximate Bayesian computation (ABC) methods. Results: Our analyses support a North-to-South speciation pattern in Palinurus with all the South-African species forming a monophyletic clade nested within the Northern Hemisphere species. Coalescent-based ABC methods allowed us to reject the previously proposed hypothesis of a Middle Miocene speciation event related with the closure of the Tethyan Seaway. Instead, divergence times obtained for Palinurus species using the combined mtDNA-microsatellite dataset and standard mutation rates for mtDNA agree with known glaciation-related processes occurring during the last 2 my. Conclusion: The Palinurus speciation pattern is a typical example of a series of rapid speciation events occurring within a group, with very short branches separating different species. Our results support the hypothesis that recent climate change-related oceanographic processes have influenced the phylogeny of marine taxa, with most Palinurus species originating during the last two million years. The present study highlights the value of new coalescent-based statistical methods such as ABC for testing different speciation hypotheses using molecular data.
Resumo:
The paper considers meta-analysis of diagnostic studies that use a continuous Score for classification of study participants into healthy, or diseased groups. Classification is often done on the basis of a threshold or cut-off value, which might vary between Studies. Consequently, conventional meta-analysis methodology focusing solely on separate analysis of sensitivity and specificity might he confounded by a potentially unknown variation of the cut-off Value. To cope with this phenomena it is suggested to use, instead an overall estimate of the misclassification error previously suggested and used as Youden's index and; furthermore, it is argued that this index is less prone to between-study variation of cut-off values. A simple Mantel-Haenszel estimator as a summary measure of the overall misclassification error is suggested, which adjusts for a potential study effect. The measure of the misclassification error based on Youden's index is advantageous in that it easily allows an extension to a likelihood approach, which is then able to cope with unobserved heterogeneity via a nonparametric mixture model. All methods are illustrated at hand of an example on a diagnostic meta-analysis on duplex doppler ultrasound, with angiography as the standard for stroke prevention.