903 resultados para Tests for Continuous Lifetime Data
                                
                                
Resumo:
BACKGROUND: In vitro aggregating brain cell cultures containing all types of brain cells have been shown to be useful for neurotoxicological investigations. The cultures are used for the detection of nervous system-specific effects of compounds by measuring multiple endpoints, including changes in enzyme activities. Concentration-dependent neurotoxicity is determined at several time points. METHODS: A Markov model was set up to describe the dynamics of brain cell populations exposed to potentially neurotoxic compounds. Brain cells were assumed to be either in a healthy or stressed state, with only stressed cells being susceptible to cell death. Cells may have switched between these states or died with concentration-dependent transition rates. Since cell numbers were not directly measurable, intracellular lactate dehydrogenase (LDH) activity was used as a surrogate. Assuming that changes in cell numbers are proportional to changes in intracellular LDH activity, stochastic enzyme activity models were derived. Maximum likelihood and least squares regression techniques were applied for estimation of the transition rates. Likelihood ratio tests were performed to test hypotheses about the transition rates. Simulation studies were used to investigate the performance of the transition rate estimators and to analyze the error rates of the likelihood ratio tests. The stochastic time-concentration activity model was applied to intracellular LDH activity measurements after 7 and 14 days of continuous exposure to propofol. The model describes transitions from healthy to stressed cells and from stressed cells to death. RESULTS: The model predicted that propofol would affect stressed cells more than healthy cells. Increasing propofol concentration from 10 to 100 μM reduced the mean waiting time for transition to the stressed state by 50%, from 14 to 7 days, whereas the mean duration to cellular death reduced more dramatically from 2.7 days to 6.5 hours. CONCLUSION: The proposed stochastic modeling approach can be used to discriminate between different biological hypotheses regarding the effect of a compound on the transition rates. The effects of different compounds on the transition rate estimates can be quantitatively compared. Data can be extrapolated at late measurement time points to investigate whether costs and time-consuming long-term experiments could possibly be eliminated.
                                
Resumo:
The European Surveillance of Congenital Anomalies (EUROCAT) network of population-based congenital anomaly registries is an important source of epidemiologic information on congenital anomalies in Europe covering live births, fetal deaths from 20 weeks gestation, and terminations of pregnancy for fetal anomaly. EUROCAT's policy is to strive for high-quality data, while ensuring consistency and transparency across all member registries. A set of 30 data quality indicators (DQIs) was developed to assess five key elements of data quality: completeness of case ascertainment, accuracy of diagnosis, completeness of information on EUROCAT variables, timeliness of data transmission, and availability of population denominator information. This article describes each of the individual DQIs and presents the output for each registry as well as the EUROCAT (unweighted) average, for 29 full member registries for 2004-2008. This information is also available on the EUROCAT website for previous years. The EUROCAT DQIs allow registries to evaluate their performance in relation to other registries and allows appropriate interpretations to be made of the data collected. The DQIs provide direction for improving data collection and ascertainment, and they allow annual assessment for monitoring continuous improvement. The DQI are constantly reviewed and refined to best document registry procedures and processes regarding data collection, to ensure appropriateness of DQI, and to ensure transparency so that the data collected can make a substantial and useful contribution to epidemiologic research on congenital anomalies.
                                
Resumo:
We explore in depth the validity of a recently proposed scaling law for earthquake inter-event time distributions in the case of the Southern California, using the waveform cross-correlation catalog of Shearer et al. Two statistical tests are used: on the one hand, the standard two-sample Kolmogorov-Smirnov test is in agreement with the scaling of the distributions. On the other hand, the one-sample Kolmogorov-Smirnov statistic complemented with Monte Carlo simulation of the inter-event times, as done by Clauset et al., supports the validity of the gamma distribution as a simple model of the scaling function appearing on the scaling law, for rescaled inter-event times above 0.01, except for the largest data set (magnitude greater than 2). A discussion of these results is provided.
                                
Resumo:
Analyzing the relationship between the baseline value and subsequent change of a continuous variable is a frequent matter of inquiry in cohort studies. These analyses are surprisingly complex, particularly if only two waves of data are available. It is unclear for non-biostatisticians where the complexity of this analysis lies and which statistical method is adequate.With the help of simulated longitudinal data of body mass index in children,we review statistical methods for the analysis of the association between the baseline value and subsequent change, assuming linear growth with time. Key issues in such analyses are mathematical coupling, measurement error, variability of change between individuals, and regression to the mean. Ideally, it is better to rely on multiple repeated measurements at different times and a linear random effects model is a standard approach if more than two waves of data are available. If only two waves of data are available, our simulations show that Blomqvist's method - which consists in adjusting for measurement error variance the estimated regression coefficient of observed change on baseline value - provides accurate estimates. The adequacy of the methods to assess the relationship between the baseline value and subsequent change depends on the number of data waves, the availability of information on measurement error, and the variability of change between individuals.
                                
Resumo:
Specific IgG and IgM responses to soluble egg antigen (SEA) and keyhole limpet haemocyanin (KLH) were measured by ELISA in patients with acute and chronic schistosomiasis. The tests based upon IgM and IgG antibodies responses to KLH presented the best diagnostic discrimination, and can be used in conjunction with clinical and epidemiological data to the differential diagnosis of acute schistosomiasis.
                                
Resumo:
A boy with a right congenital hemiparesis due to a left pre-natal middle cerebral artery infarct developed focal epilepsy at 33 months and then an insidious and subsequently more rapid, massive cognitive and behavioural regression with a frontal syndrome between the ages of 4 and 5 years with continuous spike-waves during sleep (CSWS) on the EEG. Both the epilepsy and the CSWS were immediately suppressed by hemispherotomy at the age of 5 years and 4 months. A behavioural-cognitive follow-up prior to hemispherotomy, an per-operative EEG and corticography and serial post-operative neuropsychological assessments were performed until the age of 11 years. The spread of the epileptic activity to the "healthy" frontal region was the cause of the reversible frontal syndrome. A later gradual long-term but incomplete cognitive recovery, with moderate mental disability was documented. This outcome is probably explained by another facet of the epilepsy, namely the structural effects of prolonged epileptic discharges in rapidly developing cerebral networks which are, at the same time undergoing the reorganization imposed by a unilateral early hemispheric lesion. Group studies on the outcome of children before and after hemispherectomy using only single IQ measures, pre- and post-operatively, may miss particular epileptic cognitive dysfunctions as they are likely to be different from case to case. Such detailed and rarely available complementary clinical and EEG data obtained in a single case at different time periods in relation to the epilepsy, including per-operative electrophysiological findings, may help to understand the different cognitive deficits and recovery profiles and the limits of full cognitive recovery.
                                
Resumo:
L'objectiu d'aquest projecte és l'estudi de la plagiabilitat dels lliuraments de les Proves d'Avaluació Continuada i pràctiques dels estudiants de la UOC així com l'estudi dels diferents mitjans per evitar-la.
                                
Resumo:
It is well known that dichotomizing continuous data has the effect to decrease statistical power when the goal is to test for a statistical association between two variables. Modern researchers however are focusing not only on statistical significance but also on an estimation of the "effect size" (i.e., the strength of association between the variables) to judge whether a significant association is also clinically relevant. In this article, we are interested in the consequences of dichotomizing continuous data on the value of an effect size in some classical settings. It turns out that the conclusions will not be the same whether using a correlation or an odds ratio to summarize the strength of association between the variables: Whereas the value of a correlation is typically decreased by a factor pi/2 after each dichotomization, the value of an odds ratio is at the same time raised to the power 2. From a descriptive statistical point of view, it is thus not clear whether dichotomizing continuous data leads to a decrease or to an increase in the effect size, as illustrated using a data set to investigate the relationship between motor and intellectual functions in children and adolescents
                                
Resumo:
BACKGROUND: A growing body of literature indicates that adolescents with chronic conditions are as likely, or more likely, to take risky behaviours than their healthy peers. The objective of this research was to assess whether adolescents with chronic illness in Catalonia differ from their healthy peers in risk-taking behaviour. METHODS: Data were drawn from the Catalonia Adolescent Health database, a survey including a random school-based sample of 6952 young people, aged 14-19 years. The index group (IG) included 665 adolescents (450 females) reporting several chronic conditions. The comparison group (CG) comprised 6287 healthy adolescents (3306 females). Personal, family and school-related variables were analysed to ensure comparability between groups. Sexual behaviour, drug use (tobacco, alcohol, cannabis, cocaine and synthetic drugs) and perception of drug use among peers and in school were compared. Analysis was carried out separately by gender. chi-square, Fisher's and Student's tests were used to compare categorical and continuous variables. RESULTS: The prevalence of chronic conditions was 9.6%, with females showing a higher prevalence than males. The IG showed similar or higher rates of sexual intercourse and risky sexual behaviour. For most studied drugs, IG males reported slightly lower rates of use than CG males, while IG females showed higher rates for every drug studied. No differences were found in the perceptions of drug use among peers or in their school. CONCLUSIONS: Similar to previous research, chronically ill adolescents in our sample are as likely, or more likely, to take risky behaviours than their healthy counterparts and should receive the same anticipatory guidance.
                                
Resumo:
Rapport de synthèse : Cette thèse a étudié en détail le cas d'un enfant souffrant d'une hémiplégie congénitale sur un infarctus prénatal étendu qui a développé une forme particulière d'épilepsie, le syndrome des pointes ondes continues du sommeil (POCS), associé à une régression mentale massive. Les caractéristiques de cette détérioration pointaient vers un dysfonctionnement de type frontal. Une chirurgie de l'épilepsie (hémisphérotomie) a, non seulement, permis la guérison de l'épilepsie mais une récupération rapide sur le plan comportemental et cognitif, suivie d'une reprise plus lente du développement, avec finalement à l'âge de 11 ans un niveau de déficience intellectuelle modérée. L'intérêt de cette étude réside dans le fait que l'enfant a pu être suivi prospectivement entre l'âge de 4.5 ans et 11 ans par des enregistrements électro-encéphalographiques (EEG) ainsi que des tests neuropsychologiques et des questionnaires de comportements sériés, permettant de comparer les périodes pré-, péri- et postopératoires, ce qui est rarement réalisable. Un enregistrement EEG de surface a même pu être effectué durant l'opération sur l'hémisphère non lésé, permettant de documenter l'arrêt des décharges épileptiformes généralisées dès la fin de l'intervention. L'hypothèse que nous avons- souhaité démontrer est que la régression comportementale et cognitive présentée par l'enfant après une période de développement précoce presque normale (retard de langage) était de nature épileptique : nous l'expliquons par la propagation de l'activité électrique anormale à partir de la lésion de l'hémisphère gauche vers les régions préservées, en particulier frontales bilatérales. L'hémisphérotomie a permis une récupération rapide en déconnectant l'hémisphère gauche lésé et épileptogène de l'hémisphère sain, qui a ainsi pu reprendre les fonctions cognitives les plus importantes. Les progrès plus lents par la suite et l'absence de rattrapage au delà d'un niveau de déficience mentale modérée sont plus difficiles à expliquer: on postule ici un effet de l'épilepsie sur le développement de réseaux neuronaux de l'hémisphère initialement non lésé, réseaux qui sont à la fois à un stade précoce de leur maturation et en cours de réorganisation suite à la lésion prénatale. La littérature sur les déficits cognitifs avant et après hemisphérotomie s'est surtout préoccupée du langage et de sa récupération possible. À notre connaissance, notre étude est la première à documenter la réversibilité d'une détérioration mentale avec les caractéristiques d'un syndrome frontal après hémisphérotomie. La chirurgie de l'épilepsie a offert ici une occasion unique de documenter le rôle de l'activité épileptique dans la régression cognitive puisqu'en interrompant brusquement la propagation de l'activité électrique anormale, on a pu comparer la dynamique du développement avant et après l'intervention. La mise en relation des multiples examens cliniques et EEG pratiqués chez un seul enfant sur plusieurs années a permis d'obtenir des informations importantes dans la compréhension des troubles cognitifs et du comportement associés aux épilepsies focales réfractaires. ABSTRACT : A boy with a right congenital hemiparesis due to a left pre-natal middle cerebral artery infarct developed focal epilepsy at 33 months and then an insidious and subsequently more rapid, massive cognitive and behavioural regression with a frontal syndrome between the ages of 4 and 5 years with continuous spike-waves during sleep (CSWS) on the EEG. Both the epilepsy and the CSWS were immediately suppressed by hemispherotomy at the age of 5 years and 4months. A behavioural-cognitive follow-up prior to hemispheratomy, an per-operative EEG and corticography and serial post-operative neuropsychological assessments were performed until the age of 11 years. The spread of the epileptic activity to the "healthy" frontal region was the cause of the reversible frontal syndrome. A later gradual long-term but incomplete cognitive recovery, with moderate mental disability was documented. T9ris outcome is probably explained by another facet of the epilepsy, namely the structural effects of prolonged epileptic dischazges in rapidly developing cerebral networks which are, at the same time undergoing the reorganization imposed by a unilateral early hemispheric lesion. Group studies on the outcome of children before and after hemispherectomy using only single IQ measures, pre- and postoperatively, may miss particular epileptic cognitive dysfunctions as they are likely to be different from case to case. Such detailed and rarely available complementary clinical and EEG data obtained in a single case at different time periods in relation to the epilepsy, including peroperative electrophysiological findings, may help to understand the different cognitive deficits and recovery profiles and the limits of full cognitive recovery.
                                
Resumo:
Summary Forests are key ecosystems of the earth and associated with a large range of functions. Many of these functions are beneficial to humans and are referred to as ecosystem services. Sustainable development requires that all relevant ecosystem services are quantified, managed and monitored equally. Natural resource management therefore targets the services associated with ecosystems. The main hypothesis of this thesis is that the spatial and temporal domains of relevant services do not correspond to a discrete forest ecosystem. As a consequence, the services are not quantified, managed and monitored in an equal and sustainable manner. The thesis aims were therefore to test this hypothesis, establish an improved conceptual approach and provide spatial applications for the relevant land cover and structure variables. The study was carried out in western Switzerland and based primarily on data from a countrywide landscape inventory. This inventory is part of the third Swiss national forest inventory and assesses continuous landscape variables based on a regular sampling of true colour aerial imagery. In addition, land cover variables were derived from Landsat 5 TM passive sensor data and land structure variables from active sensor data from a small footprint laserscanning system. The results confirmed the main hypothesis, as relevant services did not scale well with the forest ecosystem. Instead, a new conceptual approach for sustainable management of natural resources was described. This concept quantifies the services as a continuous function of the landscape, rather than a discrete function of the forest ecosystem. The explanatory landscape variables are therefore called continuous fields and the forest becomes a dependent and function-driven management unit. Continuous field mapping methods were established for land cover and structure variables. In conclusion, the discrete forest ecosystem is an adequate planning and management unit. However, monitoring the state of and trends in sustainability of services requires them to be quantified as a continuous function of the landscape. Sustainable natural resource management iteratively combines the ecosystem and gradient approaches. Résumé Les forêts sont des écosystèmes-clés de la terre et on leur attribue un grand nombre de fonctions. Beaucoup de ces fonctions sont bénéfiques pour l'homme et sont nommées services écosystémiques. Le développement durable exige que ces services écosystémiques soient tous quantifiés, gérés et surveillés de façon égale. La gestion des ressources naturelles a donc pour cible les services attribués aux écosystèmes. L'hypothèse principale de cette thèse est que les domaines spatiaux et temporels des services attribués à la forêt ne correspondent pas à un écosystème discret. Par conséquent, les services ne sont pas quantifiés, aménagés et surveillés d'une manière équivalente et durable. Les buts de la thèse étaient de tester cette hypothèse, d'établir une nouvelle approche conceptuelle de la gestion des ressources naturelles et de préparer des applications spatiales pour les variables paysagères et structurelles appropriées. L'étude a été menée en Suisse occidentale principalement sur la base d'un inventaire de paysage à l'échelon national. Cet inventaire fait partie du troisième inventaire forestier national suisse et mesure de façon continue des variables paysagères sur la base d'un échantillonnage régulier sur des photos aériennes couleur. En outre, des variables de couverture ? terrestre ont été dérivées des données d'un senseur passif Landsat 5 TM, ainsi que des variables structurelles, dérivées du laserscanning, un senseur actif. Les résultats confirment l'hypothèse principale, car l'échelle des services ne correspond pas à celle de l'écosystème forestier. Au lieu de cela, une nouvelle approche a été élaborée pour la gestion durable des ressources naturelles. Ce concept représente les services comme une fonction continue du paysage, plutôt qu'une fonction discrète de l'écosystème forestier. En conséquence, les variables explicatives de paysage sont dénommées continuous fields et la forêt devient une entité dépendante, définie par la fonction principale du paysage. Des méthodes correspondantes pour la couverture terrestre et la structure ont été élaborées. En conclusion, l'écosystème forestier discret est une unité adéquate pour la planification et la gestion. En revanche, la surveillance de la durabilité de l'état et de son évolution exige que les services soient quantifiés comme fonction continue du paysage. La gestion durable des ressources naturelles joint donc l'approche écosystémique avec celle du gradient de manière itérative.
                                
Resumo:
The geometry and connectivity of fractures exert a strong influence on the flow and transport properties of fracture networks. We present a novel approach to stochastically generate three-dimensional discrete networks of connected fractures that are conditioned to hydrological and geophysical data. A hierarchical rejection sampling algorithm is used to draw realizations from the posterior probability density function at different conditioning levels. The method is applied to a well-studied granitic formation using data acquired within two boreholes located 6 m apart. The prior models include 27 fractures with their geometry (position and orientation) bounded by information derived from single-hole ground-penetrating radar (GPR) data acquired during saline tracer tests and optical televiewer logs. Eleven cross-hole hydraulic connections between fractures in neighboring boreholes and the order in which the tracer arrives at different fractures are used for conditioning. Furthermore, the networks are conditioned to the observed relative hydraulic importance of the different hydraulic connections by numerically simulating the flow response. Among the conditioning data considered, constraints on the relative flow contributions were the most effective in determining the variability among the network realizations. Nevertheless, we find that the posterior model space is strongly determined by the imposed prior bounds. Strong prior bounds were derived from GPR measurements and helped to make the approach computationally feasible. We analyze a set of 230 posterior realizations that reproduce all data given their uncertainties assuming the same uniform transmissivity in all fractures. The posterior models provide valuable statistics on length scales and density of connected fractures, as well as their connectivity. In an additional analysis, effective transmissivity estimates of the posterior realizations indicate a strong influence of the DFN structure, in that it induces large variations of equivalent transmissivities between realizations. The transmissivity estimates agree well with previous estimates at the site based on pumping, flowmeter and temperature data.
                                
Resumo:
Self-organizing maps (Kohonen 1997) is a type of artificial neural network developedto explore patterns in high-dimensional multivariate data. The conventional versionof the algorithm involves the use of Euclidean metric in the process of adaptation ofthe model vectors, thus rendering in theory a whole methodology incompatible withnon-Euclidean geometries.In this contribution we explore the two main aspects of the problem:1. Whether the conventional approach using Euclidean metric can shed valid resultswith compositional data.2. If a modification of the conventional approach replacing vectorial sum and scalarmultiplication by the canonical operators in the simplex (i.e. perturbation andpowering) can converge to an adequate solution.Preliminary tests showed that both methodologies can be used on compositional data.However, the modified version of the algorithm performs poorer than the conventionalversion, in particular, when the data is pathological. Moreover, the conventional ap-proach converges faster to a solution, when data is \well-behaved".Key words: Self Organizing Map; Artificial Neural networks; Compositional data
                                
Resumo:
To evaluate whether an activity monitor based on body acceleration measurement can accurately assess the energy cost of the human locomotion, 12 subjects walked a combination of three different speeds (preferred speed +/- 1 km/h) and seven slopes (-15 to +15% by steps of 5%) on a treadmill. Body accelerations were recorded using a triaxial accelerometer attached to the low back. The mean of the integral of the vector magnitude (norm) of the accelerations (mIAN) was calculated. VO2 was measured using continuous indirect calorimetry. When the results were separately analysed for each incline, mIAN was correlated to VO2 (average r = 0.87, p<0.001, n = 36). VO2 was not significantly correlated to mIAN when data were globally analysed (n = 252). Large relative errors occurred when predicted VO2 (estimated from data of level walking) was compared with measured VO2 for different inclines (-53% at +15% incline, to +55% at -15% incline). It is concluded that without an external measurement of the slope, the standard method of analysis of body accelerations cannot accurately predict the energy cost of uphill or downhill walking.
 
                    