140 resultados para dominance data
Resumo:
CONTEXT: Several genetic risk scores to identify asymptomatic subjects at high risk of developing type 2 diabetes mellitus (T2DM) have been proposed, but it is unclear whether they add extra information to risk scores based on clinical and biological data. OBJECTIVE: The objective of the study was to assess the extra clinical value of genetic risk scores in predicting the occurrence of T2DM. DESIGN: This was a prospective study, with a mean follow-up time of 5 yr. SETTING AND SUBJECTS: The study included 2824 nondiabetic participants (1548 women, 52 ± 10 yr). MAIN OUTCOME MEASURE: Six genetic risk scores for T2DM were tested. Four were derived from the literature and two were created combining all (n = 24) or shared (n = 9) single-nucleotide polymorphisms of the previous scores. A previously validated clinic + biological risk score for T2DM was used as reference. RESULTS: Two hundred seven participants (7.3%) developed T2DM during follow-up. On bivariate analysis, no differences were found for all but one genetic score between nondiabetic and diabetic participants. After adjusting for the validated clinic + biological risk score, none of the genetic scores improved discrimination, as assessed by changes in the area under the receiver-operating characteristic curve (range -0.4 to -0.1%), sensitivity (-2.9 to -1.0%), specificity (0.0-0.1%), and positive (-6.6 to +0.7%) and negative (-0.2 to 0.0%) predictive values. Similarly, no improvement in T2DM risk prediction was found: net reclassification index ranging from -5.3 to -1.6% and nonsignificant (P ≥ 0.49) integrated discrimination improvement. CONCLUSIONS: In this study, adding genetic information to a previously validated clinic + biological score does not seem to improve the prediction of T2DM.
Resumo:
Ce guide présente la méthode Data Envelopment Analysis (DEA), une méthode d'évaluation de la performance . Il est destiné aux responsables d'organisations publiques qui ne sont pas familiers avec les notions d'optimisation mathématique, autrement dit de recherche opérationnelle. L'utilisation des mathématiques est par conséquent réduite au minimum. Ce guide est fortement orienté vers la pratique. Il permet aux décideurs de réaliser leurs propres analyses d'efficience et d'interpréter facilement les résultats obtenus. La méthode DEA est un outil d'analyse et d'aide à la décision dans les domaines suivants : - en calculant un score d'efficience, elle indique si une organisation dispose d'une marge d'amélioration ; - en fixant des valeurs-cibles, elle indique de combien les inputs doivent être réduits et les outputs augmentés pour qu'une organisation devienne efficiente ; - en identifiant le type de rendements d'échelle, elle indique si une organisation doit augmenter ou au contraire réduire sa taille pour minimiser son coût moyen de production ; - en identifiant les pairs de référence, elle désigne quelles organisations disposent des best practice à analyser.
Resumo:
Natural selection is typically exerted at some specific life stages. If natural selection takes place before a trait can be measured, using conventional models can cause wrong inference about population parameters. When the missing data process relates to the trait of interest, a valid inference requires explicit modeling of the missing process. We propose a joint modeling approach, a shared parameter model, to account for nonrandom missing data. It consists of an animal model for the phenotypic data and a logistic model for the missing process, linked by the additive genetic effects. A Bayesian approach is taken and inference is made using integrated nested Laplace approximations. From a simulation study we find that wrongly assuming that missing data are missing at random can result in severely biased estimates of additive genetic variance. Using real data from a wild population of Swiss barn owls Tyto alba, our model indicates that the missing individuals would display large black spots; and we conclude that genes affecting this trait are already under selection before it is expressed. Our model is a tool to correctly estimate the magnitude of both natural selection and additive genetic variance.
Resumo:
Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale for the purpose of improving predictions of groundwater flow and solute transport. However, extending corresponding approaches to the regional scale still represents one of the major challenges in the domain of hydrogeophysics. To address this problem, we have developed a regional-scale data integration methodology based on a two-step Bayesian sequential simulation approach. Our objective is to generate high-resolution stochastic realizations of the regional-scale hydraulic conductivity field in the common case where there exist spatially exhaustive but poorly resolved measurements of a related geophysical parameter, as well as highly resolved but spatially sparse collocated measurements of this geophysical parameter and the hydraulic conductivity. To integrate this multi-scale, multi-parameter database, we first link the low- and high-resolution geophysical data via a stochastic downscaling procedure. This is followed by relating the downscaled geophysical data to the high-resolution hydraulic conductivity distribution. After outlining the general methodology of the approach, we demonstrate its application to a realistic synthetic example where we consider as data high-resolution measurements of the hydraulic and electrical conductivities at a small number of borehole locations, as well as spatially exhaustive, low-resolution estimates of the electrical conductivity obtained from surface-based electrical resistivity tomography. The different stochastic realizations of the hydraulic conductivity field obtained using our procedure are validated by comparing their solute transport behaviour with that of the underlying ?true? hydraulic conductivity field. We find that, even in the presence of strong subsurface heterogeneity, our proposed procedure allows for the generation of faithful representations of the regional-scale hydraulic conductivity structure and reliable predictions of solute transport over long, regional-scale distances.
Resumo:
Distribution of socio-economic features in urban space is an important source of information for land and transportation planning. The metropolization phenomenon has changed the distribution of types of professions in space and has given birth to different spatial patterns that the urban planner must know in order to plan a sustainable city. Such distributions can be discovered by statistical and learning algorithms through different methods. In this paper, an unsupervised classification method and a cluster detection method are discussed and applied to analyze the socio-economic structure of Switzerland. The unsupervised classification method, based on Ward's classification and self-organized maps, is used to classify the municipalities of the country and allows to reduce a highly-dimensional input information to interpret the socio-economic landscape. The cluster detection method, the spatial scan statistics, is used in a more specific manner in order to detect hot spots of certain types of service activities. The method is applied to the distribution services in the agglomeration of Lausanne. Results show the emergence of new centralities and can be analyzed in both transportation and social terms.
Resumo:
Animal toxins are of interest to a wide range of scientists, due to their numerous applications in pharmacology, neurology, hematology, medicine, and drug research. This, and to a lesser extent the development of new performing tools in transcriptomics and proteomics, has led to an increase in toxin discovery. In this context, providing publicly available data on animal toxins has become essential. The UniProtKB/Swiss-Prot Tox-Prot program (http://www.uniprot.org/program/Toxins) plays a crucial role by providing such an access to venom protein sequences and functions from all venomous species. This program has up to now curated more than 5000 venom proteins to the high-quality standards of UniProtKB/Swiss-Prot (release 2012_02). Proteins targeted by these toxins are also available in the knowledgebase. This paper describes in details the type of information provided by UniProtKB/Swiss-Prot for toxins, as well as the structured format of the knowledgebase.
Resumo:
The advent and application of high-resolution array-based comparative genome hybridization (array CGH) has led to the detection of large numbers of copy number variants (CNVs) in patients with developmental delay and/or multiple congenital anomalies as well as in healthy individuals. The notion that CNVs are also abundantly present in the normal population challenges the interpretation of the clinical significance of detected CNVs in patients. In this review we will illustrate a general clinical workflow based on our own experience that can be used in routine diagnostics for the interpretation of CNVs.
Resumo:
Background: Many studies have found considerable variations in the resource intensity of physical therapy episodes. Although they have identified several patient-and provider-related factors, few studies have examined their relative explanatory power. We sought to quantify the contribution of patients and providers to these differences and examine how effective Swiss regulations are (nine-session ceiling per prescription and bonus for first treatments). Methods: Our sample consisted of 87,866 first physical therapy episodes performed by 3,365 physiotherapists based on referrals by 6,131 physicians. We modeled the number of visits per episode using a multilevel log linear regression with crossed random effects for physiotherapists and physicians and with fixed effects for cantons. The three-level explanatory variables were patient, physiotherapist and physician characteristics. Results: The median number of sessions was nine (interquartile range 6-13). Physical therapy use increased with age, women, higher health care costs, lower deductibles, surgery and specific conditions. Use rose with the share of nine-session episodes among physiotherapists or physicians, but fell with the share of new treatments. Geographical area had no influence. Most of the variance was explained at the patient level, but the available factors explained only 4% thereof. Physiotherapists and physicians explained only 6% and 5% respectively of the variance, although the available factors explained most of this variance. Regulations were the most powerful factors. Conclusion: Against the backdrop of abundant physical therapy supply, Swiss financial regulations did not restrict utilization. Given that patient-related factors explained most of the variance, this group should be subject to closer scrutiny. Moreover, further research is needed on the determinants of patient demand.
Resumo:
A high-resolution micropalaeontological study, combined with geochemical and sedimentological analyses was performed on the Tiefengraben, Schlossgraben and Eiberg sections (Austrian Alps) in order to characterize sea-surface carbonate production during the end-Triassic crisis. At the end-Rhaetian, the dominant calcareous nannofossil Prinsiosphaera triassica shows a decrease in abundance and size and this is correlated with a increase in delta O-18 and a gradual decline in delta C-13(carb) values. Simultaneously, benthic foraminiferal assemblages show a decrease in diversity and abundance of calcareous taxa and a dominance of infaunal agglutinated taxa. The smaller size of calcareous nannofossils disturbed the vertical export balance of the biological carbon pump towards the sea-bottom, resulting in changes in feeding strategies within the benthic foraminiferal assemblages from deposit feeders to detritus feeders and bacterial scavengers. These micropalaeontological data combined with geochemical proxies suggest that changes in seawater chemistry and/or cooling episodes might have occurred in the latest Triassic, leading to a marked decrease of carbonate production. This in turn culminated in the quasi-absence of calcareous nannofossils and benthic foraminifers in the latest Triassic. The aftermath (latest Triassic earliest Jurassic) was characterised by abundance peaks of ``disaster'' epifaunal agglutinated foraminifera Trochammina on the sea-floor. Central Atlantic Magmatic Province (CAMP) paroxysmal activity, superimposed on a major worldwide regressive phase, is assumed to be responsible for a deterioration in marine palaeoenvironments. CAMP sulfuric emissions might have been the trigger for cooling episodes and seawater acidification leading to disturbance of the surface carbonate production at the very end-Triassic.
Resumo:
Pantomimes of object use require accurate representations of movements and a selection of the most task-relevant gestures. Prominent models of praxis, corroborated by functional neuroimaging studies, predict a critical role for left parietal cortices in pantomime and advance that these areas store representations of tool use. In contrast, lesion data points to the involvement of left inferior frontal areas, suggesting that defective selection of movement features is the cause of pantomime errors. We conducted a large-scale voxel-based lesion-symptom mapping analyses with configural/spatial (CS) and body-part-as-object (BPO) pantomime errors of 150 left and right brain-damaged patients. Our results confirm the left hemisphere dominance in pantomime. Both types of error were associated with damage to left inferior frontal regions in tumor and stroke patients. While CS pantomime errors were associated with left temporoparietal lesions in both stroke and tumor patients, these errors appeared less associated with parietal areas in stroke than in tumor patients and less associated with temporal in tumor than stroke patients. BPO errors were associated with left inferior frontal lesions in both tumor and stroke patients. Collectively, our results reveal a left intrahemispheric dissociation for various aspects of pantomime, but with an unspecific role for inferior frontal regions.
Resumo:
On 9 October 1963 a catastrophic landslide suddenly occurred on the southern slope of the Vaiont dam reservoir. A mass of approximately 270 million m3 collapsed into the reservoir generating a wave that overtopped the dam and hit the town of Longarone and other villages nearby. Several investigations and interpretations of the slope collapse have been carried out during the last 45 years, however, a comprehensive explanation of both the triggering and the dynamics of the phenomenon has yet to be provided. In order to re-evaluate the currently existing information on the slide, an electronic bibliographic database and an ESRI-geodatabase have been developed. The chronology of the collected documentation showed that most of the studies for re-evaluating the failure mechanisms were conducted in the last decade, as a consequence of knowledge, methods and techniques recently acquired. The current contents of the geodatabase will improve definition of the structural setting that influenced the slide and led to the the propagation of the displaced rock mass. The objectives, structure and contents of the e-bibliography and Geodatabase are indicated, together with a brief description on the possible use of the alphanumeric and spatial contents of the databases.
Resumo:
High precision U-Pb zircon and Ar-40/Ar-39 mica geochronological data on metagranodiorites, metagranites and mica schists from north and central Evia island (Greece) are presented in this study. U-Pb zircon ages range from 308 to 1912 Ma, and indicate a prolonged magmatic activity in Late Carboniferous. Proterozoic ages represent inherited cores within younger crystals. Muscovite Ar-40/Ar-39 plateau ages of 288 to 297 Ma are interpreted as cooling ages of the magmatic bodies and metamorphic host rocks in upper greenschist to epidote-amphibolite metamorphic conditions. The multistage magmatism had a duration between 308 and 319 hla but some older intrusions, as well as metamorphic events, cannot be excluded. Geochemical analyses and zircon typology indicate calc-alkaline affinities for the granites of central Evia and alkaline to calc-alkaline characteristics for the metagranodiorites from the northern part of the island. The new data point towards the SE continuation, in Evia and the Cyclades, of a Variscan continental crust already recognised in northern Greece (Pelagonian basement). The Late Carboniferous magmatism is viewed as a result of northward subduction of the Paleotethys under the Eurasian margin.
Resumo:
Validated in vitro methods for skin corrosion and irritation were adopted by the OECD and by the European Union during the last decade. In the EU, Switzerland and countries adopting the EU legislation, these assays may allow the full replacement of animal testing for identifying and classifying compounds as skin corrosives, skin irritants, and non irritants. In order to develop harmonised recommendations on the use of in vitro data for regulatory assessment purposes within the European framework, a workshop was organized by the Swiss Federal Office of Public Health together with ECVAM and the BfR. It comprised stakeholders from various European countries involved in the process from in vitro testing to the regulatory assessment of in vitro data. Discussions addressed the following questions: (1) the information requirements considered useful for regulatory assessment; (2) the applicability of in vitro skin corrosion data to assign the corrosive subcategories as implemented by the EU Classification, Labelling and Packaging Regulation; (3) the applicability of testing strategies for determining skin corrosion and irritation hazards; and (4) the applicability of the adopted in vitro assays to test mixtures, preparations and dilutions. Overall, a number of agreements and recommendations were achieved in order to clarify and facilitate the assessment and use of in vitro data from regulatory accepted methods, and ultimately help regulators and scientists facing with the new in vitro approaches to evaluate skin irritation and corrosion hazards and risks without animal data.
Resumo:
It is well known that dichotomizing continuous data has the effect to decrease statistical power when the goal is to test for a statistical association between two variables. Modern researchers however are focusing not only on statistical significance but also on an estimation of the "effect size" (i.e., the strength of association between the variables) to judge whether a significant association is also clinically relevant. In this article, we are interested in the consequences of dichotomizing continuous data on the value of an effect size in some classical settings. It turns out that the conclusions will not be the same whether using a correlation or an odds ratio to summarize the strength of association between the variables: Whereas the value of a correlation is typically decreased by a factor pi/2 after each dichotomization, the value of an odds ratio is at the same time raised to the power 2. From a descriptive statistical point of view, it is thus not clear whether dichotomizing continuous data leads to a decrease or to an increase in the effect size, as illustrated using a data set to investigate the relationship between motor and intellectual functions in children and adolescents