30 resultados para concept analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genome-wide association studies (GWAS) are designed to identify the portion of single-nucleotide polymorphisms (SNPs) in genome sequences associated with a complex trait. Strategies based on the gene list enrichment concept are currently applied for the functional analysis of GWAS, according to which a significant overrepresentation of candidate genes associated with a biological pathway is used as a proxy to infer overrepresentation of candidate SNPs in the pathway. Here we show that such inference is not always valid and introduce the program SNP2GO, which implements a new method to properly test for the overrepresentation of candidate SNPs in biological pathways.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study proposes a new concept for upscaling local information on failure surfaces derived from geophysical data, in order to develop the spatial information and quickly estimate the magnitude and intensity of a landslide. A new vision of seismic interpretation on landslides is also demonstrated by taking into account basic geomorphic information with a numeric method based on the Sloping Local Base Level (SLBL). The SLBL is a generalization of the base level defined in geomorphology applied to landslides, and allows the calculation of the potential geometry of the landslide failure surface. This approach was applied to a large scale landslide formed mainly in gypsum and situated in a former glacial valley along the Rhone within the Western European Alps. Previous studies identified the existence of two sliding surfaces that may continue below the level of the valley. In this study. seismic refraction-reflexion surveys were carried out to verify the existence of these failure surfaces. The analysis of the seismic data provides a four-layer model where three velocity layers (<1000 ms(-1), 1500 ms(-1) and 3000 ms(-1)) are interpreted as the mobilized mass at different weathering levels and compaction. The highest velocity layer (>4000 ms(-1)) with a maximum depth of similar to 58 m is interpreted as the stable anhydrite bedrock. Two failure surfaces were interpreted from the seismic surveys: an upper failure and a much deeper one (respectively 25 and 50 m deep). The upper failure surface depth deduced from geophysics is slightly different from the results obtained using the SLBL, and the deeper failure surface depth calculated with the SLBL method is underestimated in comparison with the geophysical interpretations. Optimal results were therefore obtained by including the seismic data in the SLBL calculations according to the geomorphic limits of the landslide (maximal volume of mobilized mass = 7.5 x 10(6) m(3)).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The focus of my PhD research was the concept of modularity. In the last 15 years, modularity has become a classic term in different fields of biology. On the conceptual level, a module is a set of interacting elements that remain mostly independent from the elements outside of the module. I used modular analysis techniques to study gene expression evolution in vertebrates. In particular, I identified ``natural'' modules of gene expression in mouse and human, and I showed that expression of organ-specific and system-specific genes tends to be conserved between such distance vertebrates as mammals and fishes. Also with a modular approach, I studied patterns of developmental constraints on transcriptome evolution. I showed that none of the two commonly accepted models of the evolution of embryonic development (``evo-devo'') are exclusively valid. In particular, I found that the conservation of the sequences of regulatory regions is highest during mid-development of zebrafish, and thus it supports the ``hourglass model''. In contrast, events of gene duplication and new gene introduction are most rare in early development, which supports the ``early conservation model''. In addition to the biological insights on transcriptome evolution, I have also discussed in detail the advantages of modular approaches in large-scale data analysis. Moreover, I re-analyzed several studies (published in high-ranking journals), and showed that their conclusions do not hold out under a detailed analysis. This demonstrates that complex analysis of high-throughput data requires a co-operation between biologists, bioinformaticians, and statisticians.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work is to present a new concept, called on-line desorption of dried blood spots (on-line DBS), allowing the direct analysis of a dried blood spot coupled to liquid chromatography mass spectrometry device (LC/MS). The system is based on an inox cell which can receive a blood sample (10 microL) previously spotted on a filter paper. The cell is then integrated into LC/MS system where the analytes are desorbed out of the paper towards a column switching system ensuring the purification and separation of the compounds before their detection on a single quadrupole MS coupled to atmospheric pressure chemical ionisation (APCI) source. The described procedure implies that no pretreatment is necessary in spite the analysis is based on whole blood sample. To ensure the applicability of the concept, saquinavir, imipramine, and verapamil were chosen. Despite the use of a small sampling volume and a single quadrupole detector, on-line DBS allowed the analyses of these three compounds over their therapeutic concentrations from 50 to 500 ng/mL for imipramine and verapamil and from 100 to 1000 ng/mL for saquinavir. Moreover, the method showed good repeatability with relative standard deviation (RSD) lower than 15% based on two levels of concentration (low and high). Function responses were found to be linear over the therapeutic concentration for each compound and were used to determine the concentrations of real patient samples for saquinavir. Comparison of the founded values with those of a validated method used routinely in a reference laboratory showed a good correlation between the two methods. Moreover, good selectivity was observed ensuring that no endogenous or chemical components interfered with the quantitation of the analytes. This work demonstrates the feasibility and applicability of the on-line DBS procedure for bioanalysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Standard cardiopulmonary bypass (CPB) circuits with their large surface area and volume contribute to postoperative systemic inflammatory reaction and hemodilution. In order to minimize these problems a new approach has been developed resulting in a single disposable, compact arterio-venous loop, which has integral kinetic-assist pumping, oxygenating, air removal, and gross filtration capabilities (CardioVention Inc., Santa Clara, CA, USA). The impact of this system on gas exchange capacity, blood elements and hemolysis is compared to that of a conventional circuit in a model of prolonged perfusion. METHODS: Twelve calves (mean body weight: 72.2+/-3.7 kg) were placed on cardiopulmonary bypass for 6 h with a flow of 5 l/min, and randomly assigned to the CardioVention system (n=6) or a standard CPB circuit (n=6). A standard battery of blood samples was taken before bypass and throughout bypass. Analysis of variance was used for comparison. RESULTS: The hematocrit remained stable throughout the experiment in the CardioVention group, whereas it dropped in the standard group in the early phase of perfusion. When normalized for prebypass values, both profiles differed significantly (P<0.01). Both O2 and CO2 transfers were significantly improved in the CardioVention group (P=0.04 and P<0.001, respectively). There was a slightly higher pressure drop in the CardioVention group but no single value exceeded 112 mmHg. No hemolysis could be detected in either group with all free plasma Hb values below 15 mg/l. Thrombocyte count, when corrected by hematocrit and normalized by prebypass values, exhibited an increased drop in the standard group (P=0.03). CONCLUSION: The CardioVention system with its concept of limited priming volume and exposed foreign surface area, improves gas exchange probably because of the absence of detectable hemodilution, and appears to limit the decrease in the thrombocyte count which may be ascribed to the reduced surface. Despite the volume and surface constraints, no hemolysis could be detected throughout the 6 h full-flow perfusion period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT: BACKGROUND: Decision curve analysis has been introduced as a method to evaluate prediction models in terms of their clinical consequences if used for a binary classification of subjects into a group who should and into a group who should not be treated. The key concept for this type of evaluation is the "net benefit", a concept borrowed from utility theory. METHODS: We recall the foundations of decision curve analysis and discuss some new aspects. First, we stress the formal distinction between the net benefit for the treated and for the untreated and define the concept of the "overall net benefit". Next, we revisit the important distinction between the concept of accuracy, as typically assessed using the Youden index and a receiver operating characteristic (ROC) analysis, and the concept of utility of a prediction model, as assessed using decision curve analysis. Finally, we provide an explicit implementation of decision curve analysis to be applied in the context of case-control studies. RESULTS: We show that the overall net benefit, which combines the net benefit for the treated and the untreated, is a natural alternative to the benefit achieved by a model, being invariant with respect to the coding of the outcome, and conveying a more comprehensive picture of the situation. Further, within the framework of decision curve analysis, we illustrate the important difference between the accuracy and the utility of a model, demonstrating how poor an accurate model may be in terms of its net benefit. Eventually, we expose that the application of decision curve analysis to case-control studies, where an accurate estimate of the true prevalence of a disease cannot be obtained from the data, is achieved with a few modifications to the original calculation procedure. CONCLUSIONS: We present several interrelated extensions to decision curve analysis that will both facilitate its interpretation and broaden its potential area of application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The measurement of fat balance (fat input minus fat output) involves the accurate estimation of both metabolizable fat intake and total fat oxidation. This is possible mostly under laboratory conditions and not yet in free-living conditions. In the latter situation, net fat retention/mobilization can be estimated based on precise and accurate sequential body composition measurements. In case of positive balance, lipids stored in adipose tissue can originate from dietary (exogenous) lipids or from nonlipid precursors, mainly from carbohydrates (CHOs) but also from ethanol, through a process known as de novo lipogenesis (DNL). Basic equations are provided in this review to facilitate the interpretation of the different subcomponents of fat balance (endogenous vs exogenous) under different nutritional circumstances. One difficulty is methodological: total DNL is difficult to measure quantitatively in man; for example, indirect calorimetry only tracks net DNL, not total DNL. Although the numerous factors (mostly exogenous) influencing DNL have been studied, in particular the effect of CHO overfeeding, there is little information on the rate of DNL in habitual conditions of life, that is, large day-to-day fluctuations of CHO intakes, different types of CHO ingested with different glycemic indexes, alcohol combined with excess CHO intakes, etc. Three issues, which are still controversial today, will be addressed: (1) Is the increase of fat mass induced by CHO overfeeding explained by DNL only, or by decreased endogenous fat oxidation, or both? (2) Is DNL different in overweight and obese individuals as compared to their lean counterparts? (3) Does DNL occur both in the liver and in adipose tissue? Recent studies have demonstrated that acute CHO overfeeding influences adipose tissue lipogenic gene expression and that CHO may stimulate DNL in skeletal muscles, at least in vitro. The role of DNL and its importance in health and disease remain to be further clarified, in particular the putative effect of DNL on the control of energy intake and energy expenditure, as well as the occurrence of DNL in other tissues (such as in myocytes) in addition to hepatocytes and adipocytes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of energy gap(s) is useful for understanding the consequence of a small daily, weekly, or monthly positive energy balance and the inconspicuous shift in weight gain ultimately leading to overweight and obesity. Energy gap is a dynamic concept: an initial positive energy gap incurred via an increase in energy intake (or a decrease in physical activity) is not constant, may fade out with time if the initial conditions are maintained, and depends on the 'efficiency' with which the readjustment of the energy imbalance gap occurs with time. The metabolic response to an energy imbalance gap and the magnitude of the energy gap(s) can be estimated by at least two methods, i.e. i) assessment by longitudinal overfeeding studies, imposing (by design) an initial positive energy imbalance gap; ii) retrospective assessment based on epidemiological surveys, whereby the accumulated endogenous energy storage per unit of time is calculated from the change in body weight and body composition. In order to illustrate the difficulty of accurately assessing an energy gap we have used, as an illustrative example, a recent epidemiological study which tracked changes in total energy intake (estimated by gross food availability) and body weight over 3 decades in the US, combined with total energy expenditure prediction from body weight using doubly labelled water data. At the population level, the study attempted to assess the cause of the energy gap purported to be entirely due to increased food intake. Based on an estimate of change in energy intake judged to be more reliable (i.e. in the same study population) and together with calculations of simple energetic indices, our analysis suggests that conclusions about the fundamental causes of obesity development in a population (excess intake vs. low physical activity or both) is clouded by a high level of uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a multivariate approach to the study of geographic species distribution which does not require absence data. Building on Hutchinson's concept of the ecological niche, this factor analysis compares, in the multidimensional space of ecological variables, the distribution of the localities where the focal species was observed to a reference set describing the whole study area. The first factor extracted maximizes the marginality of the focal species, defined as the ecological distance between the species optimum and the mean habitat within the reference area. The other factors maximize the specialization of this focal species, defined as the ratio of the ecological variance in mean habitat to that observed for the focal species. Eigenvectors and eigenvalues are readily interpreted and can be used to build habitat-suitability maps. This approach is recommended in Situations where absence data are not available (many data banks), unreliable (most cryptic or rare species), or meaningless (invaders). We provide an illustration and validation of the method for the alpine ibex, a species reintroduced in Switzerland which presumably has not yet recolonized its entire range.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this PhD thesis is to investigate a semantic relation present in the connection of sentences (more specifically: propositional units). This relation, which we refer to as contrast, includes the traditional categories of adversatives - predominantly represented by the connector but in English and pero in Modern Spanish - and concessives, prototypically verbalised through although / aunque. The aim is to describe, analyse and - as far as possible - to explain the emergence and evolution of different syntactic schemes marking contrast during the first three centuries of Spanish (also referred to as Castilian) as a literary language, i.e., from the 13th to the 15th century. The starting point of this question is a commonplace in syntax, whereby the semantic and syntactic complexity of clause linkage correlates with the degree of textual elaboration. In historical linguistics, i.e., applied to the phylogeny of a language, it is commonly referred to as the parataxis hypothesis A crucial part of the thesis is dedicated by the definition of contrast as a semantic relation. Although the label contrast has been used in this sense, mainly in functional grammar and text linguistics, mainstream grammaticography and linguistics remain attached to the traditional categories adversatives and concessives. In opposition to this traditional view, we present our own model of contrast, based on a pragma-semantic description proposed for the analysis of adversatives by Oswald Ducrot and subsequently adopted by Ekkehard König for the analysis of concessives. We refine and further develop this model in order for it to accommodate all, not just the prototypical instances of contrast in Spanish, arguing that the relationship between adversatives and concessives is a marked opposition, i.e., that the higher degree of semantic and syntactic integration of concessives restricts some possible readings that the adversatives may have, but that this difference is almost systematically neutralised by contextual factors, thus justifying the assumption of contrast as a comprehensive onomasiological category. This theoretical focus is completed by a state-of-the-question overview attempting to account for all relevant forms in which contrast is expressed in Medieval Spanish, with the aid of lexicographic and grammaticographical sources, and an empirical study investigating the expression of corpus in a corpus study on the textual functions of contrast in nine Medieval Spanish texts: Cantar de Mio Cid, Libro de Alexandre, Milagros de Nuestra Sehora, Estoria de Espana, Primera Partida, Lapidario, Libro de buen amor, Conde Lucanor, and Corbacho. This corpus is analysed using quantitative and qualitative tools, and the study is accompanied by a series of methodological remarks on how to investigate a pragma-semantic category in historical linguistics. The corpus study shows that the parataxis hypothesis fails to prove from a statistical viewpoint, although a qualitative analysis shows that the use of subordination does increase over time in some particular contexts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The aims of the study were to evaluate the prevalence of acute coronary syndrome (ACS) among patients presenting with atypical chest pain who are evaluated for acute aortic syndrome (AAS) or pulmonary embolism (PE) with computed tomoangiography (CTA) and discuss the rationale for the use of triple rule-out (TRO) protocol for triaging these patients. METHODS: This study is a retrospective analysis of patients presenting with atypical chest pain and evaluated with thoracic (CTA), for suspicion of AAS/PE. Two physicians reviewed patient files for demographic characteristics, initial CT and final clinical diagnosis. Patients were classified according to CTA finding into AAS, PE and other diagnoses and according to final clinical diagnosis into AAS, PE, ACS and other diagnoses. RESULTS: Four hundred and sixty-seven patients were evaluated: 396 (84.8%) patients for clinical suspicion of PE and 71 (15.2%) patients for suspicion of AAS. The prevalence of ACS and AAS was low among the PE patients: 5.5% and 0.5% respectively (P = 0.0001), while the prevalence of ACS and PE was 18.3% and 5.6% among AAS patients (P = 0.14 and P = 0.34 respectively). CONCLUSION: The prevalence of ACS and AAS among patients suspected clinically of having PE is limited while the prevalence of ACS and PE among patients suspected clinically of having AAS is significant. Accordingly patients suspected for PE could be evaluated with dedicated PE CTA while those suspected for AAS should still be triaged using TRO protocol.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the present study was to elicit how patients with delusions with religious contents conceptualized or experienced their spirituality and religiousness. Sixty-two patients with present or past religious delusions went through semistructured interviews, which were analyzed using the three coding steps described in the grounded theory. Three major themes were found in religious delusions: ''spiritual identity,'' ''meaning of illness,'' and ''spiritual figures.'' One higher-order concept was found: ''structure of beliefs.'' We identified dynamics that put these personal beliefs into a constant reconstruction through interaction with the world and others (i.e., open dynamics) and conversely structural dynamics that created a complete rupture with the surrounding world and others (i.e., closed structural dynamics); those dynamics may coexist. These analyses may help to identify psychological functions of delusions with religious content and, therefore, to better conceptualize interventions when dealing with it in psychotherapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Shared Decision Making (SDM) is increasingly advocated as a model for medical decision making. However, there is still low use of SDM in clinical practice. High impact factor journals might represent an efficient way for its dissemination. We aimed to identify and characterize publication trends of SDM in 15 high impact medical journals. METHODS: We selected the 15 general and internal medicine journals with the highest impact factor publishing original articles, letters and editorials. We retrieved publications from 1996 to 2011 through the full-text search function on each journal website and abstracted bibliometric data. We included publications of any type containing the phrase "shared decision making" or five other variants in their abstract or full text. These were referred to as SDM publications. A polynomial Poisson regression model with logarithmic link function was used to assess the evolution across the period of the number of SDM publications according to publication characteristics. RESULTS: We identified 1285 SDM publications out of 229,179 publications in 15 journals from 1996 to 2011. The absolute number of SDM publications by journal ranged from 2 to 273 over 16 years. SDM publications increased both in absolute and relative numbers per year, from 46 (0.32% relative to all publications from the 15 journals) in 1996 to 165 (1.17%) in 2011. This growth was exponential (P < 0.01). We found fewer research publications (465, 36.2% of all SDM publications) than non-research publications, which included non-systematic reviews, letters, and editorials. The increase of research publications across time was linear. Full-text search retrieved ten times more SDM publications than a similar PubMed search (1285 vs. 119 respectively). CONCLUSION: This review in full-text showed that SDM publications increased exponentially in major medical journals from 1996 to 2011. This growth might reflect an increased dissemination of the SDM concept to the medical community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis develops a comprehensive and a flexible statistical framework for the analysis and detection of space, time and space-time clusters of environmental point data. The developed clustering methods were applied in both simulated datasets and real-world environmental phenomena; however, only the cases of forest fires in Canton of Ticino (Switzerland) and in Portugal are expounded in this document. Normally, environmental phenomena can be modelled as stochastic point processes where each event, e.g. the forest fire ignition point, is characterised by its spatial location and occurrence in time. Additionally, information such as burned area, ignition causes, landuse, topographic, climatic and meteorological features, etc., can also be used to characterise the studied phenomenon. Thereby, the space-time pattern characterisa- tion represents a powerful tool to understand the distribution and behaviour of the events and their correlation with underlying processes, for instance, socio-economic, environmental and meteorological factors. Consequently, we propose a methodology based on the adaptation and application of statistical and fractal point process measures for both global (e.g. the Morisita Index, the Box-counting fractal method, the multifractal formalism and the Ripley's K-function) and local (e.g. Scan Statistics) analysis. Many measures describing the space-time distribution of environmental phenomena have been proposed in a wide variety of disciplines; nevertheless, most of these measures are of global character and do not consider complex spatial constraints, high variability and multivariate nature of the events. Therefore, we proposed an statistical framework that takes into account the complexities of the geographical space, where phenomena take place, by introducing the Validity Domain concept and carrying out clustering analyses in data with different constrained geographical spaces, hence, assessing the relative degree of clustering of the real distribution. Moreover, exclusively to the forest fire case, this research proposes two new methodologies to defining and mapping both the Wildland-Urban Interface (WUI) described as the interaction zone between burnable vegetation and anthropogenic infrastructures, and the prediction of fire ignition susceptibility. In this regard, the main objective of this Thesis was to carry out a basic statistical/- geospatial research with a strong application part to analyse and to describe complex phenomena as well as to overcome unsolved methodological problems in the characterisation of space-time patterns, in particular, the forest fire occurrences. Thus, this Thesis provides a response to the increasing demand for both environmental monitoring and management tools for the assessment of natural and anthropogenic hazards and risks, sustainable development, retrospective success analysis, etc. The major contributions of this work were presented at national and international conferences and published in 5 scientific journals. National and international collaborations were also established and successfully accomplished. -- Cette thèse développe une méthodologie statistique complète et flexible pour l'analyse et la détection des structures spatiales, temporelles et spatio-temporelles de données environnementales représentées comme de semis de points. Les méthodes ici développées ont été appliquées aux jeux de données simulées autant qu'A des phénomènes environnementaux réels; nonobstant, seulement le cas des feux forestiers dans le Canton du Tessin (la Suisse) et celui de Portugal sont expliqués dans ce document. Normalement, les phénomènes environnementaux peuvent être modélisés comme des processus ponctuels stochastiques ou chaque événement, par ex. les point d'ignition des feux forestiers, est déterminé par son emplacement spatial et son occurrence dans le temps. De plus, des informations tels que la surface bru^lée, les causes d'ignition, l'utilisation du sol, les caractéristiques topographiques, climatiques et météorologiques, etc., peuvent aussi être utilisées pour caractériser le phénomène étudié. Par conséquent, la définition de la structure spatio-temporelle représente un outil puissant pour compren- dre la distribution du phénomène et sa corrélation avec des processus sous-jacents tels que les facteurs socio-économiques, environnementaux et météorologiques. De ce fait, nous proposons une méthodologie basée sur l'adaptation et l'application de mesures statistiques et fractales des processus ponctuels d'analyse global (par ex. l'indice de Morisita, la dimension fractale par comptage de boîtes, le formalisme multifractal et la fonction K de Ripley) et local (par ex. la statistique de scan). Des nombreuses mesures décrivant les structures spatio-temporelles de phénomènes environnementaux peuvent être trouvées dans la littérature. Néanmoins, la plupart de ces mesures sont de caractère global et ne considèrent pas de contraintes spatiales com- plexes, ainsi que la haute variabilité et la nature multivariée des événements. A cet effet, la méthodologie ici proposée prend en compte les complexités de l'espace géographique ou le phénomène a lieu, à travers de l'introduction du concept de Domaine de Validité et l'application des mesures d'analyse spatiale dans des données en présentant différentes contraintes géographiques. Cela permet l'évaluation du degré relatif d'agrégation spatiale/temporelle des structures du phénomène observé. En plus, exclusif au cas de feux forestiers, cette recherche propose aussi deux nouvelles méthodologies pour la définition et la cartographie des zones périurbaines, décrites comme des espaces anthropogéniques à proximité de la végétation sauvage ou de la forêt, et de la prédiction de la susceptibilité à l'ignition de feu. A cet égard, l'objectif principal de cette Thèse a été d'effectuer une recherche statistique/géospatiale avec une forte application dans des cas réels, pour analyser et décrire des phénomènes environnementaux complexes aussi bien que surmonter des problèmes méthodologiques non résolus relatifs à la caractérisation des structures spatio-temporelles, particulièrement, celles des occurrences de feux forestières. Ainsi, cette Thèse fournit une réponse à la demande croissante de la gestion et du monitoring environnemental pour le déploiement d'outils d'évaluation des risques et des dangers naturels et anthro- pogéniques. Les majeures contributions de ce travail ont été présentées aux conférences nationales et internationales, et ont été aussi publiées dans 5 revues internationales avec comité de lecture. Des collaborations nationales et internationales ont été aussi établies et accomplies avec succès.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Defining digital humanities might be an endless debate if we stick to the discussion about the boundaries of this concept as an academic "discipline". In an attempt to concretely identify this field and its actors, this paper shows that it is possible to analyse them through Twitter, a social media widely used by this "community of practice". Based on a network analysis of 2,500 users identified as members of this movement, the visualisation of the "who's following who?" graph allows us to highlight the structure of the network's relationships, and identify users whose position is particular. Specifically, we show that linguistic groups are key factors to explain clustering within a network whose characteristics look similar to a small world.