921 resultados para COMPLEXITY


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: To analyze the relationship between pharmacotherapeutical complexity and compliance of therapeutic objectives in HIV+ patients on antiretroviral treatment and concomitant dyslipidemia therapy. Materials and methods: A retrospective observational study including HIV patients on stable antiretroviral treatment during the past 6 months, and dyslipidemia treatment between January and December, 2013. The complexity index was calculated with the tool developed by McDonald et al. Other variables analyzed were: age, gender, risk factor of HIV, smoking, alcoholism and drugs, psychiatric disorders, adherence to antiretroviral treatment and lipid lowering drugs, and clinical parameters (HIV viral load, CD4 count, plasma levels of total cholesterol, LDL, HDL, and triglycerides). In order to determine the predictive factors associated with the compliance of therapeutic objectives, univariate analysis was conducted through logistical regression, followed by a multivariate analysis. Results: The study included 89 patients; 56.8% of them met the therapeutic objectives for dyslipidemia. The complexity index was significantly higher (p = 0.02) in those patients who did not reach the objective values (median 51.8 vs. 38.9). Adherence to lipid lowering treatment was significantly associated with compliance of the therapeutic objectives established for dyslipidemia treatment. A 67.0% of patients met the objectives for their antiretroviral treatment; however, the complexity index was not significantly higher (p = 0.06) in those patients who did not meet said objectives. Conclusions: Pharmacotherapeutical complexity represents a key factor in terms of achieving health objectives in HIV+ patients on treatment for dyslipidemia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La possibilité d’estimer l’impact du changement climatique en cours sur le comportement hydrologique des hydro-systèmes est une nécessité pour anticiper les adaptations inévitables et nécessaires que doivent envisager nos sociétés. Dans ce contexte, ce projet doctoral présente une étude sur l’évaluation de la sensibilité des projections hydrologiques futures à : (i) La non-robustesse de l’identification des paramètres des modèles hydrologiques, (ii) l’utilisation de plusieurs jeux de paramètres équifinaux et (iii) l’utilisation de différentes structures de modèles hydrologiques. Pour quantifier l’impact de la première source d’incertitude sur les sorties des modèles, quatre sous-périodes climatiquement contrastées sont tout d’abord identifiées au sein des chroniques observées. Les modèles sont calés sur chacune de ces quatre périodes et les sorties engendrées sont analysées en calage et en validation en suivant les quatre configurations du Different Splitsample Tests (Klemeš, 1986;Wilby, 2005; Seiller et al. (2012);Refsgaard et al. (2014)). Afin d’étudier la seconde source d’incertitude liée à la structure du modèle, l’équifinalité des jeux de paramètres est ensuite prise en compte en considérant pour chaque type de calage les sorties associées à des jeux de paramètres équifinaux. Enfin, pour évaluer la troisième source d’incertitude, cinq modèles hydrologiques de différents niveaux de complexité sont appliqués (GR4J, MORDOR, HSAMI, SWAT et HYDROTEL) sur le bassin versant québécois de la rivière Au Saumon. Les trois sources d’incertitude sont évaluées à la fois dans conditions climatiques observées passées et dans les conditions climatiques futures. Les résultats montrent que, en tenant compte de la méthode d’évaluation suivie dans ce doctorat, l’utilisation de différents niveaux de complexité des modèles hydrologiques est la principale source de variabilité dans les projections de débits dans des conditions climatiques futures. Ceci est suivi par le manque de robustesse de l’identification des paramètres. Les projections hydrologiques générées par un ensemble de jeux de paramètres équifinaux sont proches de celles associées au jeu de paramètres optimal. Par conséquent, plus d’efforts devraient être investis dans l’amélioration de la robustesse des modèles pour les études d’impact sur le changement climatique, notamment en développant les structures des modèles plus appropriés et en proposant des procédures de calage qui augmentent leur robustesse. Ces travaux permettent d’apporter une réponse détaillée sur notre capacité à réaliser un diagnostic des impacts des changements climatiques sur les ressources hydriques du bassin Au Saumon et de proposer une démarche méthodologique originale d’analyse pouvant être directement appliquée ou adaptée à d’autres contextes hydro-climatiques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we consider several instances of the following problem: "how complicated can the isomorphism relation for countable models be?"' Using the Borel reducibility framework, we investigate this question with regard to the space of countable models of particular complete first-order theories. We also investigate to what extent this complexity is mirrored in the number of back-and-forth inequivalent models of the theory. We consider this question for two large and related classes of theories. First, we consider o-minimal theories, showing that if T is o-minimal, then the isomorphism relation is either Borel complete or Borel. Further, if it is Borel, we characterize exactly which values can occur, and when they occur. In all cases Borel completeness implies lambda-Borel completeness for all lambda. Second, we consider colored linear orders, which are (complete theories of) a linear order expanded by countably many unary predicates. We discover the same characterization as with o-minimal theories, taking the same values, with the exception that all finite values are possible except two. We characterize exactly when each possibility occurs, which is similar to the o-minimal case. Additionally, we extend Schirrman's theorem, showing that if the language is finite, then T is countably categorical or Borel complete. As before, in all cases Borel completeness implies lambda-Borel completeness for all lambda.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The farm-gate value of extensive beef production from the northern Gulf region of Queensland, Australia, is ~$150 million annually. Poor profitability and declining equity are common issues for most beef businesses in the region. The beef industry relies primarily on native pasture systems and studies continue to report a decline in the condition and productivity of important land types in the region. Governments and Natural Resource Management groups are investing significant resources to restore landscape health and productivity. Fundamental community expectations also include broader environmental outcomes such as reducing beef industry greenhouse gas emissions. Whole-of-business analysis results are presented from 18 extensive beef businesses (producers) to highlight the complex social and economic drivers of management decisions that impact on the natural resource and environment. Business analysis activities also focussed on improving enterprise performance. Profitability, herd performance and greenhouse emission benchmarks are documented and discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nitrogen (N) is an essential plant nutrient in maize production, and if considering only natural sources, is often the limiting factor world-wide in terms of a plant’s grain yield. For this reason, many farmers around the world supplement available soil N with synthetic man-made forms. Years of over-application of N fertilizer have led to increased N in groundwater and streams due to leaching and run-off from agricultural sites. In the Midwest Corn Belt much of this excess N eventually makes its way to the Gulf of Mexico leading to eutrophication (increase of phytoplankton) and a hypoxic (reduced oxygen) dead zone. Growing concerns about these types of problems and desire for greater input use efficiency have led to demand for crops with improved N use efficiency (NUE) to allow reduced N fertilizer application rates and subsequently lower N pollution. It is well known that roots are responsible for N uptake by plants, but it is relatively unknown how root architecture affects this ability. This research was conducted to better understand the influence of root complexity (RC) in maize on a plant’s response to N stress as well as the influence of RC on other above-ground plant traits. Thirty-one above-ground plant traits were measured for 64 recombinant inbred lines (RILs) from the intermated B73 & Mo17 (IBM) population and their backcrosses (BCs) to either parent, B73 and Mo17, under normal (182 kg N ha-1) and N deficient (0 kg N ha-1) conditions. The RILs were selected based on results from an earlier experiment by Novais et al. (2011) which screened 232 RILs from the IBM to obtain their root complexity measurements. The 64 selected RILs were comprised of 31 of the lowest complexity RILs (RC1) and 33 of the highest complexity RILs (RC2) in terms of root architecture (characterized as fractal dimensions). The use of the parental BCs classifies the experiment as Design III, an experimental design developed by Comstock and Robinson (1952) which allows for estimation of dominance significance and level. Of the 31 traits measured, 12 were whole plant traits chosen due to their documented response to N stress. The other 19 traits were ear traits commonly measured for their influence on yield. Results showed that genotypes from RC1 and RC2 significantly differ for several above-ground phenotypes. We also observed a difference in the number and magnitude of N treatment responses between the two RC classes. Differences in phenotypic trait correlations and their change in response to N were also observed between the RC classes. RC did not seem to have a strong correlation with calculated NUE (ΔYield/ΔN). Quantitative genetic analysis utilizing the Design III experimental design revealed significant dominance effects acting on several traits as well as changes in significance and dominance level between N treatments. Several QTL were mapped for 26 of the 31 traits and significant N effects were observed across the majority of the genome for some N stress indicative traits (e.g. stay-green). This research and related projects are essential to a better understanding of plant N uptake and metabolism. Understanding these processes is a necessary step in the progress towards the goal of breeding for better NUE crops.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Current thinking about ‘patient safety’ emphasises the causal relationship between the work environment and the delivery of clinical care. This research draws on the theory of Normal Accidents to extend this analysis and better understand the ‘organisational factors’ that threaten safety. Methods: Ethnographic research methods were used, with observations of the operating department setting for 18 month and interviews with 80 members of hospital staff. The setting for the study was the Operating Department of a large teaching hospital in the North-West of England. Results: The work of the operating department is determined by inter-dependant, ‘tightly coupled’ organisational relationships between hospital departments based upon the timely exchange of information, services and resources required for the delivery of care. Failures within these processes, manifest as ‘breakdowns’ within inter-departmental relationships lead to situations of constraint, rapid change and uncertainty in the work of the operating department that require staff to break with established routines and work with increased time and emotional pressures. This means that staff focus on working quickly, as opposed to working safely. Conclusion: Analysis of safety needs to move beyond a focus on the immediate work environment and individual practice, to consider the more complex and deeply structured organisational systems of hospital activity. For departmental managers the scope for service planning to control for safety may be limited as the structured ‘real world’ situation of service delivery is shaped by inter-department and organisational factors that are perhaps beyond the scope of departmental management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

International audience

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the Guaymas Basin, the presence of cold seeps and hydrothermal vents in close proximity, similar sedimentary settings and comparable depths offers a unique opportunity to assess and compare the functioning of these deep-sea chemosynthetic ecosystems. The food webs of five seep and four vent assemblages were studied using stable carbon and nitrogen isotope analyses. Although the two ecosystems shared similar potential basal sources, their food webs differed: seeps relied predominantly on methanotrophy and thiotrophy via the Calvin-Benson-Bassham (CBB) cycle and vents on petroleum-derived organic matter and thiotrophy via the CBB and reductive tricarboxylic acid (rTCA) cycles. In contrast to symbiotic species, the heterotrophic fauna exhibited high trophic flexibility among assemblages, suggesting weak trophic links to the metabolic diversity of chemosynthetic primary producers. At both ecosystems, food webs did not appear to be organised through predator-prey links but rather through weak trophic relationships among co-occurring species. Examples of trophic or spatial niche differentiation highlighted the importance of species-sorting processes within chemosynthetic ecosystems. Variability in food web structure, addressed through Bayesian metrics, revealed consistent trends across ecosystems. Food-web complexity significantly decreased with increasing methane concentrations, a common proxy for the intensity of seep and vent fluid fluxes. Although high fluid-fluxes have the potential to enhance primary productivity, they generate environmental constraints that may limit microbial diversity, colonisation of consumers and the structuring role of competitive interactions, leading to an overall reduction of food-web complexity and an increase in trophic redundancy. Heterogeneity provided by foundation species was identified as an additional structuring factor. According to their biological activities, foundation species may have the potential to partly release the competitive pressure within communities of low fluid-flux habitats. Finally, ecosystem functioning in vents and seeps was highly similar despite environmental differences (e.g. physico-chemistry, dominant basal sources) suggesting that ecological niches are not specifically linked to the nature of fluids. This comparison of seep and vent functioning in the Guaymas basin thus provides further supports to the hypothesis of continuity among deep-sea chemosynthetic ecosystems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article deals with climate change from a linguistic perspective. Climate change is an extremely complex issue that has exercised the minds of experts and policy makers with renewed urgency in recent years. It has prompted an explosion of writing in the media, on the internet and in the domain of popular science and literature, as well as a proliferation of new compounds around the word ‘carbon’ as a hub, such as ‘carbon indulgence’, a new compound that will be studied in this article. Through a linguistic analysis of lexical and discourse formations around such ‘carbon compounds’ we aim to contribute to a broader understanding of the meaning of climate change. Lexical carbon compounds are used here as indicators for observing how human symbolic cultures change and adapt in response to environmental threats and how symbolic innovation and transmission occurs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Heart rate complexity analysis is a powerful non-invasive means to diagnose several cardiac ailments. Non-linear tools of complexity measurement are indispensable in order to bring out the complete non-linear behavior of Physiological signals. The most popularly used non-linear tools to measure signal complexity are the entropy measures like Approximate entropy (ApEn) and Sample entropy (SampEn). But, these methods become unreliable and inaccurate at times, in particular, for short length data. Recently, a novel method of complexity measurement called Distribution Entropy (DistEn) was introduced, which showed reliable performance to capture complexity of both short term synthetic and short term physiologic data. This study aims to i) examine the competence of DistEn in discriminating Arrhythmia from Normal sinus rhythm (NSR) subjects, using RR interval time series data; ii) explore the level of consistency of DistEn with data length N; and iii) compare the performance of DistEn with ApEn and SampEn. Sixty six RR interval time series data belonging to two groups of cardiac conditions namely `Arrhythmia' and `NSR' have been used for the analysis. The data length N was varied from 50 to 1000 beats with embedding dimension m = 2 for all entropy measurements. Maximum ROC area obtained using ApEn, SampEn and DistEn were 0.83, 0.86 and 0.94 for data length 1000, 1000 and 500 beats respectively. The results show that DistEn undoubtedly exhibits a consistently high performance as a classification feature in comparison with ApEn and SampEn. Therefore, DistEn shows a promising behavior as bio marker for detecting Arrhythmia from short length RR interval data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Scientific workflow offers a framework for cooperation between remote and shared resources on a grid computing environment (GCE) for scientific discovery. One major function of scientific workflow is to schedule a collection of computational subtasks in well-defined orders for efficient outputs by estimating task duration at runtime. In this paper, we propose a novel time computation model based on algorithm complexity (termed as TCMAC model) for high-level data intensive scientific workflow design. The proposed model schedules the subtasks based on their durations and the complexities of participant algorithms. Characterized by utilization of task duration computation function for time efficiency, the TCMAC model has three features for a full-aspect scientific workflow including both dataflow and control-flow: (1) provides flexible and reusable task duration functions in GCE;(2) facilitates better parallelism in iteration structures for providing more precise task durations;and (3) accommodates dynamic task durations for rescheduling in selective structures of control flow. We will also present theories and examples in scientific workflows to show the efficiency of the TCMAC model, especially for control-flow. Copyright©2009 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Drawn on the System Complexity literature, this study investigated how supply chain complexity impacts firms’ operational performance and what role supply chain orientation plays in complexity-performance relationship. The study provided empirical evidence for the argument that dynamic complexity as opposed to structural complexity is more difficult for firms to effectively accommodate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

“Students as co-researchers” is a mode of engagement between students and teachers in school systems that has been likened to a bridge. This article explores the bridge metaphor with reference to one school’s experience of a students as co-researchers project involving students and teachers in the school and a university partner. We use the bridge metaphor, inspired by the imagist poet Ezra Pound, to explore particular challenges faced in this project, and to envision new modes of teacher/student relationships in education. We argue that the purpose of building such a bridge between students and teachers is not an instrumental one (to reach the other side), but rather that the bridge offers up zones of affective relational encounters between students and teachers.