916 resultados para Compactification and String Models
Resumo:
SUMMARY A recent systematic review demonstrated that, overall, orthodontic treatment might result in a small worsening of periodontal status. The aim of this retrospective study was to test the hypothesis that a change of mandibular incisor inclination promotes development of labial gingival recessions. One hundred and seventy-nine subjects who met the following inclusion criteria were selected: age 11-14 years at start of orthodontic treatment (TS), bonded retainer placed immediately after treatment (T₀), dental casts and lateral cephalograms available pre-treatment (TS), post-treatment (T₀), 2 years post-treatment (T₂), and 5 years post-treatment (T₅). Depending on the change of lower incisor inclination during treatment (ΔInc_Incl), the sample was divided into three groups: Retro (N = 34; ΔInc_Incl ≤ -1 degree), Stable (N = 22; ΔInc_Incl > -1 degree and ≤1 degree), and Pro (N = 123; ΔInc_Incl > 1 degree). Clinical crown heights of mandibular incisors and the presence of gingival recessions in this region were assessed on plaster models. Fisher's exact tests, one-way analysis of variance, and regression models were used for analysis of inter-group differences. The mean increase of clinical crown heights (T₀ to T₅) of mandibular incisors ranged from 0.6 to 0.91 mm in the Retro, Stable, and Pro groups, respectively; the difference was not significant (P = 0.534). At T₅, gingival recessions were present in 8.8, 4.5, and 16.3 per cent patients from the Retro, Stable, and Pro groups, respectively. The difference was not significant (P = 0.265). The change of lower incisors inclination during treatment did not affect development of labial gingival recessions in this patient group.
Resumo:
OBJECTIVES This study sought to validate the Logistic Clinical SYNTAX (Synergy Between Percutaneous Coronary Intervention With Taxus and Cardiac Surgery) score in patients with non-ST-segment elevation acute coronary syndromes (ACS), in order to further legitimize its clinical application. BACKGROUND The Logistic Clinical SYNTAX score allows for an individualized prediction of 1-year mortality in patients undergoing contemporary percutaneous coronary intervention. It is composed of a "Core" Model (anatomical SYNTAX score, age, creatinine clearance, and left ventricular ejection fraction), and "Extended" Model (composed of an additional 6 clinical variables), and has previously been cross validated in 7 contemporary stent trials (>6,000 patients). METHODS One-year all-cause death was analyzed in 2,627 patients undergoing percutaneous coronary intervention from the ACUITY (Acute Catheterization and Urgent Intervention Triage Strategy) trial. Mortality predictions from the Core and Extended Models were studied with respect to discrimination, that is, separation of those with and without 1-year all-cause death (assessed by the concordance [C] statistic), and calibration, that is, agreement between observed and predicted outcomes (assessed with validation plots). Decision curve analyses, which weight the harms (false positives) against benefits (true positives) of using a risk score to make mortality predictions, were undertaken to assess clinical usefulness. RESULTS In the ACUITY trial, the median SYNTAX score was 9.0 (interquartile range 5.0 to 16.0); approximately 40% of patients had 3-vessel disease, 29% diabetes, and 85% underwent drug-eluting stent implantation. Validation plots confirmed agreement between observed and predicted mortality. The Core and Extended Models demonstrated substantial improvements in the discriminative ability for 1-year all-cause death compared with the anatomical SYNTAX score in isolation (C-statistics: SYNTAX score: 0.64, 95% confidence interval [CI]: 0.56 to 0.71; Core Model: 0.74, 95% CI: 0.66 to 0.79; Extended Model: 0.77, 95% CI: 0.70 to 0.83). Decision curve analyses confirmed the increasing ability to correctly identify patients who would die at 1 year with the Extended Model versus the Core Model versus the anatomical SYNTAX score, over a wide range of thresholds for mortality risk predictions. CONCLUSIONS Compared to the anatomical SYNTAX score alone, the Core and Extended Models of the Logistic Clinical SYNTAX score more accurately predicted individual 1-year mortality in patients presenting with non-ST-segment elevation acute coronary syndromes undergoing percutaneous coronary intervention. These findings support the clinical application of the Logistic Clinical SYNTAX score.
Resumo:
Prevention and treatment of osteoporosis rely on understanding of the micromechanical behaviour of bone and its influence on fracture toughness and cell-mediated adaptation processes. Postyield properties may be assessed by nonlinear finite element simulations of nanoindentation using elastoplastic and damage models. This computational study aims at determining the influence of yield surface shape and damage on the depth-dependent response of bone to nanoindentation using spherical and conical tips. Yield surface shape and damage were shown to have a major impact on the indentation curves. Their influence on indentation modulus, hardness, their ratio as well as the elastic-to-total work ratio is well described by multilinear regressions for both tip shapes. For conical tips, indentation depth was not statistically significant (p<0.0001). For spherical tips, damage was not a significant parameter (p<0.0001). The gained knowledge can be used for developing an inverse method for identification of postelastic properties of bone from nanoindentation.
Resumo:
Background: WHO's 2013 revisions to its Consolidated Guidelines on antiretroviral drugs recommend routine viral load monitoring, rather than clinical or immunological monitoring, as the preferred monitoring approach on the basis of clinical evidence. However, HIV programmes in resource-limited settings require guidance on the most cost-effective use of resources in view of other competing priorities such as expansion of antiretroviral therapy coverage. We assessed the cost-effectiveness of alternative patient monitoring strategies. Methods: We evaluated a range of monitoring strategies, including clinical, CD4 cell count, and viral load monitoring, alone and together, at different frequencies and with different criteria for switching to second-line therapies. We used three independently constructed and validated models simultaneously. We estimated costs on the basis of resource use projected in the models and associated unit costs; we quantified impact as disability-adjusted life years (DALYs) averted. We compared alternatives using incremental cost-effectiveness analysis. Findings: All models show that clinical monitoring delivers significant benefit compared with a hypothetical baseline scenario with no monitoring or switching. Regular CD4 cell count monitoring confers a benefit over clinical monitoring alone, at an incremental cost that makes it affordable in more settings than viral load monitoring, which is currently more expensive. Viral load monitoring without CD4 cell count every 6—12 months provides the greatest reductions in morbidity and mortality, but incurs a high cost per DALY averted, resulting in lost opportunities to generate health gains if implemented instead of increasing antiretroviral therapy coverage or expanding antiretroviral therapy eligibility. Interpretation: The priority for HIV programmes should be to expand antiretroviral therapy coverage, firstly at CD4 cell count lower than 350 cells per μL, and then at a CD4 cell count lower than 500 cells per μL, using lower-cost clinical or CD4 monitoring. At current costs, viral load monitoring should be considered only after high antiretroviral therapy coverage has been achieved. Point-of-care technologies and other factors reducing costs might make viral load monitoring more affordable in future. Funding: Bill & Melinda Gates Foundation, WHO.
Resumo:
We discuss non-geometric supersymmetric heterotic string models in D=4, in the framework of the free fermionic construction. We perform a systematic scan of models with four a priori left-right asymmetric Z2 projections and shifts. We analyze some 220 models, identifying 18 inequivalent classes and addressing variants generated by discrete torsions. They do not contain geometrical or trivial neutral moduli, apart from the dilaton. However, we show the existence of flat directions in the form of exactly marginal deformations and identify patterns of symmetry breaking where product gauge groups, realized at level one, are broken to their diagonal at higher level. We also describe an “inverse Gepner map” from Heterotic to Type II models that could be used, in certain non geometric settings, to define “effective” topological invariants.
Resumo:
Low self-esteem and depression are strongly related, but there is not yet consistent evidence on the nature of the relation. Whereas the vulnerability model states that low self-esteem contributes to depression, the scar model states that depression erodes self-esteem. Furthermore, it is unknown whether the models are specific for depression or whether they are also valid for anxiety. We evaluated the vulnerability and scar models of low self-esteem and depression, and low self-esteem and anxiety, by meta-analyzing the available longitudinal data (covering 77 studies on depression and 18 studies on anxiety). The mean age of the samples ranged from childhood to old age. In the analyses, we used a random-effects model and examined prospective effects between the variables, controlling for prior levels of the predicted variables. For depression, the findings supported the vulnerability model: The effect of self-esteem on depression (β = -.16) was significantly stronger than the effect of depression on self-esteem (β = -.08). In contrast, the effects between low self-esteem and anxiety were relatively balanced: Self-esteem predicted anxiety with β = -.10, and anxiety predicted self-esteem with β = -.08. Moderator analyses were conducted for the effect of low self-esteem on depression; these suggested that the effect is not significantly influenced by gender, age, measures of self-esteem and depression, or time lag between assessments. If future research supports the hypothesized causality of the vulnerability effect of low self-esteem on depression, interventions aimed at increasing self-esteem might be useful in reducing the risk of depression.
Resumo:
BACKGROUND: Research on comorbidity of psychiatric disorders identifies broad superordinate dimensions as underlying structure of psychopathology. While a syndrome-level approach informs diagnostic systems, a symptom-level approach is more likely to represent the dimensional components within existing diagnostic categories. It may capture general emotional, cognitive or physiological processes as underlying liabilities of different disorders and thus further develop dimensional-spectrum models of psychopathology. METHODS: Exploratory and confirmatory factor analyses were used to examine the structure of psychopathological symptoms assessed with the Brief Symptom Inventory in two outpatient samples (n=3171), including several correlated-factors and bifactor models. The preferred models were correlated with DSM-diagnoses. RESULTS: A model containing eight correlated factors for depressed mood, phobic fear, aggression, suicidal ideation, nervous tension, somatic symptoms, information processing deficits, and interpersonal insecurity, as well a bifactor model fit the data best. Distinct patterns of correlations with DSM-diagnoses identified a) distress-related disorders, i.e., mood disorders, PTSD, and personality disorders, which were associated with all correlated factors as well as the underlying general distress factor; b) anxiety disorders with more specific patterns of correlations; and c) disorders defined by behavioural or somatic dysfunctions, which were characterised by non-significant or negative correlations with most factors. CONCLUSIONS: This study identified emotional, somatic, cognitive, and interpersonal components of psychopathology as transdiagnostic psychopathological liabilities. These components can contribute to a more accurate description and taxonomy of psychopathology, may serve as phenotypic constructs for further aetiological research, and can inform the development of tailored general and specific interventions to treat mental disorders.
Resumo:
Methane is an important greenhouse gas, responsible for about 20 of the warming induced by long-lived greenhouse gases since pre-industrial times. By reacting with hydroxyl radicals, methane reduces the oxidizing capacity of the atmosphere and generates ozone in the troposphere. Although most sources and sinks of methane have been identified, their relative contributions to atmospheric methane levels are highly uncertain. As such, the factors responsible for the observed stabilization of atmospheric methane levels in the early 2000s, and the renewed rise after 2006, remain unclear. Here, we construct decadal budgets for methane sources and sinks between 1980 and 2010, using a combination of atmospheric measurements and results from chemical transport models, ecosystem models, climate chemistry models and inventories of anthropogenic emissions. The resultant budgets suggest that data-driven approaches and ecosystem models overestimate total natural emissions. We build three contrasting emission scenarios � which differ in fossil fuel and microbial emissions � to explain the decadal variability in atmospheric methane levels detected, here and in previous studies, since 1985. Although uncertainties in emission trends do not allow definitive conclusions to be drawn, we show that the observed stabilization of methane levels between 1999 and 2006 can potentially be explained by decreasing-to-stable fossil fuel emissions, combined with stable-to-increasing microbial emissions. We show that a rise in natural wetland emissions and fossil fuel emissions probably accounts for the renewed increase in global methane levels after 2006, although the relative contribution of these two sources remains uncertain.
Resumo:
Scholars have increasingly theorized, and debated, the decision by states to create and delegate authority to international courts, as well as the subsequent autonomy and behavior of those courts, with principal–agent and trusteeship models disagreeing on the nature and extent of states’ influence on international judges. This article formulates and tests a set of principal–agent hypotheses about the ways in which, and the conditions under which, member states are able use their powers of judicial nomination and appointment to influence the endogenous preferences of international judges. The empirical analysis surveys the record of all judicial appointments to the Appellate Body (AB) of the World Trade Organization over a 15-year period. We present a view of an AB appointment process that, far from representing a pure search for expertise, is deeply politicized and offers member-state principals opportunities to influence AB members ex ante and possibly ex post. We further demonstrate that the AB nomination process has become progressively more politicized over time as member states, responding to earlier and controversial AB decisions, became far more concerned about judicial activism and more interested in the substantive opinions of AB candidates, systematically championing candidates whose views on key issues most closely approached their own, and opposing candidates perceived to be activist or biased against their substantive preferences. Although specific to the WTO, our theory and findings have implications for the judicial politics of a large variety of global and regional international courts and tribunals.
Resumo:
Digital technologies have profoundly changed not only the ways we create, distribute, access, use and re-use information but also many of the governance structures we had in place. Overall, "older" institutions at all governance levels have grappled and often failed to master the multi-faceted and multi-directional issues of the Internet. Regulatory entrepreneurs have yet to discover and fully mobilize the potential of digital technologies as an influential factor impacting upon the regulability of the environment and as a potential regulatory tool in themselves. At the same time, we have seen a deterioration of some public spaces and lower prioritization of public objectives, when strong private commercial interests are at play, such as most tellingly in the field of copyright. Less tangibly, private ordering has taken hold and captured through contracts spaces, previously regulated by public law. Code embedded in technology often replaces law. Non-state action has in general proliferated and put serious pressure upon conventional state-centered, command-and-control models. Under the conditions of this "messy" governance, the provision of key public goods, such as freedom of information, has been made difficult or is indeed jeopardized.The grand question is how can we navigate this complex multi-actor, multi-issue space and secure the attainment of fundamental public interest objectives. This is also the question that Ian Brown and Chris Marsden seek to answer with their book, Regulating Code, as recently published under the "Information Revolution and Global Politics" series of MIT Press. This book review critically assesses the bold effort by Brown and Marsden.
Resumo:
Ice sheet thickness is determined mainly by the strength of ice-bed coupling that controls holistic transitions from slow sheet flow to fast streamflow to buttressing shelf flow. Byrd Glacier has the largest ice drainage system in Antarctica and is the fastest ice stream entering Ross Ice Shelf. In 2004 two large subglacial lakes at the head of Byrd Glacier suddenly drained and increased the terminal ice velocity of Byrd Glacier from 820 m yr(-1) to 900 m yr(-1). This resulted in partial ice-bed recoupling above the lakes and partial decoupling along Byrd Glacier. An attempt to quantify this behavior is made using flowband and flowline models in which the controlling variable for ice height above the bed is the floating fraction phi of ice along the flowband and flowline. Changes in phi before and after drainage are obtained from available data, but more reliable data in the map plane are required before Byrd Glacier can be modeled adequately. A holistic sliding velocity is derived that depends on phi, with contributions from ice shearing over coupled beds and ice stretching over uncoupled beds, as is done in state-of-the-art sliding theories.
Resumo:
OBJECTIVES: The aim of the study was to assess whether prospective follow-up data within the Swiss HIV Cohort Study can be used to predict patients who stop smoking; or among smokers who stop, those who start smoking again. METHODS: We built prediction models first using clinical reasoning ('clinical models') and then by selecting from numerous candidate predictors using advanced statistical methods ('statistical models'). Our clinical models were based on literature that suggests that motivation drives smoking cessation, while dependence drives relapse in those attempting to stop. Our statistical models were based on automatic variable selection using additive logistic regression with component-wise gradient boosting. RESULTS: Of 4833 smokers, 26% stopped smoking, at least temporarily; because among those who stopped, 48% started smoking again. The predictive performance of our clinical and statistical models was modest. A basic clinical model for cessation, with patients classified into three motivational groups, was nearly as discriminatory as a constrained statistical model with just the most important predictors (the ratio of nonsmoking visits to total visits, alcohol or drug dependence, psychiatric comorbidities, recent hospitalization and age). A basic clinical model for relapse, based on the maximum number of cigarettes per day prior to stopping, was not as discriminatory as a constrained statistical model with just the ratio of nonsmoking visits to total visits. CONCLUSIONS: Predicting smoking cessation and relapse is difficult, so that simple models are nearly as discriminatory as complex ones. Patients with a history of attempting to stop and those known to have stopped recently are the best candidates for an intervention.
An Early-Warning System for Hypo-/Hyperglycemic Events Based on Fusion of Adaptive Prediction Models
Resumo:
Introduction: Early warning of future hypoglycemic and hyperglycemic events can improve the safety of type 1 diabetes mellitus (T1DM) patients. The aim of this study is to design and evaluate a hypoglycemia / hyperglycemia early warning system (EWS) for T1DM patients under sensor-augmented pump (SAP) therapy. Methods: The EWS is based on the combination of data-driven online adaptive prediction models and a warning algorithm. Three modeling approaches have been investigated: (i) autoregressive (ARX) models, (ii) auto-regressive with an output correction module (cARX) models, and (iii) recurrent neural network (RNN) models. The warning algorithm performs postprocessing of the models′ outputs and issues alerts if upcoming hypoglycemic/hyperglycemic events are detected. Fusion of the cARX and RNN models, due to their complementary prediction performances, resulted in the hybrid autoregressive with an output correction module/recurrent neural network (cARN)-based EWS. Results: The EWS was evaluated on 23 T1DM patients under SAP therapy. The ARX-based system achieved hypoglycemic (hyperglycemic) event prediction with median values of accuracy of 100.0% (100.0%), detection time of 10.0 (8.0) min, and daily false alarms of 0.7 (0.5). The respective values for the cARX-based system were 100.0% (100.0%), 17.5 (14.8) min, and 1.5 (1.3) and, for the RNN-based system, were 100.0% (92.0%), 8.4 (7.0) min, and 0.1 (0.2). The hybrid cARN-based EWS presented outperforming results with 100.0% (100.0%) prediction accuracy, detection 16.7 (14.7) min in advance, and 0.8 (0.8) daily false alarms. Conclusion: Combined use of cARX and RNN models for the development of an EWS outperformed the single use of each model, achieving accurate and prompt event prediction with few false alarms, thus providing increased safety and comfort.
Resumo:
The pregnane X receptor (PXR) has been postulated to play a role in the metabolism of α-tocopherol owing to the up-regulation of hepatic cytochrome P450 (P450) 3A in human cell lines and murine models after α-tocopherol treatment. However, in vivo studies confirming the role of PXR in α-tocopherol metabolism in humans presents significant difficulties and has not been performed. PXR-humanized (hPXR), wild-type, and Pxr-null mouse models were used to determine whether α-tocopherol metabolism is influenced by species-specific differences in PXR function in vivo. No significant difference in the concentration of the major α-tocopherol metabolites was observed among the hPXR, wild-type, and Pxr-null mice through mass spectrometry-based metabolomics. Gene expression analysis revealed significantly increased expression of Cyp3a11 as well as several other P450s only in wild-type mice, suggesting species-specificity for α-tocopherol activation of PXR. Luciferase reporter assay confirmed activation of mouse PXR by α-tocopherol. Analysis of the Cyp2c family of genes revealed increased expression of Cyp2c29, Cyp2c37, and Cyp2c55 in wild-type, hPXR, and Pxr-null mice, which suggests PXR-independent induction of Cyp2c gene expression. This study revealed that α-tocopherol is a partial agonist of PXR and that PXR is necessary for Cyp3a induction by α-tocopherol. The implications of a novel role for α-tocopherol in Cyp2c gene regulation are also discussed.
Resumo:
Denmark and Switzerland are small and successful countries with exceptionally content populations. However, they have very different political institutions and economic models. They have followed the general tendency in the West toward economic convergence, but both countries have managed to stay on top. They both have a strong liberal tradition, but otherwise their economic strategies are a welfare state model for Denmark and a safe haven model for Switzerland. The Danish welfare state is tax-based, while the expenditures for social welfare are insurance-based in Switzerland. The political institutions are a multiparty unicameral system in Denmark, and a permanent coalition system with many referenda and strong local government in Switzerland. Both approaches have managed to ensure smoothly working political power-sharing and economic systems that allocate resources in a fairly efficient way. To date, they have also managed to adapt the economies to changes in the external environment with a combination of stability and flexibility.