911 resultados para Technicolor and Composite Models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

For half a century the integrated circuits (ICs) that make up the heart of electronic devices have been steadily improving by shrinking at an exponential rate. However, as the current crop of ICs get smaller and the insulating layers involved become thinner, electrons leak through due to quantum mechanical tunneling. This is one of several issues which will bring an end to this incredible streak of exponential improvement of this type of transistor device, after which future improvements will have to come from employing fundamentally different transistor architecture rather than fine tuning and miniaturizing the metal-oxide-semiconductor field effect transistors (MOSFETs) in use today. Several new transistor designs, some designed and built here at Michigan Tech, involve electrons tunneling their way through arrays of nanoparticles. We use a multi-scale approach to model these devices and study their behavior. For investigating the tunneling characteristics of the individual junctions, we use a first-principles approach to model conduction between sub-nanometer gold particles. To estimate the change in energy due to the movement of individual electrons, we use the finite element method to calculate electrostatic capacitances. The kinetic Monte Carlo method allows us to use our knowledge of these details to simulate the dynamics of an entire device— sometimes consisting of hundreds of individual particles—and watch as a device ‘turns on’ and starts conducting an electric current. Scanning tunneling microscopy (STM) and the closely related scanning tunneling spectroscopy (STS) are a family of powerful experimental techniques that allow for the probing and imaging of surfaces and molecules at atomic resolution. However, interpretation of the results often requires comparison with theoretical and computational models. We have developed a new method for calculating STM topographs and STS spectra. This method combines an established method for approximating the geometric variation of the electronic density of states, with a modern method for calculating spin-dependent tunneling currents, offering a unique balance between accuracy and accessibility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the past several decades, it has become apparent that anthropogenic activities have resulted in the large-scale enhancement of the levels of many trace gases throughout the troposphere. More recently, attention has been given to the transport pathway taken by these emissions as they are dispersed throughout the atmosphere. The transport pathway determines the physical characteristics of emissions plumes and therefore plays an important role in the chemical transformations that can occur downwind of source regions. For example, the production of ozone (O3) is strongly dependent upon the transport its precursors undergo. O3 can initially be formed within air masses while still over polluted source regions. These polluted air masses can experience continued O3 production or O3 destruction downwind, depending on the air mass's chemical and transport characteristics. At present, however, there are a number of uncertainties in the relationships between transport and O3 production in the North Atlantic lower free troposphere. The first phase of the study presented here used measurements made at the Pico Mountain observatory and model simulations to determine transport pathways for US emissions to the observatory. The Pico Mountain observatory was established in the summer of 2001 in order to address the need to understand the relationships between transport and O3 production. Measurements from the observatory were analyzed in conjunction with model simulations from the Lagrangian particle dispersion model (LPDM), FLEX-PART, in order to determine the transport pathway for events observed at the Pico Mountain observatory during July 2003. A total of 16 events were observed, 4 of which were analyzed in detail. The transport time for these 16 events varied from 4.5 to 7 days, while the transport altitudes over the ocean ranged from 2-8 km, but were typically less than 3 km. In three of the case studies, eastward advection and transport in a weak warm conveyor belt (WCB) airflow was responsible for the export of North American emissions into the FT, while transport in the FT was governed by easterly winds driven by the Azores/Bermuda High (ABH) and transient northerly lows. In the fourth case study, North American emissions were lofted to 6-8 km in a WCB before being entrained in the same cyclone's dry airstream and transported down to the observatory. The results of this study show that the lower marine FT may provide an important transport environment where O3 production may continue, in contrast to transport in the marine boundary layer, where O3 destruction is believed to dominate. The second phase of the study presented here focused on improving the analysis methods that are available with LPDMs. While LPDMs are popular and useful for the analysis of atmospheric trace gas measurements, identifying the transport pathway of emissions from their source to a receptor (the Pico Mountain observatory in our case) using the standard gridded model output, particularly during complex meteorological scenarios can be difficult can be difficult or impossible. The transport study in phase 1 was limited to only 1 month out of more than 3 years of available data and included only 4 case studies out of the 16 events specifically due to this confounding factor. The second phase of this study addressed this difficulty by presenting a method to clearly and easily identify the pathway taken by only those emissions that arrive at a receptor at a particular time, by combining the standard gridded output from forward (i.e., concentrations) and backward (i.e., residence time) LPDM simulations, greatly simplifying similar analyses. The ability of the method to successfully determine the source-to-receptor pathway, restoring this Lagrangian information that is lost when the data are gridded, is proven by comparing the pathway determined from this method with the particle trajectories from both the forward and backward models. A sample analysis is also presented, demonstrating that this method is more accurate and easier to use than existing methods using standard LPDM products. Finally, we discuss potential future work that would be possible by combining the backward LPDM simulation with gridded data from other sources (e.g., chemical transport models) to obtain a Lagrangian sampling of the air that will eventually arrive at a receptor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Upper Devonian-Mississippian Bakken Formation in the Williston Basin is one of the most prolific onshore petroleum systems in the continental U.S., consisting of a middle carbonate-siliciclastic member sandwiched between two organic-rich units, the Lower and Upper Bakken shales. Dr. Egenhoff discusses the formation’s surprising departures from standard stratigraphy models and depositional models which contribute to its unique characteristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Estimates of the decrease in CD4(+) cell counts in untreated patients with human immunodeficiency virus (HIV) infection are important for patient care and public health. We analyzed CD4(+) cell count decreases in the Cape Town AIDS Cohort and the Swiss HIV Cohort Study. METHODS: We used mixed-effects models and joint models that allowed for the correlation between CD4(+) cell count decreases and survival and stratified analyses by the initial cell count (50-199, 200-349, 350-499, and 500-750 cells/microL). Results are presented as the mean decrease in CD4(+) cell count with 95% confidence intervals (CIs) during the first year after the initial CD4(+) cell count. RESULTS: A total of 784 South African (629 nonwhite) and 2030 Swiss (218 nonwhite) patients with HIV infection contributed 13,388 CD4(+) cell counts. Decreases in CD4(+) cell count were steeper in white patients, patients with higher initial CD4(+) cell counts, and older patients. Decreases ranged from a mean of 38 cells/microL (95% CI, 24-54 cells/microL) in nonwhite patients from the Swiss HIV Cohort Study 15-39 years of age with an initial CD4(+) cell count of 200-349 cells/microL to a mean of 210 cells/microL (95% CI, 143-268 cells/microL) in white patients in the Cape Town AIDS Cohort > or =40 years of age with an initial CD4(+) cell count of 500-750 cells/microL. CONCLUSIONS: Among both patients from Switzerland and patients from South Africa, CD4(+) cell count decreases were greater in white patients with HIV infection than they were in nonwhite patients with HIV infection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: Hierarchical modeling has been proposed as a solution to the multiple exposure problem. We estimate associations between metabolic syndrome and different components of antiretroviral therapy using both conventional and hierarchical models. STUDY DESIGN AND SETTING: We use discrete time survival analysis to estimate the association between metabolic syndrome and cumulative exposure to 16 antiretrovirals from four drug classes. We fit a hierarchical model where the drug class provides a prior model of the association between metabolic syndrome and exposure to each antiretroviral. RESULTS: One thousand two hundred and eighteen patients were followed for a median of 27 months, with 242 cases of metabolic syndrome (20%) at a rate of 7.5 cases per 100 patient years. Metabolic syndrome was more likely to develop in patients exposed to stavudine, but was less likely to develop in those exposed to atazanavir. The estimate for exposure to atazanavir increased from hazard ratio of 0.06 per 6 months' use in the conventional model to 0.37 in the hierarchical model (or from 0.57 to 0.81 when using spline-based covariate adjustment). CONCLUSION: These results are consistent with trials that show the disadvantage of stavudine and advantage of atazanavir relative to other drugs in their respective classes. The hierarchical model gave more plausible results than the equivalent conventional model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High-resolution and highly precise age models for recent lake sediments (last 100–150 years) are essential for quantitative paleoclimate research. These are particularly important for sedimentological and geochemical proxies, where transfer functions cannot be established and calibration must be based upon the relation of sedimentary records to instrumental data. High-precision dating for the calibration period is most critical as it determines directly the quality of the calibration statistics. Here, as an example, we compare radionuclide age models obtained on two high-elevation glacial lakes in the Central Chilean Andes (Laguna Negra: 33°38′S/70°08′W, 2,680 m a.s.l. and Laguna El Ocho: 34°02′S/70°19′W, 3,250 m a.s.l.). We show the different numerical models that produce accurate age-depth chronologies based on 210Pb profiles, and we explain how to obtain reduced age-error bars at the bottom part of the profiles, i.e., typically around the end of the 19th century. In order to constrain the age models, we propose a method with five steps: (i) sampling at irregularly-spaced intervals for 226Ra, 210Pb and 137Cs depending on the stratigraphy and microfacies, (ii) a systematic comparison of numerical models for the calculation of 210Pb-based age models: constant flux constant sedimentation (CFCS), constant initial concentration (CIC), constant rate of supply (CRS) and sediment isotope tomography (SIT), (iii) numerical constraining of the CRS and SIT models with the 137Cs chronomarker of AD 1964 and, (iv) step-wise cross-validation with independent diagnostic environmental stratigraphic markers of known age (e.g., volcanic ash layer, historical flood and earthquakes). In both examples, we also use airborne pollutants such as spheroidal carbonaceous particles (reflecting the history of fossil fuel emissions), excess atmospheric Cu deposition (reflecting the production history of a large local Cu mine), and turbidites related to historical earthquakes. Our results show that the SIT model constrained with the 137Cs AD 1964 peak performs best over the entire chronological profile (last 100–150 years) and yields the smallest standard deviations for the sediment ages. Such precision is critical for the calibration statistics, and ultimately, for the quality of the quantitative paleoclimate reconstruction. The systematic comparison of CRS and SIT models also helps to validate the robustness of the chronologies in different sections of the profile. Although surprisingly poorly known and under-explored in paleolimnological research, the SIT model has a great potential in paleoclimatological reconstructions based on lake sediments

Relevância:

100.00% 100.00%

Publicador:

Resumo:

SUMMARY A recent systematic review demonstrated that, overall, orthodontic treatment might result in a small worsening of periodontal status. The aim of this retrospective study was to test the hypothesis that a change of mandibular incisor inclination promotes development of labial gingival recessions. One hundred and seventy-nine subjects who met the following inclusion criteria were selected: age 11-14 years at start of orthodontic treatment (TS), bonded retainer placed immediately after treatment (T₀), dental casts and lateral cephalograms available pre-treatment (TS), post-treatment (T₀), 2 years post-treatment (T₂), and 5 years post-treatment (T₅). Depending on the change of lower incisor inclination during treatment (ΔInc_Incl), the sample was divided into three groups: Retro (N = 34; ΔInc_Incl ≤ -1 degree), Stable (N = 22; ΔInc_Incl > -1 degree and ≤1 degree), and Pro (N = 123; ΔInc_Incl > 1 degree). Clinical crown heights of mandibular incisors and the presence of gingival recessions in this region were assessed on plaster models. Fisher's exact tests, one-way analysis of variance, and regression models were used for analysis of inter-group differences. The mean increase of clinical crown heights (T₀ to T₅) of mandibular incisors ranged from 0.6 to 0.91 mm in the Retro, Stable, and Pro groups, respectively; the difference was not significant (P = 0.534). At T₅, gingival recessions were present in 8.8, 4.5, and 16.3 per cent patients from the Retro, Stable, and Pro groups, respectively. The difference was not significant (P = 0.265). The change of lower incisors inclination during treatment did not affect development of labial gingival recessions in this patient group.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES This study sought to validate the Logistic Clinical SYNTAX (Synergy Between Percutaneous Coronary Intervention With Taxus and Cardiac Surgery) score in patients with non-ST-segment elevation acute coronary syndromes (ACS), in order to further legitimize its clinical application. BACKGROUND The Logistic Clinical SYNTAX score allows for an individualized prediction of 1-year mortality in patients undergoing contemporary percutaneous coronary intervention. It is composed of a "Core" Model (anatomical SYNTAX score, age, creatinine clearance, and left ventricular ejection fraction), and "Extended" Model (composed of an additional 6 clinical variables), and has previously been cross validated in 7 contemporary stent trials (>6,000 patients). METHODS One-year all-cause death was analyzed in 2,627 patients undergoing percutaneous coronary intervention from the ACUITY (Acute Catheterization and Urgent Intervention Triage Strategy) trial. Mortality predictions from the Core and Extended Models were studied with respect to discrimination, that is, separation of those with and without 1-year all-cause death (assessed by the concordance [C] statistic), and calibration, that is, agreement between observed and predicted outcomes (assessed with validation plots). Decision curve analyses, which weight the harms (false positives) against benefits (true positives) of using a risk score to make mortality predictions, were undertaken to assess clinical usefulness. RESULTS In the ACUITY trial, the median SYNTAX score was 9.0 (interquartile range 5.0 to 16.0); approximately 40% of patients had 3-vessel disease, 29% diabetes, and 85% underwent drug-eluting stent implantation. Validation plots confirmed agreement between observed and predicted mortality. The Core and Extended Models demonstrated substantial improvements in the discriminative ability for 1-year all-cause death compared with the anatomical SYNTAX score in isolation (C-statistics: SYNTAX score: 0.64, 95% confidence interval [CI]: 0.56 to 0.71; Core Model: 0.74, 95% CI: 0.66 to 0.79; Extended Model: 0.77, 95% CI: 0.70 to 0.83). Decision curve analyses confirmed the increasing ability to correctly identify patients who would die at 1 year with the Extended Model versus the Core Model versus the anatomical SYNTAX score, over a wide range of thresholds for mortality risk predictions. CONCLUSIONS Compared to the anatomical SYNTAX score alone, the Core and Extended Models of the Logistic Clinical SYNTAX score more accurately predicted individual 1-year mortality in patients presenting with non-ST-segment elevation acute coronary syndromes undergoing percutaneous coronary intervention. These findings support the clinical application of the Logistic Clinical SYNTAX score.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Prevention and treatment of osteoporosis rely on understanding of the micromechanical behaviour of bone and its influence on fracture toughness and cell-mediated adaptation processes. Postyield properties may be assessed by nonlinear finite element simulations of nanoindentation using elastoplastic and damage models. This computational study aims at determining the influence of yield surface shape and damage on the depth-dependent response of bone to nanoindentation using spherical and conical tips. Yield surface shape and damage were shown to have a major impact on the indentation curves. Their influence on indentation modulus, hardness, their ratio as well as the elastic-to-total work ratio is well described by multilinear regressions for both tip shapes. For conical tips, indentation depth was not statistically significant (p<0.0001). For spherical tips, damage was not a significant parameter (p<0.0001). The gained knowledge can be used for developing an inverse method for identification of postelastic properties of bone from nanoindentation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: WHO's 2013 revisions to its Consolidated Guidelines on antiretroviral drugs recommend routine viral load monitoring, rather than clinical or immunological monitoring, as the preferred monitoring approach on the basis of clinical evidence. However, HIV programmes in resource-limited settings require guidance on the most cost-effective use of resources in view of other competing priorities such as expansion of antiretroviral therapy coverage. We assessed the cost-effectiveness of alternative patient monitoring strategies. Methods: We evaluated a range of monitoring strategies, including clinical, CD4 cell count, and viral load monitoring, alone and together, at different frequencies and with different criteria for switching to second-line therapies. We used three independently constructed and validated models simultaneously. We estimated costs on the basis of resource use projected in the models and associated unit costs; we quantified impact as disability-adjusted life years (DALYs) averted. We compared alternatives using incremental cost-effectiveness analysis. Findings: All models show that clinical monitoring delivers significant benefit compared with a hypothetical baseline scenario with no monitoring or switching. Regular CD4 cell count monitoring confers a benefit over clinical monitoring alone, at an incremental cost that makes it affordable in more settings than viral load monitoring, which is currently more expensive. Viral load monitoring without CD4 cell count every 6—12 months provides the greatest reductions in morbidity and mortality, but incurs a high cost per DALY averted, resulting in lost opportunities to generate health gains if implemented instead of increasing antiretroviral therapy coverage or expanding antiretroviral therapy eligibility. Interpretation: The priority for HIV programmes should be to expand antiretroviral therapy coverage, firstly at CD4 cell count lower than 350 cells per μL, and then at a CD4 cell count lower than 500 cells per μL, using lower-cost clinical or CD4 monitoring. At current costs, viral load monitoring should be considered only after high antiretroviral therapy coverage has been achieved. Point-of-care technologies and other factors reducing costs might make viral load monitoring more affordable in future. Funding: Bill & Melinda Gates Foundation, WHO.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Low self-esteem and depression are strongly related, but there is not yet consistent evidence on the nature of the relation. Whereas the vulnerability model states that low self-esteem contributes to depression, the scar model states that depression erodes self-esteem. Furthermore, it is unknown whether the models are specific for depression or whether they are also valid for anxiety. We evaluated the vulnerability and scar models of low self-esteem and depression, and low self-esteem and anxiety, by meta-analyzing the available longitudinal data (covering 77 studies on depression and 18 studies on anxiety). The mean age of the samples ranged from childhood to old age. In the analyses, we used a random-effects model and examined prospective effects between the variables, controlling for prior levels of the predicted variables. For depression, the findings supported the vulnerability model: The effect of self-esteem on depression (β = -.16) was significantly stronger than the effect of depression on self-esteem (β = -.08). In contrast, the effects between low self-esteem and anxiety were relatively balanced: Self-esteem predicted anxiety with β = -.10, and anxiety predicted self-esteem with β = -.08. Moderator analyses were conducted for the effect of low self-esteem on depression; these suggested that the effect is not significantly influenced by gender, age, measures of self-esteem and depression, or time lag between assessments. If future research supports the hypothesized causality of the vulnerability effect of low self-esteem on depression, interventions aimed at increasing self-esteem might be useful in reducing the risk of depression.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Research on comorbidity of psychiatric disorders identifies broad superordinate dimensions as underlying structure of psychopathology. While a syndrome-level approach informs diagnostic systems, a symptom-level approach is more likely to represent the dimensional components within existing diagnostic categories. It may capture general emotional, cognitive or physiological processes as underlying liabilities of different disorders and thus further develop dimensional-spectrum models of psychopathology. METHODS: Exploratory and confirmatory factor analyses were used to examine the structure of psychopathological symptoms assessed with the Brief Symptom Inventory in two outpatient samples (n=3171), including several correlated-factors and bifactor models. The preferred models were correlated with DSM-diagnoses. RESULTS: A model containing eight correlated factors for depressed mood, phobic fear, aggression, suicidal ideation, nervous tension, somatic symptoms, information processing deficits, and interpersonal insecurity, as well a bifactor model fit the data best. Distinct patterns of correlations with DSM-diagnoses identified a) distress-related disorders, i.e., mood disorders, PTSD, and personality disorders, which were associated with all correlated factors as well as the underlying general distress factor; b) anxiety disorders with more specific patterns of correlations; and c) disorders defined by behavioural or somatic dysfunctions, which were characterised by non-significant or negative correlations with most factors. CONCLUSIONS: This study identified emotional, somatic, cognitive, and interpersonal components of psychopathology as transdiagnostic psychopathological liabilities. These components can contribute to a more accurate description and taxonomy of psychopathology, may serve as phenotypic constructs for further aetiological research, and can inform the development of tailored general and specific interventions to treat mental disorders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methane is an important greenhouse gas, responsible for about 20 of the warming induced by long-lived greenhouse gases since pre-industrial times. By reacting with hydroxyl radicals, methane reduces the oxidizing capacity of the atmosphere and generates ozone in the troposphere. Although most sources and sinks of methane have been identified, their relative contributions to atmospheric methane levels are highly uncertain. As such, the factors responsible for the observed stabilization of atmospheric methane levels in the early 2000s, and the renewed rise after 2006, remain unclear. Here, we construct decadal budgets for methane sources and sinks between 1980 and 2010, using a combination of atmospheric measurements and results from chemical transport models, ecosystem models, climate chemistry models and inventories of anthropogenic emissions. The resultant budgets suggest that data-driven approaches and ecosystem models overestimate total natural emissions. We build three contrasting emission scenarios � which differ in fossil fuel and microbial emissions � to explain the decadal variability in atmospheric methane levels detected, here and in previous studies, since 1985. Although uncertainties in emission trends do not allow definitive conclusions to be drawn, we show that the observed stabilization of methane levels between 1999 and 2006 can potentially be explained by decreasing-to-stable fossil fuel emissions, combined with stable-to-increasing microbial emissions. We show that a rise in natural wetland emissions and fossil fuel emissions probably accounts for the renewed increase in global methane levels after 2006, although the relative contribution of these two sources remains uncertain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Scholars have increasingly theorized, and debated, the decision by states to create and delegate authority to international courts, as well as the subsequent autonomy and behavior of those courts, with principal–agent and trusteeship models disagreeing on the nature and extent of states’ influence on international judges. This article formulates and tests a set of principal–agent hypotheses about the ways in which, and the conditions under which, member states are able use their powers of judicial nomination and appointment to influence the endogenous preferences of international judges. The empirical analysis surveys the record of all judicial appointments to the Appellate Body (AB) of the World Trade Organization over a 15-year period. We present a view of an AB appointment process that, far from representing a pure search for expertise, is deeply politicized and offers member-state principals opportunities to influence AB members ex ante and possibly ex post. We further demonstrate that the AB nomination process has become progressively more politicized over time as member states, responding to earlier and controversial AB decisions, became far more concerned about judicial activism and more interested in the substantive opinions of AB candidates, systematically championing candidates whose views on key issues most closely approached their own, and opposing candidates perceived to be activist or biased against their substantive preferences. Although specific to the WTO, our theory and findings have implications for the judicial politics of a large variety of global and regional international courts and tribunals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Digital technologies have profoundly changed not only the ways we create, distribute, access, use and re-use information but also many of the governance structures we had in place. Overall, "older" institutions at all governance levels have grappled and often failed to master the multi-faceted and multi-directional issues of the Internet. Regulatory entrepreneurs have yet to discover and fully mobilize the potential of digital technologies as an influential factor impacting upon the regulability of the environment and as a potential regulatory tool in themselves. At the same time, we have seen a deterioration of some public spaces and lower prioritization of public objectives, when strong private commercial interests are at play, such as most tellingly in the field of copyright. Less tangibly, private ordering has taken hold and captured through contracts spaces, previously regulated by public law. Code embedded in technology often replaces law. Non-state action has in general proliferated and put serious pressure upon conventional state-centered, command-and-control models. Under the conditions of this "messy" governance, the provision of key public goods, such as freedom of information, has been made difficult or is indeed jeopardized.The grand question is how can we navigate this complex multi-actor, multi-issue space and secure the attainment of fundamental public interest objectives. This is also the question that Ian Brown and Chris Marsden seek to answer with their book, Regulating Code, as recently published under the "Information Revolution and Global Politics" series of MIT Press. This book review critically assesses the bold effort by Brown and Marsden.