856 resultados para complexity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Both historical and idealized climate model experiments are performed with a variety of Earth system models of intermediate complexity (EMICs) as part of a community contribution to the Intergovernmental Panel on Climate Change Fifth Assessment Report. Historical simulations start at 850 CE and continue through to 2005. The standard simulations include changes in forcing from solar luminosity, Earth's orbital configuration, CO2, additional greenhouse gases, land use, and sulphate and volcanic aerosols. In spite of very different modelled pre-industrial global surface air temperatures, overall 20th century trends in surface air temperature and carbon uptake are reasonably well simulated when compared to observed trends. Land carbon fluxes show much more variation between models than ocean carbon fluxes, and recent land fluxes appear to be slightly underestimated. It is possible that recent modelled climate trends or climate–carbon feedbacks are overestimated resulting in too much land carbon loss or that carbon uptake due to CO2 and/or nitrogen fertilization is underestimated. Several one thousand year long, idealized, 2 × and 4 × CO2 experiments are used to quantify standard model characteristics, including transient and equilibrium climate sensitivities, and climate–carbon feedbacks. The values from EMICs generally fall within the range given by general circulation models. Seven additional historical simulations, each including a single specified forcing, are used to assess the contributions of different climate forcings to the overall climate and carbon cycle response. The response of surface air temperature is the linear sum of the individual forcings, while the carbon cycle response shows a non-linear interaction between land-use change and CO2 forcings for some models. Finally, the preindustrial portions of the last millennium simulations are used to assess historical model carbon-climate feedbacks. Given the specified forcing, there is a tendency for the EMICs to underestimate the drop in surface air temperature and CO2 between the Medieval Climate Anomaly and the Little Ice Age estimated from palaeoclimate reconstructions. This in turn could be a result of unforced variability within the climate system, uncertainty in the reconstructions of temperature and CO2, errors in the reconstructions of forcing used to drive the models, or the incomplete representation of certain processes within the models. Given the forcing datasets used in this study, the models calculate significant land-use emissions over the pre-industrial period. This implies that land-use emissions might need to be taken into account, when making estimates of climate–carbon feedbacks from palaeoclimate reconstructions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Myxobacteria are single-celled, but social, eubacterial predators. Upon starvation they build multicellular fruiting bodies using a developmental program that progressively changes the pattern of cell movement and the repertoire of genes expressed. Development terminates with spore differentiation and is coordinated by both diffusible and cell-bound signals. The growth and development of Myxococcus xanthus is regulated by the integration of multiple signals from outside the cells with physiological signals from within. A collection of M. xanthus cells behaves, in many respects, like a multicellular organism. For these reasons M. xanthus offers unparalleled access to a regulatory network that controls development and that organizes cell movement on surfaces. The genome of M. xanthus is large (9.14 Mb), considerably larger than the other sequenced delta-proteobacteria. We suggest that gene duplication and divergence were major contributors to genomic expansion from its progenitor. More than 1,500 duplications specific to the myxobacterial lineage were identified, representing >15% of the total genes. Genes were not duplicated at random; rather, genes for cell-cell signaling, small molecule sensing, and integrative transcription control were amplified selectively. Families of genes encoding the production of secondary metabolites are overrepresented in the genome but may have been received by horizontal gene transfer and are likely to be important for predation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intensity modulated radiation therapy (IMRT) is a technique that delivers a highly conformal dose distribution to a target volume while attempting to maximally spare the surrounding normal tissues. IMRT is a common treatment modality used for treating head and neck (H&N) cancers, and the presence of many critical structures in this region requires accurate treatment delivery. The Radiological Physics Center (RPC) acts as both a remote and on-site quality assurance agency that credentials institutions participating in clinical trials. To date, about 30% of all IMRT participants have failed the RPC’s remote audit using the IMRT H&N phantom. The purpose of this project is to evaluate possible causes of H&N IMRT delivery errors observed by the RPC, specifically IMRT treatment plan complexity and the use of improper dosimetry data from machines that were thought to be matched but in reality were not. Eight H&N IMRT plans with a range of complexity defined by total MU (1460-3466), number of segments (54-225), and modulation complexity scores (MCS) (0.181-0.609) were created in Pinnacle v.8m. These plans were delivered to the RPC’s H&N phantom on a single Varian Clinac. One of the IMRT plans (1851 MU, 88 segments, and MCS=0.469) was equivalent to the median H&N plan from 130 previous RPC H&N phantom irradiations. This average IMRT plan was also delivered on four matched Varian Clinac machines and the dose distribution calculated using a different 6MV beam model. Radiochromic film and TLD within the phantom were used to analyze the dose profiles and absolute doses, respectively. The measured and calculated were compared to evaluate the dosimetric accuracy. All deliveries met the RPC acceptance criteria of ±7% absolute dose difference and 4 mm distance-to-agreement (DTA). Additionally, gamma index analysis was performed for all deliveries using a ±7%/4mm and ±5%/3mm criteria. Increasing the treatment plan complexity by varying the MU, number of segments, or varying the MCS resulted in no clear trend toward an increase in dosimetric error determined by the absolute dose difference, DTA, or gamma index. Varying the delivery machines as well as the beam model (use of a Clinac 6EX 6MV beam model vs. Clinac 21EX 6MV model), also did not show any clear trend towards an increased dosimetric error using the same criteria indicated above.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The relationship between time in dreams and real time has intrigued scientists for centuries. The question if actions in dreams take the same time as in wakefulness can be tested by using lucid dreams where the dreamer is able to mark time intervals with prearranged eye movements that can be objectively identified in EOG recordings. Previous research showed an equivalence of time for counting in lucid dreams and in wakefulness (LaBerge, 1985; Erlacher and Schredl, 2004), but Erlacher and Schredl (2004) found that performing squats required about 40% more time in lucid dreams than in the waking state. To find out if the task modality, the task length, or the task complexity results in prolonged times in lucid dreams, an experiment with three different conditions was conducted. In the first condition, five proficient lucid dreamers spent one to three non-consecutive nights in the sleep laboratory. Participants counted to 10, 20, and 30 in wakefulness and in their lucid dreams. Lucidity and task intervals were time stamped with left-right-left-right eye movements. The same procedure was used for these condition where eight lucid dreamers had to walk 10, 20, or 30 steps. In the third condition, eight lucid dreamers performed a gymnastics routine, which in the waking state lasted the same time as walking 10 steps. Again, we found that performing a motor task in a lucid dream requires more time than in wakefulness. Longer durations in the dream state were present for all three tasks, but significant differences were found only for the tasks with motor activity (walking and gymnastics). However, no difference was found for relative times (no disproportional time effects) and a more complex motor task did not result in more prolonged times. Longer durations in lucid dreams might be related to the lack of muscular feedback or slower neural processing during REM sleep. Future studies should explore factors that might be associated with prolonged durations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Species adapted to cold-climatic mountain environments are expected to face a high risk of range contractions, if not local extinctions under climate change. Yet, the populations of many endothermic species may not be primarily affected by physiological constraints, but indirectly by climate-induced changes of habitat characteristics. In mountain forests, where vertebrate species largely depend on vegetation composition and structure, deteriorating habitat suitability may thus be mitigated or even compensated by habitat management aiming at compositional and structural enhancement. We tested this possibility using four cold-adapted bird species with complementary habitat requirements as model organisms. Based on species data and environmental information collected in 300 1-km2 grid cells distributed across four mountain ranges in central Europe, we investigated (1) how species’ occurrence is explained by climate, landscape, and vegetation, (2) to what extent climate change and climate-induced vegetation changes will affect habitat suitability, and (3) whether these changes could be compensated by adaptive habitat management. Species presence was modelled as a function of climate, landscape and vegetation variables under current climate; moreover, vegetation-climate relationships were assessed. The models were extrapolated to the climatic conditions of 2050, assuming the moderate IPCC-scenario A1B, and changes in species’ occurrence probability were quantified. Finally, we assessed the maximum increase in occurrence probability that could be achieved by modifying one or multiple vegetation variables under altered climate conditions. Climate variables contributed significantly to explaining species occurrence, and expected climatic changes, as well as climate-induced vegetation trends, decreased the occurrence probability of all four species, particularly at the low-altitudinal margins of their distribution. These effects could be partly compensated by modifying single vegetation factors, but full compensation would only be achieved if several factors were changed in concert. The results illustrate the possibilities and limitations of adaptive species conservation management under climate change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Complexity has long been recognized and is increasingly becoming mainstream in geomorphology. However, the relative novelty of various concepts and techniques associated to it means that ambiguity continues to surround complexity. In this commentary, we present and discuss a variety of recent contributions that have the potential to help clarify issues and advance the use of complexity in geomorphology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present applicative theories of words corresponding to weak, and especially logarithmic, complexity classes. The theories for the logarithmic hierarchy and alternating logarithmic time formalise function algebras with concatenation recursion as main principle. We present two theories for logarithmic space where the first formalises a new two-sorted algebra which is very similar to Cook and Bellantoni's famous two-sorted algebra B for polynomial time [4]. The second theory describes logarithmic space by formalising concatenation- and sharply bounded recursion. All theories contain the predicates WW representing words, and VV representing temporary inaccessible words. They are inspired by Cantini's theories [6] formalising B.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

CONTEXT Complex steroid disorders such as P450 oxidoreductase deficiency or apparent cortisone reductase deficiency may be recognized by steroid profiling using chromatographic mass spectrometric methods. These methods are highly specific and sensitive, and provide a complete spectrum of steroid metabolites in a single measurement of one sample which makes them superior to immunoassays. The steroid metabolome during the fetal-neonatal transition is characterized by a) the metabolites of the fetal-placental unit at birth, b) the fetal adrenal androgens until its involution 3-6 months postnatally, and c) the steroid metabolites produced by the developing endocrine organs. All these developmental events change the steroid metabolome in an age- and sex-dependent manner during the first year of life. OBJECTIVE The aim of this study was to provide normative values for the urinary steroid metabolome of healthy newborns at short time intervals in the first year of life. METHODS We conducted a prospective, longitudinal study to measure 67 urinary steroid metabolites in 21 male and 22 female term healthy newborn infants at 13 time-points from week 1 to week 49 of life. Urine samples were collected from newborn infants before discharge from hospital and from healthy infants at home. Steroid metabolites were measured by gas chromatography-mass spectrometry (GC-MS) and steroid concentrations corrected for urinary creatinine excretion were calculated. RESULTS 61 steroids showed age and 15 steroids sex specificity. Highest urinary steroid concentrations were found in both sexes for progesterone derivatives, in particular 20α-DH-5α-DH-progesterone, and for highly polar 6α-hydroxylated glucocorticoids. The steroids peaked at week 3 and decreased by ∼80% at week 25 in both sexes. The decline of progestins, androgens and estrogens was more pronounced than of glucocorticoids whereas the excretion of corticosterone and its metabolites and of mineralocorticoids remained constant during the first year of life. CONCLUSION The urinary steroid profile changes dramatically during the first year of life and correlates with the physiologic developmental changes during the fetal-neonatal transition. Thus detailed normative data during this time period permit the use of steroid profiling as a powerful diagnostic tool.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present the evaluation design for a complex multilevel program recently introduced in Switzerland. The evaluation embraces the federal level, the cantonal program level, and the project level where target groups are directly addressed. We employ Pawson and Tilley’s realist evaluation approach, in order to do justice to the varying context factors that impact the cantonal programs leading to varying effectiveness of the implemented activities. The application of the model to the canton of Uri shows that the numerous vertical and horizontal relations play a crucial role for the program’s effectiveness. As a general learning for the evaluation of complex programs, we state that there is a need to consider all affected levels of a program and that no monocausal effects can be singled out in programs where multiple interventions address the same problem. Moreover, considering all affected levels of a program can mean going beyond the borders of the actual program organization and including factors that do not directly interfere with the policy delivery as such. In particular, we found that the relationship between the cantonal and the federal level was a crucial organizational factor influencing the effectiveness of the cantonal program.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently several thousands of objects are being tracked in the MEO and GEO regions through optical means. The problem faced in this framework is that of Multiple Target Tracking (MTT). In this context both the correct associations among the observations, and the orbits of the objects have to be determined. The complexity of the MTT problem is defined by its dimension S. Where S stands for the number of ’fences’ used in the problem, each fence consists of a set of observations that all originate from dierent targets. For a dimension of S ˃ the MTT problem becomes NP-hard. As of now no algorithm exists that can solve an NP-hard problem in an optimal manner within a reasonable (polynomial) computation time. However, there are algorithms that can approximate the solution with a realistic computational e ort. To this end an Elitist Genetic Algorithm is implemented to approximately solve the S ˃ MTT problem in an e cient manner. Its complexity is studied and it is found that an approximate solution can be obtained in a polynomial time. With the advent of improved sensors and a heightened interest in the problem of space debris, it is expected that the number of tracked objects will grow by an order of magnitude in the near future. This research aims to provide a method that can treat the correlation and orbit determination problems simultaneously, and is able to e ciently process large data sets with minimal manual intervention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES The purpose of this study was to compare the 2-year safety and effectiveness of new- versus early-generation drug-eluting stents (DES) according to the severity of coronary artery disease (CAD) as assessed by the SYNTAX (Synergy between Percutaneous Coronary Intervention with Taxus and Cardiac Surgery) score. BACKGROUND New-generation DES are considered the standard-of-care in patients with CAD undergoing percutaneous coronary intervention. However, there are few data investigating the effects of new- over early-generation DES according to the anatomic complexity of CAD. METHODS Patient-level data from 4 contemporary, all-comers trials were pooled. The primary device-oriented clinical endpoint was the composite of cardiac death, myocardial infarction, or ischemia-driven target-lesion revascularization (TLR). The principal effectiveness and safety endpoints were TLR and definite stent thrombosis (ST), respectively. Adjusted hazard ratios (HRs) with 95% confidence intervals (CIs) were calculated at 2 years for overall comparisons, as well as stratified for patients with lower (SYNTAX score ≤11) and higher complexity (SYNTAX score >11). RESULTS A total of 6,081 patients were included in the study. New-generation DES (n = 4,554) compared with early-generation DES (n = 1,527) reduced the primary endpoint (HR: 0.75 [95% CI: 0.63 to 0.89]; p = 0.001) without interaction (p = 0.219) between patients with lower (HR: 0.86 [95% CI: 0.64 to 1.16]; p = 0.322) versus higher CAD complexity (HR: 0.68 [95% CI: 0.54 to 0.85]; p = 0.001). In patients with SYNTAX score >11, new-generation DES significantly reduced TLR (HR: 0.36 [95% CI: 0.26 to 0.51]; p < 0.001) and definite ST (HR: 0.28 [95% CI: 0.15 to 0.55]; p < 0.001) to a greater extent than in the low-complexity group (TLR pint = 0.059; ST pint = 0.013). New-generation DES decreased the risk of cardiac mortality in patients with SYNTAX score >11 (HR: 0.45 [95% CI: 0.27 to 0.76]; p = 0.003) but not in patients with SYNTAX score ≤11 (pint = 0.042). CONCLUSIONS New-generation DES improve clinical outcomes compared with early-generation DES, with a greater safety and effectiveness in patients with SYNTAX score >11.