28 resultados para Intergenerational resource allocation


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper is a summary of the main contribu- tions of the PhD thesis published in [1]. The main research contributions of the thesis are driven by the research question how to design simple, yet efficient and robust run-time adaptive resource allocation schemes within the commu- nication stack of Wireless Sensor Network (WSN) nodes. The thesis addresses several problem domains with con- tributions on different layers of the WSN communication stack. The main contributions can be summarized as follows: First, a a novel run-time adaptive MAC protocol is intro- duced, which stepwise allocates the power-hungry radio interface in an on-demand manner when the encountered traffic load requires it. Second, the thesis outlines a metho- dology for robust, reliable and accurate software-based energy-estimation, which is calculated at network run- time on the sensor node itself. Third, the thesis evaluates several Forward Error Correction (FEC) strategies to adap- tively allocate the correctional power of Error Correcting Codes (ECCs) to cope with timely and spatially variable bit error rates. Fourth, in the context of TCP-based communi- cations in WSNs, the thesis evaluates distributed caching and local retransmission strategies to overcome the perfor- mance degrading effects of packet corruption and trans- mission failures when transmitting data over multiple hops. The performance of all developed protocols are eval- uated on a self-developed real-world WSN testbed and achieve superior performance over selected existing ap- proaches, especially where traffic load and channel condi- tions are suspect to rapid variations over time.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background. Few studies consider the incidence of individual AIDS-defining illnesses (ADIs) at higher CD4 counts, relevant on a population level for monitoring and resource allocation. Methods. Individuals from the Collaboration of Observational HIV Epidemiological Research Europe (COHERE) aged ≥14 years with ≥1 CD4 count of ≥200 µL between 1998 and 2010 were included. Incidence rates (per 1000 person-years of follow-up [PYFU]) were calculated for each ADI within different CD4 strata; Poisson regression, using generalized estimating equations and robust standard errors, was used to model rates of ADIs with current CD4 ≥500/µL. Results. A total of 12 135 ADIs occurred at a CD4 count of ≥200 cells/µL among 207 539 persons with 1 154 803 PYFU. Incidence rates declined from 20.5 per 1000 PYFU (95% confidence interval [CI], 20.0–21.1 per 1000 PYFU) with current CD4 200–349 cells/µL to 4.1 per 1000 PYFU (95% CI, 3.6–4.6 per 1000 PYFU) with current CD4 ≥ 1000 cells/µL. Persons with a current CD4 of 500–749 cells/µL had a significantly higher rate of ADIs (adjusted incidence rate ratio [aIRR], 1.20; 95% CI, 1.10–1.32), whereas those with a current CD4 of ≥1000 cells/µL had a similar rate (aIRR, 0.92; 95% CI, .79–1.07), compared to a current CD4 of 750–999 cells/µL. Results were consistent in persons with high or low viral load. Findings were stronger for malignant ADIs (aIRR, 1.52; 95% CI, 1.25–1.86) than for nonmalignant ADIs (aIRR, 1.12; 95% CI, 1.01–1.25), comparing persons with a current CD4 of 500–749 cells/µL to 750–999 cells/µL. Discussion. The incidence of ADIs was higher in individuals with a current CD4 count of 500–749 cells/µL compared to those with a CD4 count of 750–999 cells/µL, but did not decrease further at higher CD4 counts. Results were similar in patients virologically suppressed on combination antiretroviral therapy, suggesting that immune reconstitution is not complete until the CD4 increases to >750 cells/µL.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

ABSTRACT: Fourier transform infrared spectroscopy (FTIRS) can provide detailed information on organic and minerogenic constituents of sediment records. Based on a large number of sediment samples of varying age (0�340 000 yrs) and from very diverse lake settings in Antarctica, Argentina, Canada, Macedonia/Albania, Siberia, and Sweden, we have developed universally applicable calibration models for the quantitative determination of biogenic silica (BSi; n = 816), total inorganic carbon (TIC; n = 879), and total organic carbon (TOC; n = 3164) using FTIRS. These models are based on the differential absorbance of infrared radiation at specific wavelengths with varying concentrations of individual parameters, due to molecular vibrations associated with each parameter. The calibration models have low prediction errors and the predicted values are highly correlated with conventionally measured values (R = 0.94�0.99). Robustness tests indicate the accuracy of the newly developed FTIRS calibration models is similar to that of conventional geochemical analyses. Consequently FTIRS offers a useful and rapid alternative to conventional analyses for the quantitative determination of BSi, TIC, and TOC. The rapidity, cost-effectiveness, and small sample size required enables FTIRS determination of geochemical properties to be undertaken at higher resolutions than would otherwise be possible with the same resource allocation, thus providing crucial sedimentological information for climatic and environmental reconstructions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Identifying drivers of species diversity is a major challenge in understanding and predicting the dynamics of species-rich semi-natural grasslands. In particular in temperate grasslands changes in land use and its consequences, i.e. increasing fragmentation, the on-going loss of habitat and the declining importance of regional processes such as seed dispersal by livestock, are considered key drivers of the diversity loss witnessed within the last decades. It is a largely unresolved question to what degree current temperate grassland communities already reflect a decline of regional processes such as longer distance seed dispersal. Answering this question is challenging since it requires both a mechanistic approach to community dynamics and a sufficient data basis that allows identifying general patterns. Here, we present results of a local individual- and trait-based community model that was initialized with plant functional types (PFTs) derived from an extensive empirical data set of species-rich grasslands within the `Biodiversity Exploratories' in Germany. Driving model processes included above- and belowground competition, dynamic resource allocation to shoots and roots, clonal growth, grazing, and local seed dispersal. To test for the impact of regional processes we also simulated seed input from a regional species pool. Model output, with and without regional seed input, was compared with empirical community response patterns along a grazing gradient. Simulated response patterns of changes in PFT richness, Shannon diversity, and biomass production matched observed grazing response patterns surprisingly well if only local processes were considered. Already low levels of additional regional seed input led to stronger deviations from empirical community pattern. While these findings cannot rule out that regional processes other than those considered in the modeling study potentially play a role in shaping the local grassland communities, our comparison indicates that European grasslands are largely isolated, i.e. local mechanisms explain observed community patterns to a large extent.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The present study investigated extraversion-related individual differences in visual short-term memory (VSTM) functioning. Event related potentials were recorded from 50 introverts and 50 extraverts while they performed a VSTM task based on a color-change detection paradigm with three different set sizes. Although introverts and extraverts showed almost identical hit rates and reaction times, introverts displayed larger N1 amplitudes than extraverts independent of color change or set size. Extraverts also showed larger P3 amplitudes compared to introverts when there was a color change, whereas no extraversion-related difference in P3 amplitude was found in the no-change condition. Our findings provided the first experimental evidence that introverts' greater reactivity to punctuate physical stimulation, as indicated by larger N1 amplitude, also holds for complex visual stimulus patterns. Furthermore, P3 amplitude in the change condition was larger for extraverts than introverts suggesting higher sensitivity to context change. Finally, there were no extraversion-related differences in P3 amplitude dependent on set size. This latter finding does not support the resource allocation explanation as a source of differences between introverts and extraverts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Patients presenting to the emergency department (ED) currently face inacceptable delays in initial treatment, and long, costly hospital stays due to suboptimal initial triage and site-of-care decisions. Accurate ED triage should focus not only on initial treatment priority, but also on prediction of medical risk and nursing needs to improve site-of-care decisions and to simplify early discharge management. Different triage scores have been proposed, such as the Manchester triage system (MTS). Yet, these scores focus only on treatment priority, have suboptimal performance and lack validation in the Swiss health care system. Because the MTS will be introduced into clinical routine at the Kantonsspital Aarau, we propose a large prospective cohort study to optimize initial patient triage. Specifically, the aim of this trial is to derive a three-part triage algorithm to better predict (a) treatment priority; (b) medical risk and thus need for in-hospital treatment; (c) post-acute care needs of patients at the most proximal time point of ED admission. Methods/design: Prospective, observational, multicenter, multi-national cohort study. We will include all consecutive medical patients seeking ED care into this observational registry. There will be no exclusions except for non-adult and non-medical patients. Vital signs will be recorded and left over blood samples will be stored for later batch analysis of blood markers. Upon ED admission, the post-acute care discharge score (PACD) will be recorded. Attending ED physicians will adjudicate triage priority based on all available results at the time of ED discharge to the medical ward. Patients will be reassessed daily during the hospital course for medical stability and readiness for discharge from the nurses and if involved social workers perspective. To assess outcomes, data from electronic medical records will be used and all patients will be contacted 30 days after hospital admission to assess vital and functional status, re-hospitalization, satisfaction with care and quality of life measures. We aim to include between 5000 and 7000 patients over one year of recruitment to derive the three-part triage algorithm. The respective main endpoints were defined as (a) initial triage priority (high vs. low priority) adjudicated by the attending ED physician at ED discharge, (b) adverse 30 day outcome (death or intensive care unit admission) within 30 days following ED admission to assess patients risk and thus need for in-hospital treatment and (c) post acute care needs after hospital discharge, defined as transfer of patients to a post-acute care institution, for early recognition and planning of post-acute care needs. Other outcomes are time to first physician contact, time to initiation of adequate medical therapy, time to social worker involvement, length of hospital stay, reasons fordischarge delays, patient’s satisfaction with care, overall hospital costs and patients care needs after returning home. Discussion: Using a reliable initial triage system for estimating initial treatment priority, need for in-hospital treatment and post-acute care needs is an innovative and persuasive approach for a more targeted and efficient management of medical patients in the ED. The proposed interdisciplinary , multi-national project has unprecedented potential to improve initial triage decisions and optimize resource allocation to the sickest patients from admission to discharge. The algorithms derived in this study will be compared in a later randomized controlled trial against a usual care control group in terms of resource use, length of hospital stay, overall costs and patient’s outcomes in terms of mortality, re-hospitalization, quality of life and satisfaction with care.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Upon attack by leaf herbivores, many plants reallocate photoassimilates below ground. However, little is known about how plants respond when the roots themselves come under attack. We investigated induced resource allocation in maize plants that are infested by the larvae Western corn rootworm Diabrotica virgifera virgifera. Using radioactive 11CO2, we demonstrate that root-attacked maize plants allocate more new 11C carbon from source leaves to stems, but not to roots. Reduced meristematic activity and reduced invertase activity in attacked maize root systems are identified as possible drivers of this shoot reallocation response. The increased allocation of photoassimilates to stems is shown to be associated with a marked thickening of these tissues and increased growth of stem-borne crown roots. A strong quantitative correlation between stem thickness and root regrowth across different watering levels suggests that retaining photoassimilates in the shoots may help root-attacked plants to compensate for the loss of belowground tissues. Taken together, our results indicate that induced tolerance may be an important strategy of plants to withstand belowground attack. Furthermore, root herbivore-induced carbon reallocation needs to be taken into account when studying plant-mediated interactions between herbivores.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Recently, two simple clinical scores were published to predict survival in trauma patients. Both scores may successfully guide major trauma triage, but neither has been independently validated in a hospital setting. METHODS This is a cohort study with 30-day mortality as the primary outcome to validate two new trauma scores-Mechanism, Glasgow Coma Scale (GCS), Age, and Pressure (MGAP) score and GCS, Age and Pressure (GAP) score-using data from the UK Trauma Audit and Research Network. First, an assessment of discrimination, using the area under the receiver operating characteristic (ROC) curve, and calibration, comparing mortality rates with those originally published, were performed. Second, we calculated sensitivity, specificity, predictive values, and likelihood ratios for prognostic score performance. Third, we propose new cutoffs for the risk categories. RESULTS A total of 79,807 adult (≥16 years) major trauma patients (2000-2010) were included; 5,474 (6.9%) died. Mean (SD) age was 51.5 (22.4) years, median GCS score was 15 (interquartile range, 15-15), and median Injury Severity Score (ISS) was 9 (interquartile range, 9-16). More than 50% of the patients had a low-risk GAP or MGAP score (1% mortality). With regard to discrimination, areas under the ROC curve were 87.2% for GAP score (95% confidence interval, 86.7-87.7) and 86.8% for MGAP score (95% confidence interval, 86.2-87.3). With regard to calibration, 2,390 (3.3%), 1,900 (28.5%), and 1,184 (72.2%) patients died in the low, medium, and high GAP risk categories, respectively. In the low- and medium-risk groups, these were almost double the previously published rates. For MGAP, 1,861 (2.8%), 1,455 (15.2%), and 2,158 (58.6%) patients died in the low-, medium-, and high-risk categories, consonant with results originally published. Reclassifying score point cutoffs improved likelihood ratios, sensitivity and specificity, as well as areas under the ROC curve. CONCLUSION We found both scores to be valid triage tools to stratify emergency department patients, according to their risk of death. MGAP calibrated better, but GAP slightly improved discrimination. The newly proposed cutoffs better differentiate risk classification and may therefore facilitate hospital resource allocation. LEVEL OF EVIDENCE Prognostic study, level II.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cloud Computing enables provisioning and distribution of highly scalable services in a reliable, on-demand and sustainable manner. However, objectives of managing enterprise distributed applications in cloud environments under Service Level Agreement (SLA) constraints lead to challenges for maintaining optimal resource control. Furthermore, conflicting objectives in management of cloud infrastructure and distributed applications might lead to violations of SLAs and inefficient use of hardware and software resources. This dissertation focusses on how SLAs can be used as an input to the cloud management system, increasing the efficiency of allocating resources, as well as that of infrastructure scaling. First, we present an extended SLA semantic model for modelling complex service-dependencies in distributed applications, and for enabling automated cloud infrastructure management operations. Second, we describe a multi-objective VM allocation algorithm for optimised resource allocation in infrastructure clouds. Third, we describe a method of discovering relations between the performance indicators of services belonging to distributed applications and then using these relations for building scaling rules that a CMS can use for automated management of VMs. Fourth, we introduce two novel VM-scaling algorithms, which optimally scale systems composed of VMs, based on given SLA performance constraints. All presented research works were implemented and tested using enterprise distributed applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Paper I: Corporate aging and internal resource allocation Abstract Various observers argue that established firms are at a disadvantage in pursuing new growth opportunities. In this paper, we provide systematic evidence that established firms allocate fewer resources to high-growth lines of business. However, we find no evidence of inefficient resource allocation in established firms. Redirecting resources from high-growth to low-growth lines of business does not result in lower profitability. Also, resource allocation towards new growth opportunities does not increase when managers of established firms are exposed to takeover and product market threats. Rather, it seems that conservative resource allocation strategies are driven by pressures to meet investors’ expectations. Our empirical evidence, thus, favors the hypothesis that established firms wisely choose to allocate fewer resources to new growth opportunities as external pressures force them to focus on efficiency rather than novelty (Holmström 1989). Paper II: Corporate aging and asset sales Abstract This paper asks whether divestitures are motivated by strategic considerations about the scope of the firm’s activities. Limited managerial capacity implies that exploiting core competences becomes comparatively more attractive than exploring new growth opportunities as firms mature. Divestitures help stablished firms free management time and increase the focus on core competences. The testable implication of this attention hypothesis is that established firms are the main sellers of assets, that their divestiture activity increases when managerial capacity is scarcer, that they sell non-core activities, and that they return the divestiture proceeds to the providers of capital instead of reinvesting them in the firm. We find strong empirical support for these predictions. Paper III: Corporate aging and lobbying expenditures Abstract Creative destruction forces constantly challenge established firms, especially in competitive markets. This paper asks whether corporate lobbying is a competitive weapon of established firms to counteract the decline in rents over time. We find a statistically and economically significant positive relation between firm age and lobbying expenditures. Moreover, the documented age-effect is weaker when firms have unique products or operate in concentrated product markets. To address endogeneity, we use industry distress as an exogenous nonlegislative shock to future rents and show that established firms are relatively more likely to lobby when in distress. Finally, we provide empirical evidence that corporate lobbying efforts by established firms forestall the creative destruction process. In sum, our findings suggest that corporate lobbying is a competitive weapon of established firms to retain profitability in competitive environments.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As a consequence of artificial selection for specific traits, crop plants underwent considerable genotypic and phenotypic changes during the process of domestication. These changes may have led to reduced resistance in the cultivated plant due to shifts in resource allocation from defensive traits to increased growth rates and yield. Modern maize (Zea mays ssp. mays) was domesticated from its ancestor Balsas teosinte (Z. mays ssp. parviglumis) approximately 9000 years ago. Although maize displays a high genetic overlap with its direct ancestor and other annual teosintes, several studies show that maize and its ancestors differ in their resistance phenotypes with teosintes being less susceptible to herbivore damage. However, the underlying mechanisms are poorly understood. Here we addressed the question to what extent maize domestication has affected two crucial chemical and one physical defence traits and whether differences in their expression may explain the differences in herbivore resistance levels. The ontogenetic trajectories of 1,4-benzoxazin-3-ones, maysin and leaf toughness were monitored for different leaf types across several maize cultivars and teosinte accessions during early vegetative growth stages. We found significant quantitative and qualitative differences in 1,4-benzoxazin-3-one accumulation in an initial pairwise comparison, but we did not find consistent differences between wild and cultivated genotypes during a more thorough examination employing several cultivars/accessions. Yet, 1,4-benzoxazin-3-one levels tended to decline more rapidly with plant age in the modern maize cultivars. Foliar maysin levels and leaf toughness increased with plant age in a leaf-specific manner, but were also unaffected by domestication. Based on our findings we suggest that defence traits other than the ones that were investigated are responsible for the observed differences in herbivore resistance between teosinte and maize. Furthermore, our results indicate that single pairwise comparisons may lead to false conclusions regarding the effects of domestication on defensive and possibly other traits.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Advancements in cloud computing have enabled the proliferation of distributed applications, which require management and control of multiple services. However, without an efficient mechanism for scaling services in response to changing workload conditions, such as number of connected users, application performance might suffer, leading to violations of Service Level Agreements (SLA) and possible inefficient use of hardware resources. Combining dynamic application requirements with the increased use of virtualised computing resources creates a challenging resource Management context for application and cloud-infrastructure owners. In such complex environments, business entities use SLAs as a means for specifying quantitative and qualitative requirements of services. There are several challenges in running distributed enterprise applications in cloud environments, ranging from the instantiation of service VMs in the correct order using an adequate quantity of computing resources, to adapting the number of running services in response to varying external loads, such as number of users. The application owner is interested in finding the optimum amount of computing and network resources to use for ensuring that the performance requirements of all her/his applications are met. She/he is also interested in appropriately scaling the distributed services so that application performance guarantees are maintained even under dynamic workload conditions. Similarly, the infrastructure Providers are interested in optimally provisioning the virtual resources onto the available physical infrastructure so that her/his operational costs are minimized, while maximizing the performance of tenants’ applications. Motivated by the complexities associated with the management and scaling of distributed applications, while satisfying multiple objectives (related to both consumers and providers of cloud resources), this thesis proposes a cloud resource management platform able to dynamically provision and coordinate the various lifecycle actions on both virtual and physical cloud resources using semantically enriched SLAs. The system focuses on dynamic sizing (scaling) of virtual infrastructures composed of virtual machines (VM) bounded application services. We describe several algorithms for adapting the number of VMs allocated to the distributed application in response to changing workload conditions, based on SLA-defined performance guarantees. We also present a framework for dynamic composition of scaling rules for distributed service, which used benchmark-generated application Monitoring traces. We show how these scaling rules can be combined and included into semantic SLAs for controlling allocation of services. We also provide a detailed description of the multi-objective infrastructure resource allocation problem and various approaches to satisfying this problem. We present a resource management system based on a genetic algorithm, which performs allocation of virtual resources, while considering the optimization of multiple criteria. We prove that our approach significantly outperforms reactive VM-scaling algorithms as well as heuristic-based VM-allocation approaches.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Various software packages for project management include a procedure for resource-constrained scheduling. In several packages, the user can influence this procedure by selecting a priority rule. However, the resource-allocation methods that are implemented in the procedures are proprietary information; therefore, the question of how the priority-rule selection impacts the performance of the procedures arises. We experimentally evaluate the resource-allocation methods of eight recent software packages using the 600 instances of the PSPLIB J120 test set. The results of our analysis indicate that applying the default rule tends to outperform a randomly selected rule, whereas applying two randomly selected rules tends to outperform the default rule. Applying a small set of more than two rules further improves the project durations considerably. However, a large number of rules must be applied to obtain the best possible project durations.