925 resultados para resource allocation
Resumo:
The present study investigated extraversion-related individual differences in visual short-term memory (VSTM) functioning. Event related potentials were recorded from 50 introverts and 50 extraverts while they performed a VSTM task based on a color-change detection paradigm with three different set sizes. Although introverts and extraverts showed almost identical hit rates and reaction times, introverts displayed larger N1 amplitudes than extraverts independent of color change or set size. Extraverts also showed larger P3 amplitudes compared to introverts when there was a color change, whereas no extraversion-related difference in P3 amplitude was found in the no-change condition. Our findings provided the first experimental evidence that introverts' greater reactivity to punctuate physical stimulation, as indicated by larger N1 amplitude, also holds for complex visual stimulus patterns. Furthermore, P3 amplitude in the change condition was larger for extraverts than introverts suggesting higher sensitivity to context change. Finally, there were no extraversion-related differences in P3 amplitude dependent on set size. This latter finding does not support the resource allocation explanation as a source of differences between introverts and extraverts.
Resumo:
Background: Patients presenting to the emergency department (ED) currently face inacceptable delays in initial treatment, and long, costly hospital stays due to suboptimal initial triage and site-of-care decisions. Accurate ED triage should focus not only on initial treatment priority, but also on prediction of medical risk and nursing needs to improve site-of-care decisions and to simplify early discharge management. Different triage scores have been proposed, such as the Manchester triage system (MTS). Yet, these scores focus only on treatment priority, have suboptimal performance and lack validation in the Swiss health care system. Because the MTS will be introduced into clinical routine at the Kantonsspital Aarau, we propose a large prospective cohort study to optimize initial patient triage. Specifically, the aim of this trial is to derive a three-part triage algorithm to better predict (a) treatment priority; (b) medical risk and thus need for in-hospital treatment; (c) post-acute care needs of patients at the most proximal time point of ED admission. Methods/design: Prospective, observational, multicenter, multi-national cohort study. We will include all consecutive medical patients seeking ED care into this observational registry. There will be no exclusions except for non-adult and non-medical patients. Vital signs will be recorded and left over blood samples will be stored for later batch analysis of blood markers. Upon ED admission, the post-acute care discharge score (PACD) will be recorded. Attending ED physicians will adjudicate triage priority based on all available results at the time of ED discharge to the medical ward. Patients will be reassessed daily during the hospital course for medical stability and readiness for discharge from the nurses and if involved social workers perspective. To assess outcomes, data from electronic medical records will be used and all patients will be contacted 30 days after hospital admission to assess vital and functional status, re-hospitalization, satisfaction with care and quality of life measures. We aim to include between 5000 and 7000 patients over one year of recruitment to derive the three-part triage algorithm. The respective main endpoints were defined as (a) initial triage priority (high vs. low priority) adjudicated by the attending ED physician at ED discharge, (b) adverse 30 day outcome (death or intensive care unit admission) within 30 days following ED admission to assess patients risk and thus need for in-hospital treatment and (c) post acute care needs after hospital discharge, defined as transfer of patients to a post-acute care institution, for early recognition and planning of post-acute care needs. Other outcomes are time to first physician contact, time to initiation of adequate medical therapy, time to social worker involvement, length of hospital stay, reasons fordischarge delays, patient’s satisfaction with care, overall hospital costs and patients care needs after returning home. Discussion: Using a reliable initial triage system for estimating initial treatment priority, need for in-hospital treatment and post-acute care needs is an innovative and persuasive approach for a more targeted and efficient management of medical patients in the ED. The proposed interdisciplinary , multi-national project has unprecedented potential to improve initial triage decisions and optimize resource allocation to the sickest patients from admission to discharge. The algorithms derived in this study will be compared in a later randomized controlled trial against a usual care control group in terms of resource use, length of hospital stay, overall costs and patient’s outcomes in terms of mortality, re-hospitalization, quality of life and satisfaction with care.
Resumo:
Upon attack by leaf herbivores, many plants reallocate photoassimilates below ground. However, little is known about how plants respond when the roots themselves come under attack. We investigated induced resource allocation in maize plants that are infested by the larvae Western corn rootworm Diabrotica virgifera virgifera. Using radioactive 11CO2, we demonstrate that root-attacked maize plants allocate more new 11C carbon from source leaves to stems, but not to roots. Reduced meristematic activity and reduced invertase activity in attacked maize root systems are identified as possible drivers of this shoot reallocation response. The increased allocation of photoassimilates to stems is shown to be associated with a marked thickening of these tissues and increased growth of stem-borne crown roots. A strong quantitative correlation between stem thickness and root regrowth across different watering levels suggests that retaining photoassimilates in the shoots may help root-attacked plants to compensate for the loss of belowground tissues. Taken together, our results indicate that induced tolerance may be an important strategy of plants to withstand belowground attack. Furthermore, root herbivore-induced carbon reallocation needs to be taken into account when studying plant-mediated interactions between herbivores.
Resumo:
BACKGROUND Recently, two simple clinical scores were published to predict survival in trauma patients. Both scores may successfully guide major trauma triage, but neither has been independently validated in a hospital setting. METHODS This is a cohort study with 30-day mortality as the primary outcome to validate two new trauma scores-Mechanism, Glasgow Coma Scale (GCS), Age, and Pressure (MGAP) score and GCS, Age and Pressure (GAP) score-using data from the UK Trauma Audit and Research Network. First, an assessment of discrimination, using the area under the receiver operating characteristic (ROC) curve, and calibration, comparing mortality rates with those originally published, were performed. Second, we calculated sensitivity, specificity, predictive values, and likelihood ratios for prognostic score performance. Third, we propose new cutoffs for the risk categories. RESULTS A total of 79,807 adult (≥16 years) major trauma patients (2000-2010) were included; 5,474 (6.9%) died. Mean (SD) age was 51.5 (22.4) years, median GCS score was 15 (interquartile range, 15-15), and median Injury Severity Score (ISS) was 9 (interquartile range, 9-16). More than 50% of the patients had a low-risk GAP or MGAP score (1% mortality). With regard to discrimination, areas under the ROC curve were 87.2% for GAP score (95% confidence interval, 86.7-87.7) and 86.8% for MGAP score (95% confidence interval, 86.2-87.3). With regard to calibration, 2,390 (3.3%), 1,900 (28.5%), and 1,184 (72.2%) patients died in the low, medium, and high GAP risk categories, respectively. In the low- and medium-risk groups, these were almost double the previously published rates. For MGAP, 1,861 (2.8%), 1,455 (15.2%), and 2,158 (58.6%) patients died in the low-, medium-, and high-risk categories, consonant with results originally published. Reclassifying score point cutoffs improved likelihood ratios, sensitivity and specificity, as well as areas under the ROC curve. CONCLUSION We found both scores to be valid triage tools to stratify emergency department patients, according to their risk of death. MGAP calibrated better, but GAP slightly improved discrimination. The newly proposed cutoffs better differentiate risk classification and may therefore facilitate hospital resource allocation. LEVEL OF EVIDENCE Prognostic study, level II.
Resumo:
Cloud Computing enables provisioning and distribution of highly scalable services in a reliable, on-demand and sustainable manner. However, objectives of managing enterprise distributed applications in cloud environments under Service Level Agreement (SLA) constraints lead to challenges for maintaining optimal resource control. Furthermore, conflicting objectives in management of cloud infrastructure and distributed applications might lead to violations of SLAs and inefficient use of hardware and software resources. This dissertation focusses on how SLAs can be used as an input to the cloud management system, increasing the efficiency of allocating resources, as well as that of infrastructure scaling. First, we present an extended SLA semantic model for modelling complex service-dependencies in distributed applications, and for enabling automated cloud infrastructure management operations. Second, we describe a multi-objective VM allocation algorithm for optimised resource allocation in infrastructure clouds. Third, we describe a method of discovering relations between the performance indicators of services belonging to distributed applications and then using these relations for building scaling rules that a CMS can use for automated management of VMs. Fourth, we introduce two novel VM-scaling algorithms, which optimally scale systems composed of VMs, based on given SLA performance constraints. All presented research works were implemented and tested using enterprise distributed applications.
Resumo:
Paper I: Corporate aging and internal resource allocation Abstract Various observers argue that established firms are at a disadvantage in pursuing new growth opportunities. In this paper, we provide systematic evidence that established firms allocate fewer resources to high-growth lines of business. However, we find no evidence of inefficient resource allocation in established firms. Redirecting resources from high-growth to low-growth lines of business does not result in lower profitability. Also, resource allocation towards new growth opportunities does not increase when managers of established firms are exposed to takeover and product market threats. Rather, it seems that conservative resource allocation strategies are driven by pressures to meet investors’ expectations. Our empirical evidence, thus, favors the hypothesis that established firms wisely choose to allocate fewer resources to new growth opportunities as external pressures force them to focus on efficiency rather than novelty (Holmström 1989). Paper II: Corporate aging and asset sales Abstract This paper asks whether divestitures are motivated by strategic considerations about the scope of the firm’s activities. Limited managerial capacity implies that exploiting core competences becomes comparatively more attractive than exploring new growth opportunities as firms mature. Divestitures help stablished firms free management time and increase the focus on core competences. The testable implication of this attention hypothesis is that established firms are the main sellers of assets, that their divestiture activity increases when managerial capacity is scarcer, that they sell non-core activities, and that they return the divestiture proceeds to the providers of capital instead of reinvesting them in the firm. We find strong empirical support for these predictions. Paper III: Corporate aging and lobbying expenditures Abstract Creative destruction forces constantly challenge established firms, especially in competitive markets. This paper asks whether corporate lobbying is a competitive weapon of established firms to counteract the decline in rents over time. We find a statistically and economically significant positive relation between firm age and lobbying expenditures. Moreover, the documented age-effect is weaker when firms have unique products or operate in concentrated product markets. To address endogeneity, we use industry distress as an exogenous nonlegislative shock to future rents and show that established firms are relatively more likely to lobby when in distress. Finally, we provide empirical evidence that corporate lobbying efforts by established firms forestall the creative destruction process. In sum, our findings suggest that corporate lobbying is a competitive weapon of established firms to retain profitability in competitive environments.
Resumo:
As a consequence of artificial selection for specific traits, crop plants underwent considerable genotypic and phenotypic changes during the process of domestication. These changes may have led to reduced resistance in the cultivated plant due to shifts in resource allocation from defensive traits to increased growth rates and yield. Modern maize (Zea mays ssp. mays) was domesticated from its ancestor Balsas teosinte (Z. mays ssp. parviglumis) approximately 9000 years ago. Although maize displays a high genetic overlap with its direct ancestor and other annual teosintes, several studies show that maize and its ancestors differ in their resistance phenotypes with teosintes being less susceptible to herbivore damage. However, the underlying mechanisms are poorly understood. Here we addressed the question to what extent maize domestication has affected two crucial chemical and one physical defence traits and whether differences in their expression may explain the differences in herbivore resistance levels. The ontogenetic trajectories of 1,4-benzoxazin-3-ones, maysin and leaf toughness were monitored for different leaf types across several maize cultivars and teosinte accessions during early vegetative growth stages. We found significant quantitative and qualitative differences in 1,4-benzoxazin-3-one accumulation in an initial pairwise comparison, but we did not find consistent differences between wild and cultivated genotypes during a more thorough examination employing several cultivars/accessions. Yet, 1,4-benzoxazin-3-one levels tended to decline more rapidly with plant age in the modern maize cultivars. Foliar maysin levels and leaf toughness increased with plant age in a leaf-specific manner, but were also unaffected by domestication. Based on our findings we suggest that defence traits other than the ones that were investigated are responsible for the observed differences in herbivore resistance between teosinte and maize. Furthermore, our results indicate that single pairwise comparisons may lead to false conclusions regarding the effects of domestication on defensive and possibly other traits.
Resumo:
Advancements in cloud computing have enabled the proliferation of distributed applications, which require management and control of multiple services. However, without an efficient mechanism for scaling services in response to changing workload conditions, such as number of connected users, application performance might suffer, leading to violations of Service Level Agreements (SLA) and possible inefficient use of hardware resources. Combining dynamic application requirements with the increased use of virtualised computing resources creates a challenging resource Management context for application and cloud-infrastructure owners. In such complex environments, business entities use SLAs as a means for specifying quantitative and qualitative requirements of services. There are several challenges in running distributed enterprise applications in cloud environments, ranging from the instantiation of service VMs in the correct order using an adequate quantity of computing resources, to adapting the number of running services in response to varying external loads, such as number of users. The application owner is interested in finding the optimum amount of computing and network resources to use for ensuring that the performance requirements of all her/his applications are met. She/he is also interested in appropriately scaling the distributed services so that application performance guarantees are maintained even under dynamic workload conditions. Similarly, the infrastructure Providers are interested in optimally provisioning the virtual resources onto the available physical infrastructure so that her/his operational costs are minimized, while maximizing the performance of tenants’ applications. Motivated by the complexities associated with the management and scaling of distributed applications, while satisfying multiple objectives (related to both consumers and providers of cloud resources), this thesis proposes a cloud resource management platform able to dynamically provision and coordinate the various lifecycle actions on both virtual and physical cloud resources using semantically enriched SLAs. The system focuses on dynamic sizing (scaling) of virtual infrastructures composed of virtual machines (VM) bounded application services. We describe several algorithms for adapting the number of VMs allocated to the distributed application in response to changing workload conditions, based on SLA-defined performance guarantees. We also present a framework for dynamic composition of scaling rules for distributed service, which used benchmark-generated application Monitoring traces. We show how these scaling rules can be combined and included into semantic SLAs for controlling allocation of services. We also provide a detailed description of the multi-objective infrastructure resource allocation problem and various approaches to satisfying this problem. We present a resource management system based on a genetic algorithm, which performs allocation of virtual resources, while considering the optimization of multiple criteria. We prove that our approach significantly outperforms reactive VM-scaling algorithms as well as heuristic-based VM-allocation approaches.
Resumo:
Various software packages for project management include a procedure for resource-constrained scheduling. In several packages, the user can influence this procedure by selecting a priority rule. However, the resource-allocation methods that are implemented in the procedures are proprietary information; therefore, the question of how the priority-rule selection impacts the performance of the procedures arises. We experimentally evaluate the resource-allocation methods of eight recent software packages using the 600 instances of the PSPLIB J120 test set. The results of our analysis indicate that applying the default rule tends to outperform a randomly selected rule, whereas applying two randomly selected rules tends to outperform the default rule. Applying a small set of more than two rules further improves the project durations considerably. However, a large number of rules must be applied to obtain the best possible project durations.
Resumo:
Kenya has experienced a rapid expansion of the education system partly due to high government expenditure on education. Despite the high level of expenditure on education, primary school enrolment has been declining since early 1990s and until 2003 when gross primary school enrolment increased to 104 percent after the introduction of free primary education. However, with an estimated net primary school enrolment rate of 77 percent, the country is far from achieving universal primary education. The worrying scenario is that the allocations of resources within the education sector seems to be ineffective as the increasing expenditure on education goes to recurrent expenditure (to pay teachers salaries). Kenya's Poverty Reduction Strategy Paper (PRSP) and the Economic Recovery Strategy for wealth and Employment Creation (ERS) outlines education targets of reaching universal primary education by 2015. The Government is faced with budget constrains and therefore the available resources need to be allocated efficiently in order to realize the education targets. The paper uses Budget Negotiation Framework (BNF) to analyze the cost effective ways of resource allocation in the primary education sector to achieve universal primary education and other education targets. Budget Negotiation Framework is a tool that aims at achieving equity and efficiency in resource allocation. Results from the analysis shows that universal primary education by the year 2015 is a feasible target for Kenya. The results also show that with a more cost- effective spending of education resources - increased trained teachers, enhanced textbook supplies and subsidies targeting the poor - the country could realize higher enrolment rates than what has been achieved with free primary education.
Resumo:
Background. Cardiovascular disease (CVD) exhibits the most striking public health significance due to its high prevalence and mortality as well as huge economic burdens all over the world, especially in industrialized countries. Major risk factors of CVDs have been the targets of population-wide prevention in the United States. Economic evaluations provide structured information in regard to the efficiency of resource utilization which can inform decisions of resource allocation. The main purpose of this review is to investigate the pattern of study design of economic evaluations for interventions of CVDs. ^ Methods. Primary journal articles published during 2003-2008 were systematically retrieved via relevant keywords from Medline, NHS Economic Evaluation Database (NHS EED) and EBSCO Academic Search Complete. Only full economic evaluations for narrowly defined CVD interventions were included for this review. The methodological data of interest were extracted from the eligible articles and reorganized in Microsoft Access database. Chi-square tests in SPSS were used to analyze the associations between pairs of categorical data. ^ Results. One hundred and twenty eligible articles were reviewed after two steps of literature selection with explicit inclusion and exclusion criteria. Descriptive statistics were reported regarding the evaluated interventions, outcome measures, unit costing and cost reports. The chi-square test of the association between prevention level of intervention and category of time horizon showed no statistical significance. The chi-square test showed that sponsor type was significantly associated with whether new or standard intervention being concluded as more cost effective. ^ Conclusions. Tertiary prevention and medication interventions are the major interests for economic evaluators. The majority of the evaluations were claimed from either a provider’s or a payer’s perspective. Almost all evaluations adopted gross costing strategy for unit cost data rather than micro costing. EQ-5D is the most commonly used instrument for subjective outcome measurement. More than half of the evaluations used decision analytic modeling techniques. The lack of consistency in study design standards in published evaluations appears in several aspects. Prevention level of intervention is not likely to be a factor for evaluators to decide whether to design an evaluation in a lifetime horizon or not. Published evaluations sponsored by industry are more likely to conclude that new intervention is more cost effective than standard intervention.^
Resumo:
Evaluation methods for assessing the performance of non-profit funders are lacking. The purpose of the research was to create a comprehensive framework that systematically assesses the goals and objectives of a funder, how these relate to the funder's allocation of resources, and the potential impact of programs and services selected by the funder for resource allocation to address organizational goals and objectives. The Houston Affiliate of Susan G. Komen for the Cure, a local chapter of a national breast cancer awareness advocacy organization, was selected as the funding agency whose performance assessment was to assist in the creation of this framework. Evaluation approaches from the government sector were adapted and incorporated into the research to guide the methods used to answer the three research questions corresponding to the three phases of research within the study: (1) what are the funding goals and objectives of the Affiliate?; (2) what allocation scheme does the organization use to address these goals and objectives and select programs for funding?; and, (3) to what extent do the programs funded by the Affiliate have potential long-term impact? ^ Within the first stage of the research, document reviews of the Affiliate's mission-based documents and bylaws and interviews with organizational and community informants revealed a highly latent constellation of broad objectives that were not formalized into one guiding document, thus creating gaps in management and governance. Within the second phase of the research, reviews of grant applications from the 2008-2009 funding cycle and interviews with employees and volunteers familiar with the funding process revealed competing ideas regarding resource allocation in light of vague organizational documents describing funding goals and objectives. Within the final stage of the research, these findings translated to the Affiliate selecting programs with highly varying potential long-term impact with regards to addressing goals and objectives relating to breast cancer education, screening, diagnostics, treatment, and support. The resulting performance assessment framework, consisting of three phases of research utilizing organizational documents and key informant interviews, demonstrated the importance of clearly defined funding goals and objectives, reference documents and committee participation within the funding process, and regular reviews of potential long-term impact for selected programs, all supported by the active participation and governance of a funder's Board of Directors.^
Resumo:
"Technology assessment is a comprehensive form of policy research that examines the short- and long-term social consequences of the application or use of technology" (US Congress 1967).^ This study explored a research methodology appropriate for technology assessment (TA) within the health industry. The case studied was utilization of external Small-Volume Infusion Pumps (SVIP) at a cancer treatment and research center. Primary and secondary data were collected in three project phases. In Phase I, hospital prescription records (N = 14,979) represented SVIP adoption and utilization for the years 1982-1984. The Candidate Adoption-Use (CA-U) diffusion paradigm developed for this study was germane. Compared to classic and unorthodox curves, CA-U more accurately simulated empiric experience. The hospital SVIP 1983-1984 trends denoted assurance in prescribing chemotherapy and concomitant balloon SVIP efficacy and efficiency. Abandonment of battery pumps was predicted while exponential demand for balloon SVIP was forecast for 1985-1987. In Phase II, patients using SVIP (N = 117) were prospectively surveyed from July to October 1984; the data represented a single episode of therapy. The questionnaire and indices, specifically designed to measure the impact of SVIP, evinced face validity. Compeer group data were from pre-SVIP case reviews rather than from an inpatient sample. Statistically significant results indicated that outpatients using SVIP interacted socially more than inpatients using the alternative technology. Additionally, the hospital's education program effectively taught clients to discriminate between self care and professional SVIP services. In these contexts, there was sufficient evidence that the alternative technology restricted patients activity whereas SVIP permitted patients to function more independently and in a social lifestyle, thus adding quality to life. In Phase III, diffusion forecast and patient survey findings were combined with direct observation of clinic services to profile some economic dimensions of SVIP. These three project phases provide a foundation for executing: (1) cost effectiveness analysis of external versus internal infusors, (2) institutional resource allocation, and (3) technology deployment to epidemiology-significant communities. The models and methods tested in this research of clinical technology assessment are innovative and do assess biotechnology. ^
Resumo:
Corticosterone, the main stress hormone in birds, mediates resource allocation, allowing animals to adjust their physiology and behaviour to changes in the environment. Incubation is a time and energy-consuming phase of the avian reproductive cycle. It may be terminated prematurely, when the parents' energy stores are depleted or when environmental conditions are severe. In this study, the effects of experimentally elevated baseline corticosterone levels on the parental investment of incubating male Adelie penguins were investigated. Incubation duration and reproductive success of 60 penguins were recorded. The clutches of some birds were replaced by dummy eggs, which recorded egg temperatures and rotation rates, enabling a detailed investigation of incubation behaviour. Corticosterone levels of treated birds were 2.4-fold higher than those of controls 18 days post treatment. Exogenous corticosterone triggered nest desertion in 61% of the treated birds; consequently reducing reproductive success, indicating that corticosterone can reduce or disrupt parental investment. Regarding egg temperatures, hypothermic events became more frequent and more pronounced in treated birds, before these birds eventually abandoned their nest. The treatment also significantly decreased incubation temperatures by 1.3 °C and lengthened the incubation period by 2.1 days. However, the number of chicks at hatching was similar among successful nests, regardless of treatment. Weather conditions appeared to be particularly important in determining the extent to which corticosterone levels affected the behaviour of penguins, as treated penguins were more sensitive to severe weather conditions. This underlines the importance of considering the interactions of organisms with their environment in studies of animal behaviour and ecophysiology.
Resumo:
We investigated carbon acquisition by the N2-fixing cyanobacterium Trichodesmium IMS101 in response to CO2 levels of 15.1, 37.5, and 101.3 Pa (equivalent to 150, 370, and 1000 ppm). In these acclimations, growth rates as well as cellular C and N contents were measured. In vivo activities of carbonic anhydrase (CA), photosynthetic O2 evolution, and CO2 and HCO3- fluxes were measured using membrane inlet mass spectrometry and the 14C disequilibrium technique. While no differences in growth rates were observed, elevated CO2 levels caused higher C and N quotas and stimulated photosynthesis and N2 fixation. Minimal extracellular CA (eCA) activity was observed, indicating a minor role in carbon acquisition. Rates of CO2 uptake were small relative to total inorganic carbon (Ci) fixation, whereas HCO{3 contributed more than 90% and varied only slightly over the light period and between CO2 treatments. The low eCA activity and preference for HCO3- were verified by the 14C disequilibrium technique. Regarding apparent affinities, half-saturation concentrations (K1/2) for photosynthetic O2 evolution and HCO3- uptake changed markedly over the day and with CO2 concentration. Leakage (CO2 efflux : Ci uptake) showed pronounced diurnal changes. Our findings do not support a direct CO2 effect on the carboxylation efficiency of ribulose-1,5-bisphosphate carboxylase/oxygenase (RubisCO) but point to a shift in resource allocation among photosynthesis, carbon acquisition, and N2 fixation under elevated CO2 levels. The observed increase in photosynthesis and N2fixation could have potential biogeochemical implications, as it may stimulate productivity in N-limited oligotrophic regions and thus provide a negative feedback in rising atmospheric CO2 levels.