977 resultados para Resource Utilization


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Orthotopic liver retransplantation (re-OLT) is highly controversial. The objectives of this study were to determine the validity of a recently developed United Network for Organ Sharing (UNOS) multivariate model using an independent cohort of patients undergoing re-OLT outside the United States, to determine whether incorporation of other variables that were incomplete in the UNOS registry would provide additional prognostic information, to develop new models combining data sets from both cohorts, and to evaluate the validity of the model for end-stage liver disease (MELD) in patients undergoing re-OLT. Two hundred eighty-one adult patients undergoing re-OLT (between 1986 and 1999) at 6 foreign transplant centers comprised the validation cohort. We found good agreement between actual survival and predicted survival in the validation cohort; 1-year patient survival rates in the low-, intermediate-, and high-risk groups (as assigned by the original UNOS model) were 72%, 68%, and 36%, respectively (P < .0001). In the patients for whom the international normalized ratio (INR) of prothrombin time was available, MELD correlated with outcome following re-OLT; the median MELD scores for patients surviving at least 90 days compared with those dying within 90 days were 20.75 versus 25.9, respectively (P = .004). Utilizing both patient cohorts (n = 979), a new model, based on recipient age, total serum bilirubin, creatinine, and interval to re-OLT, was constructed (whole model χ(2) = 105, P < .0001). Using the c-statistic with 30-day, 90-day, 1-year, and 3-year mortality as the end points, the area under the receiver operating characteristic (ROC) curves for 4 different models were compared. In conclusion, prospective validation and use of these models as adjuncts to clinical decision making in the management of patients being considered for re-OLT are warranted.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The demand for palliative care is increasing, yet there are few data on the best models of care nor well-validated interventions that translate current evidence into clinical practice. Supporting multidisciplinary patient-centered palliative care while successfully conducting a large clinical trial is a challenge. The Palliative Care Trial (PCT) is a pragmatic 2 x 2 x 2 factorial cluster randomized controlled trial that tests the ability of educational outreach visiting and case conferencing to improve patient-based outcomes such as performance status and pain intensity. Four hundred sixty-one consenting patients and their general practitioners (GPs) were randomized to the following: (1) GP educational outreach visiting versus usual care, (2) Structured patient and caregiver educational outreach visiting versus usual care and (3) A coordinated palliative care model of case conferencing versus the standard model of palliative care in Adelaide, South Australia (3:1 randomization). Main outcome measures included patient functional status over time, pain intensity, and resource utilization. Participants were followed longitudinally until death or November 30, 2004. The interventions are aimed at translating current evidence into clinical practice and there was particular attention in the trial's design to addressing common pitfalls for clinical studies in palliative care. Given the need for evidence about optimal interventions and service delivery models that improve the care of people with life-limiting illness, the results of this rigorous, high quality clinical trial will inform practice. Initial results are expected in mid 2005. (c) 2005 Elsevier Inc. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O modelo brasileiro de utilização de recursos provenientes de biomassa pode ser considerado como referência na utilização em substituição da matriz energética. Dentre eles o etanol vem se destacando como uma fonte de bioenergia cada vez mais utilizada, principalmente na frota de veículos, tal incentivo vem sendo sedimentado ao longo de quase quatro décadas desde a primeira crise do petróleo com a implantação do PROALCOOL até o desenvolvimento e a aplicação da tecnologia de carros bicombustíveis conhecidos como veículos flex, que hoje representam aproximadamente 90% dos automóveis vendidos. O presente trabalho buscará identificar a existência de uma relação entre os indicadores de produção de automóveis, o aumento de produção de etanol e as variáveis macroeconômicas pelos índices de INCC, IPCA e IGP-M que são amplamente conhecidos e reconhecidos pelo governo, empresários e população. Foi utilizada a técnica multivariada de regressão e correlação com auxilio do oftware SPSS. Os resultados sugerem que existe uma correlação entre os índices macroeconômicos mais baixos e o aumento da produção de automóveis e de etanol.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, the implementation aspects and constraints of the simplest network coding (NC) schemes for a two-way relay channel (TWRC) composed of a user equipment (mobile terminal), an LTE relay station (RS) and an LTE base station (eNB) are considered in order to assess the usefulness of the NC in more realistic scenarios. The information exchange rate gain (IERG), the energy reduction gain (ERG) and the resource utilization gain (RUG) of the NC schemes with and without subcarrier division duplexing (SDD) are obtained by computer simulations. The usefulness of the NC schemes are evaluated for varying traffic load levels, the geographical distances between the nodes, the RS transmit powers, and the maximum numbers of retransmissions. Simulation results show that the NC schemes with and without SDD, have the throughput gains 0.5% and 25%, the ERGs 7 - 12% and 16 - 25%, and the RUGs 0.5 - 3.2%, respectively. It is found that the NC can provide performance gains also for the users at the cell edge. Furthermore, the ERGs of the NC increase with the transmit power of the relay while the ERGs of the NC remain the same even when the maximum number of retransmissions is reduced.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper, using detailed time measurements of patients complemented by interviews with hospital management and staff, examines three facets of an emergency room's (ER) operational performance: (1) effectiveness of the triage system in rationing patient treatment; (2) factors influencing ER's operational performance in general and the trade-offs in flow times, inventory levels (that is the number of patients waiting in the system), and resource utilization; (3) the impacts of potential process and staffing changes to improve the ER's performance. Specifically, the paper discusses four proposals for streamlining the patient flow: establishing designated tracks (fast track, diagnostic track), creating a holding area for certain type of patients, introducing a protocol that would reduce the load on physicians by allowing a registered nurse to order testing and treatment for some patients, and potentially and in the longer term, moving from non-ER specialist physicians to ER specialists. The paper's findings are based on analyzing the paths and flow times of close to two thousand patients in the emergency room of the Medical Center of Leeuwarden (MCL), The Netherlands. Using exploratory data analysis the paper presents generalizable findings about the impacts of various factors on ER's lead-time performance and shows how the proposals fit with well-documented process improvement theories. © 2010 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We used a one-dimensional, spatially explicit model to simulate the community of small fishes in the freshwater wetlands of southern Florida, USA. The seasonality of rainfall in these wetlands causes annual fluctuations in the amount of flooded area. We modeled fish populations that differed from each other only in efficiency of resource utilization and dispersal ability. The simulations showed that these trade-offs, along with the spatial and temporal variability of the environment, allow coexistence of several species competing exploitatively for a common resource type. This mechanism, while sharing some characteristics with other mechanisms proposed for coexistence of competing species, is novel in detail. Simulated fish densities resembled patterns observed in Everglades empirical data. Cells with hydroperiods less than 6 months accumulated negligible fish biomass. One unique model result was that, when multiple species coexisted, it was possible for one of the coexisting species to have both lower local resource utilization efficiency and lower dispersal ability than one of the other species. This counterintuitive result is a consequence of stronger effects of other competitors on the superior species.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This dissertation presents and evaluates a methodology for scheduling medical application workloads in virtualized computing environments. Such environments are being widely adopted by providers of "cloud computing" services. In the context of provisioning resources for medical applications, such environments allow users to deploy applications on distributed computing resources while keeping their data secure. Furthermore, higher level services that further abstract the infrastructure-related issues can be built on top of such infrastructures. For example, a medical imaging service can allow medical professionals to process their data in the cloud, easing them from the burden of having to deploy and manage these resources themselves. In this work, we focus on issues related to scheduling scientific workloads on virtualized environments. We build upon the knowledge base of traditional parallel job scheduling to address the specific case of medical applications while harnessing the benefits afforded by virtualization technology. To this end, we provide the following contributions: (1) An in-depth analysis of the execution characteristics of the target applications when run in virtualized environments. (2) A performance prediction methodology applicable to the target environment. (3) A scheduling algorithm that harnesses application knowledge and virtualization-related benefits to provide strong scheduling performance and quality of service guarantees. In the process of addressing these pertinent issues for our target user base (i.e. medical professionals and researchers), we provide insight that benefits a large community of scientific application users in industry and academia. Our execution time prediction and scheduling methodologies are implemented and evaluated on a real system running popular scientific applications. We find that we are able to predict the execution time of a number of these applications with an average error of 15%. Our scheduling methodology, which is tested with medical image processing workloads, is compared to that of two baseline scheduling solutions and we find that it outperforms them in terms of both the number of jobs processed and resource utilization by 20–30%, without violating any deadlines. We conclude that our solution is a viable approach to supporting the computational needs of medical users, even if the cloud computing paradigm is not widely adopted in its current form.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Physiological processes and local-scale structural dynamics of mangroves are relatively well studied. Regional-scale processes, however, are not as well understood. Here we provide long-term data on trends in structure and forest turnover at a large scale, following hurricane damage in mangrove ecosystems of South Florida, U.S.A. Twelve mangrove vegetation plots were monitored at periodic intervals, between October 1992 and March 2005. Mangrove forests of this region are defined by a −1.5 scaling relationship between mean stem diameter and stem density, mirroring self-thinning theory for mono-specific stands. This relationship is reflected in tree size frequency scaling exponents which, through time, have exhibited trends toward a community average that is indicative of full spatial resource utilization. These trends, together with an asymptotic standing biomass accumulation, indicate that coastal mangrove ecosystems do adhere to size-structured organizing principles as described for upland tree communities. Regenerative dynamics are different between areas inside and outside of the primary wind-path of Hurricane Andrew which occurred in 1992. Forest dynamic turnover rates, however, are steady through time. This suggests that ecological, more-so than structural factors, control forest productivity. In agreement, the relative mean rate of biomass growth exhibits an inverse relationship with the seasonal range of porewater salinities. The ecosystem average in forest scaling relationships may provide a useful investigative tool of mangrove community biomass relationships, as well as offer a robust indicator of general ecosystem health for use in mangrove forest ecosystem management and restoration.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Modern data centers host hundreds of thousands of servers to achieve economies of scale. Such a huge number of servers create challenges for the data center network (DCN) to provide proportionally large bandwidth. In addition, the deployment of virtual machines (VMs) in data centers raises the requirements for efficient resource allocation and find-grained resource sharing. Further, the large number of servers and switches in the data center consume significant amounts of energy. Even though servers become more energy efficient with various energy saving techniques, DCN still accounts for 20% to 50% of the energy consumed by the entire data center. The objective of this dissertation is to enhance DCN performance as well as its energy efficiency by conducting optimizations on both host and network sides. First, as the DCN demands huge bisection bandwidth to interconnect all the servers, we propose a parallel packet switch (PPS) architecture that directly processes variable length packets without segmentation-and-reassembly (SAR). The proposed PPS achieves large bandwidth by combining switching capacities of multiple fabrics, and it further improves the switch throughput by avoiding padding bits in SAR. Second, since certain resource demands of the VM are bursty and demonstrate stochastic nature, to satisfy both deterministic and stochastic demands in VM placement, we propose the Max-Min Multidimensional Stochastic Bin Packing (M3SBP) algorithm. M3SBP calculates an equivalent deterministic value for the stochastic demands, and maximizes the minimum resource utilization ratio of each server. Third, to provide necessary traffic isolation for VMs that share the same physical network adapter, we propose the Flow-level Bandwidth Provisioning (FBP) algorithm. By reducing the flow scheduling problem to multiple stages of packet queuing problems, FBP guarantees the provisioned bandwidth and delay performance for each flow. Finally, while DCNs are typically provisioned with full bisection bandwidth, DCN traffic demonstrates fluctuating patterns, we propose a joint host-network optimization scheme to enhance the energy efficiency of DCNs during off-peak traffic hours. The proposed scheme utilizes a unified representation method that converts the VM placement problem to a routing problem and employs depth-first and best-fit search to find efficient paths for flows.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The thesis focuses on a central theme of the epidemiology and health economics of ankle sprains to inform health policy and the provision of health services. It describes the burden, prognosis, resource utilization, and costs attributed to these injuries. The first manuscript systematically reviewed 34 studies on the direct and indirect costs of treating ankle and foot injuries. The overall costs per patient ranged from $2,075- $3,799 (2014 USD) for ankle sprains; $290-$20,132 for ankle fractures; and $6,345-$45,731 for foot fractures, reflecting differences in injury severity, treatment methods, and study characteristics. The second manuscript provided an epidemiological and economic profile of non-fracture ankle and foot injuries in Ontario using linked databases from the Institute for Clinical Evaluative Sciences. The incidence rate of ankle sprains was 16.9/1,000 person-years. Annually, ankle and foot injuries cost $21,685,876 (2015 CAD). The mean expenses per case were $99.98 (95% CI, $99.70-100.26) for any injury. Costs ranged from $133.78-$210.75 for ankle sprains and $1,497.12-$1,755.69 for dislocations. The third manuscript explored the impact of body mass index on recovery from medically attended grade 1 and 2 ankle sprains using the Foot and Ankle Outcome Score. Data came from a randomized controlled trial of a physiotherapy intervention in Kingston, Ontario. At six months, the odds ratio of recovery for participants with obesity was 0.60 (0.37-0.97) before adjustment and 0.74 (0.43-1.29) after adjustment compared to non-overweight participants. The fourth manuscript used trial data to examine the health-related quality of life among ankle sprain patients using the Health Utilities Index version 3 (HUI-3). The greatest improvements in scores were seen at one month post-injury (HUI-3: 0.88, 95% CI: 0.86-0.90). Individuals with grade 2 sprains had significantly lower ambulation scores than those with grade 1 sprains (0.70 vs. 0.84; p<0.05). The final manuscript used trial data to describe the financial burden (direct and indirect costs) of ankle sprains. The overall mean costs were $1,508 (SD: $1,452) at one month and increased to $2,206 (SD: $3,419) at six months. Individuals with more severe injuries at baseline had significantly higher (p<0.001) costs compared to individuals with less severe injuries, after controlling for confounders.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Simulating the efficiency of business processes could reveal crucial bottlenecks for manufacturing companies and could lead to significant optimizations resulting in decreased time to market, more efficient resource utilization, and larger profit. While such business optimization software is widely utilized by larger companies, SMEs typically do not have the required expertise and resources to efficiently exploit these advantages. The aim of this work is to explore how simulation software vendors and consultancies can extend their portfolio to SMEs by providing business process optimization based on a cloud computing platform. By executing simulation runs on the cloud, software vendors and associated business consultancies can get access to large computing power and data storage capacity on demand, run large simulation scenarios on behalf of their clients, analyze simulation results, and advise their clients regarding process optimization. The solution is mutually beneficial for both vendor/consultant and the end-user SME. End-user companies will only pay for the service without requiring large upfront costs for software licenses and expensive hardware. Software vendors can extend their business towards the SME market with potentially huge benefits.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Network Virtualization is a key technology for the Future Internet, allowing the deployment of multiple independent virtual networks that use resources of the same basic infrastructure. An important challenge in the dynamic provision of virtual networks resides in the optimal allocation of physical resources (nodes and links) to requirements of virtual networks. This problem is known as Virtual Network Embedding (VNE). For the resolution of this problem, previous research has focused on designing algorithms based on the optimization of a single objective. On the contrary, in this work we present a multi-objective algorithm, called VNE-MO-ILP, for solving dynamic VNE problem, which calculates an approximation of the Pareto Front considering simultaneously resource utilization and load balancing. Experimental results show evidences that the proposed algorithm is better or at least comparable to a state-of-the-art algorithm. Two performance metrics were simultaneously evaluated: (i) Virtual Network Request Acceptance Ratio and (ii) Revenue/Cost Relation. The size of test networks used in the experiments shows that the proposed algorithm scales well in execution times, for networks of 84 nodes