919 resultados para Total Cost Management
Resumo:
Next-generation integrated wireless local area network (WLAN) and 3G cellular networks aim to take advantage of the roaming ability in a cellular network and the high data rate services of a WLAN. To ensure successful implementation of an integrated network, many issues must be carefully addressed, including network architecture design, resource management, quality-of-service (QoS), call admission control (CAC) and mobility management. ^ This dissertation focuses on QoS provisioning, CAC, and the network architecture design in the integration of WLANs and cellular networks. First, a new scheduling algorithm and a call admission control mechanism in IEEE 802.11 WLAN are presented to support multimedia services with QoS provisioning. The proposed scheduling algorithms make use of the idle system time to reduce the average packet loss of realtime (RT) services. The admission control mechanism provides long-term transmission quality for both RT and NRT services by ensuring the packet loss ratio for RT services and the throughput for non-real-time (NRT) services. ^ A joint CAC scheme is proposed to efficiently balance traffic load in the integrated environment. A channel searching and replacement algorithm (CSR) is developed to relieve traffic congestion in the cellular network by using idle channels in the WLAN. The CSR is optimized to minimize the system cost in terms of the blocking probability in the interworking environment. Specifically, it is proved that there exists an optimal admission probability for passive handoffs that minimizes the total system cost. Also, a method of searching the probability is designed based on linear-programming techniques. ^ Finally, a new integration architecture, Hybrid Coupling with Radio Access System (HCRAS), is proposed for lowering the average cost of intersystem communication (IC) and the vertical handoff latency. An analytical model is presented to evaluate the system performance of the HCRAS in terms of the intersystem communication cost function and the handoff cost function. Based on this model, an algorithm is designed to determine the optimal route for each intersystem communication. Additionally, a fast handoff algorithm is developed to reduce the vertical handoff latency.^
Resumo:
E=MC³ Energy Equals Management's Continued Cost Concern, is an essay written by Fritz G. Hagenmeyer, Associate Professor, School of Hospitality Management at Florida International University. In the writing, Hagenmeyer initially tenders: “Energy problems in the hospitality industry can be contained or reduced, yielding elevated profits as a result of applied, quality management principles. The concepts, processes and procedures presented in this article are intended to aid present and future managers to become more effective with a sharpened focus on profitability.” This article is an overview of energy efficiency and the management of such. In an expanding energy consumption market with its escalating costs, energy management has become an ever increasing concern and component of responsible hospitality management, Hagenmeyer will have you know. “In endeavoring to "manage" on a day-to-day basis a functioning hospitality building's energy system, the person in charge must take on the role of Justice with her scales, attempting to balance the often varying comfort needs of guests and occupants with the invariable rising costs of energy utilized to generate and maintain such comfort conditions, since comfort is seen as an integral part of the "service," "product," or "price/value” perception of patrons,” says Hagenmeyer. In contrast to what was thought in the mid point of this century - that energy would be abundant and cheap - the reality has set-in that this is not the case; not by a long shot. The author wants you to be aware that energy costs in buildings are a force to be reckoned with; a major expense to be sure. “Since 1973, "energy-conscious design" has begun to become part of the repertoire of architects, design engineers, and construction companies,” Hagenmeyer states. “For instance, whereas office buildings of the early 1970s might have used 400,000 British Thermal Units (BTUs) per square foot year, new buildings are going up that use 55,000 to 65,000 BTUs per square foot year,” Hagenmeyer, like an incandescent bulb, illuminates you. Hagenmeyer references Robert E. Aulbach’s article - Energy Management – when informing you that the hospitality manager should not become complacent in addressing the energy cost issue, but should and must maintain a diligent focus on the problem. Hagenmeyer also makes reference to the Middle East War and to OPEC, and their influence on energy prices. In closing, Hagenmeyer suggests an - Energy Management Action Plan – which he outlines for you.
Resumo:
The Mara River in East Africa is currently experiencing poor water quality and increased fluctuations in seasonal flow. This study investigated technically effective and economically viable Best Management Practices for adoption in the Mara River Basin of Kenya that can stop further water resources degradation. A survey of 155 farmers was conducted in the upper catchment of the Kenyan side of the river basin. Farmers provided their assessment of BMPs that would best suit their farm in terms of water quality improvement, economic feasibility, and technicalsuitability. Cost data on different practices from farmers and published literature was collected. The results indicated that erosion control structures and runoff management practices were most suitable for adoption. The study estimated the total area that would be improved to restore water quality and reduce further water resources degradation. Farmers were found to incur losses from adopting new practices and would therefore require monetary support.
Resumo:
This paper presents a study on the implementation of Real-Time Pricing (RTP) based Demand Side Management (DSM) of water pumping at a clean water pumping station in Northern Ireland, with the intention of minimising electricity costs and maximising the usage of electricity from wind generation. A Genetic Algorithm (GA) was used to create pumping schedules based on system constraints and electricity tariff scenarios. Implementation of this method would allow the water network operator to make significant savings on electricity costs while also helping to mitigate the variability of wind generation.
Resumo:
Teachers frequently struggle to cope with conduct problems in the classroom. The aim of this study was to assess the effectiveness of the Incredible Years Teacher Classroom Management Training Programme for improving teacher competencies and child adjustment. The study involved a group randomised controlled trial which included 22 teachers and 217 children (102 boys and 115 girls). The average age of children included in the study was 5.3 years (standard deviation = 0.89). Teachers were randomly allocated to an intervention group (n = 11 teachers; 110 children) or a waiting-list control group (n = 11; 107 children). The sample also included 63 ‘high-risk’ children (33 intervention; 30 control), who scored above the cut-off (>12) on the Strengths and Difficulties Questionnaire for abnormal socioemotional and behavioural difficulties. Teacher and child behaviours were assessed at baseline and 6 months later using psychometric and observational measures. Programme delivery costs were also analysed. Results showed positive changes in teachers’ self-reported use of positive classroom management strategies (effect size = 0.56), as well as negative classroom management strategies (effect size = −0.43). Teacher reports also highlight improvements in the classroom behaviour of the high-risk group of children, while the estimated cost of delivering the Incredible Years Teacher Classroom Management Training Programme was modest. However, analyses of teacher and child observations were largely non-significant. A need for further research exploring the effectiveness and cost-effectiveness of the Incredible Years Teacher Classroom Management Training Programme is indicated.
Resumo:
This paper presents a study on the implementation of Real-Time Pricing (RTP) based Demand Side Management (DSM) of water pumping at a clean water pumping station in Northern Ireland, with the intention of minimising electricity costs and maximising the usage of electricity from wind generation. A Genetic Algorithm (GA) was used to create pumping schedules based on system constraints and electricity tariff scenarios. Implementation of this method would allow the water network operator to make significant savings on electricity costs while also helping to mitigate the variability of wind generation.
Resumo:
This paper presents a study on the implementation of Real-Time Pricing (RTP) based Demand Side Management (DSM) of water pumping at a clean water pumping station in Northern Ireland, with the intention of minimising electricity costs and maximising the usage of electricity from wind generation. A Genetic Algorithm (GA) was used to create pumping schedules based on system constraints and electricity tariff scenarios. Implementation of this method would allow the water network operator to make significant savings on electricity costs while also helping to mitigate the variability of wind generation.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
BACKGROUND: Total hip replacements (THRs) and total knee replacements (TKRs) are common elective procedures. In the REsearch STudies into the ORthopaedic Experience (RESTORE) programme, we explored the care and experiences of patients with osteoarthritis after being listed for THR and TKR up to the time when an optimal outcome should be expected. OBJECTIVE: To undertake a programme of research studies to work towards improving patient outcomes after THR and TKR. METHODS: We used methodologies appropriate to research questions: systematic reviews, qualitative studies, randomised controlled trials (RCTs), feasibility studies, cohort studies and a survey. Research was supported by patient and public involvement. RESULTS: Systematic review of longitudinal studies showed that moderate to severe long-term pain affects about 7–23% of patients after THR and 10–34% after TKR. In our cohort study, 10% of patients with hip replacement and 30% with knee replacement showed no clinically or statistically significant functional improvement. In our review of pain assessment few research studies used measures to capture the incidence, character and impact of long-term pain. Qualitative studies highlighted the importance of support by health and social professionals for patients at different stages of the joint replacement pathway. Our review of longitudinal studies suggested that patients with poorer psychological health, physical function or pain before surgery had poorer long-term outcomes and may benefit from pre-surgical interventions. However, uptake of a pre-operative pain management intervention was low. Although evidence relating to patient outcomes was limited, comorbidities are common and may lead to an increased risk of adverse events, suggesting the possible value of optimising pre-operative management. The evidence base on clinical effectiveness of pre-surgical interventions, occupational therapy and physiotherapy-based rehabilitation relied on small RCTs but suggested short-term benefit. Our feasibility studies showed that definitive trials of occupational therapy before surgery and post-discharge group-based physiotherapy exercise are feasible and acceptable to patients. Randomised trial results and systematic review suggest that patients with THR should receive local anaesthetic infiltration for the management of long-term pain, but in patients receiving TKR it may not provide additional benefit to femoral nerve block. From a NHS and Personal Social Services perspective, local anaesthetic infiltration was a cost-effective treatment in primary THR. In qualitative interviews, patients and health-care professionals recognised the importance of participating in the RCTs. To support future interventions and their evaluation, we conducted a study comparing outcome measures and analysed the RCTs as cohort studies. Analyses highlighted the importance of different methods in treating and assessing hip and knee osteoarthritis. There was an inverse association between radiographic severity of osteoarthritis and pain and function in patients waiting for TKR but no association in THR. Different pain characteristics predicted long-term pain in THR and TKR. Outcomes after joint replacement should be assessed with a patient-reported outcome and a functional test. CONCLUSIONS: The RESTORE programme provides important information to guide the development of interventions to improve long-term outcomes for patients with osteoarthritis receiving THR and TKR. Issues relating to their evaluation and the assessment of patient outcomes are highlighted. Potential interventions at key times in the patient pathway were identified and deserve further study, ultimately in the context of a complex intervention.
Resumo:
In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.
Resumo:
This dissertation mainly focuses on coordinated pricing and inventory management problems, where the related background is provided in Chapter 1. Several periodic-review models are then discussed in Chapters 2,3,4 and 5, respectively. Chapter 2 analyzes a deterministic single-product model, where a price adjustment cost incurs if the current selling price is changed from the previous period. We develop exact algorithms for the problem under different conditions and find out that computation complexity varies significantly associated with the cost structure. %Moreover, our numerical study indicates that dynamic pricing strategies may outperform static pricing strategies even when price adjustment cost accounts for a significant portion of the total profit. Chapter 3 develops a single-product model in which demand of a period depends not only on the current selling price but also on past prices through the so-called reference price. Strongly polynomial time algorithms are designed for the case without no fixed ordering cost, and a heuristic is proposed for the general case together with an error bound estimation. Moreover, our illustrates through numerical studies that incorporating reference price effect into coordinated pricing and inventory models can have a significant impact on firms' profits. Chapter 4 discusses the stochastic version of the model in Chapter 3 when customers are loss averse. It extends the associated results developed in literature and proves that the reference price dependent base-stock policy is proved to be optimal under a certain conditions. Instead of dealing with specific problems, Chapter 5 establishes the preservation of supermodularity in a class of optimization problems. This property and its extensions include several existing results in the literature as special cases, and provide powerful tools as we illustrate their applications to several operations problems: the stochastic two-product model with cross-price effects, the two-stage inventory control model, and the self-financing model.
Resumo:
Early water resources modeling efforts were aimed mostly at representing hydrologic processes, but the need for interdisciplinary studies has led to increasing complexity and integration of environmental, social, and economic functions. The gradual shift from merely employing engineering-based simulation models to applying more holistic frameworks is an indicator of promising changes in the traditional paradigm for the application of water resources models, supporting more sustainable management decisions. This dissertation contributes to application of a quantitative-qualitative framework for sustainable water resources management using system dynamics simulation, as well as environmental systems analysis techniques to provide insights for water quality management in the Great Lakes basin. The traditional linear thinking paradigm lacks the mental and organizational framework for sustainable development trajectories, and may lead to quick-fix solutions that fail to address key drivers of water resources problems. To facilitate holistic analysis of water resources systems, systems thinking seeks to understand interactions among the subsystems. System dynamics provides a suitable framework for operationalizing systems thinking and its application to water resources problems by offering useful qualitative tools such as causal loop diagrams (CLD), stock-and-flow diagrams (SFD), and system archetypes. The approach provides a high-level quantitative-qualitative modeling framework for "big-picture" understanding of water resources systems, stakeholder participation, policy analysis, and strategic decision making. While quantitative modeling using extensive computer simulations and optimization is still very important and needed for policy screening, qualitative system dynamics models can improve understanding of general trends and the root causes of problems, and thus promote sustainable water resources decision making. Within the system dynamics framework, a growth and underinvestment (G&U) system archetype governing Lake Allegan's eutrophication problem was hypothesized to explain the system's problematic behavior and identify policy leverage points for mitigation. A system dynamics simulation model was developed to characterize the lake's recovery from its hypereutrophic state and assess a number of proposed total maximum daily load (TMDL) reduction policies, including phosphorus load reductions from point sources (PS) and non-point sources (NPS). It was shown that, for a TMDL plan to be effective, it should be considered a component of a continuous sustainability process, which considers the functionality of dynamic feedback relationships between socio-economic growth, land use change, and environmental conditions. Furthermore, a high-level simulation-optimization framework was developed to guide watershed scale BMP implementation in the Kalamazoo watershed. Agricultural BMPs should be given priority in the watershed in order to facilitate cost-efficient attainment of the Lake Allegan's TP concentration target. However, without adequate support policies, agricultural BMP implementation may adversely affect the agricultural producers. Results from a case study of the Maumee River basin show that coordinated BMP implementation across upstream and downstream watersheds can significantly improve cost efficiency of TP load abatement.
Resumo:
This study aimed to survey farmers knowledge and practices on the management of pastures, stocking rates and markets of meat goat-producing enterprises within New South Wales and Queensland, Australia. An interview-based questionnaire was conducted on properties that derived a significant proportion of their income from goats. The survey covered 31 landholders with a total land area of 567 177 ha and a reported total of 160 010 goats. A total of 55% (17/31) of producers were involved in both opportunistic harvesting and commercial goat operations, and 45% (14/31) were specialised seedstock producers. Goats were the most important livestock enterprise on 55% (17/31) of surveyed properties. Stocking rate varied considerably (0.3?9.3 goats/ha) within and across surveyed properties and was found to be negatively associated with property size and positively associated with rainfall. Overall, 81% (25/31) of producers reported that the purpose of running goats on their properties was to target international markets. Producers also cited the importance of targeting markets as a way to increase profitability. Fifty-three percent of producers were located over 600 km from a processing plant and the high cost of freight can limit the continuity of goats supplied to abattoirs. Fencing was an important issue for goat farmers, with many producers acknowledging this could potentially add to capital costs associated with better goat management and production. Producers in the pastoral regions appear to have a low investment in pasture development and opportunistic goat harvesting appears to be an important source of income.
Resumo:
In this thesis, the optimal operation of a neighborhood of smart households in terms of minimizing the total energy cost is analyzed. Each household may comprise several assets such as electric vehicles, controllable appliances, energy storage and distributed generation. Bi-directional power flow is considered for each household . Apart from the distributed generation unit, technological options such as vehicle-to-home and vehicle-to-grid are available to provide energy to cover self-consumption needs and to export excessive energy to other households, respectively.
Resumo:
To evaluate the occurrence of severe obstetric complications associated with antepartum and intrapartum hemorrhage among women from the Brazilian Network for Surveillance of Severe Maternal Morbidity. Multicenter cross-sectional study. Twenty-seven obstetric referral units in Brazil between July 2009 and June 2010. A total of 9555 women categorized as having obstetric complications. The occurrence of potentially life-threatening conditions, maternal near miss and maternal deaths associated with antepartum and intrapartum hemorrhage was evaluated. Sociodemographic and obstetric characteristics and the use of criteria for management of severe bleeding were also assessed in these women. The prevalence ratios with their respective 95% confidence intervals adjusted for the cluster effect of the design, and multiple logistic regression analysis were performed to identify factors independently associated with the occurrence of severe maternal outcome. Antepartum and intrapartum hemorrhage occurred in only 8% (767) of women experiencing any type of obstetric complication. However, it was responsible for 18.2% (140) of maternal near miss and 10% (14) of maternal death cases. On multivariate analysis, maternal age and previous cesarean section were shown to be independently associated with an increased risk of severe maternal outcome (near miss or death). Severe maternal outcome due to antepartum and intrapartum hemorrhage was highly prevalent among Brazilian women. Certain risk factors, maternal age and previous cesarean delivery in particular, were associated with the occurrence of bleeding.