926 resultados para Static average-case analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Subtitle D of the Resource Conservation and Recovery Act (RCRA) requires a post closure period of 30 years for non hazardous wastes in landfills. Post closure care (PCC) activities under Subtitle D include leachate collection and treatment, groundwater monitoring, inspection and maintenance of the final cover, and monitoring to ensure that landfill gas does not migrate off site or into on site buildings. The decision to reduce PCC duration requires exploration of a performance based methodology to Florida landfills. PCC should be based on whether the landfill is a threat to human health or the environment. Historically no risk based procedure has been available to establish an early end to PCC. Landfill stability depends on a number of factors that include variables that relate to operations both before and after the closure of a landfill cell. Therefore, PCC decisions should be based on location specific factors, operational factors, design factors, post closure performance, end use, and risk analysis. The question of appropriate PCC period for Florida’s landfills requires in depth case studies focusing on the analysis of the performance data from closed landfills in Florida. Based on data availability, Davie Landfill was identified as case study site for a case by case analysis of landfill stability. The performance based PCC decision system developed by Geosyntec Consultants was used for the assessment of site conditions to project PCC needs. The available data for leachate and gas quantity and quality, ground water quality, and cap conditions were evaluated. The quality and quantity data for leachate and gas were analyzed to project the levels of pollutants in leachate and groundwater in reference to maximum contaminant level (MCL). In addition, the projected amount of gas quantity was estimated. A set of contaminants (including metals and organics) were identified as contaminants detected in groundwater for health risk assessment. These contaminants were selected based on their detection frequency and levels in leachate and ground water; and their historical and projected trends. During the evaluations a range of discrepancies and problems that related to the collection and documentation were encountered and possible solutions made. Based on the results of PCC performance integrated with risk assessment, projection of future PCC monitoring needs and sustainable waste management options were identified. According to these results, landfill gas monitoring can be terminated, leachate and groundwater monitoring for parameters above MCL and surveying of the cap integrity should be continued. The parameters which cause longer monitoring periods can be eliminated for the future sustainable landfills. As a conclusion, 30 year PCC period can be reduced for some of the landfill components based on their potential impacts to human health and environment (HH&E).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To promote regional or mutual improvement, numerous interjurisdictional efforts to share tax bases have been attempted. Most of these efforts fail to be consummated. Motivations to share revenues include: narrowing fiscal disparities, enhancing regional cooperation and economic development, rationalizing land-use, and minimizing revenue losses caused by competition to attract and keep businesses. Various researchers have developed theories to aid understanding of why interjurisdictional cooperation efforts succeed or fail. Walter Rosenbaum and Gladys Kammerer studied two contemporaneous Florida local-government consolidation attempts. Boyd Messinger subsequently tested their Theory of Successful Consolidation on nine consolidation attempts. Paul Peterson's dual theories on Modern Federalism posit that all governmental levels attempt to further economic development and that politicians act in ways that either further their futures or cement job security. Actions related to the latter theory often interfere with the former. Samuel Nunn and Mark Rosentraub sought to learn how interjurisdictional cooperation evolves. Through multiple case studies they developed a model framing interjurisdictional cooperation in four dimensions. ^ This dissertation investigates the ability of the above theories to help predict success or failure of regional tax-base revenue sharing attempts. A research plan was formed that used five sequenced steps to gather data, analyze it, and conclude if hypotheses concerning the application of these theories were valid. The primary analytical tools were: multiple case studies, cross-case analysis, and pattern matching. Data was gathered from historical records, questionnaires, and interviews. ^ The results of this research indicate that Rosenbaum-Kammerer theory can be a predictor of success or failure in implementing tax-base revenue sharing if it is amended as suggested by Messinger and further modified by a recommendation in this dissertation. Peterson's Functional and Legislative theories considered together were able to predict revenue sharing proposal outcomes. Many of the indicators of interjurisdictional cooperation forwarded in the Nunn-Rosentraub model appeared in the cases studied, but the model was not a reliable forecasting instrument. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Subtitle D of the Resource Conservation and Recovery Act (RCRA) requires a post closure period of 30 years for non hazardous wastes in landfills. Post closure care (PCC) activities under Subtitle D include leachate collection and treatment, groundwater monitoring, inspection and maintenance of the final cover, and monitoring to ensure that landfill gas does not migrate off site or into on site buildings. The decision to reduce PCC duration requires exploration of a performance based methodology to Florida landfills. PCC should be based on whether the landfill is a threat to human health or the environment. Historically no risk based procedure has been available to establish an early end to PCC. Landfill stability depends on a number of factors that include variables that relate to operations both before and after the closure of a landfill cell. Therefore, PCC decisions should be based on location specific factors, operational factors, design factors, post closure performance, end use, and risk analysis. The question of appropriate PCC period for Florida’s landfills requires in depth case studies focusing on the analysis of the performance data from closed landfills in Florida. Based on data availability, Davie Landfill was identified as case study site for a case by case analysis of landfill stability. The performance based PCC decision system developed by Geosyntec Consultants was used for the assessment of site conditions to project PCC needs. The available data for leachate and gas quantity and quality, ground water quality, and cap conditions were evaluated. The quality and quantity data for leachate and gas were analyzed to project the levels of pollutants in leachate and groundwater in reference to maximum contaminant level (MCL). In addition, the projected amount of gas quantity was estimated. A set of contaminants (including metals and organics) were identified as contaminants detected in groundwater for health risk assessment. These contaminants were selected based on their detection frequency and levels in leachate and ground water; and their historical and projected trends. During the evaluations a range of discrepancies and problems that related to the collection and documentation were encountered and possible solutions made. Based on the results of PCC performance integrated with risk assessment, projection of future PCC monitoring needs and sustainable waste management options were identified. According to these results, landfill gas monitoring can be terminated, leachate and groundwater monitoring for parameters above MCL and surveying of the cap integrity should be continued. The parameters which cause longer monitoring periods can be eliminated for the future sustainable landfills. As a conclusion, 30 year PCC period can be reduced for some of the landfill components based on their potential impacts to human health and environment (HH&E).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An increasing number of people with terminal cancer are being cared for at home, often by their partner. This study explores the identity, experiences and relationships of people caring for their partner at the end of life and how they construct their experience through personal and couple narratives. It draws upon dialogical approaches to narrative analysis to focus on caring partners and the care relationship. Six participants were recruited for the study. Two methods of data collection are used: narrative interviews and journals. Following individual case analysis, two methods of cross-narrative analysis are used: an analysis of narrative themes and an identification of narrative types. The key findings can be summarised as follows. First, in the period since their partner's terminal prognosis, participants sustained and reconstructed self and couple relationship narratives. These narratives aided the construction of meaning and coherence at a time of major biographical disruption: the anticipated loss of a partner. Second, the study highlights the complexity of spoken and unspoken narratives in terminal cancer and how these relate to individual and couple identities. Third, a typology of archetypal narratives based upon the data is identified. The blow-by-blow narratives illustrate how participants sought to construct coherence and meaning in the illness story, while champion and resilience narratives demonstrate how participants utilised positive self and relational narratives to manage a time of biographical disruption. The study highlights how this narrative approach can enhance understanding of the experiences and identities of people caring for a terminally ill partner.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Skepticism of promised value-added is forcing suppliers to provide tangible evidence of the value they can deliver for the customers in industrial markets. Despite this, quantifying customer benefits is being thought as one of the most difficult part in business-to-business selling. The objective of this research is to identify the desired and perceived customer benefits of KONE JumpLift™ and improve the overall customer value quantification and selling process of the solution. The study was conducted with a qualitative case analysis including 7 interviews with key stakeholders from three different market areas. The market areas were chosen based on where the offering has been utilized and the research was conducted by five telephone and two email interviews. The main desired and perceived benefits include many different values for example economical, functional, symbolic and epistemic value but they vary on studied market areas. The most important result of the research was finding the biggest challenges of selling the offering which are communicating and proving the potential value to the customers. In addition, the sales arguments have different relative importance in studied market areas which create challenges for salespeople to sell the offering effectively. In managerial level this means need for investing into a new sales tool and training the salespeople.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Even though the use of recommender systems is already widely spread in several application areas, there is still a lack of studies for accessibility research field. One of these attempts to use recommender system benefits for accessibility needs is Vulcanus. The Vulcanus recommender system uses similarity analysis to compare user’s trails. In this way, it is possible to take advantage of the user’s past behavior and distribute personalized content and services. The Vulcanus combined concepts from ubiquitous computing, such as user profiles, context awareness, trails management, and similarity analysis. It uses two different approaches for trails similarity analysis: resources patterns and categories patterns. In this work we performed an asymptotic analysis, identifying Vulcanus’ algorithm complexity. Furthermore we also propose improvements achieved by dynamic programming technique, so the ordinary case is improved by using a bottom-up approach. With that approach, many unnecessary comparisons can be skipped and now Vulcanus 2.0 is presented with improvements in its average case scenario.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the standard Vehicle Routing Problem (VRP), we route a fleet of vehicles to deliver the demands of all customers such that the total distance traveled by the fleet is minimized. In this dissertation, we study variants of the VRP that minimize the completion time, i.e., we minimize the distance of the longest route. We call it the min-max objective function. In applications such as disaster relief efforts and military operations, the objective is often to finish the delivery or the task as soon as possible, not to plan routes with the minimum total distance. Even in commercial package delivery nowadays, companies are investing in new technologies to speed up delivery instead of focusing merely on the min-sum objective. In this dissertation, we compare the min-max and the standard (min-sum) objective functions in a worst-case analysis to show that the optimal solution with respect to one objective function can be very poor with respect to the other. The results motivate the design of algorithms specifically for the min-max objective. We study variants of min-max VRPs including one problem from the literature (the min-max Multi-Depot VRP) and two new problems (the min-max Split Delivery Multi-Depot VRP with Minimum Service Requirement and the min-max Close-Enough VRP). We develop heuristics to solve these three problems. We compare the results produced by our heuristics to the best-known solutions in the literature and find that our algorithms are effective. In the case where benchmark instances are not available, we generate instances whose near-optimal solutions can be estimated based on geometry. We formulate the Vehicle Routing Problem with Drones and carry out a theoretical analysis to show the maximum benefit from using drones in addition to trucks to reduce delivery time. The speed-up ratio depends on the number of drones loaded onto one truck and the speed of the drone relative to the speed of the truck.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper empirically investigates volatility transmission among stock and foreign exchange markets in seven major world economies during the period July 1988 to January 2015. To this end, we first perform a static and dynamic analysis to measure the total volatility connectedness in the entire period (the system-wide approach) using a framework recently proposed by Diebold and Yilmaz (2014). Second, we make use of a dynamic analysis to evaluate the net directional connectedness for each market. To gain further insights, we examine the time-varying behaviour of net pair-wise directional connectedness during the financial turmoil periods experienced in the sample period Our results suggest that slightly more than half of the total variance of the forecast errors is explained by shocks across markets rather than by idiosyncratic shocks. Furthermore, we find that volatility connectedness varies over time, with a surge during periods of increasing economic and financial instability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper tests the four-phase heuristic model of change in resource management regimes developed by Gunderson et al. (1995. In: Barriers and Bridges to the Renewal of Ecosystems and Institutions. Columbia University Press, New York, pp. 489-533) by applying it to a case analysis of rainforest management in northeastern Australia. The model suggests that resource management regimes change in four phases: (i) crisis caused by external factors, (ii) a search for alternative management solutions, (iii) creation of a new management regime, and (iv) bureaucratic implementation of the new arrangements. The history of human use arid management of the tropical forests of this region is described and applied to this model. The ensuing analysis demonstrates that: (i) resource management tends to be characterized by a series of distinct eras; (ii) changes to management regimes are precipitated by crisis; and (iii) change is externally generated. The paper concludes by arguing that this theoretical perspective oil institutional change in resource management systems has wider utility. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As sociedades estatais s??o importantes organiza????es societ??rias, que podem servir ao administrador p??blico para a realiza????o de atividades econ??micas que configurem relevante interesse coletivo ou motivo de seguran??a nacional. Com a finalidade de adequarem-se aos limites de investimento p??blico e com vistas ao cumprimento do princ??pio da efici??ncia, elas podem utilizar-se de mecanismos societ??rios e contratuais de gest??o disponibilizados a empresas, sejam elas estatais ou privadas. Os movimentos societ??rios, os mecanismos de controle e partilha de controle e a ado????o das pr??ticas de governan??a corporativa s??o alternativas jur??dicas disponibilizadas aos administradores das empresas estatais. A an??lise do caso da Companhia Paranaense de Energia El??trica (Copel) ?? realizada no artigo com o prop??sito de auxiliar na visualiza????o da aplica????o de tais institutos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The role of middle management is essential when managing integrative and emergent strategy formation processes. We stand out the importance of its role connecting micro and macro organizational level offering a very important contribution when examining the strategy-as-practice perspective and integrative strategy formation process. The main goal of this research is to analyse the relationship between the integrative strategy formation process and the roles of middle management under the strategy-as-practice perspective. To check it out we adopted a qualitative methodology droving a case analysis in a Spanish University. Data was collected by means of personal interviews with members of different levels of the Institution, documents analysis and direct observation. In advance of some results we find out that the University develops an integrative strategy formation process and confers to middle management an important role extended all over the organization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A preliminary version of this paper appeared in Proceedings of the 31st IEEE Real-Time Systems Symposium, 2010, pp. 239–248.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most research work on WSNs has focused on protocols or on specific applications. There is a clear lack of easy/ready-to-use WSN technologies and tools for planning, implementing, testing and commissioning WSN systems in an integrated fashion. While there exists a plethora of papers about network planning and deployment methodologies, to the best of our knowledge none of them helps the designer to match coverage requirements with network performance evaluation. In this paper we aim at filling this gap by presenting an unified toolset, i.e., a framework able to provide a global picture of the system, from the network deployment planning to system test and validation. This toolset has been designed to back up the EMMON WSN system architecture for large-scale, dense, real-time embedded monitoring. It includes network deployment planning, worst-case analysis and dimensioning, protocol simulation and automatic remote programming and hardware testing tools. This toolset has been paramount to validate the system architecture through DEMMON1, the first EMMON demonstrator, i.e., a 300+ node test-bed, which is, to the best of our knowledge, the largest single-site WSN test-bed in Europe to date.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Consider the problem of assigning implicit-deadline sporadic tasks on a heterogeneous multiprocessor platform comprising two different types of processors—such a platform is referred to as two-type platform. We present two low degree polynomial time-complexity algorithms, SA and SA-P, each providing the following guarantee. For a given two-type platform and a task set, if there exists a task assignment such that tasks can be scheduled to meet deadlines by allowing them to migrate only between processors of the same type (intra-migrative), then (i) using SA, it is guaranteed to find such an assignment where the same restriction on task migration applies but given a platform in which processors are 1+α/2 times faster and (ii) SA-P succeeds in finding a task assignment where tasks are not allowed to migrate between processors (non-migrative) but given a platform in which processors are 1+α times faster. The parameter 0<α≤1 is a property of the task set; it is the maximum of all the task utilizations that are no greater than 1. We evaluate average-case performance of both the algorithms by generating task sets randomly and measuring how much faster processors the algorithms need (which is upper bounded by 1+α/2 for SA and 1+α for SA-P) in order to output a feasible task assignment (intra-migrative for SA and non-migrative for SA-P). In our evaluations, for the vast majority of task sets, these algorithms require significantly smaller processor speedup than indicated by their theoretical bounds. Finally, we consider a special case where no task utilization in the given task set can exceed one and for this case, we (re-)prove the performance guarantees of SA and SA-P. We show, for both of the algorithms, that changing the adversary from intra-migrative to a more powerful one, namely fully-migrative, in which tasks can migrate between processors of any type, does not deteriorate the performance guarantees. For this special case, we compare the average-case performance of SA-P and a state-of-the-art algorithm by generating task sets randomly. In our evaluations, SA-P outperforms the state-of-the-art by requiring much smaller processor speedup and by running orders of magnitude faster.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many-core platforms are an emerging technology in the real-time embedded domain. These devices offer various options for power savings, cost reductions and contribute to the overall system flexibility, however, issues such as unpredictability, scalability and analysis pessimism are serious challenges to their integration into the aforementioned area. The focus of this work is on many-core platforms using a limited migrative model (LMM). LMM is an approach based on the fundamental concepts of the multi-kernel paradigm, which is a promising step towards scalable and predictable many-cores. In this work, we formulate the problem of real-time application mapping on a many-core platform using LMM, and propose a three-stage method to solve it. An extended version of the existing analysis is used to assure that derived mappings (i) guarantee the fulfilment of timing constraints posed on worst-case communication delays of individual applications, and (ii) provide an environment to perform load balancing for e.g. energy/thermal management, fault tolerance and/or performance reasons.