968 resultados para service limit state
Resumo:
The wide diffusion of cheap, small, and portable sensors integrated in an unprecedented large variety of devices and the availability of almost ubiquitous Internet connectivity make it possible to collect an unprecedented amount of real time information about the environment we live in. These data streams, if properly and timely analyzed, can be exploited to build new intelligent and pervasive services that have the potential of improving people's quality of life in a variety of cross concerning domains such as entertainment, health-care, or energy management. The large heterogeneity of application domains, however, calls for a middleware-level infrastructure that can effectively support their different quality requirements. In this thesis we study the challenges related to the provisioning of differentiated quality-of-service (QoS) during the processing of data streams produced in pervasive environments. We analyze the trade-offs between guaranteed quality, cost, and scalability in streams distribution and processing by surveying existing state-of-the-art solutions and identifying and exploring their weaknesses. We propose an original model for QoS-centric distributed stream processing in data centers and we present Quasit, its prototype implementation offering a scalable and extensible platform that can be used by researchers to implement and validate novel QoS-enforcement mechanisms. To support our study, we also explore an original class of weaker quality guarantees that can reduce costs when application semantics do not require strict quality enforcement. We validate the effectiveness of this idea in a practical use-case scenario that investigates partial fault-tolerance policies in stream processing by performing a large experimental study on the prototype of our novel LAAR dynamic replication technique. Our modeling, prototyping, and experimental work demonstrates that, by providing data distribution and processing middleware with application-level knowledge of the different quality requirements associated to different pervasive data flows, it is possible to improve system scalability while reducing costs.
Resumo:
Context During the past 2 decades, a major transition in the clinical characterization of psychotic disorders has occurred. The construct of a clinical high-risk (HR) state for psychosis has evolved to capture the prepsychotic phase, describing people presenting with potentially prodromal symptoms. The importance of this HR state has been increasingly recognized to such an extent that a new syndrome is being considered as a diagnostic category in the DSM-5. Objective To reframe the HR state in a comprehensive state-of-the-art review on the progress that has been made while also recognizing the challenges that remain. Data Sources Available HR research of the past 20 years from PubMed, books, meetings, abstracts, and international conferences. Study Selection and Data Extraction Critical review of HR studies addressing historical development, inclusion criteria, epidemiologic research, transition criteria, outcomes, clinical and functional characteristics, neurocognition, neuroimaging, predictors of psychosis development, treatment trials, socioeconomic aspects, nosography, and future challenges in the field. Data Synthesis Relevant articles retrieved in the literature search were discussed by a large group of leading worldwide experts in the field. The core results are presented after consensus and are summarized in illustrative tables and figures. Conclusions The relatively new field of HR research in psychosis is exciting. It has the potential to shed light on the development of major psychotic disorders and to alter their course. It also provides a rationale for service provision to those in need of help who could not previously access it and the possibility of changing trajectories for those with vulnerability to psychotic illnesses.
Resumo:
This thesis examines two panel data sets of 48 states from 1981 to 2009 and utilizes ordinary least squares (OLS) and fixed effects models to explore the relationship between rural Interstate speed limits and fatality rates and whether rural Interstate speed limits affect non-Interstate safety. Models provide evidence that rural Interstate speed limits higher than 55 MPH lead to higher fatality rates on rural Interstates though this effect is somewhat tempered by reductions in fatality rates for roads other than rural Interstates. These results provide some but not unanimous support for the traffic diversion hypothesis that rural Interstate speed limit increases lead to decreases in fatality rates of other roads. To the author’s knowledge, this paper is the first econometric study to differentiate between the effects of 70 MPH speed limits and speed limits above 70 MPH on fatality rates using a multi-state data set. Considering both rural Interstates and other roads, rural Interstate speed limit increases above 55 MPH are responsible for 39,700 net fatalities, 4.1 percent of total fatalities from 1987, the year limits were first raised, to 2009.
Resumo:
The ability of anesthetic agents to provide adequate analgesia and sedation is limited by the ventilatory depression associated with overdosing in spontaneously breathing patients. Therefore, quantitation of drug induced ventilatory depression is a pharmacokinetic-pharmacodynamic problem relevant to the practice of anesthesia. Although several studies describe the effect of respiratory depressant drugs on isolated endpoints, an integrated description of drug induced respiratory depression with parameters identifiable from clinically available data is not available. This study proposes a physiological model of CO2 disposition, ventilatory regulation, and the effects of anesthetic agents on the control of breathing. The predictive performance of the model is evaluated through simulations aimed at reproducing experimental observations of drug induced hypercarbia and hypoventilation associated with intravenous administration of a fast-onset, highly potent anesthetic mu agonist (including previously unpublished experimental data determined after administration of 1 mg alfentanil bolus). The proposed model structure has substantial descriptive capability and can provide clinically relevant predictions of respiratory inhibition in the non-steady-state to enhance safety of drug delivery in the anesthetic practice.
Resumo:
A considerable portion of public lands in the United States is at risk of uncharacteristically severe wildfires due to a history of fire suppression. Wildfires already have detrimental impacts on the landscape and on communities in the wildland-urban interface (WUI) due to unnatural and overstocked forests. Strategies to mitigate wildfire risk include mechanical thinning and prescribed burning in areas with high wildfire risk. The material removed is often of little or no economic value. Woody biomass utilization (WBU) could offset the costs of hazardous fuel treatments if removed material could be used for wood products, heat, or electricity production. However, barriers due to transportation costs, removal costs, and physical constraints (such as steep slopes) hinder woody biomass utilization. Various federal and state policies attempt to overcome these barriers. WBU has the potential to aid in wildfire mitigation and meet growing state mandates for renewable energy. This research utilizes interview data from individuals involved with on-the-ground woody biomass removal and utilization to determine how federal and state policies influence woody biomass utilization. Results suggest that there is not one over-arching policy that hinders or promotes woody biomass utilization, but rather woody biomass utilization is hindered by organizational constraints related to time, cost, and quality of land management agencies’ actions. However, the use of stewardship contracting (a hybrid timber sale and service contract) shows promise for increased WBU, especially in states with favorable tax policies and renewable energy mandates. Policy recommendations to promote WBU include renewal of stewardship contracting legislations and a re-evaluation of land cover types suited for WBU. Potential future policies to consider include the indirect role of carbon dioxide emission reduction activities to promote wood energy and future impacts of air quality regulations.
Resumo:
The past decade has seen the energy consumption in servers and Internet Data Centers (IDCs) skyrocket. A recent survey estimated that the worldwide spending on servers and cooling have risen to above $30 billion and is likely to exceed spending on the new server hardware . The rapid rise in energy consumption has posted a serious threat to both energy resources and the environment, which makes green computing not only worthwhile but also necessary. This dissertation intends to tackle the challenges of both reducing the energy consumption of server systems and by reducing the cost for Online Service Providers (OSPs). Two distinct subsystems account for most of IDC’s power: the server system, which accounts for 56% of the total power consumption of an IDC, and the cooling and humidifcation systems, which accounts for about 30% of the total power consumption. The server system dominates the energy consumption of an IDC, and its power draw can vary drastically with data center utilization. In this dissertation, we propose three models to achieve energy effciency in web server clusters: an energy proportional model, an optimal server allocation and frequency adjustment strategy, and a constrained Markov model. The proposed models have combined Dynamic Voltage/Frequency Scaling (DV/FS) and Vary-On, Vary-off (VOVF) mechanisms that work together for more energy savings. Meanwhile, corresponding strategies are proposed to deal with the transition overheads. We further extend server energy management to the IDC’s costs management, helping the OSPs to conserve, manage their own electricity cost, and lower the carbon emissions. We have developed an optimal energy-aware load dispatching strategy that periodically maps more requests to the locations with lower electricity prices. A carbon emission limit is placed, and the volatility of the carbon offset market is also considered. Two energy effcient strategies are applied to the server system and the cooling system respectively. With the rapid development of cloud services, we also carry out research to reduce the server energy in cloud computing environments. In this work, we propose a new live virtual machine (VM) placement scheme that can effectively map VMs to Physical Machines (PMs) with substantial energy savings in a heterogeneous server cluster. A VM/PM mapping probability matrix is constructed, in which each VM request is assigned with a probability running on PMs. The VM/PM mapping probability matrix takes into account resource limitations, VM operation overheads, server reliability as well as energy effciency. The evolution of Internet Data Centers and the increasing demands of web services raise great challenges to improve the energy effciency of IDCs. We also express several potential areas for future research in each chapter.
Resumo:
The current climate of increasing performance expectations and diminishing resources, along with innovations in evidence-based practices (EBPs), creates new dilemmas for substance abuse treatment providers, policymakers, funders, and the service delivery system. This paper describes findings from baseline interviews with representatives from 49 state substance abuse authorities (SSAs). Interviews assessed efforts aimed at facilitating EBP adoption in each state and the District of Columbia. Results suggested that SSAs are concentrating more effort on EBP implementation strategies such as education, training, and infrastructure development, and less effort on financial mechanisms, regulations, and accreditation. The majority of SSAs use EBPs as a criterion in their contracts with providers, and just over half reported that EBP use is tied to state funding. To date, Oregon remains the only state with legislation that mandates treatment expenditures for EBPs; North Carolina follows suit with legislation that requires EBP promotion within current resources.
Resumo:
The aim of the study was to report on oral, dental and prosthetic conditions as well as therapeutic measures for temporarily institutionalized geriatric patients. The patients were referred to the dentist since dental problems were observed by the physicians or reported by the patients themselves. This resulted in a selection among the geriatric patients; but they are considered to be representative for this segment of patients exhibiting typical signs of undertreatment. The main problem was the poor retention of the prosthesis, which was associated to insufficient masticatory function and poor nutrition status. Forty-seven percent of the patients were edentulous or had maximally two radicular rests out of function. Altogether 70% of the maxillary and 51% of the mandibular jaws exhibited no more teeth. Eighty-nine percent of the patients had a removable denture, and it was observed that maxillary dentures were regularly worn in contrast to mandibular dentures. The partially edentate patients had a mean number of ten teeth, significantly more in the manidublar than maxillary jaw. Treatment consisted mainly in the adaptation and repair of dentures, tooth extractions and fillings. Only few appointments (mostly two) were necessary to improve the dental conditions, resulting in low costs. Patients without dentures or no need for denture repair generated the lowest costs. Slightly more visits were necessary for patients with dementia and musculoskeletal problems. The present findings show that regular maintenance care of institutionalized geriatric patients would limit costs in a long-term perspective, improve the oral situation and reduce the need for invasive treatment.
Resumo:
The past decade has witnessed a period of intense economic globalisation. The growing significance of international trade, investment, production and financial flows appears to be curtailing the autonomy of individual nation states. In particular, globalisation appears to be encouraging, if not demanding, a decline in social spending and standards. However, many authors believe that this thesis ignores the continued impact of national political and ideological pressures and lobby groups on policy outcomes. In particular, it has been argued that national welfare consumer and provider groups remain influential defenders of the welfare state. For example, US aged care groups are considered to be particularly effective defenders of social security pensions. According to this argument, governments engaged in welfare retrenchment may experience considerable electoral backlash (Pierson 1996; Mishra 1999). Yet, it is also noted that governments can take action to reduce the impact of such groups by reducing their funding, and their access to policy-making and consultation processes. These actions are then justified on the basis of removing potential obstacles to economic competitiveness (Pierson 1994; Melville 1999).
Resumo:
This article discusses how new kinds of individual needs develop parallel to the changes in the welfare state. From a study of Victim Service in Sweden it is shown how this organisation has grown parallel to the changes in the welfare state. In the empirical material it is also shown that the need of support often comes from secondary victimisation. Those who are helped by Victim Support are often people with loose bonds to society and people of low class. As victims they can get help from Victim Support, but the need derives from lacking service in the welfare state. NGOs has replaced organisations in the public sector at the same time as the neo-liberal conception of crime, threats and risk has replaced the social democratic ideas of social security.
Resumo:
Species coexistence has been a fundamental issue to understand ecosystem functioning since the beginnings of ecology as a science. The search of a reliable and all-encompassing explanation for this issue has become a complex goal with several apparently opposing trends. On the other side, seemingly unconnected with species coexistence, an ecological state equation based on the inverse correlation between an indicator of dispersal that fits gamma distribution and species diversity has been recently developed. This article explores two factors, whose effects are inconspicuous in such an equation at the first sight, that are used to develop an alternative general theoretical background in order to provide a better understanding of species coexistence. Our main outcomes are: (i) the fit of dispersal and diversity values to gamma distribution is an important factor that promotes species coexistence mainly due to the right-skewed character of gamma distribution; (ii) the opposite correlation between species diversity and dispersal implies that any increase of diversity is equivalent to a route of “ecological cooling” whose maximum limit should be constrained by the influence of the third law of thermodynamics; this is in agreement with the well-known asymptotic trend of diversity values in space and time; (iii) there are plausible empirical and theoretical ways to apply physical principles to explain important ecological processes; (iv) the gap between theoretical and empirical ecology in those cases where species diversity is paradoxically high could be narrowed by a wave model of species coexistence based on the concurrency of local equilibrium states. In such a model, competitive exclusion has a limited but indispensable role in harmonious coexistence with functional redundancy. We analyze several literature references as well as ecological and evolutionary examples that support our approach, reinforcing the meaning equivalence between important physical and ecological principles.
Resumo:
Digitization, sophisticated fiber-optic networks and the resultant convergence of the media, communications and information technology industries have completely transformed the communications ecosystem in the last couple of decades. New contingent business and social models were created that have been mirrored in the amended communications regimes. Yet, despite an overhaul of the communications regulation paradigm, the status of and the rules on universal service have remained surprisingly intact, both during and after the liberalization exercise. The present paper looks into this paradox and examines the sustainability of the existing concept of universal service. It suggests that there is a need for a novel concept of universal service in the digital networked communications environment, whose objectives go beyond the conventional internalizing and redistributional rationales and concentrate on communication and information networks as a public good, where not only access to infrastructure but also access to content may be essential.