928 resultados para Service Level Agreement (SLA)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Web content hosting, in which a Web server stores and provides Web access to documents for different customers, is becoming increasingly common. For example, a web server can host webpages for several different companies and individuals. Traditionally, Web Service Providers (WSPs) provide all customers with the same level of performance (best-effort service). Most service differentiation has been in the pricing structure (individual vs. business rates) or the connectivity type (dial-up access vs. leased line, etc.). This report presents DiffServer, a program that implements two simple, server-side, application-level mechanisms (server-centric and client-centric) to provide different levels of web service. The results of the experiments show that there is not much overhead due to the addition of this additional layer of abstraction between the client and the Apache web server under light load conditions. Also, the average waiting time for high priority requests decreases significantly after they are assigned priorities as compared to a FIFO approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this action research study of my 5th grade mathematics class, I investigated the issue of homework and its relationship with students and parents. I made some interesting observations and discovered that the majority of students and parents felt that the math homework that was given was fairly easy, yet issues of incomplete assignments and failing homework quizzes were notorious for some individuals. Comments were also made to make homework even easier and have shortened assignments despite the already indicated ease of the work. As a result of this research, I plan to look more closely at the history and development of homework, as well as the psychological implications and “hereditary” issues involving homework, which I believe are passed from one generation of learners to the next. My intent is to continue to study this phenomenon in future school years, trying to develop methods of instilling successful, intrinsic motivational skills to aid students in their homework endeavors. Finally, I will take a close inventory of my own beliefs and understandings toward homework: What is the purpose of having students do work away from the classroom, and how can homework serve as a proactive service for all who are involved?

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a wide range of video services over complex transmission networks, and in some cases end users fail to receive an acceptable quality level. In this paper, the different factors that degrade users' quality of experience (QoE) in video streaming service that use TCP as transmission protocol are studied. In this specific service, impairment factors are: number of pauses, their duration and temporal location. In order to measure the effect that each temporal segment has in the overall video quality, subjective tests. Because current subjective test methodologies are not adequate to assess video streaming over TCP, some recommendations are provided here. At the application layer, a customized player is used to evaluate the behavior of player buffer, and consequently, the end user QoE. Video subjective test results demonstrate that there is a close correlation between application parameters and subjective scores. Based on this fact, a new metrics named VsQM is defined, which considers the importance of temporal location of pauses to assess the user QoE of video streaming service. A useful application scenario is also presented, in which the metrics proposed herein is used to improve video services(1).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives. This study recorded and evaluated the intra-and inter-group agreement degree by different examiners for the classification of lower third molars according to both the Winter's and Pell & Gregory's systems. Study Design. An observational and cross-sectional study was realized with forty lower third molars analyzed from twenty digital panoramic radiographs. Four examiner groups (undergraduates, maxillofacial surgeons, oral radiologists and clinical dentists) from Aracaju, Sergipe, Brazil, classified them in relation to angulation, class and position. The variance test (ANOVA) was applied in the examiner findings with significance level of p<0.05 and confidence intervals of 95%. Results. Intra- and inter-group agreement was observed in Winter's classification system among all examiners. Pell & Gregory's classification system showed an average intra-group agreement and a statistical significant difference to position variable in inter-group analysis with greater disagreement to the clinical dentists group (p<0.05). Conclusions. High reproducibility was associated to Winter's classification, whereas the system proposed by Pell & Gregory did not demonstrate appropriate levels of reliability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: to evaluate the effects of low-level laser therapy for perineal pain and healing after episiotomy. Design: a double-blind, randomised, controlled clinical trial comparing perineal pain scores and episiotomy healing in women treated with low-level laser therapy (LLLT) and with the simulation of the treatment. Setting: the study was conducted in the Birth Centre and rooming-in units of Amparo Maternal, a maternity service located in the city of Sao Paulo, Brazil. Participants: fifty-two postpartum women who had had mediolateral episiotomies during their first normal delivery were randomly divided into two groups of 26: an experimental group and a control group. Intervention: in the experimental group, the women were treated with LLLT. Irradiation was applied at three points directly on the episiotomy after the suture and in three postpartum sessions: up to 2 hrs postpartum, between 20 and 24 hrs postpartum and between 40 and 48 hrs postpartum. The LLLT was performed with diode laser, with a wavelength of 660 nm (red light), spot size of 0.04 cm(2), energy density of 3.8 J/cm(2), radiant power of 15 mW and 10 s per point, which resulted in an energy of 0.15 J per point and a total energy of 0.45 J per session. The control group participants also underwent three treatment sessions, but without the emission of radiation (simulation group), to assess the possible effects of placebo treatment. Main outcomes: perineal pain scores, rated on a scale from 0 to 10, were evaluated before and immediately after the irradiation in the three sessions. The healing process was assessed using the REEDA scale (Redness, Edema, Echymosis, Discharge Aproximation) before each laser therapy session and 15 and 20 days after the women's discharge. Findings: comparing the pain scores before and after the LLLT sessions, the experimental group presented a significant within-group reduction in mean pain scores after the second and third sessions (p=0.003 and p<0.001, respectively), and the control group showed a significant reduction after the first treatment simulation (p=0.043). However, the comparison of the perineal pain scores between the experimental and control groups indicated no statistical difference at any of the evaluated time points. There was no significant difference in perineal healing scores between the groups. All postpartum women approved of the low-level laser therapy. Conclusions: this pilot study showed that LLLT did not accelerate episiotomy healing. Although there was a reduction in perineal pain mean scores in the experimental group, we cannot conclude that the laser relieved perineal pain. This study led to the suggestion of a new research proposal involving another irradiation protocol to evaluate LLLT's effect on perineal pain relief. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The dynamicity and heterogeneity that characterize pervasive environments raise new challenges in the design of mobile middleware. Pervasive environments are characterized by a significant degree of heterogeneity, variability, and dynamicity that conventional middleware solutions are not able to adequately manage. Originally designed for use in a relatively static context, such middleware systems tend to hide low-level details to provide applications with a transparent view on the underlying execution platform. In mobile environments, however, the context is extremely dynamic and cannot be managed by a priori assumptions. Novel middleware should therefore support mobile computing applications in the task of adapting their behavior to frequent changes in the execution context, that is, it should become context-aware. In particular, this thesis has identified the following key requirements for novel context-aware middleware that existing solutions do not fulfil yet. (i) Middleware solutions should support interoperability between possibly unknown entities by providing expressive representation models that allow to describe interacting entities, their operating conditions and the surrounding world, i.e., their context, according to an unambiguous semantics. (ii) Middleware solutions should support distributed applications in the task of reconfiguring and adapting their behavior/results to ongoing context changes. (iii) Context-aware middleware support should be deployed on heterogeneous devices under variable operating conditions, such as different user needs, application requirements, available connectivity and device computational capabilities, as well as changing environmental conditions. Our main claim is that the adoption of semantic metadata to represent context information and context-dependent adaptation strategies allows to build context-aware middleware suitable for all dynamically available portable devices. Semantic metadata provide powerful knowledge representation means to model even complex context information, and allow to perform automated reasoning to infer additional and/or more complex knowledge from available context data. In addition, we suggest that, by adopting proper configuration and deployment strategies, semantic support features can be provided to differentiated users and devices according to their specific needs and current context. This thesis has investigated novel design guidelines and implementation options for semantic-based context-aware middleware solutions targeted to pervasive environments. These guidelines have been applied to different application areas within pervasive computing that would particularly benefit from the exploitation of context. Common to all applications is the key role of context in enabling mobile users to personalize applications based on their needs and current situation. The main contributions of this thesis are (i) the definition of a metadata model to represent and reason about context, (ii) the definition of a model for the design and development of context-aware middleware based on semantic metadata, (iii) the design of three novel middleware architectures and the development of a prototypal implementation for each of these architectures, and (iv) the proposal of a viable approach to portability issues raised by the adoption of semantic support services in pervasive applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Customer satisfaction has been traditionally studied and measured regardless of the time elapsed since the purchase. Some studies have recently reopened the debate about the temporal pattern of satisfaction. This research aims to explain why “how you evaluate a service depends on when you evaluate it” on the basis of the theoretical framework proposed by Construal-Level Theory (CLT). Although an empirical investigation is still lacking, the literature does not deny that CLT can be applied also with regard to past events. Moreover, some studies support the idea that satisfaction is a good predictor of future intentions, while others do not. On the basis of CLT, we argue that these inconsistent results are due to the different construal levels of the information pertaining to retrospective and prospective evaluations. Building on the Two-Factor Theory, we explain the persistence of certain attributes’ representations over time according to their relationship with overall performance. We present and discuss three experiments and one field study that were conducted a) to test the extensibility of CLT to past events, b) to disentangle memory and construal effects, c) to study the effect of different temporal perspective on overall satisfaction judgements, and d) to investigate the temporal shift of the determinants of customer satisfaction as a function of temporal distance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis deals with channel coding theory applied to upper layers in the protocol stack of a communication link and it is the outcome of four year research activity. A specific aspect of this activity has been the continuous interaction between the natural curiosity related to the academic blue-sky research and the system oriented design deriving from the collaboration with European industry in the framework of European funded research projects. In this dissertation, the classical channel coding techniques, that are traditionally applied at physical layer, find their application at upper layers where the encoding units (symbols) are packets of bits and not just single bits, thus explaining why such upper layer coding techniques are usually referred to as packet layer coding. The rationale behind the adoption of packet layer techniques is in that physical layer channel coding is a suitable countermeasure to cope with small-scale fading, while it is less efficient against large-scale fading. This is mainly due to the limitation of the time diversity inherent in the necessity of adopting a physical layer interleaver of a reasonable size so as to avoid increasing the modem complexity and the latency of all services. Packet layer techniques, thanks to the longer codeword duration (each codeword is composed of several packets of bits), have an intrinsic longer protection against long fading events. Furthermore, being they are implemented at upper layer, Packet layer techniques have the indisputable advantages of simpler implementations (very close to software implementation) and of a selective applicability to different services, thus enabling a better matching with the service requirements (e.g. latency constraints). Packet coding technique improvement has been largely recognized in the recent communication standards as a viable and efficient coding solution: Digital Video Broadcasting standards, like DVB-H, DVB-SH, and DVB-RCS mobile, and 3GPP standards (MBMS) employ packet coding techniques working at layers higher than the physical one. In this framework, the aim of the research work has been the study of the state-of-the-art coding techniques working at upper layer, the performance evaluation of these techniques in realistic propagation scenario, and the design of new coding schemes for upper layer applications. After a review of the most important packet layer codes, i.e. Reed Solomon, LDPC and Fountain codes, in the thesis focus our attention on the performance evaluation of ideal codes (i.e. Maximum Distance Separable codes) working at UL. In particular, we analyze the performance of UL-FEC techniques in Land Mobile Satellite channels. We derive an analytical framework which is a useful tool for system design allowing to foresee the performance of the upper layer decoder. We also analyze a system in which upper layer and physical layer codes work together, and we derive the optimal splitting of redundancy when a frequency non-selective slowly varying fading channel is taken into account. The whole analysis is supported and validated through computer simulation. In the last part of the dissertation, we propose LDPC Convolutional Codes (LDPCCC) as possible coding scheme for future UL-FEC application. Since one of the main drawbacks related to the adoption of packet layer codes is the large decoding latency, we introduce a latency-constrained decoder for LDPCCC (called windowed erasure decoder). We analyze the performance of the state-of-the-art LDPCCC when our decoder is adopted. Finally, we propose a design rule which allows to trade-off performance and latency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The modern stratigraphy of clastic continental margins is the result of the interaction between several geological processes acting on different time scales, among which sea level oscillations, sediment supply fluctuations and local tectonics are the main mechanisms. During the past three years my PhD was focused on understanding the impact of each of these process in the deposition of the central and northern Adriatic sedimentary successions, with the aim of reconstructing and quantifying the Late Quaternary eustatic fluctuations. In the last few decades, several Authors tried to quantify past eustatic fluctuations through the analysis of direct sea level indicators, among which drowned barrier-island deposits or coral reefs, or indirect methods, such as Oxygen isotope ratios (δ18O) or modeling simulations. Sea level curves, obtained from direct sea level indicators, record a composite signal, formed by the contribution of the global eustatic change and regional factors, as tectonic processes or glacial-isostatic rebound effects: the eustatic signal has to be obtained by removing the contribution of these other mechanisms. To obtain the most realistic sea level reconstructions it is important to quantify the tectonic regime of the central Adriatic margin. This result has been achieved integrating a numerical approach with the analysis of high-resolution seismic profiles. In detail, the subsidence trend obtained from the geohistory analysis and the backstripping of the borehole PRAD1.2 (the borehole PRAD1.2 is a 71 m continuous borehole drilled in -185 m of water depth, south of the Mid Adriatic Deep - MAD - during the European Project PROMESS 1, Profile Across Mediterranean Sedimentary Systems, Part 1), has been confirmed by the analysis of lowstand paleoshorelines and by benthic foraminifera associations investigated through the borehole. This work showed an evolution from inner-shelf environment, during Marine Isotopic Stage (MIS) 10, to upper-slope conditions, during MIS 2. Once the tectonic regime of the central Adriatic margin has been constrained, it is possible to investigate the impact of sea level and sediment supply fluctuations on the deposition of the Late Pleistocene-Holocene transgressive deposits. The Adriatic transgressive record (TST - Transgressive Systems Tract) is formed by three correlative sedimentary bodies, deposited in less then 14 kyr since the Last Glacial Maximum (LGM); in particular: along the central Adriatic shelf and in the adjacent slope basin the TST is formed by marine units, while along the northern Adriatic shelf the TST is represented by costal deposits in a backstepping configuration. The central Adriatic margin, characterized by a thick transgressive sedimentary succession, is the ideal site to investigate the impact of late Pleistocene climatic and eustatic fluctuations, among which Meltwater Pulses 1A and 1B and the Younger Dryas cold event. The central Adriatic TST is formed by a tripartite deposit bounded by two regional unconformities. In particular, the middle TST unit includes two prograding wedges, deposited in the interval between the two Meltwater Pulse events, as highlighted by several 14C age estimates, and likely recorded the Younger Dryas cold interval. Modeling simulations, obtained with the two coupled models HydroTrend 3.0 and 2D-Sedflux 1.0C (developed by the Community Surface Dynamics Modeling System - CSDMS), integrated by the analysis of high resolution seismic profiles and core samples, indicate that: 1 - the prograding middle TST unit, deposited during the Younger Dryas, was formed as a consequence of an increase in sediment flux, likely connected to a decline in vegetation cover in the catchment area due to the establishment of sub glacial arid conditions; 2 - the two-stage prograding geometry was the consequence of a sea level still-stand (or possibly a fall) during the Younger Dryas event. The northern Adriatic margin, characterized by a broad and gentle shelf (350 km wide with a low angle plunge of 0.02° to the SE), is the ideal site to quantify the timing of each steps of the post LGM sea level rise. The modern shelf is characterized by sandy deposits of barrier-island systems in a backstepping configuration, showing younger ages at progressively shallower depths, which recorded the step-wise nature of the last sea level rise. The age-depth model, obtained by dated samples of basal peat layers, is in good agreement with previous published sea level curves, and highlights the post-glacial eustatic trend. The interval corresponding to the Younger Dyas cold reversal, instead, is more complex: two coeval coastal deposits characterize the northern Adriatic shelf at very different water depths. Several explanations and different models can be attempted to explain this conundrum, but the problem remains still unsolved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The wide diffusion of cheap, small, and portable sensors integrated in an unprecedented large variety of devices and the availability of almost ubiquitous Internet connectivity make it possible to collect an unprecedented amount of real time information about the environment we live in. These data streams, if properly and timely analyzed, can be exploited to build new intelligent and pervasive services that have the potential of improving people's quality of life in a variety of cross concerning domains such as entertainment, health-care, or energy management. The large heterogeneity of application domains, however, calls for a middleware-level infrastructure that can effectively support their different quality requirements. In this thesis we study the challenges related to the provisioning of differentiated quality-of-service (QoS) during the processing of data streams produced in pervasive environments. We analyze the trade-offs between guaranteed quality, cost, and scalability in streams distribution and processing by surveying existing state-of-the-art solutions and identifying and exploring their weaknesses. We propose an original model for QoS-centric distributed stream processing in data centers and we present Quasit, its prototype implementation offering a scalable and extensible platform that can be used by researchers to implement and validate novel QoS-enforcement mechanisms. To support our study, we also explore an original class of weaker quality guarantees that can reduce costs when application semantics do not require strict quality enforcement. We validate the effectiveness of this idea in a practical use-case scenario that investigates partial fault-tolerance policies in stream processing by performing a large experimental study on the prototype of our novel LAAR dynamic replication technique. Our modeling, prototyping, and experimental work demonstrates that, by providing data distribution and processing middleware with application-level knowledge of the different quality requirements associated to different pervasive data flows, it is possible to improve system scalability while reducing costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resource management is of paramount importance in network scenarios and it is a long-standing and still open issue. Unfortunately, while technology and innovation continue to evolve, our network infrastructure system has been maintained almost in the same shape for decades and this phenomenon is known as “Internet ossification”. Software-Defined Networking (SDN) is an emerging paradigm in computer networking that allows a logically centralized software program to control the behavior of an entire network. This is done by decoupling the network control logic from the underlying physical routers and switches that forward traffic to the selected destination. One mechanism that allows the control plane to communicate with the data plane is OpenFlow. The network operators could write high-level control programs that specify the behavior of an entire network. Moreover, the centralized control makes it possible to define more specific and complex tasks that could involve many network functionalities, e.g., security, resource management and control, into a single framework. Nowadays, the explosive growth of real time applications that require stringent Quality of Service (QoS) guarantees, brings the network programmers to design network protocols that deliver certain performance guarantees. This thesis exploits the use of SDN in conjunction with OpenFlow to manage differentiating network services with an high QoS. Initially, we define a QoS Management and Orchestration architecture that allows us to manage the network in a modular way. Then, we provide a seamless integration between the architecture and the standard SDN paradigm following the separation between the control and data planes. This work is a first step towards the deployment of our proposal in the University of California, Los Angeles (UCLA) campus network with differentiating services and stringent QoS requirements. We also plan to exploit our solution to manage the handoff between different network technologies, e.g., Wi-Fi and WiMAX. Indeed, the model can be run with different parameters, depending on the communication protocol and can provide optimal results to be implemented on the campus network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sea level variation is one of the parameters directly related to climate change. Monitoring sea level rise is an important scientific issue since many populated areas of the world and megacities are located in low-lying regions. At present, sea level is measured by means of two techniques: the tide gauges and the satellite radar altimetry. Tide gauges measure sea-level relatively to a ground benchmark, hence, their measurements are directly affected by vertical ground motions. Satellite radar altimetry measures sea-level relative to a geocentric reference and are not affected by vertical land motions. In this study, the linear relative sea level trends of 35 tide gauge stations distributed across the Mediterranean Sea have been computed over the period 1993-2014. In order to extract the real sea-level variation, the vertical land motion has been estimated using the observations of available GPS stations and removed from the tide gauges records. These GPS-corrected trends have then been compared with satellite altimetry measurements over the same time interval (AVISO data set). A further comparison has been performed, over the period 1993-2013, using the CCI satellite altimetry data set which has been generated using an updated modeling. The absolute sea level trends obtained from satellite altimetry and GPS-corrected tide gauge data are mostly consistent, meaning that GPS data have provided reliable corrections for most of the sites. The trend values range between +2.5 and +4 mm/yr almost everywhere in the Mediterranean area, the largest trends were found in the Northern Adriatic Sea and in the Aegean. These results are in agreement with estimates of the global mean sea level rise over the last two decades. Where GPS data were not available, information on the vertical land motion deduced from the differences between absolute and relative trends are in agreement with the results of other studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our society uses a large diversity of co-existing wired and wireless networks in order to satisfy its communication needs. A cooper- ation between these networks can benefit performance, service availabil- ity and deployment ease, and leads to the emergence of hybrid networks. This position paper focuses on a hybrid mobile-sensor network identify- ing potential advantages and challenges of its use and defining feasible applications. The main value of the paper, however, is in the proposed analysis approach to evaluate the performance at the mobile network side given the mixed mobile-sensor traffic. The approach combines packet- level analysis with modelling of flow-level behaviour and can be applied for the study of various application scenarios. In this paper we consider two applications with distinct traffic models namely multimedia traffic and best-effort traffic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consultation is promoted throughout school psychology literature as a best practice in service delivery. This method has numerous benefits including being able to work with more students at one time, providing practitioners with preventative rather than strictly reactive strategies, and helping school professionals meet state and federal education mandates and initiatives. Despite the benefits of consultation, teachers are sometimes resistant to this process.This research studies variables hypothesized to lead to resistance (Gonzalez, Nelson, Gutkin, & Shwery, 2004) and attempts to distinguish differences between school level (elementary, middle and high school) with respect to the role played by these variables and to determine if the model used to identify students for special education services has an influence on resistance factors. Twenty-sixteachers in elementary and middle schools responded to a demographicquestionnaire and a survey developed by Gonzalez, et al. (2004). This survey measures eight variables related to resistance to consultation. No high school teachers responded to the request to participate. Results of analysis of variance indicated a significant difference in the teaching efficacy subscale with elementary teachers reporting more efficacy in teaching than middle school teachers. Results also indicate a significant difference in classroom managementefficacy with teachers who work in schools that identify students according to a Response to Intervention model reporting higher classroom management efficacy than teachers who work in schools that identify students according to a combined method of refer-test-place/RtI combination model. Implications, limitations and directions for future research are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compared the service use patterns of older adults with varying levels of mental impairment, and assessed the effects of services received on their mental health status over a 1-yr period. Data were obtained from a US General Accounting Office (1977, 1979) study of 531 elderly persons (mean age 76.1 yrs), which included administration of a modified version of the Older Americans Resources and Services Multidimensional Functional Assessment Questionnaire. Ss were interviewed twice, 1 yr apart. 174 Ss were classified as having a mild psychiatric impairment, and 118 Ss had a severe psychiatric impairment. The existence of mental impairment was related to marital status, race, and level of education. Usage of mental health services was low, although mentally impaired Ss were more likely than unimpaired Ss to use social and medical services. Results also suggest that such services can have an important effect on the mental health of older persons.