809 resultados para Service quality measurement


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objective. To develop and validate a new short and simple measure of health-related quality of life (HRQL) in children with juvenile idiopathic arthritis (JIA).Methods. The Paediatric Rheumatology Quality of Life Scale (PRQL) is a 10-item questionnaire that explores HRQL in two domains: physical health (PhH) and psychosocial health (PsH). Validation of the parent proxy report and child self-report versions of the instrument was accomplished by evaluating 472 JIA patients and similar to 800 healthy children. Validation analyses included assessment of feasibility, face and content validity; construct and discriminative ability; internal structure and consistency; test-retest reliability; responsiveness to clinical change; and minimal clinically important difference.Results. The PRQL was found to be feasible and to possess both face and content validity. The PRQL score correlated in the predicted range with most of the other JIA outcome measures, thereby demonstrating good construct validity, and discriminated well between different levels of disease severity. Assessment of internal structure (factor analysis) revealed that the PhH and PsH subscales identify two unambiguously separated domains. The internal consistency (Cronbach's alpha) was 0.86. The intraclass correlation coefficient for test-retest reliability was 0.91. The PRQL revealed fair responsiveness, with a standardized response mean of 0.67 in improved patients. Overall, the PRQL appeared to be more able to capture physical HRQL than psychosocial HRQL.Conclusion. The PRQL was found to possess good measurement properties and is, therefore, a valid instrument for the assessment of HRQL in children with JIA. This tool is primarily proposed for use in standard clinical care.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Nowadays, networks must support applications such as: distance learning, electronic commerce, access to Internet, Intranets and Extranets, voice over IP (Internet Protocol) and many others. These new applications, employing data, voice, and video traffic, require high bandwidth and Quality of Service (QoS). The ATM (Asynchronous Transfer Mode) technology, together with dynamic resource allocation methods, offers network connections that guarantee QoS parameters, such as minimum losses and delays. This paper presents a system that uses Network Management Functions together with dynamic resource allocation for provision of the end-to-end QoS parameters for rt-VBR connections.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The use of QoS parameters to evaluate the quality of service in a mesh network is essential mainly when providing multimedia services. This paper proposes an algorithm for planning wireless mesh networks in order to satisfy some QoS parameters, given a set of test points (TPs) and potential access points (APs). Examples of QoS parameters include: probability of packet loss and mean delay in responding to a request. The proposed algorithm uses a Mathematical Programming model to determine an adequate topology for the network and Monte Carlo simulation to verify whether the QoS parameters are being satisfied. The results obtained show that the proposed algorithm is able to find satisfactory solutions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper reports the results that are part of a series of experiments designed to evaluate aspects of the spatial resolution of the visual system of the opossum, Didelphis marsupialis aurita. This nocturnal marsupial presents a well-developed eye, displaying features that reflect specialization for operation at low levels of luminosity. The species was shown to be slightly myopic, a feature that may prove to be valuable because of the increased depth of field. Opossum visual acuity has been previously evaluated by means of determining the Contrast Sensitivity Function (CSF). The results indicate rather poor visual acuity compared with other nocturnal animals. In this paper, we describe the results obtained for the optical quality of the opossum's eye using a single-pass method. The results suggest that the opossum's optical system is capable of forming images that can be resolved when separated by an angular distance on the order of 6 minutes of arc.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The aim of this study is to establish the factors that influence the quality of life of people living with HIV/AIDS being treated at a specialized public service. The participants answered the questionnaire on sociodemographic conditions, issues related to HIV and daily habits. The quality of life was analyzed using the HIV/AIDS-targeted quality of life (HAT-QoL) instrument with 42 items divided into 9 fields: General Activity, Sexual Activity, Confidentiality Concerns, Health Concerns, Financial Concerns, HIV Awareness, Satisfaction with Life, Issues related to Medication and Trust in the Physician. Bivariate and multiple linear regressions were performed. Of the participants, 53.1% were women and had a mean age of 42 years. In analyzing the quality of life, the HAT-QoL domain with the lowest average was Financial Concerns (39.4), followed by Confidentiality Concerns (43.2), Sexual Activity (55.2) and Health Concerns (62. 88). There was an association between the variables: not being gainfully employed (p < 0.001), being mulatto or black (p = 0.045) and alcohol consumption (p = 0.041) with the worst quality of life scores. Inadequate socioeconomic and health conditions had a negative impact on the quality of life of people with HIV/AIDS.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Increased railroad traffic volumes, speeds, and axle loads have created a need to better measure track quality. Previous research has indicated that the vertical track deflection provides a meaningful indicator of track integrity. The measured deflection can be related to the bending stresses in the rail as well as characterize the mechanical response of the track. This investigation summarizes the simulation, analysis and development of a measurement system at the University of Nebraska (UNL) to measure vertical track deflection in real-time from a car moving at revenue speeds. The UNL system operates continuously over long distances and in revenue service. Using a camera and two line lasers, the system establishes three points of the rail shape beneath the loaded wheels and over a distance of 10 ft. The resulting rail shape can then be related to the actual bending stress in the rail and estimate the track support through beam theory. Finite element simulations are used to characterize the track response as related to the UNL measurement system. The results of field tests using bondable resistance strain gages illustrate the system’s capability of approximating the actual rail bending stresses under load.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

There is a wide range of video services over complex transmission networks, and in some cases end users fail to receive an acceptable quality level. In this paper, the different factors that degrade users' quality of experience (QoE) in video streaming service that use TCP as transmission protocol are studied. In this specific service, impairment factors are: number of pauses, their duration and temporal location. In order to measure the effect that each temporal segment has in the overall video quality, subjective tests. Because current subjective test methodologies are not adequate to assess video streaming over TCP, some recommendations are provided here. At the application layer, a customized player is used to evaluate the behavior of player buffer, and consequently, the end user QoE. Video subjective test results demonstrate that there is a close correlation between application parameters and subjective scores. Based on this fact, a new metrics named VsQM is defined, which considers the importance of temporal location of pauses to assess the user QoE of video streaming service. A useful application scenario is also presented, in which the metrics proposed herein is used to improve video services(1).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The research activity carried out during the PhD course in Electrical Engineering belongs to the branch of electric and electronic measurements. The main subject of the present thesis is a distributed measurement system to be installed in Medium Voltage power networks, as well as the method developed to analyze data acquired by the measurement system itself and to monitor power quality. In chapter 2 the increasing interest towards power quality in electrical systems is illustrated, by reporting the international research activity inherent to the problem and the relevant standards and guidelines emitted. The aspect of the quality of voltage provided by utilities and influenced by customers in the various points of a network came out only in recent years, in particular as a consequence of the energy market liberalization. Usually, the concept of quality of the delivered energy has been associated mostly to its continuity. Hence the reliability was the main characteristic to be ensured for power systems. Nowadays, the number and duration of interruptions are the “quality indicators” commonly perceived by most customers; for this reason, a short section is dedicated also to network reliability and its regulation. In this contest it should be noted that although the measurement system developed during the research activity belongs to the field of power quality evaluation systems, the information registered in real time by its remote stations can be used to improve the system reliability too. Given the vast scenario of power quality degrading phenomena that usually can occur in distribution networks, the study has been focused on electromagnetic transients affecting line voltages. The outcome of such a study has been the design and realization of a distributed measurement system which continuously monitor the phase signals in different points of a network, detect the occurrence of transients superposed to the fundamental steady state component and register the time of occurrence of such events. The data set is finally used to locate the source of the transient disturbance propagating along the network lines. Most of the oscillatory transients affecting line voltages are due to faults occurring in any point of the distribution system and have to be seen before protection equipment intervention. An important conclusion is that the method can improve the monitored network reliability, since the knowledge of the location of a fault allows the energy manager to reduce as much as possible both the area of the network to be disconnected for protection purposes and the time spent by technical staff to recover the abnormal condition and/or the damage. The part of the thesis presenting the results of such a study and activity is structured as follows: chapter 3 deals with the propagation of electromagnetic transients in power systems by defining characteristics and causes of the phenomena and briefly reporting the theory and approaches used to study transients propagation. Then the state of the art concerning methods to detect and locate faults in distribution networks is presented. Finally the attention is paid on the particular technique adopted for the same purpose during the thesis, and the methods developed on the basis of such approach. Chapter 4 reports the configuration of the distribution networks on which the fault location method has been applied by means of simulations as well as the results obtained case by case. In this way the performance featured by the location procedure firstly in ideal then in realistic operating conditions are tested. In chapter 5 the measurement system designed to implement the transients detection and fault location method is presented. The hardware belonging to the measurement chain of every acquisition channel in remote stations is described. Then, the global measurement system is characterized by considering the non ideal aspects of each device that can concur to the final combined uncertainty on the estimated position of the fault in the network under test. Finally, such parameter is computed according to the Guide to the Expression of Uncertainty in Measurements, by means of a numeric procedure. In the last chapter a device is described that has been designed and realized during the PhD activity aiming at substituting the commercial capacitive voltage divider belonging to the conditioning block of the measurement chain. Such a study has been carried out aiming at providing an alternative to the used transducer that could feature equivalent performance and lower cost. In this way, the economical impact of the investment associated to the whole measurement system would be significantly reduced, making the method application much more feasible.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The wide diffusion of cheap, small, and portable sensors integrated in an unprecedented large variety of devices and the availability of almost ubiquitous Internet connectivity make it possible to collect an unprecedented amount of real time information about the environment we live in. These data streams, if properly and timely analyzed, can be exploited to build new intelligent and pervasive services that have the potential of improving people's quality of life in a variety of cross concerning domains such as entertainment, health-care, or energy management. The large heterogeneity of application domains, however, calls for a middleware-level infrastructure that can effectively support their different quality requirements. In this thesis we study the challenges related to the provisioning of differentiated quality-of-service (QoS) during the processing of data streams produced in pervasive environments. We analyze the trade-offs between guaranteed quality, cost, and scalability in streams distribution and processing by surveying existing state-of-the-art solutions and identifying and exploring their weaknesses. We propose an original model for QoS-centric distributed stream processing in data centers and we present Quasit, its prototype implementation offering a scalable and extensible platform that can be used by researchers to implement and validate novel QoS-enforcement mechanisms. To support our study, we also explore an original class of weaker quality guarantees that can reduce costs when application semantics do not require strict quality enforcement. We validate the effectiveness of this idea in a practical use-case scenario that investigates partial fault-tolerance policies in stream processing by performing a large experimental study on the prototype of our novel LAAR dynamic replication technique. Our modeling, prototyping, and experimental work demonstrates that, by providing data distribution and processing middleware with application-level knowledge of the different quality requirements associated to different pervasive data flows, it is possible to improve system scalability while reducing costs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Resource management is of paramount importance in network scenarios and it is a long-standing and still open issue. Unfortunately, while technology and innovation continue to evolve, our network infrastructure system has been maintained almost in the same shape for decades and this phenomenon is known as “Internet ossification”. Software-Defined Networking (SDN) is an emerging paradigm in computer networking that allows a logically centralized software program to control the behavior of an entire network. This is done by decoupling the network control logic from the underlying physical routers and switches that forward traffic to the selected destination. One mechanism that allows the control plane to communicate with the data plane is OpenFlow. The network operators could write high-level control programs that specify the behavior of an entire network. Moreover, the centralized control makes it possible to define more specific and complex tasks that could involve many network functionalities, e.g., security, resource management and control, into a single framework. Nowadays, the explosive growth of real time applications that require stringent Quality of Service (QoS) guarantees, brings the network programmers to design network protocols that deliver certain performance guarantees. This thesis exploits the use of SDN in conjunction with OpenFlow to manage differentiating network services with an high QoS. Initially, we define a QoS Management and Orchestration architecture that allows us to manage the network in a modular way. Then, we provide a seamless integration between the architecture and the standard SDN paradigm following the separation between the control and data planes. This work is a first step towards the deployment of our proposal in the University of California, Los Angeles (UCLA) campus network with differentiating services and stringent QoS requirements. We also plan to exploit our solution to manage the handoff between different network technologies, e.g., Wi-Fi and WiMAX. Indeed, the model can be run with different parameters, depending on the communication protocol and can provide optimal results to be implemented on the campus network.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

There are two main types of bone in the human body, trabecular and cortical bone. Cortical bone is primarily found on the outer surface of most bones in the body while trabecular bone is found in vertebrae and at the end of long bones (Ross 2007). Osteoporosis is a condition that compromises the structural integrity of trabecular bone, greatly reducing the ability of the bone to absorb energy from falls. The current method for diagnosing osteoporosis and predicting fracture risk is measurement of bone mineral density. Limitations of this method include dependence on the bone density measurement device and dependence on type of test and measurement location (Rubin 2005). Each year there are approximately 250,000 hip fractures in the United States due to osteoporosis (Kleerekoper 2006). Currently, the most common method for repairing a hip fracture is a hip fixation surgery. During surgery, a temporary guide wire is inserted to guide the permanent screw into place and then removed. It is believed that directly measuring this screw pullout force may result in a better assessment of bone quality than current indirect measurement techniques (T. Bowen 2008-2010, pers. comm.). The objective of this project is to design a device that can measure the force required to extract this guide wire. It is believed that this would give the surgeon a direct, quantitative measurement of bone quality at the site of the fixation. A first generation device was designed by a Bucknell Biomedical Engineering Senior Design team during the 2008- 2009 Academic Year. The first step of this project was to examine the device, conduct a thorough design analysis, and brainstorm new concepts. The concept selected uses a translational screw to extract the guide wire. The device was fabricated and underwent validation testing to ensure that the device was functional and met the required engineering specifications. Two tests were conducted, one to test the functionality of the device by testing if the device gave repeatable results, and the other to test the sensitivity of the device to misalignment. Guide wires were extracted from 3 materials, low density polyethylene, ultra high molecular weight polyethylene, and polypropylene and the force of extraction was measured. During testing, it was discovered that the spring in the device did not have a high enough spring constant to reach the high forces necessary for extracting the wires without excessive deflection of the spring. The test procedure was modified slightly so the wires were not fully threaded into the material. The testing results indicate that there is significant variation in the screw pullout force, up to 30% of the average value. This significant variation was attributed to problems in the testing and data collection, and a revised set of tests was proposed to better evaluate the performance of the device. The fabricated device is a fully-functioning prototype and further refinements and testing of the device may lead to a 3rd generation version capable of measuring the screw pullout force during hip fixation surgery.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

PURPOSE: Computer-based feedback systems for assessing the quality of cardiopulmonary resuscitation (CPR) are widely used these days. Recordings usually involve compression and ventilation dependent variables. Thorax compression depth, sufficient decompression and correct hand position are displayed but interpreted independently of one another. We aimed to generate a parameter, which represents all the combined relevant parameters of compression to provide a rapid assessment of the quality of chest compression-the effective compression ratio (ECR). METHODS: The following parameters were used to determine the ECR: compression depth, correct hand position, correct decompression and the proportion of time used for chest compressions compared to the total time spent on CPR. Based on the ERC guidelines, we calculated that guideline compliant CPR (30:2) has a minimum ECR of 0.79. To calculate the ECR, we expanded the previously described software solution. In order to demonstrate the usefulness of the new ECR-parameter, we first performed a PubMed search for studies that included correct compression and no-flow time, after which we calculated the new parameter, the ECR. RESULTS: The PubMed search revealed 9 trials. Calculated ECR values ranged between 0.03 (for basic life support [BLS] study, two helpers, no feedback) and 0.67 (BLS with feedback from the 6th minute). CONCLUSION: ECR enables rapid, meaningful assessment of CPR and simplifies the comparability of studies as well as the individual performance of trainees. The structure of the software solution allows it to be easily adapted to any manikin, CPR feedback devices and different resuscitation guidelines (e.g. ILCOR, ERC).