989 resultados para Service Measurement
Resumo:
The aim of our study was to provide an innovative headspace-gas chromatography-mass spectrometry (HS-GC-MS) method applicable for the routine determination of blood CO concentration in forensic toxicology laboratories. The main drawback of the GC/MS methods discussed in literature for CO measurement is the absence of a specific CO internal standard necessary for performing quantification. Even if stable isotope of CO is commercially available in the gaseous state, it is essential to develop a safer method to limit the manipulation of gaseous CO and to precisely control the injected amount of CO for spiking and calibration. To avoid the manipulation of a stable isotope-labeled gas, we have chosen to generate in a vial in situ, an internal labeled standard gas ((13)CO) formed by the reaction of labeled formic acid formic acid (H(13)COOH) with sulfuric acid. As sulfuric acid can also be employed to liberate the CO reagent from whole blood, the procedure allows for the liberation of CO simultaneously with the generation of (13)CO. This method allows for precise measurement of blood CO concentrations from a small amount of blood (10 μL). Finally, this method was applied to measure the CO concentration of intoxicated human blood samples from autopsies.
Resumo:
In this paper, we consider the ATM networks in which the virtual path concept is implemented. The question of how to multiplex two or more diverse traffic classes while providing different quality of service requirements is a very complicated open problem. Two distinct options are available: integration and segregation. In an integration approach all the traffic from different connections are multiplexed onto one VP. This implies that the most restrictive QOS requirements must be applied to all services. Therefore, link utilization will be decreased because unnecessarily stringent QOS is provided to all connections. With the segregation approach the problem can be much simplified if different types of traffic are separated by assigning a VP with dedicated resources (buffers and links). Therefore, resources may not be efficiently utilized because no sharing of bandwidth can take place across the VP. The probability that the bandwidth required by the accepted connections exceeds the capacity of the link is evaluated with the probability of congestion (PC). Since the PC can be expressed as the CLP, we shall simply carry out bandwidth allocation using the PC. We first focus on the influence of some parameters (CLP, bit rate and burstiness) on the capacity required by a VP supporting a single traffic class using the new convolution approach. Numerical results are presented both to compare the required capacity and to observe which conditions under each approach are preferred
Resumo:
In the accounting literature, interaction or moderating effects are usually assessed by means of OLS regression and summated rating scales are constructed to reduce measurement error bias. Structural equation models and two-stage least squares regression could be used to completely eliminate this bias, but large samples are needed. Partial Least Squares are appropriate for small samples but do not correct measurement error bias. In this article, disattenuated regression is discussed as a small sample alternative and is illustrated on data of Bisbe and Otley (in press) that examine the interaction effect of innovation and style of use of budgets on performance. Sizeable differences emerge between OLS and disattenuated regression
Resumo:
This paper focuses on QoS routing with protection in an MPLS network over an optical layer. In this multi-layer scenario each layer deploys its own fault management methods. A partially protected optical layer is proposed and the rest of the network is protected at the MPLS layer. New protection schemes that avoid protection duplications are proposed. Moreover, this paper also introduces a new traffic classification based on the level of reliability. The failure impact is evaluated in terms of recovery time depending on the traffic class. The proposed schemes also include a novel variation of minimum interference routing and shared segment backup computation. A complete set of experiments proves that the proposed schemes are more efficient as compared to the previous ones, in terms of resources used to protect the network, failure impact and the request rejection ratio
Resumo:
Introduction : Le monitoring de la tension artérielle à domicile est recommandé par plusieurs guidelines et a été montré être faisable chez la personne âgée. Les manomètres au poignet ont récemment été proposés pour la mesure de la tension artérielle à domicile, mais leur précision n'a pas été au préalable évaluée chez les patients âgés. Méthode : Quarante-huit participants (33 femmes et 15 hommes, moyenne d'âge 81.3±8.0 ans) ont leur tension artérielle mesurée avec un appareil au poignet avec capteur de position et un appareil au bras dans un ordre aléatoire et dans une position assise. Résultats : Les moyennes de mesures de tension artérielle étaient systématiquement plus basses avec l'appareil au poignet par rapport à celui du bras pour la pression systolique (120.1±2.2 vs. 130.5±2.2 mmHg, P < 0.001, moyenneidéviation standard) et pour la pression diastolique (66.011.3 vs. 69.7±1.3 mmHg, P < 0.001). De plus, une différence de lOmmHg ou plus grande entre l'appareil au bras et au poignet était observée dans 54.2 et 18,8% des mesures systoliques et diastoliques respectivement. Conclusion : Comparé à l'appareil au bras, l'appareil au poignet avec capteur de position sous-estimait systématiquement aussi bien la tension artérielle systolique que diastolique. L'ampleur de la différence est cliniquement significative et met en doute l'utilisation de l'appareil au poignet pour monitorer la tension artérielle chez la personne âgée. Cette étude indique le besoin de valider les appareils de mesures de la tension artérielle dans tous les groupes d'âge, y compris les personnes âgées.
Resumo:
OBJECTIVES: To assess the prevalence and predictors of service disengagement in a treated epidemiological cohort of first-episode psychosis (FEP) patients. METHODS: The Early Psychosis Prevention and Intervention Centre (EPPIC) in Australia admitted 786 FEP patients from January 1998 to December 2000. Treatment at EPPIC is scheduled for 18 months. Data were collected from patients' files using a standardized questionnaire. Seven hundred four files were available; 44 were excluded, because of a non-psychotic diagnosis at endpoint (n=43) or missing data on service disengagement (n=1). Rate of service disengagement was the outcome of interest, as well as pre-treatment, baseline, and treatment predictors of service disengagement, which were examined via Cox proportional hazards models. RESULTS: 154 patients (23.3%) disengaged from service. A past forensic history (Hazard ratio [HR]=1.69; 95%CI 1.17-2.45), lower severity of illness at baseline (HR=0.59; 95%CI 0.48-0.72), living without family at discharge (HR=1.75; 95%CI 1.22-2.50) and persistence of substance use disorder during treatment (HR=2.30; 95%CI 1.45-3.66) were significant predictors of disengagement from service. CONCLUSIONS: While engagement strategies are a core element in the treatment of first-episode psychosis, particular attention should be paid to these factors associated with disengagement. Involvement of the family in the treatment process, and focusing on reduction of substance use, need to be pursued in early intervention services.
Resumo:
The success story of hydroelectricity long influenced and dominated Swiss scholarly literature devoted to the history of technology. This means of conducting power, which emerged at the end of the 19th century and is still dominating today, has attracted much more attention than technologies that have been shadowed by its success. In spite of their important contribution to Swiss economic development, the distribution networks of pressurized water have been neglected by scholars. This article contributes to close this historiographic gap by analyzing the introduction of pressurized water distribution in 1876 in Lausanne, in the context of the building of the first Swiss cable funicular between Lausanne and Ouchy. This article shows how pressurized water distribution transformed socio-economic practices in the urban areas in which it was adopted. Indeed, this innovation, which allowed the use of distant hydraulic resources, enabled the rationalization of industrial and artisanal production as well as improved the density of the urban industrial base. By facilitating the introduction of electric lighting, pressurized water networks played a key role in the early development, and further successes, of the Swiss hydroelectric industry.
Resumo:
The presentation will focus on the reasons for deploying an e-reader loan service at a virtual university library as a part of an e-learning support system to aid user mobility, concentration of documentary and electronic resources, and ICT skills acquisition, using the example of the UOC pilot project and its subsequent consolidation. E-reader devices at the UOC are an extension of the Virtual Campus. They are offered as a tool to aid user mobility, access to documentary and electronic resources, and development of information and IT skills. The e-reader loan service began as a pilot project in 2009 and was consolidated in 2010. The UOC Library piloted the e-reader loan service from October to December 2009. The pilot project was carried out with 15 devices and involved 37 loans. The project was extended into 2010 with the same number of devices and 218 loans (October 2010). In 2011 the e-reader loan service is to involve 190 devices, thus offering an improved service. The reasons for deploying an e-reader loan service at the UOC are the following: a) to offer library users access to the many kinds of learning materials available at the UOC through a single device that facilitates student study and learning; b) to enhance access to and use of the e-book collections subscribed to by the UOC Library; c) to align with UOC strategy on the development of learning materials in multiple formats, and promote e-devices as an extension of the UOC Virtual Campus, and d) to increase UOC Library visibility within and beyond the institution. The presentation will conclude with an analysis of the key issues to be taken into account at a university library: the e-reader market, the unclear business and license model for e-book contents, and the library's role in promoting new reading formats to increase use of e-collections.
Resumo:
Référence bibliographique : Rol, 55330
Resumo:
BACKGROUND: Straylight gives the appearance of a veil of light thrown over a person's retinal image when there is a strong light source present. We examined the reproducibility of the measurements by C-Quant, and assessed its correlation to characteristics of the eye and subjects' age. PARTICIPANTS AND METHODS: Five repeated straylight measurements were taken using the dominant eye of 45 healthy subjects (age 21-59) with a BCVA of 20/20: 14 emmetropic, 16 myopic, eight hyperopic and seven with astigmatism. We assessed the extent of reproducibility of straylight measures using the intraclass correlation coefficient. RESULTS: The mean straylight value of all measurements was 1.01 (SD 0.23, median 0.97, interquartile range 0.85-1.1). Per 10 years of age, straylight increased in average by 0.10 (95%CI 0.04 to 0.16, p < 0.01]. We found no independent association of refraction (range -5.25 dpt to +2 dpt) on straylight values (0.001; 95%CI -0.022 to 0.024, p = 0.92). Compared to emmetropic subjects, myopia reduced straylight (-.011; -0.024 to 0.02, p = 0.11), whereas higher straylight values (0.09; -0.01 to 0.20, p = 0.09) were observed in subjects with blue irises as compared to dark-colored irises when correcting for age. The intraclass correlation coefficient (ICC) of repeated measurements was 0.83 (95%CI 0.76 to 0.90). CONCLUSIONS: Our study showed that straylight measurements with the C-Quant had a high reproducibility, i.e. a lack of large intra-observer variability, making it appropriate to be applied in long-term follow-up studies assessing the long-term effect of surgical procedures on the quality of vision.
Resumo:
Leprosy is an infectious and contagious spectral disease accompanied by a series of immunological events triggered by the host response to the aetiologic agent, Mycobacterium leprae . The induction and maintenance of the immune/inflammatory response in leprosy are linked to multiple cell interactions and soluble factors, primarily through the action of cytokines. The purpose of the present study was to evaluate the serum levels of tumour necrosis factor (TNF)-α and its soluble receptors (sTNF-R1 and sTNF-R2) in leprosy patients at different stages of multidrug treatment (MDT) in comparison with non-infected individuals and to determine their role as putative biomarkers of the severity of leprosy or the treatment response. ELISA was used to measure the levels of these molecules in 30 healthy controls and 37 leprosy patients at the time of diagnosis and during and after MDT. Our results showed increases in the serum levels of TNF-α and sTNF-R2 in infected individuals in comparison with controls. The levels of TNF-α, but not sTNF-R2, decreased with treatment. The current results corroborate previous reports of elevated serum levels of TNF-α in leprosy and suggest a role for sTNF-R2 in the control of this cytokine during MDT.