972 resultados para Reliability prediction


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study describes the rehabilitation length of stay (LOS), discharge destination and discharge functional status of 149 patients admitted with traumatic brain injury (TBI) to an Australian hospital over a 5-year period. Hospital charts of patients admitted between 1993-1998 were reviewed. Average LOS over the 5-year time period was 61.8 days and only decreased nominally over this time. Longer LOS was predicted by lower admission motor FIM scores and presence of comorbidities. Mean admission and discharge motor FIM scores were 58 and 79, which represented a gain of 21 points. Higher discharge motor FIM scores were predicted by higher admission motor FIM scores and younger age. FIM gain was predicted by cognitive status and age. Most patients, 88%, were discharged back to the community, with 30% changing their living setting or situation. Changing living status was predicted by living alone and having poorer functional status on admission.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE: To assess the theoretical and practical knowledge of the Glasgow Coma Scale (GCS) by trained Air-rescue physicians in Switzerland. METHODS: Prospective anonymous observational study with a specially designed questionnaire. General knowledge of the GCS and its use in a clinical case were assessed. RESULTS: From 130 questionnaires send out, 103 were returned (response rate of 79.2%) and analyzed. Theoretical knowledge of the GCS was consistent for registrars, fellows, consultants and private practitioners active in physician-staffed helicopters. The clinical case was wrongly scored by 38 participants (36.9%). Wrong evaluation of the motor component occurred in 28 questionnaires (27.2%), and 19 errors were made for the verbal score (18.5%). Errors were made most frequently by registrars (47.5%, p = 0.09), followed by fellows (31.6%, p = 0.67) and private practitioners (18.4%, p = 1.00). Consultants made significantly less errors than the rest of the participating physicians (0%, p < 0.05). No statistically significant differences were shown between anesthetists, general practitioners, internal medicine trainees or others. CONCLUSION: Although the theoretical knowledge of the GCS by out-of-hospital physicians is correct, significant errors were made in scoring a clinical case. Less experienced physicians had a higher rate of errors. Further emphasis on teaching the GCS is mandatory.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Accelerated life testing (ALT) is widely used to obtain reliability information about a product within a limited time frame. The Cox s proportional hazards (PH) model is often utilized for reliability prediction. My master thesis research focuses on designing accelerated life testing experiments for reliability estimation. We consider multiple step-stress ALT plans with censoring. The optimal stress levels and times of changing the stress levels are investigated. We discuss the optimal designs under three optimality criteria. They are D-, A- and Q-optimal designs. We note that the classical designs are optimal only if the model assumed is correct. Due to the nature of prediction made from ALT experimental data, attained under the stress levels higher than the normal condition, extrapolation is encountered. In such case, the assumed model cannot be tested. Therefore, for possible imprecision in the assumed PH model, the method of construction for robust designs is also explored.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Measured rates of intrinsic clearance determined using cryopreserved trout hepatocytes can be extrapolated to the whole animal as a means of improving modeled bioaccumulation predictions for fish. To date, however, the intra- and interlaboratory reliability of this procedure has not been determined. In the present study, three laboratories determined in vitro intrinsic clearance of six reference compounds (benzo[a]pyrene, 4-nonylphenol, di-tert-butyl phenol, fenthion, methoxychlor and o-terphenyl) by conducting substrate depletion experiments with cryopreserved trout hepatocytes from a single source. O-terphenyl was excluded from the final analysis due to nonfirst-order depletion kinetics and significant loss from denatured controls. For the other five compounds, intralaboratory variability (% CV) in measured in vitro intrinsic clearance values ranged from 4.1 to 30%, while interlaboratory variability ranged from 27 to 61%. Predicted bioconcentration factors based on in vitro clearance values exhibited a reduced level of interlaboratory variability (5.3-38% CV). The results of this study demonstrate that cryopreserved trout hepatocytes can be used to reliably obtain in vitro intrinsic clearance of xenobiotics, which provides support for the application of this in vitro method in a weight-of-evidence approach to chemical bioaccumulation assessment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The technical reliability (i.e., interinstrument and interoperator reliability) of three SEAC-swept frequency bioimpedance monitors was assessed for both errors of measurement and associated analyses. In addition, intraoperator and intrainstrument variability was evaluated for repeat measures over a 4-hour period. The measured impedance values from a range of resistance-capacitance circuits were accurate to within 3% of theoretical values over a range of 50-800 ohms. Similarly, phase was measured over the range 1 degrees-19 degrees with a maximum deviation of 1.3 degrees from the theoretical value. The extrapolated impedance at zero frequency was equally well determined (+/-3%). However, the accuracy of the extrapolated value at infinite frequency was decreased, particularly at impedances below 50 ohms (approaching the lower limit of the measurement range of the instrument). The interinstrument/operator variation for whole body measurements were recorded on human volunteers with biases of less than +/-1% for measured impedance values and less than 3% for phase. The variation in the extrapolated values of impedance at zero and infinite frequencies included variations due to operator choice of the analysis parameters but was still less than +/-0.5%. (C) 1997 Wiley-Liss, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective. To determine the slow crack growth (SCG) and Weibull parameters of five dental ceramics: a vitreous porcelain (V), a leucite-based porcelain (D), a leucite-based glass-ceramic (E1), a lithium disilicate glass-ceramic (E2) and a glass-infiltrated alumina composite (IC). Methods. Eighty disks (empty set 12mm x 1.1mm thick) of each material were constructed according to manufacturers` recommendations and polished. The stress corrosion susceptibility coefficient (n) was obtained by dynamic fatigue test, and specimens were tested in biaxial flexure at five stress rates immersed in artificial saliva at 37 degrees C. Weibull parameters were calculated for the 30 specimens tested at 1MPa/s in artificial saliva at 37 degrees C. The 80 specimens were distributed as follows: 10 for each stress rate (10(-2), 10(-1), 10(1), 10(2) MPa/s), 10 for inert strength (10(2) MPa/s, silicon oil) and 30 for 10(0) MPa/s. Fractographic analysis was also performed to investigate the fracture origin. Results. E2 showed the lowest slow crack growth susceptibility coefficient (17.2), followed by D (20.4) and V (26.3). E1 and IC presented the highest n values (30.1 and 31.1, respectively). Porcelain V presented the lowest Weibull modulus (5.2). All other materials showed similar Weibull modulus values, ranging from 9.4 to 11.7. Fractographic analysis indicated that for porcelain D, glass-ceramics E1 and E2, and composite IC crack deflection was the main toughening mechanism. Significance. This study provides a detailed microstructural and slow crack growth characterization of widely used dental ceramics. This is important from a clinical standpoint to assist the clinician in choosing the best ceramic material for each situation as well as predicting its clinical longevity. It also can be helpful in developing new materials for dental prostheses. (c) 2010 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last few years, we have observed an exponential increasing of the information systems, and parking information is one more example of them. The needs of obtaining reliable and updated information of parking slots availability are very important in the goal of traffic reduction. Also parking slot prediction is a new topic that has already started to be applied. San Francisco in America and Santander in Spain are examples of such projects carried out to obtain this kind of information. The aim of this thesis is the study and evaluation of methodologies for parking slot prediction and the integration in a web application, where all kind of users will be able to know the current parking status and also future status according to parking model predictions. The source of the data is ancillary in this work but it needs to be understood anyway to understand the parking behaviour. Actually, there are many modelling techniques used for this purpose such as time series analysis, decision trees, neural networks and clustering. In this work, the author explains the best techniques at this work, analyzes the result and points out the advantages and disadvantages of each one. The model will learn the periodic and seasonal patterns of the parking status behaviour, and with this knowledge it can predict future status values given a date. The data used comes from the Smart Park Ontinyent and it is about parking occupancy status together with timestamps and it is stored in a database. After data acquisition, data analysis and pre-processing was needed for model implementations. The first test done was with the boosting ensemble classifier, employed over a set of decision trees, created with C5.0 algorithm from a set of training samples, to assign a prediction value to each object. In addition to the predictions, this work has got measurements error that indicates the reliability of the outcome predictions being correct. The second test was done using the function fitting seasonal exponential smoothing tbats model. Finally as the last test, it has been tried a model that is actually a combination of the previous two models, just to see the result of this combination. The results were quite good for all of them, having error averages of 6.2, 6.6 and 5.4 in vacancies predictions for the three models respectively. This means from a parking of 47 places a 10% average error in parking slot predictions. This result could be even better with longer data available. In order to make this kind of information visible and reachable from everyone having a device with internet connection, a web application was made for this purpose. Beside the data displaying, this application also offers different functions to improve the task of searching for parking. The new functions, apart from parking prediction, were: - Park distances from user location. It provides all the distances to user current location to the different parks in the city. - Geocoding. The service for matching a literal description or an address to a concrete location. - Geolocation. The service for positioning the user. - Parking list panel. This is not a service neither a function, is just a better visualization and better handling of the information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The major objective of this research project was to use thermal analysis techniques in conjunction with x-ray analysis methods to identify and explain chemical reactions that promote aggregate related deterioration in portland cement concrete. Twenty-two different carbonate aggregate samples were subjected to a chemical testing scheme that included: • bulk chemistry (major, minor and selected trace elements) • bulk mineralogy (minor phases concentrated by acid extraction) • solid-solution in the major carbonate phases • crystallite size determinations for the major carbonate phases • a salt treatment study to evaluate the impact of deicer salts Test results from these different studies were then compared to information that had been obtained using thermogravimetric analysis techniques. Since many of the limestones and dolomites that were used in the study had extensive field service records it was possible to correlate many of the variables with service life. The results of this study have indicated that thermogravimetric analysis can play an important role in categorizing carbonate aggregates. In fact, with modern automated thermal analysis systems it should be possible to utilize such methods on a quality control basis. Strong correlations were found between several of the variables that were monitored in this study. In fact, several of the variables exhibited significant correlations to concrete service life. When the full data set was utilized (n = 18), the significant correlations to service life can be summarized as follows ( a = 5% level): • Correlation coefficient, r, = -0.73 for premature TG loss versus service life. • Correlation coefficient, r, = 0.74 for relative crystallite size versus service life. • Correlation coefficient, r, = 0.53 for ASTM C666 durability factor versus service life. • Correlation coefficient, r, = -0.52 for acid-insoluble residue versus service life. Separation of the carbonate aggregates into their mineralogical categories (i.e., calcites and dolomites) tended to increase the correlation coefficients for some specific variables (r sometimes approached 0.90); however, the reliability of such correlations was questionable because of the small number of samples that were present in this study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Both the intermolecular interaction energies and the geometries for M ̄ thiophene, M ̄ pyrrole, M n+ ̄ thiophene, and M n+ ̄ pyrrole ͑with M = Li, Na, K, Ca, and Mg; and M n+ = Li+ , Na+ , K+ , Ca2+, and Mg2+͒ have been estimated using four commonly used density functional theory ͑DFT͒ methods: B3LYP, B3PW91, PBE, and MPW1PW91. Results have been compared to those provided by HF, MP2, and MP4 conventional ab initio methods. The PBE and MPW1PW91 are the only DFT methods able to provide a reasonable description of the M ̄ complexes. Regarding M n+ ̄ ␲ complexes, the four DFT methods have been proven to be adequate in the prediction of these electrostatically stabilized systems, even though they tend to overestimate the interaction energies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of this master’s thesis is to examine if Weibull analysis is suitable method for warranty forecasting in the Case Company. The Case Company has used Reliasoft’s Weibull++ software, which is basing on the Weibull method, but the Company has noticed that the analysis has not given right results. This study was conducted making Weibull simulations in different profit centers of the Case Company and then comparing actual cost and forecasted cost. Simula-tions were made using different time frames and two methods for determining future deliveries. The first sub objective is to examine, which parameters of simulations will give the best result to each profit center. The second sub objective of this study is to create a simple control model for following forecasted costs and actual realized costs. The third sub objective is to document all Qlikview-parameters of profit centers. This study is a constructive research, and solutions for company’s problems are figured out in this master’s thesis. In the theory parts were introduced quality issues, for example; what is quality, quality costing and cost of poor quality. Quality is one of the major aspects in the Case Company, so understand-ing the link between quality and warranty forecasting is important. Warranty management was also introduced and other different tools for warranty forecasting. The Weibull method and its mathematical properties and reliability engineering were introduced. The main results of this master’s thesis are that the Weibull analysis forecasted too high costs, when calculating provision. Although, some forecasted values of profit centers were lower than actual values, the method works better for planning purposes. One of the reasons is that quality improving or alternatively quality decreasing is not showing in the results of the analysis in the short run. The other reason for too high values is that the products of the Case Company are com-plex and analyses were made in the profit center-level. The Weibull method was developed for standard products, but products of the Case Company consists of many complex components. According to the theory, this method was developed for homogeneous-data. So the most im-portant notification is that the analysis should be made in the product level, not the profit center level, when the data is more homogeneous.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software systems are progressively being deployed in many facets of human life. The implication of the failure of such systems, has an assorted impact on its customers. The fundamental aspect that supports a software system, is focus on quality. Reliability describes the ability of the system to function under specified environment for a specified period of time and is used to objectively measure the quality. Evaluation of reliability of a computing system involves computation of hardware and software reliability. Most of the earlier works were given focus on software reliability with no consideration for hardware parts or vice versa. However, a complete estimation of reliability of a computing system requires these two elements to be considered together, and thus demands a combined approach. The present work focuses on this and presents a model for evaluating the reliability of a computing system. The method involves identifying the failure data for hardware components, software components and building a model based on it, to predict the reliability. To develop such a model, focus is given to the systems based on Open Source Software, since there is an increasing trend towards its use and only a few studies were reported on the modeling and measurement of the reliability of such products. The present work includes a thorough study on the role of Free and Open Source Software, evaluation of reliability growth models, and is trying to present an integrated model for the prediction of reliability of a computational system. The developed model has been compared with existing models and its usefulness of is being discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forecasting atmospheric blocking is one of the main problems facing medium-range weather forecasters in the extratropics. The European Centre for Medium-Range Weather Forecasts (ECMWF) Ensemble Prediction System (EPS) provides an excellent basis for medium-range forecasting as it provides a number of different possible realizations of the meteorological future. This ensemble of forecasts attempts to account for uncertainties in both the initial conditions and the model formulation. Since 18 July 2000, routine output from the EPS has included the field of potential temperature on the potential vorticity (PV) D 2 PV units (PVU) surface, the dynamical tropopause. This has enabled the objective identification of blocking using an index based on the reversal of the meridional potential-temperature gradient. A year of EPS probability forecasts of Euro-Atlantic and Pacific blocking have been produced and are assessed in this paper, concentrating on the Euro-Atlantic sector. Standard verification techniques such as Brier scores, Relative Operating Characteristic (ROC) curves and reliability diagrams are used. It is shown that Euro-Atlantic sector-blocking forecasts are skilful relative to climatology out to 10 days, and are more skilful than the deterministic control forecast at all lead times. The EPS is also more skilful than a probabilistic version of this deterministic forecast, though the difference is smaller. In addition, it is shown that the onset of a sector-blocking episode is less well predicted than its decay. As the lead time increases, the probability forecasts tend towards a model climatology with slightly less blocking than is seen in the real atmosphere. This small under-forecasting bias in the blocking forecasts is possibly related to a westerly bias in the ECMWF model. Copyright © 2003 Royal Meteorological Society