955 resultados para time-variant reliability


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Effective treatment for breast cancer requires accurate preoperative planning, developing and implementing a consistent definition of margin clearance, and using tools that provide detailed real-time intraoperative information on margin status. Intraoperative ultrasound (IOUS) may fulfil these requirements and may offer few advantages that other preoperative localization and intraoperative margin assessment techniques may notPurpose: The goal of the present work is to determine how accurate the intraoperative ultrasound should be to acquire complete surgical excision with negative histological margins in patients undergoing Breast Conservative SurgeryDesign: A diagnostic test study with a cross-sectional design carried out in a tertiary referral hospital in Girona within a Breast Pathology UnitParticipants: Women diagnosed with breast cancer undergoing a Breast Conservative Surgery in the Breast Pathology Unit at Hospital Universitari de Girona Dr. Josep Trueta

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of Process Management has been used by managers and consultants that search for the improvement of both operational or managerial industrial processes. Its strength is in focusing on the external client and on the optimization of the internal process in order to fulfill their needs. By the time the needs of internal clients are being sought, a set of improvements takes place. The Taguchi method, because of its claim for knowledge share between design engineers and people engaged in the process, is a candidate for process management implementation. The objective of this paper is to propose that kind of application aiming for improvements related with reliability of results revealed by the robust design of Taguchi method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main characteristic of the nursing Interactive Observation Scale for Psychiatric Inpatients (IOSPI) is the necessity of interaction between raters and patients during assessment. The aim of this study was to evaluate the reliability and validity of the scale in the "real" world of daily ward practice and to determine whether the IOSPI can increase the interaction time between raters and patients and influence the raters' opinion about mental illness. All inpatients of a general university hospital psychiatric ward were assessed daily over a period of two months by 9 nursing aides during the morning and afternoon shifts, with 273 pairs of daily observations. Once a week the patients were interviewed by a psychiatrist who filled in the Brief Psychiatric Rating Scale (BPRS). The IOSPI total score was found to show significant test-retest reliability (interclass correlation coefficient = 0.83) and significant correlation with the BPRS total score (r = 0.69), meeting the criteria of concurrent validity. The instrument can also discriminate between patients in need of further inpatient treatment from those about to be discharged (negative predictive value for discharge = 0.91). Using this scale, the interaction time between nursing aides and patients increased significantly (t = 2.93, P<0.05) and their opinion about the mental illness changed. The "social restrictiveness" factor of the opinion scale about mental illness showed a significant reduction (t = 4.27, P<0.01) and the "interpersonal etiology" factor tended to increase (t = 1.98, P = 0.08). The IOSPI was confirmed as a reliable and valid scale and as an efficient tool to stimulate the therapeutic attitudes of the nursing staff.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of the present study was to determine the reliability of the Brazilian version of the Composite International Diagnostic Interview 2.1 (CIDI 2.1) in clinical psychiatry. The CIDI 2.1 was translated into Portuguese using WHO guidelines and reliability was studied using the inter-rater reliability method. The study sample consisted of 186 subjects from psychiatric hospitals and clinics, primary care centers and community services. The interviewers consisted of a group of 13 lay and three non-lay interviewers submitted to the CIDI training. The average interview time was 2 h and 30 min. General reliability ranged from kappa 0.50 to 1. For lifetime diagnoses the reliability ranged from kappa 0.77 (Bipolar Affective Disorder) to 1 (Substance-Related Disorder, Alcohol-Related Disorder, Eating Disorders). Previous year reliability ranged from kappa 0.66 (Obsessive-Compulsive Disorder) to 1 (Dissociative Disorders, Maniac Disorders, Eating Disorders). The poorest reliability rate was found for Mild Depressive Episode (kappa = 0.50) during the previous year. Training proved to be a fundamental factor for maintaining good reliability. Technical knowledge of the questionnaire compensated for the lack of psychiatric knowledge of the lay personnel. Inter-rater reliability was good to excellent for persons in psychiatric practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study compared the effectiveness of the multifocal visual evoked cortical potentials (mfVEP) elicited by pattern pulse stimulation with that of pattern reversal in producing reliable responses (signal-to-noise ratio >1.359). Participants were 14 healthy subjects. Visual stimulation was obtained using a 60-sector dartboard display consisting of 6 concentric rings presented in either pulse or reversal mode. Each sector, consisting of 16 checks at 99% Michelson contrast and 80 cd/m² mean luminance, was controlled by a binary m-sequence in the time domain. The signal-to-noise ratio was generally larger in the pattern reversal than in the pattern pulse mode. The number of reliable responses was similar in the central sectors for the two stimulation modes. At the periphery, pattern reversal showed a larger number of reliable responses. Pattern pulse stimuli performed similarly to pattern reversal stimuli to generate reliable waveforms in R1 and R2. The advantage of using both protocols to study mfVEP responses is their complementarity: in some patients, reliable waveforms in specific sectors may be obtained with only one of the two methods. The joint analysis of pattern reversal and pattern pulse stimuli increased the rate of reliability for central sectors by 7.14% in R1, 5.35% in R2, 4.76% in R3, 3.57% in R4, 2.97% in R5, and 1.78% in R6. From R1 to R4 the reliability to generate mfVEPs was above 70% when using both protocols. Thus, for a very high reliability and thorough examination of visual performance, it is recommended to use both stimulation protocols.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge seems to need the admixture of de facto reliability and epistemic responsibility. But philosophers have had a hard time in attempting to combine them in order to achieve a satisfactory account of knowledge. In this paper I attempt to find a solution by capitalizing on the real and ubiquitous human phenomenon that is the social dispersal of epistemic labour through time. More precisely, the central objective of the paper is to deliver a novel and plausible social account of knowledge-relevant responsibility and to consider the merits of the proposed combination of reliability and responsibility with respect to certain cases of unreflective epistemic subjects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently, laser scribing is growing material processing method in the industry. Benefits of laser scribing technology are studied for example for improving an efficiency of solar cells. Due high-quality requirement of the fast scribing process, it is important to monitor the process in real time for detecting possible defects during the process. However, there is a lack of studies of laser scribing real time monitoring. Commonly used monitoring methods developed for other laser processes such a laser welding, are sufficient slow and existed applications cannot be implemented in fast laser scribing monitoring. The aim of this thesis is to find a method for laser scribing monitoring with a high-speed camera and evaluate reliability and performance of the developed monitoring system with experiments. The laser used in experiments is an IPG ytterbium pulsed fiber laser with 20 W maximum average power and Scan head optics used in the laser is Scanlab’s Hurryscan 14 II with an f100 tele-centric lens. The camera was connected to laser scanner using camera adapter to follow the laser process. A powerful fully programmable industrial computer was chosen for executing image processing and analysis. Algorithms for defect analysis, which are based on particle analysis, were developed using LabVIEW system design software. The performance of the algorithms was analyzed by analyzing a non-moving image from the scribing line with resolution 960x20 pixel. As a result, the maximum analysis speed was 560 frames per second. Reliability of the algorithm was evaluated by imaging scribing path with a variable number of defects 2000 mm/s when the laser was turned off and image analysis speed was 430 frames per second. The experiment was successful and as a result, the algorithms detected all defects from the scribing path. The final monitoring experiment was performed during a laser process. However, it was challenging to get active laser illumination work with the laser scanner due physical dimensions of the laser lens and the scanner. For reliable error detection, the illumination system is needed to be replaced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This document could not have been completed without the hard work of a number of individuals. First and foremost, my supervisor, Dr. David Gabriel deserves the utmost recognition for the immense effort and time spent guiding the production of this document through the various stages of completion. Also, aiding in the data collection, technical support, and general thought processing were Lab Technician Greig Inglis and fellow members of the Electromyographic Kinesiology Laboratory Jon Howard, Sean Lenhardt, Lara Robbins, and Corrine Davies-Schinkel. The input of Drs. Ted Clancy, Phil Sullivan and external examiner Dr. Anita Christie, all members ofthe assessment committee, was incredibly important and vital to the completion of this work. Their expertise provided a strong source of knowledge and went to ensure that this project was completed at exemplary level. There were a number of other individuals who were an immense help in getting this project off the ground and completed. The donation of their time and efforts was very generous and much needed in order to fulfill the requirements needed for completion of this study. Finally, I cannot exclude the contributions of my family throughout this project especially that of my parents whose support never wavers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contexte. Les études cas-témoins sont très fréquemment utilisées par les épidémiologistes pour évaluer l’impact de certaines expositions sur une maladie particulière. Ces expositions peuvent être représentées par plusieurs variables dépendant du temps, et de nouvelles méthodes sont nécessaires pour estimer de manière précise leurs effets. En effet, la régression logistique qui est la méthode conventionnelle pour analyser les données cas-témoins ne tient pas directement compte des changements de valeurs des covariables au cours du temps. Par opposition, les méthodes d’analyse des données de survie telles que le modèle de Cox à risques instantanés proportionnels peuvent directement incorporer des covariables dépendant du temps représentant les histoires individuelles d’exposition. Cependant, cela nécessite de manipuler les ensembles de sujets à risque avec précaution à cause du sur-échantillonnage des cas, en comparaison avec les témoins, dans les études cas-témoins. Comme montré dans une étude de simulation précédente, la définition optimale des ensembles de sujets à risque pour l’analyse des données cas-témoins reste encore à être élucidée, et à être étudiée dans le cas des variables dépendant du temps. Objectif: L’objectif général est de proposer et d’étudier de nouvelles versions du modèle de Cox pour estimer l’impact d’expositions variant dans le temps dans les études cas-témoins, et de les appliquer à des données réelles cas-témoins sur le cancer du poumon et le tabac. Méthodes. J’ai identifié de nouvelles définitions d’ensemble de sujets à risque, potentiellement optimales (le Weighted Cox model and le Simple weighted Cox model), dans lesquelles différentes pondérations ont été affectées aux cas et aux témoins, afin de refléter les proportions de cas et de non cas dans la population source. Les propriétés des estimateurs des effets d’exposition ont été étudiées par simulation. Différents aspects d’exposition ont été générés (intensité, durée, valeur cumulée d’exposition). Les données cas-témoins générées ont été ensuite analysées avec différentes versions du modèle de Cox, incluant les définitions anciennes et nouvelles des ensembles de sujets à risque, ainsi qu’avec la régression logistique conventionnelle, à des fins de comparaison. Les différents modèles de régression ont ensuite été appliqués sur des données réelles cas-témoins sur le cancer du poumon. Les estimations des effets de différentes variables de tabac, obtenues avec les différentes méthodes, ont été comparées entre elles, et comparées aux résultats des simulations. Résultats. Les résultats des simulations montrent que les estimations des nouveaux modèles de Cox pondérés proposés, surtout celles du Weighted Cox model, sont bien moins biaisées que les estimations des modèles de Cox existants qui incluent ou excluent simplement les futurs cas de chaque ensemble de sujets à risque. De plus, les estimations du Weighted Cox model étaient légèrement, mais systématiquement, moins biaisées que celles de la régression logistique. L’application aux données réelles montre de plus grandes différences entre les estimations de la régression logistique et des modèles de Cox pondérés, pour quelques variables de tabac dépendant du temps. Conclusions. Les résultats suggèrent que le nouveau modèle de Cox pondéré propose pourrait être une alternative intéressante au modèle de régression logistique, pour estimer les effets d’expositions dépendant du temps dans les études cas-témoins

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les E. coli entérotoxinogènes (ETEC) sont souvent la cause de diarrhée post-sevrage chez le porc. Deux types d’entérotoxines sont retrouvées chez les ETEC, soit les thermolabiles, comme la toxine LT, et les thermostables, comme EAST-1, STa et STb. Cette dernière est composée de 48 acides aminés et est impliquée dans la pathologie causée par les ETEC. Pour la première fois un variant de la toxine STb fut découvert dans une étude. Nous avons alors émis l’hypothèse qu’il y a présence de variants dans la population de souches ETEC du Québec. Dans les 100 souches STb+ analysées, 23 possédaient le gène de la toxine avec une variation dans la séquence génétique : l’asparagine était présente en position 12 remplaçant ainsi l’histidine. Une corrélation entre la présence du variant et la présence de facteurs de virulence retrouvés dans ces 100 souches ETEC étudiées a été effectuée. Ce variant semble fortement associé à la toxine STa puisque toutes les souches variantes ont hybridé avec le gène codant pour cette dernière. Étant donné sa présence répandue dans la population de souches ETEC du Québec, nous avons de plus émis l’hypothèse que ce variant a des caractéristiques biologiques altérées par rapport à la toxine sauvage. L’analyse par dichroïsme circulaire a montré que le variant et la toxine sauvage ont une structure secondaire ainsi qu’une stabilité similaires. Par la suite, l’attachement au récepteur de la toxine, le sulfatide, a été étudié par résonnance plasmonique de surface (biacore). Le variant a une affinité au sulfatide légèrement réduite comparativement à la toxine sauvage. Puisque l’internalisation de la toxine fut observée dans une étude précédente et qu’elle semble liée à la toxicité, nous avons comparé l’internalisation du variant et de la toxine sauvage à l’intérieur des cellules IPEC-J2. L’internalisation du variant dans les cellules est légèrement supérieure à l’internalisation de la toxine sauvage. Ces résultats suggèrent que le variant est biochimiquement et structurellement comparable à la toxine sauvage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To determine overall, test–retest and inter-rater reliability of posture indices among persons with idiopathic scoliosis. Design A reliability study using two raters and two test sessions. Setting Tertiary care paediatric centre. Participants Seventy participants aged between 10 and 20 years with different types of idiopathic scoliosis (Cobb angle 15 to 60°) were recruited from the scoliosis clinic. Main outcome measures Based on the XY co-ordinates of natural reference points (e.g. eyes) as well as markers placed on several anatomical landmarks, 32 angular and linear posture indices taken from digital photographs in the standing position were calculated from a specially developed software program. Generalisability theory served to estimate the reliability and standard error of measurement (SEM) for the overall, test–retest and inter-rater designs. Bland and Altman's method was also used to document agreement between sessions and raters. Results In the random design, dependability coefficients demonstrated a moderate level of reliability for six posture indices (ϕ = 0.51 to 0.72) and a good level of reliability for 26 posture indices out of 32 (ϕ ≥ 0.79). Error attributable to marker placement was negligible for most indices. Limits of agreement and SEM values were larger for shoulder protraction, trunk list, Q angle, cervical lordosis and scoliosis angles. The most reproducible indices were waist angles and knee valgus and varus. Conclusions Posture can be assessed in a global fashion from photographs in persons with idiopathic scoliosis. Despite the good reliability of marker placement, other studies are needed to minimise measurement errors in order to provide a suitable tool for monitoring change in posture over time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis T-policy is implemented to the inventory system with random lead time and also repair in the reliability of k-out-of-n system. Inventory system may be considered as the system of keeping records of the amounts of commodities in stock. Reliability is defined as the ability of an entity to perform a required function under given conditions for a given time interval. It is measured by the probability that an entity E can perform a required function under given conditions for the time interval. In this thesis considered k-out-of-n system with repair and two modes of service under T-policy. In this case first server is available always and second server is activated on elapse of T time units. The lead time is exponentially distributed with parameter  and T is exponentially distributed with parameter  from the epoch at which it was inactivated after completion of repair of all failed units in the previous cycle, or the moment n-k failed units accumulate. The repaired units are assumed to be as good as new. In this study , three different situations, ie; cold system, warm system and hot system. A k-out-of-n system is called cold, warm or hot according as the functional units do not fail, fail at a lower rate or fail at the same rate when system is shown as that when it is up.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present environment, industry should provide the products of high quality. Quality of products is judged by the period of time they can successfully perform their intended functions without failure. The cause of the failures can be ascertained through life testing experiments and the times to failure due to different cause are likely to follow different distributions. Knowledge of this distribution is essential to eliminate causes of failures and thereby to improve the quality and the reliability of products. The main accomplishment expected to the study is to develop statistical tools that could facilitate solution to lifetime data arising in such and similar contexts

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The service quality of any sector has two major aspects namely technical and functional. Technical quality can be attained by maintaining technical specification as decided by the organization. Functional quality refers to the manner which service is delivered to customer which can be assessed by the customer feed backs. A field survey was conducted based on the management tool SERVQUAL, by designing 28 constructs under 7 dimensions of service quality. Stratified sampling techniques were used to get 336 valid responses and the gap scores of expectations and perceptions are analyzed using statistical techniques to identify the weakest dimension. To assess the technical aspects of availability six months live outage data of base transceiver were collected. The statistical and exploratory techniques were used to model the network performance. The failure patterns have been modeled in competing risk models and probability distribution of service outage and restorations were parameterized. Since the availability of network is a function of the reliability and maintainability of the network elements, any service provider who wishes to keep up their service level agreements on availability should be aware of the variability of these elements and its effects on interactions. The availability variations were studied by designing a discrete time event simulation model with probabilistic input parameters. The probabilistic distribution parameters arrived from live data analysis was used to design experiments to define the availability domain of the network under consideration. The availability domain can be used as a reference for planning and implementing maintenance activities. A new metric is proposed which incorporates a consistency index along with key service parameters that can be used to compare the performance of different service providers. The developed tool can be used for reliability analysis of mobile communication systems and assumes greater significance in the wake of mobile portability facility. It is also possible to have a relative measure of the effectiveness of different service providers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reliability analysis is a well established branch of statistics that deals with the statistical study of different aspects of lifetimes of a system of components. As we pointed out earlier that major part of the theory and applications in connection with reliability analysis were discussed based on the measures in terms of distribution function. In the beginning chapters of the thesis, we have described some attractive features of quantile functions and the relevance of its use in reliability analysis. Motivated by the works of Parzen (1979), Freimer et al. (1988) and Gilchrist (2000), who indicated the scope of quantile functions in reliability analysis and as a follow up of the systematic study in this connection by Nair and Sankaran (2009), in the present work we tried to extend their ideas to develop necessary theoretical framework for lifetime data analysis. In Chapter 1, we have given the relevance and scope of the study and a brief outline of the work we have carried out. Chapter 2 of this thesis is devoted to the presentation of various concepts and their brief reviews, which were useful for the discussions in the subsequent chapters .In the introduction of Chapter 4, we have pointed out the role of ageing concepts in reliability analysis and in identifying life distributions .In Chapter 6, we have studied the first two L-moments of residual life and their relevance in various applications of reliability analysis. We have shown that the first L-moment of residual function is equivalent to the vitality function, which have been widely discussed in the literature .In Chapter 7, we have defined percentile residual life in reversed time (RPRL) and derived its relationship with reversed hazard rate (RHR). We have discussed the characterization problem of RPRL and demonstrated with an example that the RPRL for given does not determine the distribution uniquely