979 resultados para reliability testing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We empirically applied the GrooFiWorld agent-based model (Puga-González et al. 2009) in a group of captive mangabeys (Cercocebus torquatus). We analysed several measurements related to aggression and affiliative patterns. The group adopted a combination of despotic and egalitarian behaviours resulting from the behavioural flexibility observed in the Cercopithecinae subfamily. Our study also demonstrates that the GrooFiWorld agent-based model can be extended to other members of the Cercopithecinae subfamily generating parsimonious hypotheses related to the social organization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective The present study evaluated the reliability of digital panoramic radiography in the diagnosis of carotid artery calcifications. Materials and Methods Thirty-five patients under high-risk for development of carotid artery calcifications who had digital panoramic radiography were referred to undergo ultrasonography. Thus, 70 arteries were assessed by both methods. The main parameters utilized to evaluate the panoramic radiography reliability in the diagnosis of carotid artery calcifications were accuracy, sensitivity, specificity and positive predictive value of this method as compared with ultrasonography. Additionally, the McNemar's test was utilized to verify whether there was a statistically significant difference between digital panoramic radiography and ultrasonography. Results Ultrasonography demonstrated carotid artery calcifications in 17 (48.57%) patients. Such individuals presented with a total of 29 (41.43%) carotid arteries affected by calcification. Radiography was accurate in 71.43% (n = 50) of cases evaluated. The degree of sensitivity of this method was 37.93%, specificity of 95.12% and positive predictive value of 84.61%. A statistically significant difference (p < 0.001) was observed between the methods evaluated in their capacity to diagnose carotid artery calcifications. Conclusion Digital panoramic radiography should not be indicated as a method of choice in the investigation of carotid artery calcifications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La dermatite irritative est décrite comme une réaction réversible, non immunologique caractérisée par des lésions d'aspect très variable, allant de la simple rougeur jusqu'à la formation de bulles voire d'une nécrose, accompagnée de prurit ou d'une sensation de brûlure suite à I' application d'une substance chimique. Le teste de prédiction d'irritation cutanée est traditionnellement depuis les années 1940 le Test de Draize. Ce test consiste en l'application d'une substance chimique sur une peau rasée de lapin pendant 4h et de regarder à 24h si des signes cliniques d'irritations sont présents. Cette méthode critiquable autant d'un point de vue éthique que qualitative reste actuellement le teste le plus utilisé. Depuis le début des années 2000 de nouvelles méthodes in vitro se sont développées tel que le model d'épiderme humain recombiné (RHE). Il s agit d'une multicouche de kératinocyte bien différencié obtenu depuis une culture de don d'ovocyte. Cependant cette méthode en plus d'être très couteuse n'obtient au mieux que 76% de résultat similaire comparé au test in vivo humain. Il existe donc la nécessité de développer une nouvelle méthode in vitro qui simulerait encore mieux la réalité anatomique et physiologique retrouvée in vivo. Notre objectif a été de développer cette nouvelle méthode in vitro. Pour cela nous avons travaillé avec de la peau humaine directement prélevée après une abdominoplastie. Celle ci après préparation avec un dermatome, un couteau dont la lame est réglable pour découper l'épaisseur souhaitée de peau, est montée dans un système de diffusion cellulaire. La couche cornée est alors exposée de manière optimale à 1 ml de la substance chimique testée pendant 4h. L'échantillon de peau est alors fixé dans du formaldéhyde pour permettre la préparation de lames standards d'hématoxyline et éosine. L'irritation est alors investiguée selon des critères histopathologiques de spongioses, de nécroses et de vacuolisations cellulaires. Les résultats de ce.tte première batterie de testes sont plus que prometteurs. En effet, comparé au résultat in vivo, nous obtenons 100% de concordance pour les 4 même substances testes irritantes ou non irritantes, ce qui est supérieur au model d épiderme humain recombiné (76%). De plus le coefficient de variation entre les 3 différentes séries est inférieur à 0.1 ce qui montre une bonne reproductibilité dans un même laboratoire. Dans le futur cette méthode va devoir être testée avec un plus grand nombre de substances chimiques et sa reproductibilité évaluée dans différents laboratoires. Mais cette première evaluation, très encourageante, ouvre des pistes précieuses pour l'avenir des tests irritatifs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bloodstream infections and sepsis are a major cause of morbidity and mortality. The successful outcome of patients suffering from bacteremia depends on a rapid identification of the infectious agent to guide optimal antibiotic treatment. The analysis of Gram stains from positive blood culture can be rapidly conducted and already significantly impact the antibiotic regimen. However, the accurate identification of the infectious agent is still required to establish the optimal targeted treatment. We present here a simple and fast bacterial pellet preparation from a positive blood culture that can be used as a sample for several essential downstream applications such as identification by MALDI-TOF MS, antibiotic susceptibility testing (AST) by disc diffusion assay or automated AST systems and by automated PCR-based diagnostic testing. The performance of these different identification and AST systems applied directly on the blood culture bacterial pellets is very similar to the performance normally obtained from isolated colonies grown on agar plates. Compared to conventional approaches, the rapid acquisition of a bacterial pellet significantly reduces the time to report both identification and AST. Thus, following blood culture positivity, identification by MALDI-TOF can be reported within less than 1 hr whereas results of AST by automated AST systems or disc diffusion assays within 8 to 18 hr, respectively. Similarly, the results of a rapid PCR-based assay can be communicated to the clinicians less than 2 hr following the report of a bacteremia. Together, these results demonstrate that the rapid preparation of a blood culture bacterial pellet has a significant impact on the identification and AST turnaround time and thus on the successful outcome of patients suffering from bloodstream infections.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many educators and educational institutions have yet to integrate web-based practices into their classrooms and curricula. As a result, it can be difficult to prototype and evaluate approaches to transforming classrooms from static endpoints to dynamic, content-creating nodes in the online information ecosystem. But many scholastic journalism programs have already embraced the capabilities of the Internet for virtual collaboration, dissemination, and reader participation. Because of this, scholastic journalism can act as a test-bed for integrating web-based sharing and collaboration practices into classrooms. Student Journalism 2.0 was a research project to integrate open copyright licenses into two scholastic journalism programs, to document outcomes, and to identify recommendations and remaining challenges for similar integrations. Video and audio recordings of two participating high school journalism programs informed the research. In describing the steps of our integration process, we note some important legal, technical, and social challenges. Legal worries such as uncertainty over copyright ownership could lead districts and administrators to disallow open licensing of student work. Publication platforms among journalism classrooms are far from standardized, making any integration of new technologies and practices difficult to achieve at scale. And teachers and students face challenges re-conceptualizing the role their class work can play online.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The work aims to analyze the possibilities of utilizing old crane driving AC induction motors in modern pulse-width-modulated variable frequency drives. Bearing currents and voltage stresses are the two main problems associated with modern IGBT inverters, and they may cause premature failure of an old induction motor. The origins of these two problems are studied. An analysis of the mechanism of bearing failure is proposed. Certain types of bearing currents are considered in detail. The most effective and economical means are chosen for bearing currents mitigation. Transient phenomena of cables and mechanism of over voltages occurring at motor terminals are studied in the work. The weakest places of the stator winding insulation system are shown and recommendations are given considering the mitigation of voltage stresses. Only the most appropriate and cost effective preventative methods are chosen for old motor drives. Rewinding of old motors is also considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The central goal of food safety policy in the European Union (EU) is to protect consumer health by guaranteeing a high level of food safety throughout the food chain. This goal can in part be achieved by testing foodstuffs for the presence of various chemical and biological hazards. The aim of this study was to facilitate food safety testing by providing rapid and user-friendly methods for the detection of particular food-related hazards. Heterogeneous competitive time-resolved fluoroimmunoassays were developed for the detection of selected veterinary residues, that is coccidiostat residues, in eggs and chicken liver. After a simplified sample preparation procedure, the immunoassays were performed either in manual format with dissociation-enhanced measurement or in automated format with pre-dried assay reagents and surface measurement. Although the assays were primarily designed for screening purposes providing only qualitative results, they could also be used in a quantitative mode. All the developed assays had good performance characteristics enabling reliable screening of samples at concentration levels required by the authorities. A novel polymerase chain reaction (PCR)-based assay system was developed for the detection of Salmonella spp. in food. The sample preparation included a short non-selective pre-enrichment step, after which the target cells were collected with immunomagnetic beads and applied to PCR reaction vessels containing all the reagents required for the assay in dry form. The homogeneous PCR assay was performed with a novel instrument platform, GenomEra, and the qualitative assay results were automatically interpreted based on end-point time-resolved fluorescence measurements and cut-off values. The assay was validated using various food matrices spiked with sub-lethally injured Salmonella cells at levels of 1-10 colony forming units (CFU)/25 g of food. The main advantage of the system was the exceptionally short time to result; the entire process starting from the pre-enrichment and ending with the PCR result could be completed in eight hours. In conclusion, molecular methods using state-of-the-art assay techniques were developed for food safety testing. The combination of time-resolved fluorescence detection and ready-to-use reagents enabled sensitive assays easily amenable to automation. Consequently, together with the simplified sample preparation, these methods could prove to be applicable in routine testing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Control on regional government budgets is important in a monetary union as lower tiers of government have fewer incentives to consolidate debt. According to the Fiscal Theory of the Price Level; unsustainable non-Ricardian fiscal policies eventually force monetary policy to adjust. Hence, uncoordinated and non-regulated regional fiscal policies would therefore threaten price stability for the monetary union as a whole. However, the union central bank is not without defense. A federal government that internalises the spillover effect of non-Ricardian fiscal policies on the price level can offset non-Ricardian regional fiscal policies. A federal government, which taxes and transfers resources between regions, may compensate for unsustainable regional fiscal policies so as to keep fiscal policy Ricardian on aggregate. Following Canzoneri et al. (2001), we test the validity of the Fiscal Theory of the Price Level for both federal and regional governments in Germany. We find evidence of a spillover effect of unsustainable policies on the price level for other Länder. However, the German federal government offsets this effect on the price level by running Ricardian policies. These results have implications for the regulation of fiscal policies in the EMU.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distribution companies are facing numerous challenges in the near future. Regulation defines correlation between power quality and revenue cap. Companies have to take measures for reliability increase to successfully compete in modern conditions. Most of the failures seen by customers originate in medium voltage networks. Implementation of network automation is the very effective measure to reduce duration and number of outages, and consequently, outage costs. Topic of this diploma work is study of automation investments effect on outage costs and other reliability indices. Calculation model have been made to perform needed reliability calculations. Theoretical study of different automation scenarios has been done. Case feeder from actual distribution company has been studied and various renovation plans have been suggested. Network automation proved to be effective measure for increasing medium voltage network reliability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fatal and permanently disabling accidents form only one per I cent of all occupational accidents but in many branches of industry they account for more than half the accident costs. Furthermore the human suffering of the victim and his family is greater in severe accidents than in slight ones. For both human and economic reasons the severe accident risks should be identified befor injuries occur. It is for this purpose that different safety analysis methods have been developed . This study shows two new possible approaches to the problem.. The first is the hypothesis that it is possible to estimate the potential severity of accidents independent of the actual severity. The second is the hypothesis that when workers are also asked to report near accidents, they are particularly prone to report potentially severe near accidents on the basis of their own subjective risk assessment. A field study was carried out in a steel factory. The results supported both the hypotheses. The reliability and the validity of post incident estimates of an accident's potential severity were reasonable. About 10 % of accidents were estimated to be potentially critical; they could have led to death or very severe permanent disability. Reported near accidents were significantly more severe, about 60 $ of them were estimated to be critical. Furthermore the validity of workers subjective risk assessment, manifested in the near accident reports, proved to be reasonable. The studied new methods require further development and testing. They could be used both in routine usage in work places and in research for identifying and setting the priorities of accident risks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Software faults are expensive and cause serious damage, particularly if discovered late or not at all. Some software faults tend to be hidden. One goal of the thesis is to figure out the status quo in the field of software fault elimination since there are no recent surveys of the whole area. Basis for a structural framework is proposed for this unstructured field, paying attention to compatibility and how to find studies. Bug elimination means are surveyed, including bug knowhow, defect prevention and prediction, analysis, testing, and fault tolerance. The most common research issues for each area are identified and discussed, along with issues that do not get enough attention. Recommendations are presented for software developers, researchers, and teachers. Only the main lines of research are figured out. The main emphasis is on technical aspects. The survey was done by performing searches in IEEE, ACM, Elsevier, and Inspect databases. In addition, a systematic search was done for a few well-known related journals from recent time intervals. Some other journals, some conference proceedings and a few books, reports, and Internet articles have been investigated, too. The following problems were found and solutions for them discussed. Quality assurance is testing only is a common misunderstanding, and many checks are done and some methods applied only in the late testing phase. Many types of static review are almost forgotten even though they reveal faults that are hard to be detected by other means. Other forgotten areas are knowledge of bugs, knowing continuously repeated bugs, and lightweight means to increase reliability. Compatibility between studies is not always good, which also makes documents harder to understand. Some means, methods, and problems are considered method- or domain-specific when they are not. The field lacks cross-field research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis concentrates on studying the operational disturbance behavior of machine tools integrated into FMS. Operational disturbances are short term failures of machine tools which are especially disruptive to unattended or unmanned operation of FMS. The main objective was to examine the effect of operational disturbances on reliability and operation time distribution for machine tools. The theoretical part of the thesis covers the fimdamentals of FMS relating to the subject of this study. The concept of FMS, its benefits and operator's role in FMS operation are reviewed. The importance of reliability is presented. The terms describing the operation time of machine tools are formed by adopting standards and references. The concept of failure and indicators describing reliability and operational performance for machine tools in FMSs are presented. The empirical part of the thesis describes the research methodology which is a combination of automated (ADC) and manual data collection. By using this methodology it is possible to have a complete view of the operation time distribution for studied machine tools. Data collection was carried out in four FMSs consisting of a total of 17 machine tools. Each FMS's basic features and the signals of ADC are described. The indicators describing the reliability and operation time distribution of machine tools were calculated according to collected data. The results showed that operational disturbances have a significant influence on machine tool reliability and operational performance. On average, an operational disturbance occurs every 8,6 hours of operation time and has a down time of 0,53 hours. Operational disturbances cause a 9,4% loss in operation time which is twice the amount of losses caused by technical failures (4,3%). Operational disturbances have a decreasing influence on the utilization rate. A poor operational disturbance behavior decreases the utilization rate. It was found that the features of a part family to be machined and the method technology related to it are defining the operational disturbance behavior of the machine tool. Main causes for operational disturbances were related to material quality variations, tool maintenance, NC program errors, ATC and machine tool control. Operator's role was emphasized. It was found that failure recording activity of the operators correlates with the utilization rate. The more precisely the operators record the failure, the higher is the utilization rate. Also the FMS organizations which record failures more precisely have fewer operational disturbances.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tämä diplomityö käsittelee teollisen yrityksen tuotannonohjauksen kehittämistä piensarjatuotannossa. Työn kohteena on ABB Oy:n Tuulivoimageneraattorit-tulosyksikkö, joka valmistaa vakiotuotteita asiakasohjautuvasti. Työssä esitellään aluksi tuotannon ja tuotannonohjauksen teoriaa. Lävitse käydään perusasioiden kuten määritelmien, tavoitteiden ja tehtävien lisäksi tuotannonohjausprosessia sekä tuotannonohjauksen tietotekniikkaa. Teorian jälkeisessä empiriaosuudessa esitellään työssä kehitettyjä keinoja tuotannonohjauksen parantamiseksi. Tutkimus on toteutettu teoreettisen ja empiirisen tutkimustyön avulla. Teoreettiseen tutkimustyöhön sisältyi suomalaisiin ja ulkomaalaisiin kirjallisuuslähteisiin perehtyminen. Empiirinen tutkimustyö suoritettiin itsenäisen ongelman ratkaisutyön avulla. Tämä sisälsi kehittämiskohteiden analysoinnin, tarkempien kehittämistarpeiden määrityksen sekä kokeilujen kautta tapahtuneen kehittämistyön. Tutkimuksen päätavoitteena oli selvittää, miten tuotannonohjauksen kehittämisellä voidaan parantaa kohteena olevan tulosyksikön tuottavuutta ja kannattavuutta. Päätavoitteen pohjalta muodostettiin kuusi osatavoitetta: toimitusvarmuuden parantaminen, kapasiteetin kuormitusasteen nostaminen, kapasiteetin suunnittelun kehittäminen, läpäisyaikojen lyhentäminen, uuden ERP-järjestelmän vaatimusmäärittely sekä tuotannonohjausprosessin määrittäminen. Työssä rakennettiin neljään ensiksi mainittuun osatavoitteeseen tietotekniset sovellukset, jotka mahdollistavat osatavoitteiden suunnittelun ja ohjaamisen. Sovelluksia varten kullekin tuotteelle määriteltiin esimerkiksi työnvaiheketjut läpäisyaikoineen, kuormitusryhmät, kuormitusryhmien kapasiteetit, tuotteiden kuormittavuudet sekä kriittiset työvälineet. Työ osoitti, että tietotekniikka auttaa suuresti tuotannonohjauksessa. Lisääntynyt läpinäkyvyys, parantunut tiedonkulku, simulointimahdollisuudet sekä graafinen esitystapa helpottavat erilaisten suunnitelmien teossa ja parantavat siten päätöksenteon laatua. Tietotekniikan hyväksikäytön pohjana toimii tuotannon perus- ja tapahtumatietojen kurinalainen päivitys. Tämän vuoksi tietojärjestelmistä kannattaa rakentaa mahdollisimman yksinkertaisia.