928 resultados para Residual-based tests


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis work encloses activities carried out in the Laser Center of the Polytechnic University of Madrid and the laboratories of the University of Bologna in Forlì. This thesis focuses on the superficial mechanical treatment for metallic materials called Laser Shock Peening (LSP). This process is a surface enhancement treatment which induces a significant layer of beneficial compressive residual stresses underneath the surface of metal components in order to improve the detrimental effects of the crack growth behavior rate in it. The innovation aspect of this work is the LSP application to specimens with extremely low thickness. In particular, after a bibliographic study and comparison with the main treatments used for the same purposes, this work analyzes the physics of the operation of a laser, its interaction with the surface of the material and the generation of the surface residual stresses which are fundamentals to obtain the LSP benefits. In particular this thesis work regards the application of this treatment to some Al2024-T351 specimens with low thickness. Among the improvements that can be obtained performing this operation, the most important in the aeronautic field is the fatigue life improvement of the treated components. As demonstrated in this work, a well-done LSP treatment can slow down the progress of the defects in the material that could lead to sudden failure of the structure. A part of this thesis is the simulation of this phenomenon using the program AFGROW, with which have been analyzed different geometric configurations of the treatment, verifying which was better for large panels of typical aeronautical interest. The core of the LSP process are the residual stresses that are induced on the material by the interaction with the laser light, these can be simulated with the finite elements but it is essential to verify and measure them experimentally. In the thesis are introduced the main methods for the detection of those stresses, they can be mechanical or by diffraction. In particular, will be described the principles and the detailed realization method of the Hole Drilling measure and an introduction of the X-ray Diffraction; then will be presented the results I obtained with both techniques. In addition to these two measurement techniques will also be introduced Neutron Diffraction method. The last part refers to the experimental tests of the fatigue life of the specimens, with a detailed description of the apparatus and the procedure used from the initial specimen preparation to the fatigue test with the press. Then the obtained results are exposed and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is aimed to assess similarities and mismatches between the outputs from two independent methods for the cloud cover quantification and classification based on quite different physical basis. One of them is the SAFNWC software package designed to process radiance data acquired by the SEVIRI sensor in the VIS/IR. The other is the MWCC algorithm, which uses the brightness temperatures acquired by the AMSU-B and MHS sensors in their channels centered in the MW water vapour absorption band. At a first stage their cloud detection capability has been tested, by comparing the Cloud Masks they produced. These showed a good agreement between two methods, although some critical situations stand out. The MWCC, in effect, fails to reveal clouds which according to SAFNWC are fractional, cirrus, very low and high opaque clouds. In the second stage of the inter-comparison the pixels classified as cloudy according to both softwares have been. The overall observed tendency of the MWCC method, is an overestimation of the lower cloud classes. Viceversa, the more the cloud top height grows up, the more the MWCC not reveal a certain cloud portion, rather detected by means of the SAFNWC tool. This is what also emerges from a series of tests carried out by using the cloud top height information in order to evaluate the height ranges in which each MWCC category is defined. Therefore, although the involved methods intend to provide the same kind of information, in reality they return quite different details on the same atmospheric column. The SAFNWC retrieval being very sensitive to the top temperature of a cloud, brings the actual level reached by this. The MWCC, by exploiting the capability of the microwaves, is able to give an information about the levels that are located more deeply within the atmospheric column.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research activities were focused on evaluating the effect of Mo addition to mechanical properties and microstructure of A354 aluminium casting alloy. Samples, with increasing amount of Mo, were produced and heat treated. After heat treatment and exposition to high temperatures samples underwent microstructural and chemical analyses, hardness and tensile tests. The collected data led to the optimization of both casting parameters, for obtaining a homogeneous Mo distribution in the alloy, and heat treatment parameters, allowing the formation of Mo based strengthening precipitates stable at high temperature. Microstructural and chemical analyses highlighted how Mo addition in percentage superior to 0.1% wt. can modify the silicon eutectic morphology and hinder the formation of iron based β intermetallics. High temperature exposure curves, instead, showed that after long exposition hardness is slightly influenced by heat treatment while the effect of Mo addition superior to 0,3% is negligible. Tensile tests confirmed that the addition of 0.3%wt Mo induces an increase of about 10% of ultimate tensile strength after high temperature exposition (250°C for 100h) while heat treatments have slight influence on mechanical behaviour. These results could be exploited for developing innovative heat treatment sequence able to reduce residual stresses in castings produced with A354 modified with Mo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In dentistry the restoration of decayed teeth is challenging and makes great demands on both the dentist and the materials. Hence, fiber-reinforced posts have been introduced. The effects of different variables on the ultimate load on teeth restored using fiber-reinforced posts is controversial, maybe because the results are mostly based on non-standardized in vitro tests and, therefore, give inhomogeneous results. This study combines the advantages of in vitro tests and finite element analysis (FEA) to clarify the effects of ferrule height, post length and cementation technique used for restoration. Sixty-four single rooted premolars were decoronated (ferrule height 1 or 2 mm), endodontically treated and restored using fiber posts (length 2 or 7 mm), composite fillings and metal crowns (resin bonded or cemented). After thermocycling and chewing simulation the samples were loaded until fracture, recording first damage events. Using UNIANOVA to analyze recorded fracture loads, ferrule height and cementation technique were found to be significant, i.e. increased ferrule height and resin bonding of the crown resulted in higher fracture loads. Post length had no significant effect. All conventionally cemented crowns with a 1-mm ferrule height failed during artificial ageing, in contrast to resin-bonded crowns (75% survival rate). FEA confirmed these results and provided information about stress and force distribution within the restoration. Based on the findings of in vitro tests and computations we concluded that crowns, especially those with a small ferrule height, should be resin bonded. Finally, centrally positioned fiber-reinforced posts did not contribute to load transfer as long as the bond between the tooth and composite core was intact.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known that the early initiation of a specific antiinfective therapy is crucial to reduce the mortality in severe infection. Procedures culturing pathogens are the diagnostic gold standard in such diseases. However, these methods yield results earliest between 24 to 48 hours. Therefore, severe infections such as sepsis need to be treated with an empirical antimicrobial therapy, which is ineffective in an unknown fraction of these patients. Today's microbiological point of care tests are pathogen specific and therefore not appropriate for an infection with a variety of possible pathogens. Molecular nucleic acid diagnostics such as polymerase chain reaction (PCR) allow the identification of pathogens and resistances. These methods are used routinely to speed up the analysis of positive blood cultures. The newest PCR based system allows the identification of the 25 most frequent sepsis pathogens by PCR in parallel without previous culture in less than 6 hours. Thereby, these systems might shorten the time of possibly insufficient antiinfective therapy. However, these extensive tools are not suitable as point of care diagnostics. Miniaturization and automating of the nucleic acid based method is pending, as well as an increase of detectable pathogens and resistance genes by these methods. It is assumed that molecular PCR techniques will have an increasing impact on microbiological diagnostics in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a new and clinically oriented approach to perform atlas-based segmentation of brain tumor images. A mesh-free method is used to model tumor-induced soft tissue deformations in a healthy brain atlas image with subsequent registration of the modified atlas to a pathologic patient image. The atlas is seeded with a tumor position prior and tumor growth simulating the tumor mass effect is performed with the aim of improving the registration accuracy in case of patients with space-occupying lesions. We perform tests on 2D axial slices of five different patient data sets and show that the approach gives good results for the segmentation of white matter, grey matter, cerebrospinal fluid and the tumor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gastroesophageal reflux disease (GERD) still remains the most common out- GI-related condition in the out-patient setting. While primary care physicians often use empiric trials with proton pump inhibitors (PPI trial) to diagnose GERD, often specialised tests are required to confirm or exclude gastroesophageal reflux causing esophageal or extraesophageal symptoms. The most commonly used procedures to diagnose GERD include: conventional (catheter based) pH monitoring, wireless esophageal pH monitoring (Bravo), bilirubin monitoring (Bilitec), and combined multichannel intraluminal impedance-pH monitoring (MII-pH). Each technique has strengths and limitations of which clinicians and investigators should be aware when deciding which one to choose.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of antibiotics is highest in primary care and directly associated with antibiotic resistance in the community. We assessed regional variations in antibiotic use in primary care in Switzerland and explored prescription patterns in relation to the use of point of care tests. Defined daily doses of antibiotics per 1000 inhabitants (DDD(1000pd) ) were calculated for the year 2007 from reimbursement data of the largest Swiss health insurer, based on the anatomic therapeutic chemical classification and the DDD methodology recommended by WHO. We present ecological associations by use of descriptive and regression analysis. We analysed data from 1 067 934 adults, representing 17.1% of the Swiss population. The rate of outpatient antibiotic prescriptions in the entire population was 8.5 DDD(1000pd) , and varied between 7.28 and 11.33 DDD(1000pd) for northwest Switzerland and the Lake Geneva region. DDD(1000pd) for the three most prescribed antibiotics were 2.90 for amoxicillin and amoxicillin-clavulanate, 1.77 for fluoroquinolones, and 1.34 for macrolides. Regions with higher DDD(1000pd) showed higher seasonal variability in antibiotic use and lower use of all point of care tests. In regression analysis for each class of antibiotics, the use of any point of care test was consistently associated with fewer antibiotic prescriptions. Prescription rates of primary care physicians showed variations between Swiss regions and were lower in northwest Switzerland and in physicians using point of care tests. Ecological studies are prone to bias and whether point of care tests reduce antibiotic use has to be investigated in pragmatic primary care trials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Rivaroxaban (RXA) is licensed for prophylaxis of venous thromboembolism after major orthopaedic surgery of the lower limbs. Currently, no test to quantify RXA in plasma has been validated in an inter-laboratory setting. Our study had three aims: to assess i) the feasibility of RXA quantification with a commercial anti-FXa assay, ii) its accuracy and precision in an inter-laboratory setting, and iii) the influence of 10mg of RXA on routine coagulation tests. METHODS: The same chromogenic anti-FXa assay (Hyphen BioMed) was used in all participating laboratories. RXA calibrators and sets of blinded probes (aim ii.) were prepared in vitro by spiking normal plasma. The precise RXA content was assessed by high-pressure liquid chromatography-tandem mass spectrometry. For ex-vivo studies (aim iii), plasma samples from 20 healthy volunteers taken before and 2 - 3hours after ingestion of 10mg of RXA were analyzed by participating laboratories. RESULTS: RXA can be assayed chromogenically. Among the participating laboratories, the mean accuracy and the mean coefficient of variation for precision of RXA quantification were 7.0% and 8.8%, respectively. Mean RXA concentration was 114±43?g/L .RXA significantly altered prothrombin time, activated partial thromboplastin time, factor analysis for intrinsic and extrinsic factors. Determinations of thrombin time, fibrinogen, FXIII and D-Dimer levels were not affected. CONCLUSIONS: RXA plasma levels can be quantified accurately and precisely by a chromogenic anti-FXa assay on different coagulometers in different laboratories. Ingestion of 10mg RXA results in significant alterations of both PT- and aPTT-based coagulation assays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a new approach for corpus-based speech enhancement that significantly improves over a method published by Xiao and Nickel in 2010. Corpus-based enhancement systems do not merely filter an incoming noisy signal, but resynthesize its speech content via an inventory of pre-recorded clean signals. The goal of the procedure is to perceptually improve the sound of speech signals in background noise. The proposed new method modifies Xiao's method in four significant ways. Firstly, it employs a Gaussian mixture model (GMM) instead of a vector quantizer in the phoneme recognition front-end. Secondly, the state decoding of the recognition stage is supported with an uncertainty modeling technique. With the GMM and the uncertainty modeling it is possible to eliminate the need for noise dependent system training. Thirdly, the post-processing of the original method via sinusoidal modeling is replaced with a powerful cepstral smoothing operation. And lastly, due to the improvements of these modifications, it is possible to extend the operational bandwidth of the procedure from 4 kHz to 8 kHz. The performance of the proposed method was evaluated across different noise types and different signal-to-noise ratios. The new method was able to significantly outperform traditional methods, including the one by Xiao and Nickel, in terms of PESQ scores and other objective quality measures. Results of subjective CMOS tests over a smaller set of test samples support our claims.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Synchronization programs have become standard in the dairy industry in many countries. In Switzerland, these programs are not routinely used for groups of cows, but predominantly as a therapy for individual problem cows. The objective of this study was to compare the effect of a CIDR-Select Synch and a 12-d CIDR protocol on the pregnancy rate in healthy, multiparous dairy cows in Swiss dairy farms. Methods Cows (N = 508) were randomly assigned to CIDR-Select Synch (N = 262) or 12-d CIDR (N = 246) protocols. Cows in the CIDR-Select Synch group received a CIDR and 2.5 ml of buserelin i.m. on d 0. On d 7, the CIDR insert was removed and 5 ml of dinoprost was administered i.m.. Cows in the 12-d CIDR group received the CIDR on d 0 and it was removed on d 12 (the routine CIDR protocol in Swiss dairies). On d 0 a milk sample for progesterone analysis was taken. Cows were inseminated upon observed estrus. Pregnancy was determined at or more than 35 days after artificial insemination. As a first step, the two groups were compared as to indication for treatment, breed, stud book, stall, pasture, and farmer's business using chi square tests or Fisher's exact test. Furthermore, groups were compared as to age, DIM, number of AI's, number of cows per farm, and yearly milk yield per cow using nonparametric ANOVA. A multiple logistic model was used to relate the success of the protocols to all of the available factors; in particular treatment (CIDR-Select Synch/12-d CIDR), milk progesterone value, age, DIM, previous treatment of the uterus, previous gynecological treatment, and number of preceding inseminations. Results The pregnancy rate was higher in cows following the CIDR-Select Synch compared to the 12-d CIDR protocol (50.4% vs. 22.4%; P < 0.0001). Conclusion The CIDR-Select Synch protocol may be highly recommended for multiparous dairy cows. The reduced time span of the progesterone insert decreased the number of days open, improved the pregnancy rate compared to the 12-d CIDR protocol and the cows did not to have to be handled more often.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Development of novel implants in orthopaedic trauma surgery is based on limited datasets of cadaver trials or artificial bone models. A method has been developed whereby implants can be constructed in an evidence based method founded on a large anatomic database consisting of more than 2.000 datasets of bones extracted from CT scans. The aim of this study was the development and clinical application of an anatomically pre-contoured plate for the treatment of distal fibular fractures based on the anatomical database. 48 Caucasian and Asian bone models (left and right) from the database were used for the preliminary optimization process and validation of the fibula plate. The implant was constructed to fit bilaterally in a lateral position of the fibula. Then a biomechanical comparison of the designed implant to the current gold standard in the treatment of distal fibular fractures (locking 1/3 tubular plate) was conducted. Finally, a clinical surveillance study to evaluate the grade of implant fit achieved was performed. The results showed that with a virtual anatomic database it was possible to design a fibula plate with an optimized fit for a large proportion of the population. Biomechanical testing showed the novel fibula plate to be superior to 1/3 tubular plates in 4-point bending tests. The clinical application showed a very high degree of primary implant fit. Only in a small minority of cases further intra-operative implant bending was necessary. Therefore, the goal to develop an implant for the treatment of distal fibular fractures based on the evidence of a large anatomical database could be attained. Biomechanical testing showed good results regarding the stability and the clinical application confirmed the high grade of anatomical fit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditionally, the routine artificial digestion test is applied to assess the presence of Trichinella larvae in pigs. However, this diagnostic method has a low sensitivity compared to serological tests. The results from artificial digestion tests in Switzerland were evaluated over a time period of 15 years to determine by when freedom from infection based on these data could be confirmed. Freedom was defined as a 95% probability that the prevalence of infection was below 0.0001%. Freedom was demonstrated after 12 years at the latest. A new risk-based surveillance approach was then developed based on serology. Risk-based surveillance was also assessed over 15 years, starting in 2010. It was shown that by using this design, the sample size could be reduced by at least a factor of 4 when compared with the traditional testing regimen, without lowering the level of confidence in the Trichinella-free status of the pig population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: The disease alveolar echinococcosis (AE), caused by the larval stage of the cestode Echinococcus multilocularis, is fatal if treatment is unsuccessful. Current treatment options are, at best, parasitostatic, and involve taking benzimidazoles (albendazole, mebendazole) for the whole of a patient's life. In conjunction with the recent development of optimized procedures for E. multilocularis metacestode cultivation, we aimed to develop a rapid and reliable drug screening test, which enables efficient screening of a large number of compounds in a relatively short time frame. METHODS: Metacestodes were treated in vitro with albendazole, the nitro-thiazole nitazoxanide and 29 nitazoxanide derivatives. The resulting leakage of phosphoglucose isomerase (PGI) activity into the medium supernatant was measured and provided an indication of compound efficacy. RESULTS: We show that upon in vitro culture of E. multilocularis metacestodes in the presence of active drugs such as albendazole, the nitro-thiazole nitazoxanide and 30 different nitazoxanide derivatives, the activity of PGI in culture supernatants increased. The increase in PGI activity correlated with the progressive degeneration and destruction of metacestode tissue in a time- and concentration-dependent manner, which allowed us to perform a structure-activity relationship analysis on the thiazolide compounds used in this study. CONCLUSIONS: The assay presented here is inexpensive, rapid, can be used in 24- and 96-well formats and will serve as an ideal tool for first-round in vitro tests on the efficacy of large numbers of antiparasitic compounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past few decades, integrated circuits have become a major part of everyday life. Every circuit that is created needs to be tested for faults so faulty circuits are not sent to end-users. The creation of these tests is time consuming, costly and difficult to perform on larger circuits. This research presents a novel method for fault detection and test pattern reduction in integrated circuitry under test. By leveraging the FPGA's reconfigurability and parallel processing capabilities, a speed up in fault detection can be achieved over previous computer simulation techniques. This work presents the following contributions to the field of Stuck-At-Fault detection: We present a new method for inserting faults into a circuit net list. Given any circuit netlist, our tool can insert multiplexers into a circuit at correct internal nodes to aid in fault emulation on reconfigurable hardware. We present a parallel method of fault emulation. The benefit of the FPGA is not only its ability to implement any circuit, but its ability to process data in parallel. This research utilizes this to create a more efficient emulation method that implements numerous copies of the same circuit in the FPGA. A new method to organize the most efficient faults. Most methods for determinin the minimum number of inputs to cover the most faults require sophisticated softwareprograms that use heuristics. By utilizing hardware, this research is able to process data faster and use a simpler method for an efficient way of minimizing inputs.