987 resultados para Correction method
Resumo:
Purpose: Although several approaches have been already used to reduce radiation dose, CT doses are still among the high doses in radio-diagnostic. Recently, General Electric introduced a new imaging reconstruction technique, adaptive statistical iterative reconstruction (ASIR), allows to taking into account the statistical fluctuation of noise. The benefits of ASIR method were assessed through classic metrics and the evaluations of cardiac structures by radiologists. Methods and materials: A 64-row CT (MDCT) was employed. Catphan600 phantom acquisitions and 10 routine-dose CT examinations performed at 80 kVp were reconstructed with FBP and with 50% of ASIR. Six radiologists then assessed the visibility of main cardiac structures using the visual grading analysis (VGA) method. Results: On phantoms, for a constant value of SD (25 HU), CTDIvol is divided by 2 (8 mGy to 4 mGy) when 50% of ASIR is used. At constant CTDIvol, MTF medium frequencies were also significantly improved. First results indicated that clinical images reconstructed with ASIR had a better overall image quality compared with conventional reconstruction. This means that at constant image quality the radiation dose can be strongly reduced. Conclusion: The first results of this study shown that the ASIR method improves the image quality on phantoms by decreasing noise and improving resolution with respect to the classical one. Moreover, the benefit obtained is higher at lower doses. In clinical environment, a dose reduction can still be expected on 80 kVp low dose pediatric protocols using 50% of iterative reconstruction. Best ASIR percentage as a function of cardiac structures and detailed protocols will be presented for cardiac examinations.
Resumo:
BACKGROUND: The efficacy of cardiac pacing for prevention of syncopal recurrences in patients with neurally mediated syncope is controversial. We wanted to determine whether pacing therapy reduces syncopal recurrences in patients with severe asystolic neurally mediated syncope. METHODS AND RESULTS: Double-blind, randomized placebo-controlled study conducted in 29 centers in the Third International Study on Syncope of Uncertain Etiology (ISSUE-3) trial. Patients were ≥40 years, had experienced ≥3 syncopal episodes in the previous 2 years. Initially, 511 patients, received an implantable loop recorder; 89 of these had documentation of syncope with ≥3 s asystole or ≥6 s asystole without syncope within 12 ± 10 months and met criteria for pacemaker implantation; 77 of 89 patients were randomly assigned to dual-chamber pacing with rate drop response or to sensing only. The data were analyzed on intention-to-treat principle. There was syncope recurrence during follow-up in 27 patients, 19 of whom had been assigned to pacemaker OFF and 8 to pacemaker ON. The 2-year estimated syncope recurrence rate was 57% (95% CI, 40-74) with pacemaker OFF and 25% (95% CI, 13-45) with pacemaker ON (log rank: P=0.039 at the threshold of statistical significance of 0.04). The risk of recurrence was reduced by 57% (95% CI, 4-81). Five patients had procedural complications: lead dislodgment in 4 requiring correction and subclavian vein thrombosis in 1 patient. CONCLUSIONS: Dual-chamber permanent pacing is effective in reducing recurrence of syncope in patients ≥40 years with severe asystolic neurally mediated syncope. The observed 32% absolute and 57% relative reduction in syncope recurrence support this invasive treatment for the relatively benign neurally mediated syncope. CLINICAL TRIAL REGISTRATION: URL: http://www.clinicaltrials.gov. Unique identifier: NCT00359203.
Resumo:
A new aggregation method for decision making is presented by using induced aggregation operators and the index of maximum and minimum level. Its main advantage is that it can assess complex reordering processes in the aggregation that represent complex attitudinal characters of the decision maker such as psychological or personal factors. A wide range of properties and particular cases of this new approach are studied. A further generalization by using hybrid averages and immediate weights is also presented. The key issue in this approach against the previous model is that we can use the weighted average and the ordered weighted average in the same formulation. Thus, we are able to consider the subjective attitude and the degree of optimism of the decision maker in the decision process. The paper ends with an application in a decision making problem based on the use of the assignment theory.
Resumo:
Résumé La réalisation d'une seconde ligne de métro (M2) dès 2004, passant dans le centre ville de Lausanne, a été l'opportunité de développer une méthodologie concernant des campagnes microgravimétriques dans un environnement urbain perturbé. Les corrections topographiques prennent une dimension particulière dans un tel milieu, car de nombreux objets non géologiques d'origine anthropogénique comme toutes sortes de sous-sols vides viennent perturber les mesures gravimétriques. Les études de génie civil d'avant projet de ce métro nous ont fournis une quantité importante d'informations cadastrales, notamment sur les contours des bâtiments, sur la position prévue du tube du M2, sur des profondeurs de sous-sol au voisinage du tube, mais aussi sur la géologie rencontré le long du corridor du M2 (issue des données lithologiques de forages géotechniques). La planimétrie des sous-sols a été traitée à l'aide des contours des bâtiments dans un SIG (Système d'Information Géographique), alors qu'une enquête de voisinage fut nécessaire pour mesurer la hauteur des sous-sols. Il a été alors possible, à partir d'un MNT (Modèle Numérique de Terrain) existant sur une grille au mètre, de mettre à jour celui ci avec les vides que représentent ces sous-sols. Les cycles de mesures gravimétriques ont été traités dans des bases de données Ac¬cess, pour permettre un plus grand contrôle des données, une plus grande rapidité de traitement, et une correction de relief rétroactive plus facile, notamment lorsque des mises à jour de la topographie ont lieu durant les travaux. Le quartier Caroline (entre le pont Bessières et la place de l'Ours) a été choisi comme zone d'étude. Le choix s'est porté sur ce quartier du fait que, durant ce travail de thèse, nous avions chronologiquement les phases pré et post creusement du tunnel du M2. Cela nous a permis d'effectuer deux campagnes gravimétriques (avant le creu¬sement durant l'été 2005 et après le creusement durant l'été 2007). Ces réitérations nous ont permis de tester notre modélisation du tunnel. En effet, en comparant les mesures des deux campagnes et la réponse gravifique du modèle du tube discrétisé en prismes rectangulaires, nous avons pu valider notre méthode de modélisation. La modélisation que nous avons développée nous permet de construire avec détail la forme de l'objet considéré avec la possibilité de recouper plusieurs fois des interfaces de terrains géologiques et la surface topographique. Ce type de modélisation peut s'appliquer à toutes constructions anthropogéniques de formes linéaires. Abstract The realization of a second underground (M2) in 2004, in downtown Lausanne, was the opportunity to develop a methodology of microgravity in urban environment. Terrain corrections take on special meaning in such environment. Many non-geologic anthropogenic objects like basements act as perturbation of gravity measurements. Civil engineering provided a large amount of cadastral informations, including out¬lines of buildings, M2 tube position, depths of some basements in the vicinity of the M2 corridor, and also on the geology encountered along the M2 corridor (from the lithological data from boreholes). Geometry of basements was deduced from building outlines in a GIS (Geographic Information System). Field investigation was carried out to measure or estimate heights of basements. A DEM (Digital Elevation Model) of the city of Lausanne is updated from voids of basements. Gravity cycles have been processed in Access database, to enable greater control of data, enhance speed processing, and retroactive terrain correction easier, when update of topographic surface are available. Caroline area (between the bridge Saint-Martin and Place de l'Ours) was chosen as the study area. This area was in particular interest because it was before and after digging in this thesis. This allowed us to conduct two gravity surveys (before excavation during summer 2005 and after excavation during summer 2007). These re-occupations enable us to test our modélisation of the tube. Actually, by comparing the difference of measurements between the both surveys and the gravity response of our model (by rectangular prisms), we were able to validate our modeling. The modeling method we developed allows us to construct detailed shape of an object with possibility to cross land geological interfaces and surface topography. This type of modélisation can be applied to all anthropogenic structures.
Resumo:
The main objective of this work was to compare two methods to estimate the deposition of pesticide applied by aerial spraying. Hundred and fifty pieces of water sensitive paper were distributed over an area of 50 m length by 75 m width for sampling droplets sprayed by an aircraft calibrated to apply a spray volume of 32 L/ha. The samples were analysed by visual microscopic method using NG 2 Porton graticule and by an image analyser computer program. The results reached by visual microscopic method were the following: volume median diameter, 398±62 mum; number median diameter, 159±22 mum; droplet density, 22.5±7.0 droplets/cm² and estimated deposited volume, 22.2±9.4 L/ha. The respective ones reached with the computer program were: 402±58 mum, 161±32 mum, 21.9±7.5 droplets/cm² and 21.9±9.2 L/ha. Graphs of the spatial distribution of droplet density and deposited spray volume on the area were produced by the computer program.
Resumo:
Increasing anthropogenic pressures urge enhanced knowledge and understanding of the current state of marine biodiversity. This baseline information is pivotal to explore present trends, detect future modifications and propose adequate management actions for marine ecosystems. Coralligenous outcrops are a highly diverse and structurally complex deep-water habitat faced with major threats in the Mediterranean Sea. Despite its ecological, aesthetic and economic value, coralligenous biodiversity patterns are still poorly understood. There is currently no single sampling method that has been demonstrated to be sufficiently representative to ensure adequate community assessment and monitoring in this habitat. Therefore, we propose a rapid non-destructive protocol for biodiversity assessment and monitoring of coralligenous outcrops providing good estimates of its structure and species composition, based on photographic sampling and the determination of presence/absence of macrobenthic species. We used an extensive photographic survey, covering several spatial scales (100s of m to 100s of km) within the NW Mediterranean and including 2 different coralligenous assemblages: Paramuricea clavata (PCA) and Corallium rubrum assemblage (CRA). This approach allowed us to determine the minimal sampling area for each assemblage (5000 cm² for PCA and 2500 cm²for CRA). In addition, we conclude that 3 replicates provide an optimal sampling effort in order to maximize the species number and to assess the main biodiversity patterns of studied assemblages in variability studies requiring replicates. We contend that the proposed sampling approach provides a valuable tool for management and conservation planning, monitoring and research programs focused on coralligenous outcrops, potentially also applicable in other benthic ecosystems
Resumo:
Alveolar haemorrhage (AH) is a rare and potentially life-threatening condition characterised by diffuse blood leakage from the pulmonary microcirculation into the alveolar spaces due to microvascular damage. It is not a single disease but a clinical syndrome that may have numerous causes. Autoimmune disorders account for fewer than half of cases, whereas the majority are due to nonimmune causes such as left heart disease, infections, drug toxicities, coagulopathies and malignancies. The clinical picture includes haemoptysis, diffuse alveolar opacities at imaging and anaemia. Bronchoalveolar lavage is the gold standard method for diagnosing AH. The lavage fluid appears macroscopically haemorrhagic and/or contains numerous haemosiderin-laden macrophages. The diagnostic work-up includes search for autoimmune disorders, review of drugs and exposures, assessment of coagulation and left heart function, and search for infectious agents. Renal biopsy is often indicated if AH is associated with renal involvement, whereas lung biopsy is only rarely useful. Therapy aims at correction of reversible factors and immunosuppressive therapy in autoimmune causes, with plasmapheresis in selected situations.
Resumo:
A comprehensive field detection method is proposed that is aimed at developing advanced capability for reliable monitoring, inspection and life estimation of bridge infrastructure. The goal is to utilize Motion-Sensing Radio Transponders (RFIDS) on fully adaptive bridge monitoring to minimize the problems inherent in human inspections of bridges. We developed a novel integrated condition-based maintenance (CBM) framework integrating transformative research in RFID sensors and sensing architecture, for in-situ scour monitoring, state-of-the-art computationally efficient multiscale modeling for scour assessment.
Resumo:
Even though much improvement has been made in plant transformation methods, the screening of transgenic plants is often a laborious work. Most approaches for detecting the transgene in transformed plants are still timeconsuming, and can be quite expensive. The objective of this study was to search for a simpler method to screen for transgenic plants. The infiltration of kanamycin (100 mg/mL) into tobacco leaves resulted in conspicuous chlorotic spots on the non-transgenic plant leaves, while no spots were seen on the leaves of transformed plants. This reaction occurred regardless of age of the tested plants, and the method has proven to be simple, fast, non-destructive, relatively cheap, and reliable. These results were comparable to those obtained by the polymerase chain reaction (PCR) amplification of the transgene using specific primers.
Resumo:
The ability of a PCR-based restriction fragment length polymorphism (RFLP) analysis of the cytochrome b (mtDNA) to distinguish Apodemus alpicola from two other Apodemus species was investigated. The partial sequencing of the cytochrome b allowed the identification of one enzyme as being potentially diagnostic. This was supported by an analysis of 131 specimens previously identified using morphometric and/or allozymic data, indicating that the PCR-based RFLP method provides a rapid and reliable tool for distinguishing A. alpicola from its two co-occurring congenerics. The method is applicable to samples taken in the field for ecological studies, and could easily be adapted to the identification of museum samples.
Resumo:
A simple method determining airborne monoethanolamine has been developed. Monoethanolamine determination has traditionally been difficult due to analytical separation problems. Even in recent sophisticated methods, this difficulty remains as the major issue often resulting in time-consuming sample preparations. Impregnated glass fiber filters were used for sampling. Desorption of monoethanolamine was followed by capillary GC analysis and nitrogen phosphorous selective detection. Separation was achieved using a specific column for monoethanolamines (35% diphenyl and 65% dimethyl polysiloxane). The internal standard was quinoline. Derivatization steps were not needed. The calibration range was 0.5-80 μg/mL with a good correlation (R(2) = 0.996). Averaged overall precisions and accuracies were 4.8% and -7.8% for intraday (n = 30), and 10.5% and -5.9% for interday (n = 72). Mean recovery from spiked filters was 92.8% for the intraday variation, and 94.1% for the interday variation. Monoethanolamine on stored spiked filters was stable for at least 4 weeks at 5°C. This newly developed method was used among professional cleaners and air concentrations (n = 4) were 0.42 and 0.17 mg/m(3) for personal and 0.23 and 0.43 mg/m(3) for stationary measurements. The monoethanolamine air concentration method described here was simple, sensitive, and convenient both in terms of sampling and analytical analysis.
Resumo:
A new method was developed for breaking high strength prestressed cable. The old method used an aluminum oxide grit packed into a special gripping jaw. The new method uses aluminum shims wrapped around the cable and then is gripped with a V-grip. The new method gives nearly 100% "good breaks" on the cable compared to approximately 10% good breaks with the old method. In addition, the new cable breaking method gives higher ultimate tensile strengths, is more reproducible, is quicker, cleaner and easier on equipment.
Resumo:
Background: Conventional magnetic resonance imaging (MRI) techniques are highly sensitive to detect multiple sclerosis (MS) plaques, enabling a quantitative assessment of inflammatory activity and lesion load. In quantitative analyses of focal lesions, manual or semi-automated segmentations have been widely used to compute the total number of lesions and the total lesion volume. These techniques, however, are both challenging and time-consuming, being also prone to intra-observer and inter-observer variability.Aim: To develop an automated approach to segment brain tissues and MS lesions from brain MRI images. The goal is to reduce the user interaction and to provide an objective tool that eliminates the inter- and intra-observer variability.Methods: Based on the recent methods developed by Souplet et al. and de Boer et al., we propose a novel pipeline which includes the following steps: bias correction, skull stripping, atlas registration, tissue classification, and lesion segmentation. After the initial pre-processing steps, a MRI scan is automatically segmented into 4 classes: white matter (WM), grey matter (GM), cerebrospinal fluid (CSF) and partial volume. An expectation maximisation method which fits a multivariate Gaussian mixture model to T1-w, T2-w and PD-w images is used for this purpose. Based on the obtained tissue masks and using the estimated GM mean and variance, we apply an intensity threshold to the FLAIR image, which provides the lesion segmentation. With the aim of improving this initial result, spatial information coming from the neighbouring tissue labels is used to refine the final lesion segmentation.Results:The experimental evaluation was performed using real data sets of 1.5T and the corresponding ground truth annotations provided by expert radiologists. The following values were obtained: 64% of true positive (TP) fraction, 80% of false positive (FP) fraction, and an average surface distance of 7.89 mm. The results of our approach were quantitatively compared to our implementations of the works of Souplet et al. and de Boer et al., obtaining higher TP and lower FP values.Conclusion: Promising MS lesion segmentation results have been obtained in terms of TP. However, the high number of FP which is still a well-known problem of all the automated MS lesion segmentation approaches has to be improved in order to use them for the standard clinical practice. Our future work will focus on tackling this issue.