917 resultados para post-processing method
Resumo:
Ultrasmall superparamagnetic iron oxide (USPIO) particles are promising contrast media, especially for molecular and cellular imaging besides lymph node staging owing to their superior NMR efficacy, macrophage uptake and lymphotropic properties. The goal of the present prospective clinical work was to validate quantification of signal decrease on high-resolution T(2)-weighted MR sequences before and 24-36 h after USPIO administration for accurate differentiation between benign and malignant normal-sized pelvic lymph nodes. Fifty-eight patients with bladder or prostate cancer were examined on a 3 T MR unit and their respective lymph node signal intensities (SI), signal-to-noise (SNR) and contrast-to-noise (CNR) were determined on pre- and post-contrast 3D T(2)-weighted turbo spin echo (TSE) images. Based on histology and/or localization, USPIO-uptake-related SI/SNR decrease of benign vs malignant and pelvic vs inguinal lymph nodes was compared. Out of 2182 resected lymph nodes 366 were selected for MRI post-processing. Benign pelvic lymph nodes showed a significantly higher SI/SNR decrease compared with malignant nodes (p < 0.0001). Inguinal lymph nodes in comparison to pelvic lymph nodes presented a reduced SI/SNR decrease (p < 0.0001). CNR did not differ significantly between benign and malignant lymph nodes. The receiver operating curve analysis yielded an area under the curve of 0.96, and the point with optimal accuracy was found at a threshold value of 13.5% SNR decrease. Overlap of SI and SNR changes between benign and malignant lymph nodes were attributed to partial voluming, lipomatosis, histiocytosis or focal lymphoreticular hyperplasia. USPIO-enhanced MRI improves the diagnostic ability of lymph node staging in normal-sized lymph nodes, although some overlap of SI/SNR-changes remained. Quantification of USPIO-dependent SNR decrease will enable the validation of this promising technique with the final goal of improving and individualizing patient care.
Resumo:
PURPOSE: To prospectively evaluate whether intravenous morphine co-medication improves bile duct visualization of dual-energy CT-cholangiography. MATERIALS AND METHODS: Forty potential donors for living-related liver transplantation underwent CT-cholangiography with infusion of a hepatobiliary contrast agent over 40min. Twenty minutes after the beginning of the contrast agent infusion, either normal saline (n=20 patients; control group [CG]) or morphine sulfate (n=20 patients; morphine group [MG]) was injected. Forty-five minutes after initiation of the contrast agent, a dual-energy CT acquisition of the liver was performed. Applying dual-energy post-processing, pure iodine images were generated. Primary study goals were determination of bile duct diameters and visualization scores (on a scale of 0 to 3: 0-not visualized; 3-excellent visualization). RESULTS: Bile duct visualization scores for second-order and third-order branch ducts were significantly higher in the MG compared to the CG (2.9±0.1 versus 2.6±0.2 [P<0.001] and 2.7±0.3 versus 2.1±0.6 [P<0.01], respectively). Bile duct diameters for the common duct and main ducts were significantly higher in the MG compared to the CG (5.9±1.3mm versus 4.9±1.3mm [P<0.05] and 3.7±1.3mm versus 2.6±0.5mm [P<0.01], respectively). CONCLUSION: Intravenous morphine co-medication significantly improved biliary visualization on dual-energy CT-cholangiography in potential donors for living-related liver transplantation.
Resumo:
The aim of this prospective trial was to evaluate sensitivity and specificity of bright lumen magnetic resonance colonography (MRC) in comparison with conventional colonoscopy (CC). A total of 120 consecutive patients with clinical indications for CC were prospectively examined using MRC (1.5 Tesla) which was then followed by CC. Prior to MRC, the cleansed colon was filled with a gadolinium-water solution. A 3D GRE sequence was performed with the patient in the prone and supine position, each acquired during one breathhold period. After division of the colon into five segments, interactive data analysis was carried out using three-dimensional post-processing, including a virtual intraluminal view. The results of CC served as a reference standard. In all patients MRC was performed successfully and no complications occurred. Image quality was diagnostic in 92% (574/620 colonic segments). On a per-patient basis, the results of MRC were as follows: sensitivity 84% (95% CI 71.7-92.3%), specificity 97% (95% CI 89.0-99.6%). Five flat adenomas and 6/16 small polyps (< or =5 mm) were not identified by MRC. MRC offers high sensitivity and excellent specificity rates in patients with clinical indications for CC. Improved MRC techniques are needed to detect small polyps and flat adenomas.
Resumo:
This study focuses on a specific engine, i.e., a dual-spool, separate-flow turbofan engine with an Interstage Turbine Burner (ITB). This conventional turbofan engine has been modified to include a secondary isobaric burner, i.e., ITB, in a transition duct between the high-pressure turbine and the low-pressure turbine. The preliminary design phase for this modified engine starts with the aerothermodynamics cycle analysis is consisting of parametric (i.e., on-design) and performance (i.e., off-design) cycle analyses. In parametric analysis, the modified engine performance parameters are evaluated and compared with baseline engine in terms of design limitation (maximum turbine inlet temperature), flight conditions (such as flight Mach condition, ambient temperature and pressure), and design choices (such as compressor pressure ratio, fan pressure ratio, fan bypass ratio etc.). A turbine cooling model is also included to account for the effect of cooling air on engine performance. The results from the on-design analysis confirmed the advantage of using ITB, i.e., higher specific thrust with small increases in thrust specific fuel consumption, less cooling air, and less NOx production, provided that the main burner exit temperature and ITB exit temperature are properly specified. It is also important to identify the critical ITB temperature, beyond which the ITB is turned off and has no advantage at all. With the encouraging results from parametric cycle analysis, a detailed performance cycle analysis of the identical engine is also conducted for steady-stateengine performance prediction. The results from off-design cycle analysis show that the ITB engine at full throttle setting has enhanced performance over baseline engine. Furthermore, ITB engine operating at partial throttle settings will exhibit higher thrust at lower specific fuel consumption and improved thermal efficiency over the baseline engine. A mission analysis is also presented to predict the fuel consumptions in certain mission phases. Excel macrocode, Visual Basic for Application, and Excel neuron cells are combined to facilitate Excel software to perform these cycle analyses. These user-friendly programs compute and plot the data sequentially without forcing users to open other types of post-processing programs.
Resumo:
The primary goal of this project is to demonstrate the practical use of data mining algorithms to cluster a solved steady-state computational fluids simulation (CFD) flow domain into a simplified lumped-parameter network. A commercial-quality code, “cfdMine” was created using a volume-weighted k-means clustering that that can accomplish the clustering of a 20 million cell CFD domain on a single CPU in several hours or less. Additionally agglomeration and k-means Mahalanobis were added as optional post-processing steps to further enhance the separation of the clusters. The resultant nodal network is considered a reduced-order model and can be solved transiently at a very minimal computational cost. The reduced order network is then instantiated in the commercial thermal solver MuSES to perform transient conjugate heat transfer using convection predicted using a lumped network (based on steady-state CFD). When inserting the lumped nodal network into a MuSES model, the potential for developing a “localized heat transfer coefficient” is shown to be an improvement over existing techniques. Also, it was found that the use of the clustering created a new flow visualization technique. Finally, fixing clusters near equipment newly demonstrates a capability to track temperatures near specific objects (such as equipment in vehicles).
Resumo:
Non-invasive imaging methods are increasingly entering the field of forensic medicine. Facing the intricacies of classical neck dissection techniques, postmortem imaging might provide new diagnostic possibilities which could also improve forensic reconstruction. The aim of this study was to determine the value of postmortem neck imaging in comparison to forensic autopsy regarding the evaluation of the cause of death and the analysis of biomechanical aspects of neck trauma. For this purpose, 5 deceased persons (1 female and 4 male, mean age 49.8 years, range 20-80 years) who had suffered odontoid fractures or atlantoaxial distractions with or without medullary injuries, were studied using multislice computed tomography (MSCT), magnetic resonance imaging (MRI) and subsequent forensic autopsy. Evaluation of the findings was performed by radiologists, forensic pathologists and neuropathologists. The cause of death could be established radiologically in three of the five cases. MRI data were insufficient due to metal artefacts in one case, and in another, ascending medullary edema as the cause of delayed death was only detected by histological analysis. Regarding forensic reconstruction, the imaging methods were superior to autopsy neck exploration in all cases due to the post-processing possibilities of viewing the imaging data. In living patients who suffer medullary injury, follow-up MRI should be considered to exclude ascending medullary edema.
Resumo:
BACKGROUND Nebulized surfactant therapy has been proposed as an alternative method of surfactant administration. The use of a perforated vibrating membrane nebulizer provides a variety of advantages over conventional nebulizers. We investigated the molecular structure and integrity of poractant alfa pre- and post-nebulization. METHOD Curosurf® was nebulized using an Investigational eFlow® Nebulizer System. Non-nebulized surfactant ("NN"), recollected surfactant droplets from nebulization through an endotracheal tube ("NT") and nebulization of surfactant directly onto a surface ("ND") were investigated by transmission electron microscopy. Biophysical characteristics were assessed by the Langmuir-Wilhelmy balance and the Captive Bubble Surfactometer. RESULTS Volume densities of lamellar body-like forms (LBL) and multi-lamellar forms (ML) were high for "NN" and "NT" samples (38.8% vs. 47.7% for LBL and 58.2% vs. 47.8% for ML). In the "ND" sample, we found virtually no LBL's, ML's (72.6%) as well as uni-lamellar forms (16.4%) and a new structure, the "garland-like" forms (9.4%). Surface tension for "NN" and "NT" was 23.33 ± 0.29 and 25.77 ± 1.12 mN/m, respectively. Dynamic compression-expansion cycling minimum surface tensions were between 0.91 and 1.77 mN/m. CONCLUSION The similarity of surfactant characteristics of nebulized surfactant via a tube and the non-nebulized surfactant suggests that vibrating membrane nebulizers are suitable for surfactant nebulization. Alterations in surfactant morphology and characteristics after nebulization were transient. A new structural subtype of surfactant was identified. Pediatr Pulmonol. 2014; 49:348-356. © 2013 Wiley Periodicals, Inc.
Resumo:
In 1999, all student teachers at secondary I level at the University of Bern who had to undertake an internship were asked to participate in a study on learning processes during practicum: 150 students and their mentors in three types of practicum participated—introductory practicum (after the first half‐year of studies), intermediate practicum (after two years of studies) and final practicum (after three years of studies). At the end of the practicum, student teachers and mentors completed questionnaires on preparing, teaching and post‐processing lessons. All student teachers, additionally, rated their professional skills and aspects of personality (attitudes towards pupils, self‐assuredness and well‐being) before and after the practicum. Forty‐six student teachers wrote daily semi‐structured diaries about essential learning situations during their practicum. Results indicate that in each practicum students improved significantly in preparing, conducting and post‐processing lessons. The mentors rated these changes as being greater than did the student teachers. From the perspective of the student teachers their general teaching skills also improved, and their attitudes toward pupils became more open. Furthermore, during practicum their self‐esteem and subjective well‐being increased. Diary data confirmed that there are no differences between different levels of practicum in terms of learning outcomes, but give some first insight into different ways of learning during internship.
Resumo:
The domain of context-free languages has been extensively explored and there exist numerous techniques for parsing (all or a subset of) context-free languages. Unfortunately, some programming languages are not context-free. Using standard context-free parsing techniques to parse a context-sensitive programming language poses a considerable challenge. Im- plementors of programming language parsers have adopted various techniques, such as hand-written parsers, special lex- ers, or post-processing of an ambiguous parser output to deal with that challenge. In this paper we suggest a simple extension of a top-down parser with contextual information. Contrary to the tradi- tional approach that uses only the input stream as an input to a parsing function, we use a parsing context that provides ac- cess to a stream and possibly to other context-sensitive infor- mation. At a same time we keep the context-free formalism so a grammar definition stays simple without mind-blowing context-sensitive rules. We show that our approach can be used for various purposes such as indent-sensitive parsing, a high-precision island parsing or XML (with arbitrary el- ement names) parsing. We demonstrate our solution with PetitParser, a parsing-expression grammar based, top-down, parser combinator framework written in Smalltalk.
Resumo:
In clinical practice, traditional X-ray radiography is widely used, and knowledge of landmarks and contours in anteroposterior (AP) pelvis X-rays is invaluable for computer aided diagnosis, hip surgery planning and image-guided interventions. This paper presents a fully automatic approach for landmark detection and shape segmentation of both pelvis and femur in conventional AP X-ray images. Our approach is based on the framework of landmark detection via Random Forest (RF) regression and shape regularization via hierarchical sparse shape composition. We propose a visual feature FL-HoG (Flexible- Level Histogram of Oriented Gradients) and a feature selection algorithm based on trace radio optimization to improve the robustness and the efficacy of RF-based landmark detection. The landmark detection result is then used in a hierarchical sparse shape composition framework for shape regularization. Finally, the extracted shape contour is fine-tuned by a post-processing step based on low level image features. The experimental results demonstrate that our feature selection algorithm reduces the feature dimension in a factor of 40 and improves both training and test efficiency. Further experiments conducted on 436 clinical AP pelvis X-rays show that our approach achieves an average point-to-curve error around 1.2 mm for femur and 1.9 mm for pelvis.
Resumo:
Stray light contamination reduces considerably the precision of photometric of faint stars for low altitude spaceborne observatories. When measuring faint objects, the necessity of coping with stray light contamination arises in order to avoid systematic impacts on low signal-to-noise images. Stray light contamination can be represented by a flat offset in CCD data. Mitigation techniques begin by a comprehensive study during the design phase, followed by the use of target pointing optimisation and post-processing methods. We present a code that aims at simulating the stray-light contamination in low-Earth orbit coming from reflexion of solar light by the Earth. StrAy Light SimulAtor (SALSA) is a tool intended to be used at an early stage as a tool to evaluate the effective visible region in the sky and, therefore to optimise the observation sequence. SALSA can compute Earth stray light contamination for significant periods of time allowing missionwide parameters to be optimised (e.g. impose constraints on the point source transmission function (PST) and/or on the altitude of the satellite). It can also be used to study the behaviour of the stray light at different seasons or latitudes. Given the position of the satellite with respect to the Earth and the Sun, SALSA computes the stray light at the entrance of the telescope following a geometrical technique. After characterising the illuminated region of the Earth, the portion of illuminated Earth that affects the satellite is calculated. Then, the flux of reflected solar photons is evaluated at the entrance of the telescope. Using the PST of the instrument, the final stray light contamination at the detector is calculated. The analysis tools include time series analysis of the contamination, evaluation of the sky coverage and an objects visibility predictor. Effects of the South Atlantic Anomaly and of any shutdown periods of the instrument can be added. Several designs or mission concepts can be easily tested and compared. The code is not thought as a stand-alone mission designer. Its mandatory inputs are a time series describing the trajectory of the satellite and the characteristics of the instrument. This software suite has been applied to the design and analysis of CHEOPS (CHaracterizing ExOPlanet Satellite). This mission requires very high precision photometry to detect very shallow transits of exoplanets. Different altitudes and characteristics of the detector have been studied in order to find the best parameters, that reduce the effect of contamination. © (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Resumo:
Today, there is little knowledge on the attitude state of decommissioned intact objects in Earth orbit. Observational means have advanced in the past years, but are still limited with respect to an accurate estimate of motion vector orientations and magnitude. Especially for the preparation of Active Debris Removal (ADR) missions as planned by ESA’s Clean Space initiative or contingency scenarios for ESA spacecraft like ENVISAT, such knowledge is needed. ESA's “Debris Attitude Motion Measurements and Modelling” project (ESA Contract No. 40000112447), led by the Astronomical Institute of the University of Bern (AIUB), addresses this problem. The goal of the project is to achieve a good understanding of the attitude evolution and the considerable internal and external effects which occur. To characterize the attitude state of selected targets in LEO and GTO, multiple observation methods are combined. Optical observations are carried out by AIUB, Satellite Laser Ranging (SLR) is performed by the Space Research Institute of the Austrian Academy of Sciences (IWF) and radar measurements and signal level determination are provided by the Fraunhofer Institute for High Frequency Physics and Radar Techniques (FHR). The In-Orbit Tumbling Analysis tool (ιOTA) is a prototype software, currently in development by Hyperschall Technologie Göttingen GmbH (HTG) within the framework of the project. ιOTA will be a highly modular software tool to perform short-(days), medium-(months) and long-term (years) propagation of the orbit and attitude motion (six degrees-of-freedom) of spacecraft in Earth orbit. The simulation takes into account all relevant acting forces and torques, including aerodynamic drag, solar radiation pressure, gravitational influences of Earth, Sun and Moon, eddy current damping, impulse and momentum transfer from space debris or micro meteoroid impact, as well as the optional definition of particular spacecraft specific influences like tank sloshing, reaction wheel behaviour, magnetic torquer activity and thruster firing. The purpose of ιOTA is to provide high accuracy short-term simulations to support observers and potential ADR missions, as well as medium-and long-term simulations to study the significance of the particular internal and external influences on the attitude, especially damping factors and momentum transfer. The simulation will also enable the investigation of the altitude dependency of the particular external influences. ιOTA's post-processing modules will generate synthetic measurements for observers and for software validation. The validation of the software will be done by cross-calibration with observations and measurements acquired by the project partners.
Resumo:
OBJECTIVES
To test the applicability, accuracy, precision, and reproducibility of various 3D superimposition techniques for radiographic data, transformed to triangulated surface data.
METHODS
Five superimposition techniques (3P: three-point registration; AC: anterior cranial base; AC + F: anterior cranial base + foramen magnum; BZ: both zygomatic arches; 1Z: one zygomatic arch) were tested using eight pairs of pre-existing CT data (pre- and post-treatment). These were obtained from non-growing orthodontic patients treated with rapid maxillary expansion. All datasets were superimposed by three operators independently, who repeated the whole procedure one month later. Accuracy was assessed by the distance (D) between superimposed datasets on three form-stable anatomical areas, located on the anterior cranial base and the foramen magnum. Precision and reproducibility were assessed using the distances between models at four specific landmarks. Non parametric multivariate models and Bland-Altman difference plots were used for analyses.
RESULTS
There was no difference among operators or between time points on the accuracy of each superimposition technique (p>0.05). The AC + F technique was the most accurate (D<0.17 mm), as expected, followed by AC and BZ superimpositions that presented similar level of accuracy (D<0.5 mm). 3P and 1Z were the least accurate superimpositions (0.79
Resumo:
La conservación de alimentos por tecnología de barreras es una alternativa de procesamiento, en la cual se aplican las técnicas de conservación tradicionales pero de una manera menos intensa, mediante una combinación de obstáculos a fin de mantener las cualidades organolépticas en el producto final. El objetivo de este trabajo fue desarrollar un método de conservación de porotos con la aplicación de tecnología de bajo costo y de aceptabilidad comparable al método convencional (Método Appert). La metodología que se empleó en la elaboración de conservas de porotos por tecnologías de barrera fue la siguiente: selección, lavado, remojado durante 24 horas, escurrido, cocinado a presión en salmuera al 0,5% durante cinco minutos a 121°C. Posteriormente se envasaron en recipientes esterilizados, se agregó la solución de relleno en ebullición (vinagre 7,50%, ácido láctico 0,30%, ácido ascórbico 0,35%, ácido cítrico 0,35%, sal 1,00% y azúcar 0,50%. No se esterilizó y se dejó enfriar hasta alcanzar la temperatura ambiente. La determinación de la calidad de las conservas elaboradas se realizó a los seis meses de almacenadas realizando una evaluación microbiológica, físicoquímica y sensorial. La conserva obtenida es aceptable, inocua y puede ser preservada a temperatura ambiente.
Resumo:
We re-evaluate the Greenland mass balance for the recent period using low-pass Independent Component Analysis (ICA) post-processing of the Level-2 GRACE data (2002-2010) from different official providers (UTCSR, JPL, GFZ) and confirm the present important ice mass loss in the range of -70 and -90 Gt/y of this ice sheet, due to negative contributions of the glaciers on the east coast. We highlight the high interannual variability of mass variations of the Greenland Ice Sheet (GrIS), especially the recent deceleration of ice loss in 2009-2010, once seasonal cycles are robustly removed by Seasonal Trend Loess (STL) decomposition. Interannual variability leads to varying trend estimates depending on the considered time span. Correction of post-glacial rebound effects on ice mass trend estimates represents no more than 8 Gt/y over the whole ice sheet. We also investigate possible climatic causes that can explain these ice mass interannual variations, as strong correlations between GRACE-based mass balance and atmosphere/ocean parallels are established: (1) changes in snow accumulation, and (2) the influence of inputs of warm ocean water that periodically accelerate the calving of glaciers in coastal regions and, feed-back effects of coastal water cooling by fresh currents from glaciers melting. These results suggest that the Greenland mass balance is driven by coastal sea surface temperature at time scales shorter than accumulation.