991 resultados para Combining method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN]This work introduces a new technique for tetrahedral mesh optimization. The procedure relocates boundary and inner nodes without changing the mesh topology. In order to maintain the boundary approximation while boundary nodes are moved, a local refinement of tetrahedra with faces on the solid boundary is necessary in some cases. New nodes are projected on the boundary by using a surface parameterization. In this work, the proposed method is applied to tetrahedral meshes of genus-zero solids that are generated by the meccano method. In this case, the solid boundary is automatically decomposed into six surface patches which are parameterized into the six faces of a cube with the Floater parameterization...

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reactive halogen compounds are known to play an important role in a wide variety of atmospheric processes such as atmospheric oxidation capacity and coastal new particle formation. In this work, novel analytical approaches combining diffusion denuder/impinger sampling techniques with gas chromatographic–mass spectrometric (GC–MS) determination are developed to measure activated chlorine compounds (HOCl and Cl2), activated bromine compounds (HOBr, Br2, BrCl, and BrI), activated iodine compounds (HOI and ICl), and molecular iodine (I2). The denuder/GC–MS methods have been used to field measurements in the marine boundary layer (MBL). High mixing ratios (of the order of 100 ppt) of activated halogen compounds and I2 are observed in the coastal MBL in Ireland, which explains the ozone destruction observed. The emission of I2 is found to correlate inversely with tidal height and correlate positively with the levels of O3 in the surrounding air. In addition the release is found to be dominated by algae species compositions and biomass density, which proves the “hot-spot” hypothesis of atmospheric iodine chemistry. The observations of elevated I2 concentrations substantially support the existence of higher concentrations of littoral iodine oxides and thus the connection to the strong ultra-fine particle formation events in the coastal MBL.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Satellite image classification involves designing and developing efficient image classifiers. With satellite image data and image analysis methods multiplying rapidly, selecting the right mix of data sources and data analysis approaches has become critical to the generation of quality land-use maps. In this study, a new postprocessing information fusion algorithm for the extraction and representation of land-use information based on high-resolution satellite imagery is presented. This approach can produce land-use maps with sharp interregional boundaries and homogeneous regions. The proposed approach is conducted in five steps. First, a GIS layer - ATKIS data - was used to generate two coarse homogeneous regions, i.e. urban and rural areas. Second, a thematic (class) map was generated by use of a hybrid spectral classifier combining Gaussian Maximum Likelihood algorithm (GML) and ISODATA classifier. Third, a probabilistic relaxation algorithm was performed on the thematic map, resulting in a smoothed thematic map. Fourth, edge detection and edge thinning techniques were used to generate a contour map with pixel-width interclass boundaries. Fifth, the contour map was superimposed on the thematic map by use of a region-growing algorithm with the contour map and the smoothed thematic map as two constraints. For the operation of the proposed method, a software package is developed using programming language C. This software package comprises the GML algorithm, a probabilistic relaxation algorithm, TBL edge detector, an edge thresholding algorithm, a fast parallel thinning algorithm, and a region-growing information fusion algorithm. The county of Landau of the State Rheinland-Pfalz, Germany was selected as a test site. The high-resolution IRS-1C imagery was used as the principal input data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genetic evidence indicates that the major gelatinases MMP-2 and MMP-9 are involved in mammalian craniofacial development. Since these matrix metalloproteinases are secreted as proenzymes that require activation, their tissue distribution does not necessarily reflect the sites of enzymatic activity. Information regarding the spatial and temporal expression of gelatinolytic activity in the head of the mammalian embryo is sparse. Sensitive in situ zymography with dye-quenched gelatin (DQ-gelatin) has been introduced recently; gelatinolytic activity results in a local increase in fluorescence. Using frontal sections of wild-type mouse embryo heads from embryonic day 14.5-15.5, we optimized and validated a simple double-labeling in situ technique for combining DQ-gelatin zymography with immunofluorescence staining. MMP inhibitors were tested to confirm the specificity of the reaction in situ, and results were compared to standard SDS-gel zymography of tissue extracts. Double-labeling was used to show the spatial relationship in situ between gelatinolytic activity and immunostaining for gelatinases MMP-2 and MMP-9, collagenase 3 (MMP-13) and MT1-MMP (MMP-14), a major activator of pro-gelatinases. Strong gelatinolytic activity, which partially overlapped with MMP proteins, was confirmed for Meckel's cartilage and developing mandibular bone. In addition, we combined in situ zymography with immunostaining for extracellular matrix proteins that are potential gelatinase substrates. Interestingly, gelatinolytic activity colocalized precisely with laminin-positive basement membranes at specific sites around growing epithelia in the developing mouse head, such as the ducts of salivary glands or the epithelial fold between tongue and lower jaw region. Thus, this sensitive method allows to associate, with high spatial resolution, gelatinolytic activity with epithelial morphogenesis in the embryo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS: Although an added diagnostic and prognostic value of the global coronary artery calcification (CAC) score as an adjunct to single-photon emission computed tomography (SPECT)-myocardial perfusion image (MPI) has been repeatedly documented, none of the previous studies took advantage of the anatomic information provided by the unenhanced cardiac CT. Therefore, no co-registration has so far been used to match a myocardial perfusion defect with calcifications in the subtending coronary artery. To evaluate the prognostic value of integrating SPECT-MPI with CAC images were obtained from non-enhanced cardiac computed tomography (CT) for attenuation correction to predict major adverse cardiac events (MACE). METHODS AND RESULTS: Follow-up was obtained in 462 patients undergoing a 1-day stress/rest (99m)Tc-teterofosmin SPECT and non-enhanced cardiac CT for attenuation correction. Survival free of MACE was determined using the Kaplan-Meier method. After integrating MPI and CT findings, patients were divided into three groups (i) MPI defect matched by calcification (CAC ≥ 1) in the subtending coronary artery (ii) unmatched MPI and CT finding (iii) normal finding by MPI and CT. At a mean follow-up of 34.5 ± 13 months, a MACE was observed in 80 patients (33 death, 6 non-fatal myocardial infarction, 9 hospitalizations due to unstable angina, and 32 revascularizations). Survival analysis revealed the most unfavourable outcome (P < 0.001 log-rank test) for patients with a matched finding. CONCLUSION: In the present study, a novel approach using a combined integration of cardiac SPECT-CAC imaging allows for refined risk stratification, as a matched defect emerged as an independent predictor of MACE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study develops an automated analysis tool by combining total internal reflection fluorescence microscopy (TIRFM), an evanescent wave microscopic imaging technique to capture time-sequential images and the corresponding image processing Matlab code to identify movements of single individual particles. The developed code will enable us to examine two dimensional hindered tangential Brownian motion of nanoparticles with a sub-pixel resolution (nanoscale). The measured mean square displacements of nanoparticles are compared with theoretical predictions to estimate particle diameters and fluid viscosity using a nonlinear regression technique. These estimated values will be confirmed by the diameters and viscosities given by manufacturers to validate this analysis tool. Nano-particles used in these experiments are yellow-green polystyrene fluorescent nanospheres (200 nm, 500 nm and 1000 nm in diameter (nominal); 505 nm excitation and 515 nm emission wavelengths). Solutions used in this experiment are de-ionized (DI) water, 10% d-glucose and 10% glycerol. Mean square displacements obtained near the surface shows significant deviation from theoretical predictions which are attributed to DLVO forces in the region but it conforms to theoretical predictions after ~125 nm onwards. The proposed automation analysis tool will be powerfully employed in the bio-application fields needed for examination of single protein (DNA and/or vesicle) tracking, drug delivery, and cyto-toxicity unlike the traditional measurement techniques that require fixing the cells. Furthermore, this tool can be also usefully applied for the microfluidic areas of non-invasive thermometry, particle tracking velocimetry (PTV), and non-invasive viscometry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE Cognitive impairments are regarded as a core component of schizophrenia. However, the cognitive dimension of psychosis is hardly considered by ultra-high risk (UHR) criteria. Therefore, we studied whether the combination of symptomatic UHR criteria and the basic symptom criterion "cognitive disturbances" (COGDIS) is superior in predicting first-episode psychosis. METHOD In a naturalistic 48-month follow-up study, the conversion rate to first-episode psychosis was studied in 246 outpatients of an early detection of psychosis service (FETZ); thereby, the association between conversion, and the combined and singular use of UHR criteria and COGDIS was compared. RESULTS Patients that met UHR criteria and COGDIS (n=127) at baseline had a significantly higher risk of conversion (hr=0.66 at month 48) and a shorter time to conversion than patients that met only UHR criteria (n=37; hr=0.28) or only COGDIS (n=30; hr=0.23). Furthermore, the risk of conversion was higher for the combined criteria than for UHR criteria (n=164; hr=0.56 at month 48) and COGDIS (n=158; hr=0.56 at month 48) when considered irrespective of each other. CONCLUSIONS Our findings support the merits of considering both COGDIS and UHR criteria in the early detection of persons who are at high risk of developing a first psychotic episode within 48months. Applying both sets of criteria improves sensitivity and individual risk estimation, and may thereby support the development of stage-targeted interventions. Moreover, since the combined approach enables the identification of considerably more homogeneous at-risk samples, it should support both preventive and basic research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Phase-sensitive X-ray imaging shows a high sensitivity towards electron density variations, making it well suited for imaging of soft tissue matter. However, there are still open questions about the details of the image formation process. Here, a framework for numerical simulations of phase-sensitive X-ray imaging is presented, which takes both particle- and wave-like properties of X-rays into consideration. A split approach is presented where we combine a Monte Carlo method (MC) based sample part with a wave optics simulation based propagation part, leading to a framework that takes both particle- and wave-like properties into account. The framework can be adapted to different phase-sensitive imaging methods and has been validated through comparisons with experiments for grating interferometry and propagation-based imaging. The validation of the framework shows that the combination of wave optics and MC has been successfully implemented and yields good agreement between measurements and simulations. This demonstrates that the physical processes relevant for developing a deeper understanding of scattering in the context of phase-sensitive imaging are modelled in a sufficiently accurate manner. The framework can be used for the simulation of phase-sensitive X-ray imaging, for instance for the simulation of grating interferometry or propagation-based imaging.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Continuous theta burst stimulation (cTBS) represents a promising approach in the treatment of neglect syndrom. However, it is not known whether cTBS in conjunction with another technique may enhance the therapeutic effects. In the present sham-controlled study, we aimed to combine cTBS with smooth pursuit training (SPT), another method known to effectively improve neglect symptoms, and to evaluate whether this combination would result in a stronger effect than SPT alone. Eighteen patients with left spatial neglect after right-hemispheric stroke were included in the study and performed a cancellation task on a large 54.6" touchscreen monitor. A sequential application of cTBS and SPT induced a significantly greater improvement of neglect than SPT alone. After the combined application of these two methods, patients detected significantly more targets and their cancellation behaviour presented a significantly greater shift towards the contralesional hemispace. We suggest that a combined, sequential application of cTBS and SPT is a promising new approach to treat neglect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pencil beam scanned (PBS) proton therapy has many advantages over conventional radiotherapy, but its effectiveness for treating mobile tumours remains questionable. Gating dose delivery to the breathing pattern is a well-developed method in conventional radiotherapy for mitigating tumour-motion, but its clinical efficiency for PBS proton therapy is not yet well documented. In this study, the dosimetric benefits and the treatment efficiency of beam gating for PBS proton therapy has been comprehensively evaluated. A series of dedicated 4D dose calculations (4DDC) have been performed on 9 different 4DCT(MRI) liver data sets, which give realistic 4DCT extracting motion information from 4DMRI. The value of 4DCT(MRI) is its capability of providing not only patient geometries and deformable breathing characteristics, but also includes variations in the breathing patterns between breathing cycles. In order to monitor target motion and derive a gating signal, we simulate time-resolved beams' eye view (BEV) x-ray images as an online motion surrogate. 4DDCs have been performed using three amplitude-based gating window sizes (10/5/3 mm) with motion surrogates derived from either pre-implanted fiducial markers or the diaphragm. In addition, gating has also been simulated in combination with up to 19 times rescanning using either volumetric or layered approaches. The quality of the resulting 4DDC plans has been quantified in terms of the plan homogeneity index (HI), total treatment time and duty cycle. Results show that neither beam gating nor rescanning alone can fully retrieve the plan homogeneity of the static reference plan. Especially for variable breathing patterns, reductions of the effective duty cycle to as low as 10% have been observed with the smallest gating rescanning window (3 mm), implying that gating on its own for such cases would result in much longer treatment times. In addition, when rescanning is applied on its own, large differences between volumetric and layered rescanning have been observed as a function of increasing number of re-scans. However, once gating and rescanning is combined, HI to within 2% of the static plan could be achieved in the clinical target volume, with only moderately prolonged treatment times, irrespective of the rescanning strategy used. Moreover, these results are independent of the motion surrogate used. In conclusion, our results suggest image guided beam gating, combined with rescanning, is a feasible, effective and efficient motion mitigation approach for PBS-based liver tumour treatments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coastal managers require reliable spatial data on the extent and timing of potential coastal inundation, particularly in a changing climate. Most sea level rise (SLR) vulnerability assessments are undertaken using the easily implemented bathtub approach, where areas adjacent to the sea and below a given elevation are mapped using a deterministic line dividing potentially inundated from dry areas. This method only requires elevation data usually in the form of a digital elevation model (DEM). However, inherent errors in the DEM and spatial analysis of the bathtub model propagate into the inundation mapping. The aim of this study was to assess the impacts of spatially variable and spatially correlated elevation errors in high-spatial resolution DEMs for mapping coastal inundation. Elevation errors were best modelled using regression-kriging. This geostatistical model takes the spatial correlation in elevation errors into account, which has a significant impact on analyses that include spatial interactions, such as inundation modelling. The spatial variability of elevation errors was partially explained by land cover and terrain variables. Elevation errors were simulated using sequential Gaussian simulation, a Monte Carlo probabilistic approach. 1,000 error simulations were added to the original DEM and reclassified using a hydrologically correct bathtub method. The probability of inundation to a scenario combining a 1 in 100 year storm event over a 1 m SLR was calculated by counting the proportion of times from the 1,000 simulations that a location was inundated. This probabilistic approach can be used in a risk-aversive decision making process by planning for scenarios with different probabilities of occurrence. For example, results showed that when considering a 1% probability exceedance, the inundated area was approximately 11% larger than mapped using the deterministic bathtub approach. The probabilistic approach provides visually intuitive maps that convey uncertainties inherent to spatial data and analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effective static analyses have been proposed which allow inferring functions which bound the number of resolutions or reductions. These have the advantage of being independent from the platform on which the programs are executed and such bounds have been shown useful in a number of applications, such as granularity control in parallel execution. On the other hand, in certain distributed computation scenarios where different platforms come into play, with each platform having different capabilities, it is more interesting to express costs in metrics that include the characteristics of the platform. In particular, it is specially interesting to be able to infer upper and lower bounds on actual execution time. With this objective in mind, we propose a method which allows inferring upper and lower bounds on the execution times of procedures of a program in a given execution platform. The approach combines compile-time cost bounds analysis with a one-time profiling of the platform in order to determine the values of certain constants for that platform. These constants calibrate a cost model which from then on is able to compute statically time bound functions for procedures and to predict with a significant degree of accuracy the execution times of such procedures in the given platform. The approach has been implemented and integrated in the CiaoPP system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The method reported in the literature to calculate the stress–strain curve of nuclear fuel cladding from ring tensile test is revisited in this paper and a new alternative is presented. In the former method, two universal curves are introduced under the assumption of small strain. In this paper it is shown that these curves are not universal, but material-dependent if geometric nonlinearity is taken into account. The new method is valid beyond small strains, takes geometric nonlinearity into consideration and does not need universal curves. The stress–strain curves in the hoop direction are determined by combining numerical calculations with experimental results in a convergent loop. To this end, ring tensile tests were performed in unirradiated hydrogen-charged samples. The agreement among the simulations and the experimental results is excellent for the range of concentrations tested (up to 2000 wppm hydrogen). The calculated stress–strain curves show that the mechanical properties do not depend strongly on the hydrogen concentration, and that no noticeable strain hardening occurs. However, ductility decreases with the hydrogen concentration, especially beyond 500 wppm hydrogen. The fractographic results indicate that as-received samples fail in a ductile fashion, whereas quasicleavage is bserved in the hydrogen-charged samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this position paper, we claim that the need for time consuming data preparation and result interpretation tasks in knowledge discovery, as well as for costly expert consultation and consensus building activities required for ontology building can be reduced through exploiting the interplay of data mining and ontology engineering. The aim is to obtain in a semi-automatic way new knowledge from distributed data sources that can be used for inference and reasoning, as well as to guide the extraction of further knowledge from these data sources. The proposed approach is based on the creation of a novel knowledge discovery method relying on the combination, through an iterative ?feedbackloop?, of (a) data mining techniques to make emerge implicit models from data and (b) pattern-based ontology engineering to capture these models in reusable, conceptual and inferable artefacts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern sensor technologies and simulators applied to large and complex dynamic systems (such as road traffic networks, sets of river channels, etc.) produce large amounts of behavior data that are difficult for users to interpret and analyze. Software tools that generate presentations combining text and graphics can help users understand this data. In this paper we describe the results of our research on automatic multimedia presentation generation (including text, graphics, maps, images, etc.) for interactive exploration of behavior datasets. We designed a novel user interface that combines automatically generated text and graphical resources. We describe the general knowledge-based design of our presentation generation tool. We also present applications that we developed to validate the method, and a comparison with related work.