893 resultados para Shape optimization method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Point Distribution Models (PDM) are among the most popular shape description techniques and their usefulness has been demonstrated in a wide variety of medical imaging applications. However, to adequately characterize the underlying modeled population it is essential to have a representative number of training samples, which is not always possible. This problem is especially relevant as the complexity of the modeled structure increases, being the modeling of ensembles of multiple 3D organs one of the most challenging cases. In this paper, we introduce a new GEneralized Multi-resolution PDM (GEM-PDM) in the context of multi-organ analysis able to efficiently characterize the different inter-object relations, as well as the particular locality of each object separately. Importantly, unlike previous approaches, the configuration of the algorithm is automated thanks to a new agglomerative landmark clustering method proposed here, which equally allows us to identify smaller anatomically significant regions within organs. The significant advantage of the GEM-PDM method over two previous approaches (PDM and hierarchical PDM) in terms of shape modeling accuracy and robustness to noise, has been successfully verified for two different databases of sets of multiple organs: six subcortical brain structures, and seven abdominal organs. Finally, we propose the integration of the new shape modeling framework into an active shape-model-based segmentation algorithm. The resulting algorithm, named GEMA, provides a better overall performance than the two classical approaches tested, ASM, and hierarchical ASM, when applied to the segmentation of 3D brain MRI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An in-depth study, using simulations and covariance analysis, is performed to identify the optimal sequence of observations to obtain the most accurate orbit propagation. The accuracy of the results of an orbit determination/ improvement process depends on: tracklet length, number of observations, type of orbit, astrometric error, time interval between tracklets and observation geometry. The latter depends on the position of the object along its orbit and the location of the observing station. This covariance analysis aims to optimize the observation strategy taking into account the influence of the orbit shape, of the relative object-observer geometry and the interval between observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Finite element (FE) analysis is an important computational tool in biomechanics. However, its adoption into clinical practice has been hampered by its computational complexity and required high technical competences for clinicians. In this paper we propose a supervised learning approach to predict the outcome of the FE analysis. We demonstrate our approach on clinical CT and X-ray femur images for FE predictions ( FEP), with features extracted, respectively, from a statistical shape model and from 2D-based morphometric and density information. Using leave-one-out experiments and sensitivity analysis, comprising a database of 89 clinical cases, our method is capable of predicting the distribution of stress values for a walking loading condition with an average correlation coefficient of 0.984 and 0.976, for CT and X-ray images, respectively. These findings suggest that supervised learning approaches have the potential to leverage the clinical integration of mechanical simulations for the treatment of musculoskeletal conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel surrogate model-based global optimization framework allowing a large number of function evaluations. The method, called SpLEGO, is based on a multi-scale expected improvement (EI) framework relying on both sparse and local Gaussian process (GP) models. First, a bi-objective approach relying on a global sparse GP model is used to determine potential next sampling regions. Local GP models are then constructed within each selected region. The method subsequently employs the standard expected improvement criterion to deal with the exploration-exploitation trade-off within selected local models, leading to a decision on where to perform the next function evaluation(s). The potential of our approach is demonstrated using the so-called Sparse Pseudo-input GP as a global model. The algorithm is tested on four benchmark problems, whose number of starting points ranges from 102 to 104. Our results show that SpLEGO is effective and capable of solving problems with large number of starting points, and it even provides significant advantages when compared with state-of-the-art EI algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pharmacokinetic and pharmacodynamic properties of a chiral drug can significantly differ between application of the racemate and single enantiomers. During drug development, the characteristics of candidate compounds have to be assessed prior to clinical testing. Since biotransformation significantly influences drug actions in an organism, metabolism studies represent a crucial part of such tests. Hence, an optimized and economical capillary electrophoretic method for on-line studies of the enantioselective drug metabolism mediated by cytochrome P450 enzymes was developed. It comprises a diffusion-based procedure, which enables mixing of the enzyme with virtually any compound inside the nanoliter-scale capillary reactor and without the need of additional optimization of mixing conditions. For CYP3A4, ketamine as probe substrate and highly sulfated γ-cyclodextrin as chiral selector, improved separation conditions for ketamine and norketamine enantiomers compared to a previously published electrophoretically mediated microanalysis method were elucidated. The new approach was thoroughly validated for the CYP3A4-mediated N-demethylation pathway of ketamine and applied to the determination of its kinetic parameters and the inhibition characteristics in presence of ketoconazole and dexmedetomidine. The determined parameters were found to be comparable to literature data obtained with different techniques. The presented method constitutes a miniaturized and cost-effective tool, which should be suitable for the assessment of the stereoselective aspects of kinetic and inhibition studies of cytochrome P450-mediated metabolic steps within early stages of the development of a new drug.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diamonds are known for both their beauty and their durability. Jefferson National Lab in Newport News, VA has found a way to utilize the diamond's strength to view the beauty of the inside of the atomic nucleus with the hopes of finding exotic forms of matter. By firing very fast electrons at a diamond sheet no thicker than a human hair, high energy particles of light known as photons are produced with a high degree of polarization that can illuminate the constituents of the nucleus known as quarks. The University of Connecticut Nuclear Physics group has responsibility for crafting these extremely thin, high quality diamond wafers. These wafers must be cut from larger stones that are about the size of a human finger, and then carefully machined down to the final thickness. The thinning of these diamonds is extremely challenging, as the diamond's greatest strength also becomes its greatest weakness. The Connecticut Nuclear Physics group has developed a novel technique to assist industrial partners in assessing the quality of the final machining steps, using a technique based on laser interferometry. The images of the diamond surface produced by the interferometer encode the thickness and shape of the diamond surface in a complex way that requires detailed analysis to extract. We have developed a novel software application to analyze these images based on the method of simulated annealing. Being able to image the surface of these diamonds without requiring costly X-ray diffraction measurements allows rapid feedback to the industrial partners as they refine their thinning techniques. Thus, by utilizing a material found to be beautiful by many, the beauty of nature can be brought more clearly into view.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study multibeam angular backscatter data acquired in the eastern slope of the Porcupine Seabight are analysed. Processing of the angular backscatter data using the 'NRGCOR' software was made for 29 locations comprising different geological provinces like: carbonate mounds, buried mounds, seafloor channels, and inter-channel areas. A detailed methodology is developed to produce a map of angle-invariant (normalized) backscatter data by correcting the local angular backscatter values. The present paper involves detailed processing steps and related technical aspects of the normalization approach. The presented angle-invariant backscatter map possesses 12 dB dynamic range in terms of grey scale. A clear distinction is seen between the mound dominated northern area (Belgica province) and the Gollum channel seafloor at the southern end of the site. Qualitative analyses of the calculated mean backscatter values i.e., grey scale levels, utilizing angle-invariant backscatter data generally indicate backscatter values are highest (lighter grey scale) in the mound areas followed by buried mounds. The backscatter values are lowest in the inter-channel areas (lowest grey scale level). Moderate backscatter values (medium grey level) are observed from the Gollum and Kings channel data, and significant variability within the channel seafloor provinces. The segmentation of the channel seafloor provinces are made based on the computed grey scale levels for further analyses based on the angular backscatter strength. Three major parameters are utilized to classify four different seafloor provinces of the Porcupine Seabight by employing a semi-empirical method to analyse multibeam angular backscatter data. The predicted backscatter response which has been computed at 20° is the highest for the mound areas. The coefficient of variation (CV) of the mean backscatter response is also the highest for the mound areas. Interestingly, the slope value of the buried mound areas are found to be the highest. However, the channel seafloor of moderate backscatter response presents the lowest slope and CV values. A critical examination of the inter-channel areas indicates less variability within the estimated three parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This doctoral thesis explores some of the possibilities that near-field optics can bring to photovoltaics, and in particular to quantum-dot intermediate band solar cells (QD-IBSCs). Our main focus is the analytical optimization of the electric field distribution produced in the vicinity of single scattering particles, in order to produce the highest possible absorption enhancement in the photovoltaic medium in their surroundings. Near-field scattering structures have also been fabricated in laboratory, allowing the application of the previously studied theoretical concepts to real devices. We start by looking into the electrostatic scattering regime, which is only applicable to sub-wavelength sized particles. In this regime it was found that metallic nano-spheroids can produce absorption enhancements of about two orders of magnitude on the material in their vicinity, due to their strong plasmonic resonance. The frequency of such resonance can be tuned with the shape of the particles, allowing us to match it with the optimal transition energies of the intermediate band material. Since these metallic nanoparticles (MNPs) are to be inserted inside the cell photovoltaic medium, they should be coated by a thin insulating layer to prevent electron-hole recombination at their surface. This analysis is then generalized, using an analytical separation-of-variables method implemented in Mathematica7.0, to compute scattering by spheroids of any size and material. This code allowed the study of the scattering properties of wavelengthsized particles (mesoscopic regime), and it was verified that in this regime dielectric spheroids perform better than metallic. The light intensity scattered from such dielectric spheroids can have more than two orders of magnitude than the incident intensity, and the focal region in front of the particle can be shaped in several ways by changing the particle geometry and/or material. Experimental work was also performed in this PhD to implement in practice the concepts studied in the analysis of sub-wavelength MNPs. A wet-coating method was developed to self-assemble regular arrays of colloidal MNPs on the surface of several materials, such as silicon wafers, amorphous silicon films, gallium arsenide and glass. A series of thermal and chemical tests have been performed showing what treatments the nanoparticles can withstand for their embedment in a photovoltaic medium. MNPs arrays are then inserted in an amorphous silicon medium to study the effect of their plasmonic near-field enhancement on the absorption spectrum of the material. The self-assembled arrays of MNPs constructed in these experiments inspired a new strategy for fabricating IBSCs using colloidal quantum dots (CQDs). Such CQDs can be deposited in self-assembled monolayers, using procedures similar to those developed for the patterning of colloidal MNPs. The use of CQDs to form the intermediate band presents several important practical and physical advantages relative to the conventional dots epitaxially grown by the Stranski-Krastanov method. Besides, this provides a fast and inexpensive method for patterning binary arrays of QDs and MNPs, envisioned in the theoretical part of this thesis, in which the MNPs act as antennas focusing the light in the QDs and therefore boosting their absorption

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The technique of Abstract Interpretation has allowed the development of very sophisticated global program analyses which are at the same time provably correct and practical. We present in a tutorial fashion a novel program development framework which uses abstract interpretation as a fundamental tool. The framework uses modular, incremental abstract interpretation to obtain information about the program. This information is used to validate programs, to detect bugs with respect to partial specifications written using assertions (in the program itself and/or in system libraries), to generate and simplify run-time tests, and to perform high-level program transformations such as multiple abstract specialization, parallelization, and resource usage control, all in a provably correct way. In the case of validation and debugging, the assertions can refer to a variety of program points such as procedure entry, procedure exit, points within procedures, or global computations. The system can reason with much richer information than, for example, traditional types. This includes data structure shape (including pointer sharing), bounds on data structure sizes, and other operational variable instantiation properties, as well as procedure-level properties such as determinacy, termination, nonfailure, and bounds on resource consumption (time or space cost). CiaoPP, the preprocessor of the Ciao multi-paradigm programming system, which implements the described functionality, will be used to illustrate the fundamental ideas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have recently demonstrated a biosensor based on a lattice of SU8 pillars on a 1 μm SiO2/Si wafer by measuring vertically reflectivity as a function of wavelength. The biodetection has been proven with the combination of Bovine Serum Albumin (BSA) protein and its antibody (antiBSA). A BSA layer is attached to the pillars; the biorecognition of antiBSA involves a shift in the reflectivity curve, related with the concentration of antiBSA. A detection limit in the order of 2 ng/ml is achieved for a rhombic lattice of pillars with a lattice parameter (a) of 800 nm, a height (h) of 420 nm and a diameter(d) of 200 nm. These results correlate with calculations using 3D-finite difference time domain method. A 2D simplified model is proposed, consisting of a multilayer model where the pillars are turned into a 420 nm layer with an effective refractive index obtained by using Beam Propagation Method (BPM) algorithm. Results provided by this model are in good correlation with experimental data, reaching a reduction in time from one day to 15 minutes, giving a fast but accurate tool to optimize the design and maximizing sensitivity, and allows analyzing the influence of different variables (diameter, height and lattice parameter). Sensitivity is obtained for a variety of configurations, reaching a limit of detection under 1 ng/ml. Optimum design is not only chosen because of its sensitivity but also its feasibility, both from fabrication (limited by aspect ratio and proximity of the pillars) and fluidic point of view. (© 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop a novel remote sensing technique for the observation of waves on the ocean surface. Our method infers the 3-D waveform and radiance of oceanic sea states via a variational stereo imagery formulation. In this setting, the shape and radiance of the wave surface are given by minimizers of a composite energy functional that combines a photometric matching term along with regularization terms involving the smoothness of the unknowns. The desired ocean surface shape and radiance are the solution of a system of coupled partial differential equations derived from the optimality conditions of the energy functional. The proposed method is naturally extended to study the spatiotemporal dynamics of ocean waves and applied to three sets of stereo video data. Statistical and spectral analysis are carried out. Our results provide evidence that the observed omnidirectional wavenumber spectrum S(k) decays as k-2.5 is in agreement with Zakharov's theory (1999). Furthermore, the 3-D spectrum of the reconstructed wave surface is exploited to estimate wave dispersion and currents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Simultaneous Multiple Surfaces (SMS) was developed as a design method in Nonimaging Optics during the 90s. Later, the method was extended for designing Imaging Optics. We present an overview of the method applied to imaging optics in planar (2D) geometry and compare the results with more classical designs based on achieving aplanatism of different orders. These classical designs are also viewed as particular cases of SMS designs. Systems with up to 4 aspheric surfaces are shown. The SMS design strategy is shown to perform always better than the classical design (in terms of image quality). Moreover, the SMS method is a direct method, i.e., it is not based in multi-parametric optimization techniques. This gives the SMS method an additional interest since it can be used for exploring solutions where the multiparameter techniques can get lost because of the multiple local minima

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article focuses on the evaluation of a biometric technique based on the performance of an identifying gesture by holding a telephone with an embedded accelerometer in his/her hand. The acceleration signals obtained when users perform gestures are analyzed following a mathematical method based on global sequence alignment. In this article, eight different scores are proposed and evaluated in order to quantify the differences between gestures, obtaining an optimal EER result of 3.42% when analyzing a random set of 40 users of a database made up of 80 users with real attempts of falsification. Moreover, a temporal study of the technique is presented leeding to the need to update the template to adapt the manner in which users modify how they perform their identifying gesture over time. Six updating schemes have been assessed within a database of 22 users repeating their identifying gesture in 20 sessions over 4 months, concluding that the more often the template is updated the better and more stable performance the technique presents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modeling the evolution of the state of program memory during program execution is critical to many parallehzation techniques. Current memory analysis techniques either provide very accurate information but run prohibitively slowly or produce very conservative results. An approach based on abstract interpretation is presented for analyzing programs at compile time, which can accurately determine many important program properties such as aliasing, logical data structures and shape. These properties are known to be critical for transforming a single threaded program into a versión that can be run on múltiple execution units in parallel. The analysis is shown to be of polynomial complexity in the size of the memory heap. Experimental results for benchmarks in the Jolden suite are given. These results show that in practice the analysis method is efflcient and is capable of accurately determining shape information in programs that créate and manipúlate complex data structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present in a tutorial fashion CiaoPP, the preprocessor of the Ciao multi-paradigm programming system, which implements a novel program development framework which uses abstract interpretation as a fundamental tool. The framework uses modular, incremental abstract interpretation to obtain information about the program. This information is used to validate programs, to detect bugs with respect to partial specifications written using assertions (in the program itself and/or in system libraries), to generate and simplify run-time tests, and to perform high-level program transformations such as multiple abstract specialization, parallelization, and resource usage control, all in a provably correct way. In the case of validation and debugging, the assertions can refer to a variety of program points such as procedure entry, procedure exit, points within procedures, or global computations. The system can reason with much richer information than, for example, traditional types. This includes data structure shape (including pointer sharing), bounds on data structure sizes, and other operational variable instantiation properties, as well as procedure-level properties such as determinacy, termination, non-failure, and bounds on resource consumption (time or space cost).