404 resultados para regularization


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A modification of the Dubinin-Radushkevich pore filling model by incorporation of the repulsive contribution to the pore potential, and of bulk non-ideality, is proposed in this paper for characterization of activated carbon using liquid phase adsorption. For this purpose experiments have been performed using ethyl propionate, ethyl butyrate, and ethyl isovalerate as adsorbates and the microporous-mesoporous activated carbons Filtrasorb 400, Norit ROW 0.8 and Norit ROX 0.8 as adsorbents. The repulsive contribution to the pore potential is incorporated through a Lennard-Jones intermolecular potential model, and the bulk-liquid phase non-ideality through the UNIFAC activity coefficient model. For the characterization of activated carbons, the generalized adsorption isotherm is utilized with a bimodal gamma function as the pore size distribution function. It is found that the model can represent the experimental data very well, and significantly better than when the classical energy-size relationship is used, or when bulk non-ideality is neglected. Excellent agreement between the bimodal gamma pore size distribution and DFT-cum-regularization based pore size distribution is also observed, supporting the validity of the proposed model. (C) 2001 Elsevier Science Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a new approach for the design of genuinely finite-length shim and gradient coils, intended for use in magnetic resonance imaging equipment. A cylindrical target region is located asymmetrically, at an arbitrary position within a coil of finite length. A desired target field is specified on the surface of that region, and a method is given that enables winding patterns on the surface of the coil to be designed, to produce the desired field at the inner target region. The method uses a minimization technique combined with regularization, to find the current density on the surface of the coil. The method is illustrated for linear, quadratic and cubic magnetic target fields located asymmetrically within a finite-length coil.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Radio-frequency (RF) coils are a necessary component of magnetic resonance imaging (MRI) systems. When used in transmit operation, they act to generate a homogeneous RF magnetic field within a volume of interest and when in receive operation, they act to receive the nuclear magnetic resonance signal from the RF-excited specimen. This paper outlines a procedure for the design of open RF coils using the time-harmonic inverse method. This method entails the calculation of an ideal current density on a multipaned planar surface that would generate a specified magnetic field within the volume of interest. Because of the averaging effect of the regularization technique in the matrix solution, the specified magnetic field is shaped within an iterative procedure until the generated magnetic field matches the desired magnetic field. The stream-function technique is used to ascertain conductor positions and a method of moments package is then used to finalize the design. An open head/neck coil was designed to operate in a clinical 2T MRI system and the presented results prove the efficacy of this design methodology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of a fitted parameter watershed model to address water quantity and quality management issues requires that it be calibrated under a wide range of hydrologic conditions. However, rarely does model calibration result in a unique parameter set. Parameter nonuniqueness can lead to predictive nonuniqueness. The extent of model predictive uncertainty should be investigated if management decisions are to be based on model projections. Using models built for four neighboring watersheds in the Neuse River Basin of North Carolina, the application of the automated parameter optimization software PEST in conjunction with the Hydrologic Simulation Program Fortran (HSPF) is demonstrated. Parameter nonuniqueness is illustrated, and a method is presented for calculating many different sets of parameters, all of which acceptably calibrate a watershed model. A regularization methodology is discussed in which models for similar watersheds can be calibrated simultaneously. Using this method, parameter differences between watershed models can be minimized while maintaining fit between model outputs and field observations. In recognition of the fact that parameter nonuniqueness and predictive uncertainty are inherent to the modeling process, PEST's nonlinear predictive analysis functionality is then used to explore the extent of model predictive uncertainty.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Image segmentation is an ubiquitous task in medical image analysis, which is required to estimate morphological or functional properties of given anatomical targets. While automatic processing is highly desirable, image segmentation remains to date a supervised process in daily clinical practice. Indeed, challenging data often requires user interaction to capture the required level of anatomical detail. To optimize the analysis of 3D images, the user should be able to efficiently interact with the result of any segmentation algorithm to correct any possible disagreement. Building on a previously developed real-time 3D segmentation algorithm, we propose in the present work an extension towards an interactive application where user information can be used online to steer the segmentation result. This enables a synergistic collaboration between the operator and the underlying segmentation algorithm, thus contributing to higher segmentation accuracy, while keeping total analysis time competitive. To this end, we formalize the user interaction paradigm using a geometrical approach, where the user input is mapped to a non-cartesian space while this information is used to drive the boundary towards the position provided by the user. Additionally, we propose a shape regularization term which improves the interaction with the segmented surface, thereby making the interactive segmentation process less cumbersome. The resulting algorithm offers competitive performance both in terms of segmentation accuracy, as well as in terms of total analysis time. This contributes to a more efficient use of the existing segmentation tools in daily clinical practice. Furthermore, it compares favorably to state-of-the-art interactive segmentation software based on a 3D livewire-based algorithm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently, several distributed video coding (DVC) solutions based on the distributed source coding (DSC) paradigm have appeared in the literature. Wyner-Ziv (WZ) video coding, a particular case of DVC where side information is made available at the decoder, enable to achieve a flexible distribution of the computational complexity between the encoder and decoder, promising to fulfill novel requirements from applications such as video surveillance, sensor networks and mobile camera phones. The quality of the side information at the decoder has a critical role in determining the WZ video coding rate-distortion (RD) performance, notably to raise it to a level as close as possible to the RD performance of standard predictive video coding schemes. Towards this target, efficient motion search algorithms for powerful frame interpolation are much needed at the decoder. In this paper, the RD performance of a Wyner-Ziv video codec is improved by using novel, advanced motion compensated frame interpolation techniques to generate the side information. The development of these type of side information estimators is a difficult problem in WZ video coding, especially because the decoder only has available some reference, decoded frames. Based on the regularization of the motion field, novel side information creation techniques are proposed in this paper along with a new frame interpolation framework able to generate higher quality side information at the decoder. To illustrate the RD performance improvements, this novel side information creation framework has been integrated in a transform domain turbo coding based Wyner-Ziv video codec. Experimental results show that the novel side information creation solution leads to better RD performance than available state-of-the-art side information estimators, with improvements up to 2 dB: moreover, it allows outperforming H.264/AVC Intra by up to 3 dB with a lower encoding complexity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We compare the magnetic field at the centre and the self-magnetic flux through a current-carrying circular loop, with those obtained for current-carrying polygons with the same perimeter. As the magnetic field diverges at the position of the wires, we compare the self-fluxes utilizing several regularization procedures. The calculation is best performed utilizing the vector potential, thus highlighting its usefulness in practical applications. Our analysis answers some of the intuition challenges students face when they encounter a related simple textbook example. These results can be applied directly to the determination of mutual inductances in a variety of situations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fluorescence confocal microscopy (FCM) is now one of the most important tools in biomedicine research. In fact, it makes it possible to accurately study the dynamic processes occurring inside the cell and its nucleus by following the motion of fluorescent molecules over time. Due to the small amount of acquired radiation and the huge optical and electronics amplification, the FCM images are usually corrupted by a severe type of Poisson noise. This noise may be even more damaging when very low intensity incident radiation is used to avoid phototoxicity. In this paper, a Bayesian algorithm is proposed to remove the Poisson intensity dependent noise corrupting the FCM image sequences. The observations are organized in a 3-D tensor where each plane is one of the images acquired along the time of a cell nucleus using the fluorescence loss in photobleaching (FLIP) technique. The method removes simultaneously the noise by considering different spatial and temporal correlations. This is accomplished by using an anisotropic 3-D filter that may be separately tuned in space and in time dimensions. Tests using synthetic and real data are described and presented to illustrate the application of the algorithm. A comparison with several state-of-the-art algorithms is also presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Schwinger proper-time method is an effective calculation method, explicitly gauge-invariant and nonperturbative. We make use of this method to investigate the radiatively induced Lorentz- and CPT-violating effects in quantum electrodynamics when an axial-vector interaction term is introduced in the fermionic sector. The induced Lorentz- and CPT-violating Chern-Simons term coincides with the one obtained using a covariant derivative expansion but differs from the result usually obtained in other regularization schemes. A possible ambiguity in the approach is also discussed. (C) 2001 Published by Elsevier Science B.V.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Evidence Accumulation Clustering (EAC) paradigm is a clustering ensemble method which derives a consensus partition from a collection of base clusterings obtained using different algorithms. It collects from the partitions in the ensemble a set of pairwise observations about the co-occurrence of objects in a same cluster and it uses these co-occurrence statistics to derive a similarity matrix, referred to as co-association matrix. The Probabilistic Evidence Accumulation for Clustering Ensembles (PEACE) algorithm is a principled approach for the extraction of a consensus clustering from the observations encoded in the co-association matrix based on a probabilistic model for the co-association matrix parameterized by the unknown assignments of objects to clusters. In this paper we extend the PEACE algorithm by deriving a consensus solution according to a MAP approach with Dirichlet priors defined for the unknown probabilistic cluster assignments. In particular, we study the positive regularization effect of Dirichlet priors on the final consensus solution with both synthetic and real benchmark data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Doutor em Matemática

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The processes of mobilization of land for infrastructures of public and private domain are developed according to proper legal frameworks and systematically confronted with the impoverished national situation as regards the cadastral identification and regularization, which leads to big inefficiencies, sometimes with very negative impact to the overall effectiveness. This project report describes Ferbritas Cadastre Information System (FBSIC) project and tools, which in conjunction with other applications, allow managing the entire life-cycle of Land Acquisition and Cadastre, including support to field activities with the integration of information collected in the field, the development of multi-criteria analysis information, monitoring all information in the exploration stage, and the automated generation of outputs. The benefits are evident at the level of operational efficiency, including tools that enable process integration and standardization of procedures, facilitate analysis and quality control and maximize performance in the acquisition, maintenance and management of registration information and expropriation (expropriation projects). Therefore, the implemented system achieves levels of robustness, comprehensiveness, openness, scalability and reliability suitable for a structural platform. The resultant solution, FBSIC, is a fit-for-purpose cadastre information system rooted in the field of railway infrastructures. FBSIC integrating nature of allows: to accomplish present needs and scale to meet future services; to collect, maintain, manage and share all information in one common platform, and transform it into knowledge; to relate with other platforms; to increase accuracy and productivity of business processes related with land property management.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Projeto de Investigação integrado de mestrado Internacional em Sustentabilidade do Ambiente Construído

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Imaging microwave reconstruction dielectric contrast regularization iterative multiport cavity measurement

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Cardiac magnetic resonance imaging provides detailed anatomical information on infarction. However, few studies have investigated the association of these data with mortality after acute myocardial infarction. Objective: To study the association between data regarding infarct size and anatomy, as obtained from cardiac magnetic resonance imaging after acute myocardial infarction, and long-term mortality. Methods: A total of 1959 reports of “infarct size” were identified in 7119 cardiac magnetic resonance imaging studies, of which 420 had clinical and laboratory confirmation of previous myocardial infarction. The variables studied were the classic risk factors – left ventricular ejection fraction, categorized ventricular function, and location of acute myocardial infarction. Infarct size and acute myocardial infarction extent and transmurality were analyzed alone and together, using the variable named “MET-AMI”. The statistical analysis was carried out using the elastic net regularization, with the Cox model and survival trees. Results: The mean age was 62.3 ± 12 years, and 77.3% were males. During the mean follow-up of 6.4 ± 2.9 years, there were 76 deaths (18.1%). Serum creatinine, diabetes mellitus and previous myocardial infarction were independently associated with mortality. Age was the main explanatory factor. The cardiac magnetic resonance imaging variables independently associated with mortality were transmurality of acute myocardial infarction (p = 0.047), ventricular dysfunction (p = 0.0005) and infarcted size (p = 0.0005); the latter was the main explanatory variable for ischemic heart disease death. The MET-AMI variable was the most strongly associated with risk of ischemic heart disease death (HR: 16.04; 95%CI: 2.64-97.5; p = 0.003). Conclusion: The anatomical data of infarction, obtained from cardiac magnetic resonance imaging after acute myocardial infarction, were independently associated with long-term mortality, especially for ischemic heart disease death.