962 resultados para Lagrangian pseudo-isotopy


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A computationally efficient sequential Monte Carlo algorithm is proposed for the sequential design of experiments for the collection of block data described by mixed effects models. The difficulty in applying a sequential Monte Carlo algorithm in such settings is the need to evaluate the observed data likelihood, which is typically intractable for all but linear Gaussian models. To overcome this difficulty, we propose to unbiasedly estimate the likelihood, and perform inference and make decisions based on an exact-approximate algorithm. Two estimates are proposed: using Quasi Monte Carlo methods and using the Laplace approximation with importance sampling. Both of these approaches can be computationally expensive, so we propose exploiting parallel computational architectures to ensure designs can be derived in a timely manner. We also extend our approach to allow for model uncertainty. This research is motivated by important pharmacological studies related to the treatment of critically ill patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a physically motivated reappraisal of manoeuvring models for ships and presents a new model developed from first principles by application of low aspect-ratio aerodynamic theory and Lagrangian mechanics. The coefficients of the model are shown to be related to physical processes, and validation is presented using the results from a planar motion mechanism dataset.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of microfinance in Vietnam since 1990s has coincided with a remarkable progress in poverty reduction. Numerous descriptive studies have illustrated that microfinance is an effective tool to eradicate poverty in Vietnam but evidence from quantitative studies is mixed. This study contributes to the literature by providing new evidence on the impact of microfinance to poverty reduction in Vietnam using the repeated cross - sectional data from the Vietnam Living Standard s Survey (VLSS) during period 1992 - 2010. Our results show that micro - loans contribute significantly to household consumption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In estuaries and natural water channels, the estimate of velocity and dispersion coefficients is critical to the knowledge of scalar transport and mixing. This estimate is rarely available experimentally at sub-tidal time scale in shallow water channels where high frequency is required to capture its spatio-temporal variation. This study estimates Lagrangian integral scales and autocorrelation curves, which are key parameters for obtaining velocity fluctuations and dispersion coefficients, and their spatio-temporal variability from deployments of Lagrangian drifters sampled at 10 Hz for a 4-hour period. The power spectral densities of the velocities between 0.0001 and 0.8 Hz were well fitted with a slope of 5/3 predicted by Kolmogorov’s similarity hypothesis within the inertial subrange, and were similar to the Eulerian power spectral previously observed within the estuary. The result showed that large velocity fluctuations determine the magnitude of the integral time scale, TL. Overlapping of short segments improved the stability of the estimate of TL by taking advantage of the redundant data included in the autocorrelation function. The integral time scales were about 20 s and varied by up to a factor of 8. These results are essential inputs for spatial binning of velocities, Lagrangian stochastic modelling and single particle analysis of the tidal estuary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We defined a new statistical fluid registration method with Lagrangian mechanics. Although several authors have suggested that empirical statistics on brain variation should be incorporated into the registration problem, few algorithms have included this information and instead use regularizers that guarantee diffeomorphic mappings. Here we combine the advantages of a large-deformation fluid matching approach with empirical statistics on population variability in anatomy. We reformulated the Riemannian fluid algorithmdeveloped in [4], and used a Lagrangian framework to incorporate 0 th and 1st order statistics in the regularization process. 92 2D midline corpus callosum traces from a twin MRI database were fluidly registered using the non-statistical version of the algorithm (algorithm 0), giving initial vector fields and deformation tensors. Covariance matrices were computed for both distributions and incorporated either separately (algorithm 1 and algorithm 2) or together (algorithm 3) in the registration. We computed heritability maps and two vector and tensorbased distances to compare the power and the robustness of the algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we used a nonconservative Lagrangian mechanics approach to formulate a new statistical algorithm for fluid registration of 3-D brain images. This algorithm is named SAFIRA, acronym for statistically-assisted fluid image registration algorithm. A nonstatistical version of this algorithm was implemented, where the deformation was regularized by penalizing deviations from a zero rate of strain. In, the terms regularizing the deformation included the covariance of the deformation matrices Σ and the vector fields (q). Here, we used a Lagrangian framework to reformulate this algorithm, showing that the regularizing terms essentially allow nonconservative work to occur during the flow. Given 3-D brain images from a group of subjects, vector fields and their corresponding deformation matrices are computed in a first round of registrations using the nonstatistical implementation. Covariance matrices for both the deformation matrices and the vector fields are then obtained and incorporated (separately or jointly) in the nonconservative terms, creating four versions of SAFIRA. We evaluated and compared our algorithms' performance on 92 3-D brain scans from healthy monozygotic and dizygotic twins; 2-D validations are also shown for corpus callosum shapes delineated at midline in the same subjects. After preliminary tests to demonstrate each method, we compared their detection power using tensor-based morphometry (TBM), a technique to analyze local volumetric differences in brain structure. We compared the accuracy of each algorithm variant using various statistical metrics derived from the images and deformation fields. All these tests were also run with a traditional fluid method, which has been quite widely used in TBM studies. The versions incorporating vector-based empirical statistics on brain variation were consistently more accurate than their counterparts, when used for automated volumetric quantification in new brain images. This suggests the advantages of this approach for large-scale neuroimaging studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aerosol deposition in cylindrical tubes is a subject of interest to researchers and engineers in many applications of aerosol physics and metrology. Investigation of nano-particles in different aspects such as lungs, upper airways, batteries and vehicle exhaust gases is vital due the smaller size, adverse health effect and higher trouble for trapping than the micro-particles. The Lagrangian particle tracking provides an effective method for simulating the deposition of nano-particles as well as micro-particles as it accounts for the particle inertia effect as well as the Brownian excitation. However, using the Lagrangian approach for simulating ultrafine particles has been limited due to computational cost and numerical difficulties. In this paper, the deposition of nano-particles in cylindrical tubes under laminar condition is studied using the Lagrangian particle tracking method. The commercial Fluent software is used to simulate the fluid flow in the pipes and to study the deposition and dispersion of nano-particles. Different particle diameters as well as different flow rates are examined. The point analysis in a uniform flow is performed for validating the Brownian motion. The results show good agreement between the calculated deposition efficiency and the analytic correlations in the literature. Furthermore, for the nano-particles with the diameter more than 40 nm, the calculated deposition efficiency by the Lagrangian method is less than the analytic correlations based on Eulerian method due to statistical error or the inertia effect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, efficient scheduling algorithms based on Lagrangian relaxation have been proposed for scheduling parallel machine systems and job shops. In this article, we develop real-world extensions to these scheduling methods. In the first part of the paper, we consider the problem of scheduling single operation jobs on parallel identical machines and extend the methodology to handle multiple classes of jobs, taking into account setup times and setup costs, The proposed methodology uses Lagrangian relaxation and simulated annealing in a hybrid framework, In the second part of the paper, we consider a Lagrangian relaxation based method for scheduling job shops and extend it to obtain a scheduling methodology for a real-world flexible manufacturing system with centralized material handling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A pseudo-dynamical approach for a class of inverse problems involving static measurements is proposed and explored. Following linearization of the minimizing functional associated with the underlying optimization problem, the new strategy results in a system of linearized ordinary differential equations (ODEs) whose steady-state solutions yield the desired reconstruction. We consider some explicit and implicit schemes for integrating the ODEs and thus establish a deterministic reconstruction strategy without an explicit use of regularization. A stochastic reconstruction strategy is then developed making use of an ensemble Kalman filter wherein these ODEs serve as the measurement model. Finally, we assess the numerical efficacy of the developed tools against a few linear and nonlinear inverse problems of engineering interest.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The need for reexamination of the standard model of strong, weak, and electromagnetic interactions is discussed, especially with regard to 't Hooft's criterion of naturalness. It has been argued that theories with fundamental scalar fields tend to be unnatural at relatively low energies. There are two solutions to this problem: (i) a global supersymmetry, which ensures the absence of all the naturalness-violating effects associated with scalar fields, and (ii) composite structure of the scalar fields, which starts showing up at energy scales where unnatural effects would otherwise have appeared. With reference to the second solution, this article reviews the case for dynamical breaking of the gauge symmetry and the technicolor scheme for the composite Higgs boson. This new interaction, of the scaled-up quantum chromodynamic type, keeps the new set of fermions, the technifermions, together in the Higgs particles. It also provides masses for the electroweak gauge bosons W± and Z0 through technifermion condensate formation. In order to give masses to the ordinary fermions, a new interaction, the extended technicolor interaction, which would connect the ordinary fermions to the technifermions, is required. The extended technicolor group breaks down spontaneously to the technicolor group, possibly as a result of the "tumbling" mechanism, which is discussed here. In addition, the author presents schemes for the isospin breaking of mass matrices of ordinary quarks in the technicolor models. In generalized technicolor models with more than one doublet of technifermions or with more than one technicolor sector, we have additional low-lying degrees of freedom, the pseudo-Goldstone bosons. The pseudo-Goldstone bosons in the technicolor model of Dimopoulos are reviewed and their masses computed. In this context the vacuum alignment problem is also discussed. An effective Lagrangian is derived describing colorless low-lying degrees of freedom for models with two technicolor sectors in the combined limits of chiral symmetry and large number of colors and technicolors. Finally, the author discusses suppression of flavor-changing neutral currents in the extended technicolor models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Arylalkylcyclopropenethiones undergo highly regioselective photochemical a-cleavage via thioketene carbene intermediates, giving rise to products derived from the less stabilized carbene. UHF MIND0/3 calculations provide an insight into this unexpected regioselectivity. The nx* triplet of cyclopropenethione is calculated to have a highly unsymmetrical geometry with an elongated C-C bond, a delocalized thiaaUyl fragment, and a pyramidal radicaloid carbon (which eventually becomes the carbene center). From this molecular electronic structure, aryl group stabilization is expected to be more effective at the thiaallyl group rather than at the pyramidal radical center. Thus, the stability of the substituted triplet thione rather than that of the thioketene carbene determines the preferred regiochemistry of cleavage. The unusual structure of the cyclopropenethione triplet is suggested to be related to one of the Jahn-Teller distorted forms of the cyclopropenyl radical. An alternative symmetrical structure is adopted by the corresponding triplet of cyclopropenone, partly accounting for its differing photobehavior. A similar structural dichotomy is demonstrated for the corresponding radical anions as well.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pseudo-marginal methods such as the grouped independence Metropolis-Hastings (GIMH) and Markov chain within Metropolis (MCWM) algorithms have been introduced in the literature as an approach to perform Bayesian inference in latent variable models. These methods replace intractable likelihood calculations with unbiased estimates within Markov chain Monte Carlo algorithms. The GIMH method has the posterior of interest as its limiting distribution, but suffers from poor mixing if it is too computationally intensive to obtain high-precision likelihood estimates. The MCWM algorithm has better mixing properties, but less theoretical support. In this paper we propose to use Gaussian processes (GP) to accelerate the GIMH method, whilst using a short pilot run of MCWM to train the GP. Our new method, GP-GIMH, is illustrated on simulated data from a stochastic volatility and a gene network model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A pair of Latin squares, A and B, of order n, is said to be pseudo-orthogonal if each symbol in A is paired with every symbol in B precisely once, except for one symbol with which it is paired twice and one symbol with which it is not paired at all. A set of t Latin squares, of order n, are said to be mutually pseudo-orthogonal if they are pairwise pseudo-orthogonal. A special class of pseudo-orthogonal Latin squares are the mutually nearly orthogonal Latin squares (MNOLS) first discussed in 2002, with general constructions given in 2007. In this paper we develop row complete MNOLS from difference covering arrays. We will use this connection to settle the spectrum question for sets of 3 mutually pseudo-orthogonal Latin squares of even order, for all but the order 146.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper presents a method for the evaluation of external stability of reinforced soil walls subjected to earthquakes in the framework of the pseudo-dynamic method. The seismic reliability of the wall is evaluated by considering the different possible failure modes such as sliding along the base, overturning about the toe point of the wall, bearing capacity and the eccentricity of the resultant force. The analysis is performed considering properties of the reinforced backfill, foundation soil below the base of the wall, length of the geosynthetic reinforcement and characteristics of earthquake ground motions such as shear wave and primary wave velocity as random variables. The optimum length of reinforcement needed to maintain stability against four modes of failure by targeting various component reliability indices is obtained. Differences between pseudo-static and pseudo-dynamic methods are clearly highlighted in the paper. A complete analysis of pseudo-static and pseudo-dynamic methodologies shows that the pseudodynamic method results in realistic design values for the length of geosynthetic reinforcement under earthquake conditions.