989 resultados para Monte Alegre - PA


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The nonequilibrium-phase transition has been studied by Monte Carlo simulation in a ferromagnetically interacting (nearest-neighbour) kinetic Ising model in presence of a sinusoidally oscillating magnetic field. The ('specific-heat') temperature derivative of energies (averaged over a full cycle of the oscillating field) diverge near the dynamic transition point.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The compositional evolution in sputter deposited LiCoO(2) thin films is influenced by process parameters involved during deposition. The electrochemical performance of these films strongly depends on their microstructure, preferential orientation and stoichiometry. The transport process of sputtered Li and Co atoms from the LiCoO(2) target to the substrate, through Ar plasma in a planar magnetron configuration, was investigated based on the Monte Carlo technique. The effect of sputtering gas pressure and the substrate-target distance (d(st)) on Li/Co ratio, as well as, energy and angular distribution of sputtered atoms on the substrate were examined. Stable Li/Co ratios have been obtained at 5 Pa pressure and d(st) in the range 5 11 cm. The kinetic energy and incident angular distribution of Li and Co atoms reaching the substrate have been found to be dependent on sputtering pressure. Simulations were extended to predict compositional variations in films prepared at various process conditions. These results were compared with the composition of films determined experimentally using x-ray photoelectron spectroscopy (XPS). Li/Co ratio calculated using XPS was in moderate agreement with that of the simulated value. The measured film thickness followed the same trend as predicted by simulation. These studies are shown to be useful in understanding the complexities in multicomponent sputtering. (C) 2011 American Institute of Physics. doi:10.1063/1.3597829]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An improved Monte Carlo technique is presented in this work to simulate nanoparticle formation through a micellar route. The technique builds on the simulation technique proposed by Bandyopadhyaya et al. (Langmuir 2000, 16, 7139) which is general and rigorous but at the same time very computation intensive, so much so that nanoparticle formation in low occupancy systems cannot be simulated in reasonable time. In view of this, several strategies, rationalized by simple mathematical analyses, are proposed to accelerate Monte Carlo simulations. These are elimination of infructuous events, removal of excess reactant postreaction, and use of smaller micelle population a large number of times. Infructuous events include collision of an empty micelle with another empty one or with another one containing only one molecule or only a solid particle. These strategies are incorporated in a new simulation technique which divides the entire micelle population in four classes and shifts micelles from one class to other as the simulation proceeds. The simulation results, throughly tested using chi-square and other tests, show that the predictions of the improved technique remain unchanged, but with more than an order of magnitude decrease in computational effort for some of the simulations reported in the literature. A post priori validation scheme for the correctness of the simulation results has been utilized to propose a new simulation strategy to arrive at converged simulation results with near minimum computational effort.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given an unweighted undirected or directed graph with n vertices, m edges and edge connectivity c, we present a new deterministic algorithm for edge splitting. Our algorithm splits-off any specified subset S of vertices satisfying standard conditions (even degree for the undirected case and in-degree ≥ out-degree for the directed case) while maintaining connectivity c for vertices outside S in Õ(m+nc2) time for an undirected graph and Õ(mc) time for a directed graph. This improves the current best deterministic time bounds due to Gabow [8], who splits-off a single vertex in Õ(nc2+m) time for an undirected graph and Õ(mc) time for a directed graph. Further, for appropriate ranges of n, c, |S| it improves the current best randomized bounds due to Benczúr and Karger [2], who split-off a single vertex in an undirected graph in Õ(n2) Monte Carlo time. We give two applications of our edge splitting algorithms. Our first application is a sub-quadratic (in n) algorithm to construct Edmonds' arborescences. A classical result of Edmonds [5] shows that an unweighted directed graph with c edge-disjoint paths from any particular vertex r to every other vertex has exactly c edge-disjoint arborescences rooted at r. For a c edge connected unweighted undirected graph, the same theorem holds on the digraph obtained by replacing each undirected edge by two directed edges, one in each direction. The current fastest construction of these arborescences by Gabow [7] takes Õ(n2c2) time. Our algorithm takes Õ(nc3+m) time for the undirected case and Õ(nc4+mc) time for the directed case. The second application of our splitting algorithm is a new Steiner edge connectivity algorithm for undirected graphs which matches the best known bound of Õ(nc2 + m) time due to Bhalgat et al [3]. Finally, our algorithm can also be viewed as an alternative proof for existential edge splitting theorems due to Lovász [9] and Mader [11].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given an undirected unweighted graph G = (V, E) and an integer k ≥ 1, we consider the problem of computing the edge connectivities of all those (s, t) vertex pairs, whose edge connectivity is at most k. We present an algorithm with expected running time Õ(m + nk3) for this problem, where |V| = n and |E| = m. Our output is a weighted tree T whose nodes are the sets V1, V2,..., V l of a partition of V, with the property that the edge connectivity in G between any two vertices s ε Vi and t ε Vj, for i ≠ j, is equal to the weight of the lightest edge on the path between Vi and Vj in T. Also, two vertices s and t belong to the same Vi for any i if and only if they have an edge connectivity greater than k. Currently, the best algorithm for this problem needs to compute all-pairs min-cuts in an O(nk) edge graph; this takes Õ(m + n5/2kmin{k1/2, n1/6}) time. Our algorithm is much faster for small values of k; in fact, it is faster whenever k is o(n5/6). Our algorithm yields the useful corollary that in Õ(m + nc3) time, where c is the size of the global min-cut, we can compute the edge connectivities of all those pairs of vertices whose edge connectivity is at most αc for some constant α. We also present an Õ(m + n) Monte Carlo algorithm for the approximate version of this problem. This algorithm is applicable to weighted graphs as well. Our algorithm, with some modifications, also solves another problem called the minimum T-cut problem. Given T ⊆ V of even cardinality, we present an Õ(m + nk3) algorithm to compute a minimum cut that splits T into two odd cardinality components, where k is the size of this cut.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effect of structural and aerodynamic uncertainties on the performance predictions of a helicopter is investigated. An aerodynamic model based on blade element and momentum theory is used to predict the helicopter performance. The aeroelastic parameters, such as blade chord, rotor radius, two-dimensional lift-curve slope, blade profile drag coefficient, rotor angular velocity, blade pitch angle, and blade twist rate per radius of the rotor, are considered as random variables. The propagation of these uncertainties to the performance parameters, such as thrust coefficient and power coefficient, are studied using Monte Carlo Simulations. The simulations are performed with 100,000 samples of structural and aerodynamic uncertain variables with a coefficient of variation ranging from 1 to 5%. The scatter in power predictions in hover, axial climb, and forward flight for the untwisted and linearly twisted blades is studied. It is found that about 20-25% excess power can be required by the helicopter relative to the determination predictions due to uncertainties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The compositional evolution in sputter deposited LiCoO2 thin films is influenced by process parameters involved during deposition. The electrochemical performance of these films strongly depends on their microstructure, preferential orientation and stoichiometry. The transport process of sputtered Li and Co atoms from the LiCoO2 target to the substrate, through Ar plasma in a planar magnetron configuration, was investigated based on the Monte Carlo technique. The effect of sputtering gas pressure and the substrate-target distance (dst) on Li/Co ratio, as well as, energy and angular distribution of sputtered atoms on the substrate were examined. Stable Li/Co ratios have been obtained at 5 Pa pressure and dst in the range 5−11 cm. The kinetic energy and incident angular distribution of Li and Co atoms reaching the substrate have been found to be dependent on sputtering pressure. Simulations were extended to predict compositional variations in films prepared at various process conditions. These results were compared with the composition of films determined experimentally using x-ray photoelectron spectroscopy (XPS). Li/Co ratio calculated using XPS was in moderate agreement with that of the simulated value. The measured film thickness followed the same trend as predicted by simulation. These studies are shown to be useful in understanding the complexities in multicomponent sputtering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Based on the analogy between polytypes and spin-half Ising chains with competing short- and infinite-range interactions, a Monte Carlo simulation of polytypes has been attempted. A general double-layer mechanism connects different states of the polytype chain with about the same probability as the spin-flip mechanism in magnetic Ising chains. It has been possible to simulate various polytypes with periodicities extending up to 12 layers. The Monte Carlo method should be useful in testing different interaction models that may be proposed in the future to describe polytypism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The van der Waals and Platteuw (vdVVP) theory has been successfully used to model the thermodynamics of gas hydrates. However, earlier studies have shown that this could be due to the presence of a large number of adjustable parameters whose values are obtained through regression with experimental data. To test this assertion, we carry out a systematic and rigorous study of the performance of various models of vdWP theory that have been proposed over the years. The hydrate phase equilibrium data used for this study is obtained from Monte Carlo molecular simulations of methane hydrates. The parameters of the vdWP theory are regressed from this equilibrium data and compared with their true values obtained directly from simulations. This comparison reveals that (i) methane-water interactions beyond the first cage and methane-methane interactions make a significant contribution to the partition function and thus cannot be neglected, (ii) the rigorous Monte Carlo integration should be used to evaluate the Langmuir constant instead of the spherical smoothed cell approximation, (iii) the parameter values describing the methane-water interactions cannot be correctly regressed from the equilibrium data using the vdVVP theory in its present form, (iv) the regressed empty hydrate property values closely match their true values irrespective of the level of rigor in the theory, and (v) the flexibility of the water lattice forming the hydrate phase needs to be incorporated in the vdWP theory. Since methane is among the simplest of hydrate forming molecules, the conclusions from this study should also hold true for more complicated hydrate guest molecules.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given the increasing cost of designing and building new highway pavements, reliability analysis has become vital to ensure that a given pavement performs as expected in the field. Recognizing the importance of failure analysis to safety, reliability, performance, and economy, back analysis has been employed in various engineering applications to evaluate the inherent uncertainties of the design and analysis. The probabilistic back analysis method formulated on Bayes' theorem and solved using the Markov chain Monte Carlo simulation method with a Metropolis-Hastings algorithm has proved to be highly efficient to address this issue. It is also quite flexible and is applicable to any type of prior information. In this paper, this method has been used to back-analyze the parameters that influence the pavement life and to consider the uncertainty of the mechanistic-empirical pavement design model. The load-induced pavement structural responses (e.g., stresses, strains, and deflections) used to predict the pavement life are estimated using the response surface methodology model developed based on the results of linear elastic analysis. The failure criteria adopted for the analysis were based on the factor of safety (FOS), and the study was carried out for different sample sizes and jumping distributions to estimate the most robust posterior statistics. From the posterior statistics of the case considered, it was observed that after approximately 150 million standard axle load repetitions, the mean values of the pavement properties decrease as expected, with a significant decrease in the values of the elastic moduli of the expected layers. An analysis of the posterior statistics indicated that the parameters that contribute significantly to the pavement failure were the moduli of the base and surface layer, which is consistent with the findings from other studies. After the back analysis, the base modulus parameters show a significant decrease of 15.8% and the surface layer modulus a decrease of 3.12% in the mean value. The usefulness of the back analysis methodology is further highlighted by estimating the design parameters for specified values of the factor of safety. The analysis revealed that for the pavement section considered, a reliability of 89% and 94% can be achieved by adopting FOS values of 1.5 and 2, respectively. The methodology proposed can therefore be effectively used to identify the parameters that are critical to pavement failure in the design of pavements for specified levels of reliability. DOI: 10.1061/(ASCE)TE.1943-5436.0000455. (C) 2013 American Society of Civil Engineers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose low-complexity algorithms based on Monte Carlo sampling for signal detection and channel estimation on the uplink in large-scale multiuser multiple-input-multiple-output (MIMO) systems with tens to hundreds of antennas at the base station (BS) and a similar number of uplink users. A BS receiver that employs a novel mixed sampling technique (which makes a probabilistic choice between Gibbs sampling and random uniform sampling in each coordinate update) for detection and a Gibbs-sampling-based method for channel estimation is proposed. The algorithm proposed for detection alleviates the stalling problem encountered at high signal-to-noise ratios (SNRs) in conventional Gibbs-sampling-based detection and achieves near-optimal performance in large systems with M-ary quadrature amplitude modulation (M-QAM). A novel ingredient in the detection algorithm that is responsible for achieving near-optimal performance at low complexity is the joint use of a mixed Gibbs sampling (MGS) strategy coupled with a multiple restart (MR) strategy with an efficient restart criterion. Near-optimal detection performance is demonstrated for a large number of BS antennas and users (e. g., 64 and 128 BS antennas and users). The proposed Gibbs-sampling-based channel estimation algorithm refines an initial estimate of the channel obtained during the pilot phase through iterations with the proposed MGS-based detection during the data phase. In time-division duplex systems where channel reciprocity holds, these channel estimates can be used for multiuser MIMO precoding on the downlink. The proposed receiver is shown to achieve good performance and scale well for large dimensions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Estimation of design quantiles of hydrometeorological variables at critical locations in river basins is necessary for hydrological applications. To arrive at reliable estimates for locations (sites) where no or limited records are available, various regional frequency analysis (RFA) procedures have been developed over the past five decades. The most widely used procedure is based on index-flood approach and L-moments. It assumes that values of scale and shape parameters of frequency distribution are identical across all the sites in a homogeneous region. In real-world scenario, this assumption may not be valid even if a region is statistically homogeneous. To address this issue, a novel mathematical approach is proposed. It involves (i) identification of an appropriate frequency distribution to fit the random variable being analyzed for homogeneous region, (ii) use of a proposed transformation mechanism to map observations of the variable from original space to a dimensionless space where the form of distribution does not change, and variation in values of its parameters is minimal across sites, (iii) construction of a growth curve in the dimensionless space, and (iv) mapping the curve to the original space for the target site by applying inverse transformation to arrive at required quantile(s) for the site. Effectiveness of the proposed approach (PA) in predicting quantiles for ungauged sites is demonstrated through Monte Carlo simulation experiments considering five frequency distributions that are widely used in RFA, and by case study on watersheds in conterminous United States. Results indicate that the PA outperforms methods based on index-flood approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The flexibility of the water lattice in clathrate hydrates and guest-guest interactions has been shown in previous studies to significantly affect the values of the thermodynamic properties, such as chemical potentials and free energies. Here we describe methods for computing occupancies, chemical potentials, and free energies that account for the flexibility of water lattice and guest-guest interactions in the hydrate phase. The methods are validated for a wide variety of guest molecules, such as methane, ethane, carbon dioxide, and tetrahydrodfuran by comparing the predicted occupancy values of guest molecules with those obtained from isothermal isobaric semigrand Monte Carlo simulations. The proposed methods extend the van der Waals and Platteuw theory for clathrate hydrates, and the Langmuir constant is calculated based on the structure of the empty hydrate lattice. These methods in combination with development of advanced molecular models for water and guest molecules should lead to a more thermodynamically consistent theory for clathrate hydrates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present work deals with the prediction of stiffness of an Indian nanoclay-reinforced polypropylene composite (that can be termed as a nanocomposite) using a Monte Carlo finite element analysis (FEA) technique. Nanocomposite samples are at first prepared in the laboratory using a torque rheometer for achieving desirable dispersion of nanoclay during master batch preparation followed up with extrusion for the fabrication of tensile test dog-bone specimens. It has been observed through SEM (scanning electron microscopy) images of the prepared nanocomposite containing a given percentage (3–9% by weight) of the considered nanoclay that nanoclay platelets tend to remain in clusters. By ascertaining the average size of these nanoclay clusters from the images mentioned, a planar finite element model is created in which nanoclay groups and polymer matrix are modeled as separate entities assuming a given homogeneous distribution of the nanoclay clusters. Using a Monte Carlo simulation procedure, the distribution of nanoclay is varied randomly in an automated manner in a commercial FEA code, and virtual tensile tests are performed for computing the linear stiffness for each case. Values of computed stiffness modulus of highest frequency for nanocomposites with different nanoclay contents correspond well with the experimentally obtained measures of stiffness establishing the effectiveness of the present approach for further applications.