850 resultados para John Lewis Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop an inhomogeneous mean-field theory for the extended Bose-Hubbard model with a quadratic, confining potential. In the absence of this potential, our mean-field theory yields the phase diagram of the homogeneous extended Bose-Hubbard model. This phase diagram shows a superfluid (SF) phase and lobes of Mott-insulator (MI), density-wave (DW), and supersolid (SS) phases in the plane of the chemical potential mu and on-site repulsion U; we present phase diagrams for representative values of V, the repulsive energy for bosons on nearest-neighbor sites. We demonstrate that, when the confining potential is present, superfluid and density-wave order parameters are nonuniform; in particular, we obtain, for a few representative values of parameters, spherical shells of SF, MI, DW, and SS phases. We explore the implications of our study for experiments on cold-atom dipolar condensates in optical lattices in a confining potential.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have developed a graphical user interface based dendrimer builder toolkit (DBT) which can be used to generate the dendrimer configuration of desired generation for various dendrimer architectures. The validation of structures generated by this tool was carried out by studying the structural properties of two well known classes of dendrimers: ethylenediamine cored poly(amidoamine) (PAMAM) dendrimer, diaminobutyl cored poly(propylene imine) (PPI) dendrimer. Using full atomistic molecular dynamics (MD) simulation we have calculated the radius of gyration, shape tensor and monomer density distribution for PAMAM and PPI dendrimer at neutral and high pH. A good agreement between the available simulation and experimental (small angle X-ray and neutron scattering; SAXS, SANS) results and calculated radius of gyration was observed. With this validation we have used DBT to build another new class of nitrogen cored poly(propyl ether imine) dendrimer and study it's structural features using all atomistic MD simulation. DBT is a versatile tool and can be easily used to generate other dendrimer structures with different chemistry and topology. The use of general amber force field to describe the intra-molecular interactions allows us to integrate this tool easily with the widely used molecular dynamics software AMBER. This makes our tool a very useful utility which can help to facilitate the study of dendrimer interaction with nucleic acids, protein and lipid bilayer for various biological applications. (c) 2012 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A mathematical model was developed for simulating runoff generation and soil erosion on hillslopes. The model is comprised of three modules: one for overland flow, one for soil infiltration, and one for soil erosion including rill erosion and interrill er

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sediment transport in rill flows exhibits the characteristics of non-equilibrium transport, and the sediment transport rate of rill flow gradually recovers along the flow direction by erosion. By employing the concept of partial equilibrium sediment transport from open channel hydraulics, a dynamic model of rill erosion on hillslopes was developed. In the model, a parameter, called the restoration coefficient of sediment transport capacity, was used to express the recovery process of sediment transport rate, which was analysed by dimensional analysis and determined from laboratory experimental data. The values of soil loss simulated by the model were in agreement with observed values. The model results showed that the length and gradient of the hillslope and rainfall intensity had different influences on rill erosion. Copyright (c) 2006 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the 2d O(3) model with the standard action by Monte Carlo simulation at couplings β up to 2.05. We measure the energy density, mass gap and susceptibility of the model, and gather high statistics on lattices of size L ≤ 1024 using the Floating Point Systems T-series vector hypercube and the Thinking Machines Corp.'s Connection Machine 2. Asymptotic scaling does not appear to set in for this action, even at β = 2.10, where the correlation length is 420. We observe a 20% difference between our estimate m/Λ^─_(Ms) = 3.52(6) at this β and the recent exact analytical result . We use the overrelaxation algorithm interleaved with Metropolis updates and show that decorrelation time scales with the correlation length and the number of overrelaxation steps per sweep. We determine its effective dynamical critical exponent to be z' = 1.079(10); thus critical slowing down is reduced significantly for this local algorithm that is vectorizable and parallelizable.

We also use the cluster Monte Carlo algorithms, which are non-local Monte Carlo update schemes which can greatly increase the efficiency of computer simulations of spin models. The major computational task in these algorithms is connected component labeling, to identify clusters of connected sites on a lattice. We have devised some new SIMD component labeling algorithms, and implemented them on the Connection Machine. We investigate their performance when applied to the cluster update of the two dimensional Ising spin model.

Finally we use a Monte Carlo Renormalization Group method to directly measure the couplings of block Hamiltonians at different blocking levels. For the usual averaging block transformation we confirm the renormalized trajectory (RT) observed by Okawa. For another improved probabilistic block transformation we find the RT, showing that it is much closer to the Standard Action. We then use this block transformation to obtain the discrete β-function of the model which we compare to the perturbative result. We do not see convergence, except when using a rescaled coupling β_E to effectively resum the series. For the latter case we see agreement for m/ Λ^─_(Ms) at , β = 2.14, 2.26, 2.38 and 2.50. To three loops m/Λ^─_(Ms) = 3.047(35) at β = 2.50, which is very close to the exact value m/ Λ^─_(Ms) = 2.943. Our last point at β = 2.62 disagrees with this estimate however.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Threefold symmetric Fe phosphine complexes have been used to model the structural and functional aspects of biological N2 fixation by nitrogenases. Low-valent bridging Fe-S-Fe complexes in the formal oxidation states Fe(II)Fe(II), Fe(II)/Fe(I), and Fe(I)/Fe(I) have been synthesized which display rich spectroscopic and magnetic behavior. A series of cationic tris-phosphine borane (TPB) ligated Fe complexes have been synthesized and been shown to bind a variety of nitrogenous ligands including N2H4, NH3, and NH2-. These complexes are all high spin S = 3/2 and display EPR and magnetic characteristics typical of this spin state. Furthermore, a sequential protonation and reduction sequence of a terminal amide results in loss of NH3 and uptake of N2. These stoichiometric transformations represent the final steps in potential N2 fixation schemes.

Treatment of an anionic FeN2 complex with excess acid also results in the formation of some NH3, suggesting the possibility of a catalytic cycle for the conversion of N2 to NH3 mediated by Fe. Indeed, use of excess acid and reductant results in the formation of seven equivalents of NH3 per Fe center, demonstrating Fe mediated catalytic N2 fixation with acids and protons for the first time. Numerous control experiments indicate that this catalysis is likely being mediated by a molecular species.

A number of other phosphine ligated Fe complexes have also been tested for catalysis and suggest that a hemi-labile Fe-B interaction may be critical for catalysis. Additionally, various conditions for the catalysis have been investigated. These studies further support the assignment of a molecular species and delineate some of the conditions required for catalysis.

Finally, combined spectroscopic studies have been performed on a putative intermediate for catalysis. These studies converge on an assignment of this new species as a hydrazido(2-) complex. Such species have been known on group 6 metals for some time, but this represents the first characterization of this ligand on Fe. Further spectroscopic studies suggest that this species is present in catalytic mixtures, which suggests that the first steps of a distal mechanism for N2 fixation are feasible in this system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An economic air pollution control model, which determines the least cost of reaching various air quality levels, is formulated. The model takes the form of a general, nonlinear, mathematical programming problem. Primary contaminant emission levels are the independent variables. The objective function is the cost of attaining various emission levels and is to be minimized subject to constraints that given air quality levels be attained.

The model is applied to a simplified statement of the photochemical smog problem in Los Angeles County in 1975 with emissions specified by a two-dimensional vector, total reactive hydrocarbon, (RHC), and nitrogen oxide, (NOx), emissions. Air quality, also two-dimensional, is measured by the expected number of days per year that nitrogen dioxide, (NO2), and mid-day ozone, (O3), exceed standards in Central Los Angeles.

The minimum cost of reaching various emission levels is found by a linear programming model. The base or "uncontrolled" emission levels are those that will exist in 1975 with the present new car control program and with the degree of stationary source control existing in 1971. Controls, basically "add-on devices", are considered here for used cars, aircraft, and existing stationary sources. It is found that with these added controls, Los Angeles County emission levels [(1300 tons/day RHC, 1000 tons /day NOx) in 1969] and [(670 tons/day RHC, 790 tons/day NOx) at the base 1975 level], can be reduced to 260 tons/day RHC (minimum RHC program) and 460 tons/day NOx (minimum NOx program).

"Phenomenological" or statistical air quality models provide the relationship between air quality and emissions. These models estimate the relationship by using atmospheric monitoring data taken at one (yearly) emission level and by using certain simple physical assumptions, (e. g., that emissions are reduced proportionately at all points in space and time). For NO2, (concentrations assumed proportional to NOx emissions), it is found that standard violations in Central Los Angeles, (55 in 1969), can be reduced to 25, 5, and 0 days per year by controlling emissions to 800, 550, and 300 tons /day, respectively. A probabilistic model reveals that RHC control is much more effective than NOx control in reducing Central Los Angeles ozone. The 150 days per year ozone violations in 1969 can be reduced to 75, 30, 10, and 0 days per year by abating RHC emissions to 700, 450, 300, and 150 tons/day, respectively, (at the 1969 NOx emission level).

The control cost-emission level and air quality-emission level relationships are combined in a graphical solution of the complete model to find the cost of various air quality levels. Best possible air quality levels with the controls considered here are 8 O3 and 10 NO2 violations per year (minimum ozone program) or 25 O3 and 3 NO2 violations per year (minimum NO2 program) with an annualized cost of $230,000,000 (above the estimated $150,000,000 per year for the new car control program for Los Angeles County motor vehicles in 1975).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A review is presented of the statistical bootstrap model of Hagedorn and Frautschi. This model is an attempt to apply the methods of statistical mechanics in high-energy physics, while treating all hadron states (stable or unstable) on an equal footing. A statistical calculation of the resonance spectrum on this basis leads to an exponentially rising level density ρ(m) ~ cm-3 eβom at high masses.

In the present work, explicit formulae are given for the asymptotic dependence of the level density on quantum numbers, in various cases. Hamer and Frautschi's model for a realistic hadron spectrum is described.

A statistical model for hadron reactions is then put forward, analogous to the Bohr compound nucleus model in nuclear physics, which makes use of this level density. Some general features of resonance decay are predicted. The model is applied to the process of NN annihilation at rest with overall success, and explains the high final state pion multiplicity, together with the low individual branching ratios into two-body final states, which are characteristic of the process. For more general reactions, the model needs modification to take account of correlation effects. Nevertheless it is capable of explaining the phenomenon of limited transverse momenta, and the exponential decrease in the production frequency of heavy particles with their mass, as shown by Hagedorn. Frautschi's results on "Ericson fluctuations" in hadron physics are outlined briefly. The value of βo required in all these applications is consistently around [120 MeV]-1 corresponding to a "resonance volume" whose radius is very close to ƛπ. The construction of a "multiperipheral cluster model" for high-energy collisions is advocated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Fentanyl is widely used off-label in NICU. Our aim was to investigate its cerebral, cardiovascular and pulmonary effects as well as pharmacokinetics in an experimental model for neonates. Methods: Fentanyl (5 mu g/kg bolus immediately followed by a 90 minute infusion of 3 mu g/kg/h) was administered to six mechanically ventilated newborn piglets. Cardiovascular, ventilation, pulmonary and oxygenation indexes as well as brain activity were monitored from T = 0 up to the end of experiments (T = 225-300 min). Also plasma samples for quantification of fentanyl were drawn. Results: A "reliable degree of sedation" was observed up to T = 210-240 min, consistent with the selected dosing regimen and the observed fentanyl plasma levels. Unlike cardiovascular parameters, which were unmodified except for an increasing trend in heart rate, some of the ventilation and oxygenation indexes as well as brain activity were significantly altered. The pulmonary and brain effects of fentanyl were mostly recovered from T = 210 min to the end of experiment. Conclusion: The newborn piglet was shown to be a suitable experimental model for studying fentanyl disposition as well as respiratory and cardiovascular effects in human neonates. Therefore, it could be extremely useful for further investigating the drug behaviour under pathophysiological conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using a bioenergetics model, we estimated daily ration and seasonal prey consumption rates for six age classes of juvenile sandbar sharks (Carcharhinus plumbeus) in the lower Chesapeake Bay summer nursery area. The model, incorporating habitat and species-specific data on growth rates, metabolic rate, diet composition, water temperature (range 16.8−27.9°C), and population structure, predicted mean daily rations between 2.17 ±0.03 (age-0) and 1.30 ±0.02 (age-5) % body mass/day. These daily rations are higher than earlier predictions for sandbar sharks but are comparable to those for ecologically similar shark species. The total nursery population of sandbar sharks was predicted to consume ~124,000 kg of prey during their 4.5 month stay in the Chesapeake Bay nursery. The predicted consumption rates support the conclusion that juvenile sandbar sharks exert a lesser top-down effect on the Chesapeake Bay ecosystem than do teleost piscivores and hu

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report a Monte Carlo representation of the long-term inter-annual variability of monthly snowfall on a detailed (1 km) grid of points throughout the southwest. An extension of the local climate model of the southwestern United States (Stamm and Craig 1992) provides spatially based estimates of mean and variance of monthly temperature and precipitation. The mean is the expected value from a canonical regression using independent variables that represent controls on climate in this area, including orography. Variance is computed as the standard error of the prediction and provides site-specific measures of (1) natural sources of variation and (2) errors due to limitations of the data and poor distribution of climate stations. Simulation of monthly temperature and precipitation over a sequence of years is achieved by drawing from a bivariate normal distribution. The conditional expectation of precipitation. given temperature in each month, is the basis of a numerical integration of the normal probability distribution of log precipitation below a threshold temperature (3°C) to determine snowfall as a percent of total precipitation. Snowfall predictions are tested at stations for which long-term records are available. At Donner Memorial State Park (elevation 1811 meters) a 34-year simulation - matching the length of instrumental record - is within 15 percent of observed for mean annual snowfall. We also compute resulting snowpack using a variation of the model of Martinec et al. (1983). This allows additional tests by examining spatial patterns of predicted snowfall and snowpack and their hydrologic implications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A density prediction model for juvenile brown shrimp (Farfantepenaeus aztecus) was developed by using three bottom types, five salinity zones, and four seasons to quantify patterns of habitat use in Galveston Bay, Texas. Sixteen years of quantitative density data were used. Bottom types were vegetated marsh edge, submerged aquatic vegetation, and shallow nonvegetated bottom. Multiple regression was used to develop density estimates, and the resultant formula was then coupled with a geographical information system (GIS) to provide a spatial mosaic (map) of predicted habitat use. Results indicated that juvenile brown shrimp (<100 mm) selected vegetated habitats in salinities of 15−25 ppt and that seagrasses were selected over marsh edge where they co-occurred. Our results provide a spatially resolved estimate of high-density areas that will help designate essential fish habitat (EFH) in Galveston Bay. In addition, using this modeling technique, we were able to provide an estimate of the overall population of juvenile brown shrimp (<100 mm) in shallow water habitats within the bay of approximately 1.3 billion. Furthermore, the geographic range of the model was assessed by plotting observed (actual) versus expected (model) brown shrimp densities in three other Texas bays. Similar habitat-use patterns were observed in all three bays—each having a coefficient of determination >0.50. These results indicate that this model may have a broader geographic application and is a plausible approach in refining current EFH designations for all Gulf of Mexico estuaries with similar geomorphological and hydrological characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tag release and recapture data of bigeye (Thunnus obesus) and yellowfin tuna (T. albacares) from the Hawaii Tuna Tagging Project (HTTP) were analyzed with a bulk transfer model incorporating size-specific attrition to infer population dynamics and transfer rates between various fishery components. For both species, the transfer rate estimates from the offshore handline fishery areas to the longline fishery area were higher than the estimates of transfer from those same areas into the inshore fishery areas. Natural and fishing mortality rates were estimated over three size classes: yellowfin 20–45, 46–55, and ≥56 cm and bigeye 29–55, 56–70, and ≥71 cm. For both species, the estimates of natural mortality were highest in the smallest size class. For bigeye tuna, the estimates decreased with increasing size and for yellowfin tuna there was a slight increase in the largest size class. In the Cross Seamount fishery, the fishing mortality rate of bigeye tuna was similar for all three size classes and represented roughly 12% of the gross attrition rate (includes fishing and natural mortality and emigration rates). For yellowfin tuna, fishing mortality ranged between 7% and 30%, the highest being in the medium size class. For both species, the overall attrition rate from the entire fishery area was nearly the same. However, in the specific case of the Cross Seamount fishery, the attrition rate for yellowfin tuna was roughly twice that for bigeye. This result indicates that bigeye tuna are more resident at the Seamount than yellowfin tuna, and larger bigeye tunas tend to reside longer than smaller individuals. This may result in larger fish being more vulnerable to capture in the Seamount fishery. The relatively low level of exchange between the Sea-mount and the inshore and longline fisheries suggests that the fishing activity at the Seamount need not be of great management concern for either species. However, given that the current exploitation rates are considered moderate (10–30%), and that Seamount aggregations of yellowfin and bigeye tuna are highly vulnerable to low-cost gear types, it is recommended that further increases in fishing effort for these species be monitored at Cross Seamount.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

EXTRACT (SEE PDF FOR FULL ABSTRACT): We describe an empirical-statistical model of climates of the southwestern United States. Boundary conditions include sea surface temperatures, atmospheric transmissivity, and topography. Independent variables are derived from the boundary conditions along 1000-km paths of atmospheric circulation. ... Predictor equations are derived over a larger region than the application area to allow for the increased range of paleoclimate. This larger region is delimited by the autocorrelation properties of climatic data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

EXTRACT (SEE PDF FOR FULL ABSTRACT): A local climate model (LCM) has been developed to simulate the modern and 18 ka climate of the southwestern United States. ... LCM solutions indicate summers were about 1°C cooler and winters 11°C cooler at 18 ka. Annual PREC increased 68% at 18 ka, with large increases in spring and fall PREC and diminished summer monsoonal PREC. ... Validation of simulations of 18 ka climate indicate general agreement with proxy estimates of climate for that time. However, the LCM estimates of summer temperatures are about 5 to 10°C higher than estimates from proxy reconstructions.