38 resultados para logarithmic sprayer
Resumo:
Geochemical data that is derived from the whole or partial analysis of various geologic materialsrepresent a composition of mineralogies or solute species. Minerals are composed of structuredrelationships between cations and anions which, through atomic and molecular forces, keep the elementsbound in specific configurations. The chemical compositions of minerals have specific relationships thatare governed by these molecular controls. In the case of olivine, there is a well-defined relationshipbetween Mn-Fe-Mg with Si. Balances between the principal elements defining olivine composition andother significant constituents in the composition (Al, Ti) have been defined, resulting in a near-linearrelationship between the logarithmic relative proportion of Si versus (MgMnFe) and Mg versus (MnFe),which is typically described but poorly illustrated in the simplex.The present contribution corresponds to ongoing research, which attempts to relate stoichiometry andgeochemical data using compositional geometry. We describe here the approach by which stoichiometricrelationships based on mineralogical constraints can be accounted for in the space of simplicialcoordinates using olivines as an example. Further examples for other mineral types (plagioclases andmore complex minerals such as clays) are needed. Issues that remain to be dealt with include thereduction of a bulk chemical composition of a rock comprised of several minerals from which appropriatebalances can be used to describe the composition in a realistic mineralogical framework. The overallobjective of our research is to answer the question: In the cases where the mineralogy is unknown, arethere suitable proxies that can be substituted?Kew words: Aitchison geometry, balances, mineral composition, oxides
Resumo:
We use aggregate GDP data and within-country income shares for theperiod 1970-1998 to assign a level of income to each person in theworld. We then estimate the gaussian kernel density function for theworldwide distribution of income. We compute world poverty rates byintegrating the density function below the poverty lines. The $1/daypoverty rate has fallen from 20% to 5% over the last twenty five years.The $2/day rate has fallen from 44% to 18%. There are between 300 and500 million less poor people in 1998 than there were in the 70s.We estimate global income inequality using seven different popularindexes: the Gini coefficient, the variance of log-income, two ofAtkinson s indexes, the Mean Logarithmic Deviation, the Theil indexand the coefficient of variation. All indexes show a reduction in globalincome inequality between 1980 and 1998. We also find that most globaldisparities can be accounted for by across-country, not within-country,inequalities. Within-country disparities have increased slightly duringthe sample period, but not nearly enough to offset the substantialreduction in across-country disparities. The across-country reductionsin inequality are driven mainly, but not fully, by the large growth rateof the incomes of the 1.2 billion Chinese citizens. Unless Africa startsgrowing in the near future, we project that income inequalities willstart rising again. If Africa does not start growing, then China, India,the OECD and the rest of middle-income and rich countries diverge awayfrom it, and global inequality will rise. Thus, the aggregate GDP growthof the African continent should be the priority of anyone concerned withincreasing global income inequality.
Resumo:
We present a new general concentration-of-measure inequality and illustrate its power by applications in random combinatorics. The results find direct applications in some problems of learning theory.
Resumo:
Many multivariate methods that are apparently distinct can be linked by introducing oneor more parameters in their definition. Methods that can be linked in this way arecorrespondence analysis, unweighted or weighted logratio analysis (the latter alsoknown as "spectral mapping"), nonsymmetric correspondence analysis, principalcomponent analysis (with and without logarithmic transformation of the data) andmultidimensional scaling. In this presentation I will show how several of thesemethods, which are frequently used in compositional data analysis, may be linkedthrough parametrizations such as power transformations, linear transformations andconvex linear combinations. Since the methods of interest here all lead to visual mapsof data, a "movie" can be made where where the linking parameter is allowed to vary insmall steps: the results are recalculated "frame by frame" and one can see the smoothchange from one method to another. Several of these "movies" will be shown, giving adeeper insight into the similarities and differences between these methods.
Resumo:
A welfare analysis of unemployment insurance (UI) is performed in a generalequilibrium job search model. Finitely-lived, risk-averse workers smooth consumption over time by accumulating assets, choose search effort whenunemployed, and suffer disutility from work. Firms hire workers, purchasecapital, and pay taxes to finance worker benefits; their equity is the assetaccumulated by workers. A matching function relates unemployment, hiringexpenditure, and search effort to the formation of jobs. The model is calibrated to US data; the parameters relating job search effort to the probability of job finding are chosen to match microeconomic studies ofunemployment spells. Under logarithmic utility, numerical simulation shows rather small welfaregains from UI. Even without UI, workers smooth consumption effectivelythrough asset accumulation. Greater risk aversion leads to substantiallylarger welfare gains from UI; however, even in this case much of its welfareimpact is due not to consumption smoothing effects, but rather to decreased work disutility, or to a variety of externalities.
Resumo:
We present an analytical model to interpret nanoscale capacitance microscopy measurements on thin dielectric films. The model displays a logarithmic dependence on the tip-sample distance and on the film thickness-dielectric constant ratio and shows an excellent agreement with finite-element numerical simulations and experimental results on a broad range of values. Based on these results, we discuss the capabilities of nanoscale capacitance microscopy for the quantitative extraction of the dielectric constant and the thickness of thin dielectric films at the nanoscale.
Resumo:
We obtain the next-to-next-to-leading-logarithmic renormalization-group improvement of the spectrum of hydrogenlike atoms with massless fermions by using potential NRQED. These results can also be applied to the computation of the muonic hydrogen spectrum where we are able to reproduce some known double logarithms at O(m¿s6). We compare with other formalisms dealing with logarithmic resummation available in the literature.
Resumo:
We use the recently obtained theoretical expression for the complete QCD static energy at next-to-next-to-next-to leading-logarithmic accuracy to determine r(0)Lambda((MS) over bar) by comparison with available lattice data, where r(0) is the lattice scale and Lambda((MS) over bar) is the QCD scale. We obtain r(0)Lambda((MS) over bar) = 0.622(-0.015)(+0.019) for the zero-flavor case. The procedure we describe can be directly used to obtain r(0)Lambda((MS) over bar) in the unquenched case, when unquenched lattice data for the static energy at short distances becomes available. Using the value of the strong coupling alpha(s) as an input, the unquenched result would provide a determination of the lattice scale r(0).
Resumo:
In this paper, we study dynamical aspects of the two-dimensional (2D) gonihedric spin model using both numerical and analytical methods. This spin model has vanishing microscopic surface tension and it actually describes an ensemble of loops living on a 2D surface. The self-avoidance of loops is parametrized by a parameter ¿. The ¿=0 model can be mapped to one of the six-vertex models discussed by Baxter, and it does not have critical behavior. We have found that allowing for ¿¿0 does not lead to critical behavior either. Finite-size effects are rather severe, and in order to understand these effects, a finite-volume calculation for non-self-avoiding loops is presented. This model, like his 3D counterpart, exhibits very slow dynamics, but a careful analysis of dynamical observables reveals nonglassy evolution (unlike its 3D counterpart). We find, also in this ¿=0 case, the law that governs the long-time, low-temperature evolution of the system, through a dual description in terms of defects. A power, rather than logarithmic, law for the approach to equilibrium has been found.
Resumo:
We have included the effective description of squark interactions with charginos/neutralinos in the MadGraph MSSM model. This effective description includes the effective Yukawa couplings, and another logarithmic term which encodes the supersymmetry-breaking. We have performed an extensive test of our implementation analyzing the results of the partial decay widths of squarks into charginos and neutralinos obtained by using FeynArts/FormCalc programs and the new model file in MadGraph. We present results for the cross-section of top-squark production decaying into charginos and neutralinos.
Resumo:
The magnetic properties of BaFe12O19 and BaFe10.2Sn0.74Co0.66O19 single crystals have been investigated in the temperature range (1.8 to 320 K) with a varying field from -5 to +5 T applied parallel and perpendicular to the c axis. Low-temperature magnetic relaxation, which is ascribed to the domain-wall motion, was performed between 1.8 and 15 K. The relaxation of magnetization exhibits a linear dependence on logarithmic time. The magnetic viscosity extracted from the relaxation data, decreases linearly as temperature goes down, which may correspond to the thermal depinning of domain walls. Below 2.5 K, the viscosity begins to deviate from the linear dependence on temperature, tending to be temperature independent. The near temperature independence of viscosity suggests the existence of quantum tunneling of antiferromagnetic domain wall in this temperature range.
Resumo:
We study the dynamics of the late stages of the Fréedericksz transition in which a periodic transient pattern decays to a final homogeneous state. A stability analysis of an unstable stationary pattern is presented, and equations for the evolution of the domain walls are obtained. Using results of previous theories, we analyze the effect that the specific dynamics of the problem, incorporating hydrodynamic couplings, has on the expected logarithmic domain growth law.
Resumo:
Lettuce greenhouse experiments were carried out from March to June 2011 in order to analyze how pesticides behave from the time of application until their intake via human consumption taking into account the primary distribution of pesticides, field dissipation, and post-harvest processing. In addition, experimental conditions were used to evaluate a new dynamic plant uptake model comparing its results with the experimentally derived residues. One application of imidacloprid and two of azoxystrobin were conducted. For evaluating primary pesticide distribution, two approaches based on leaf area index and vegetation cover were used and results were compared with those obtained from a tracer test. High influence of lettuce density, growth stage and type of sprayer was observed in primary distribution showing that low densities or early growth stages implied high losses of pesticides on soil. Washed and unwashed samples of lettuce were taken and analyzed from application to harvest to evaluate removal of pesticides by food processing. Results show that residues found on the Spanish preharvest interval days were in all cases below officially set maximum residue limits, although it was observed that time between application and harvest is as important for residues as application amounts. An overall reduction of 40–60% of pesticides residues was obtained from washing lettuce. Experimentally derived residues were compared with modeled residues and deviate from 1.2 to 1.4 for imidacloprid and azoxystrobin, respectively, presenting good model predictions. Resulting human intake fractions range from... for imidacloprid to ... for azoxystrobin.
Resumo:
We have included the effective description of squark interactions with charginos/neutralinos in the MadGraph MSSM model. This effective description includes the effective Yukawa couplings, and another logarithmic term which encodes the supersymmetry-breaking. We have performed an extensive test of our implementation analyzing the results of the partial decay widths of squarks into charginos and neutralinos obtained by using FeynArts/FormCalc programs and the new model file in MadGraph. We present results for the cross-section of top-squark production decaying into charginos and neutralinos.
Resumo:
We present a computer-simulation study of the effect of the distribution of energy barriers in an anisotropic magnetic system on the relaxation behavior of the magnetization. While the relaxation law for the magnetization can be approximated in all cases by a time logarithmic decay, the law for the dependence of the magnetic viscosity with temperature is found to be quite sensitive to the shape of the distribution of barriers. The low-temperature region for the magnetic viscosity never extrapolates to a positive no-null value. Moreover our computer simulation results agree reasonably well with some recent relaxation experiments on highly anisotropic single-domain particles.