934 resultados para Precipitation probabilities
Resumo:
High-level CASSCF/MRCI calculations with a quintuple-zeta quality basis set are reported by characterizing for the first time a manifold of electronic states of the CAs radical yet to be investigated experimentally. Along with the potential energy curves and the associated spectroscopic constants, the dipole moment functions for selected electronic states as well as the transition dipole moment functions for the most relevant electronic transitions are also presented. Estimates of radiative transition probabilities and lifetimes complement this investigation, which also assesses the effect of spin-orbit interaction on the A (2)Pi state. Whenever pertinent, comparisons of similarities and differences with the isovalent CN and CP radicals are made.
Resumo:
In this present work a method for the determination of Ca, Fe, Ga, Na, Si and Zn in alumina (Al(2)O(3)) by inductively coupled plasma optical emission spectrometry (ICP OES) with axial viewing is presented. Preliminary studies revealed intense aluminum spectral interference over the majority of elements and reaction between aluminum and quartz to form aluminosilicate, reducing drastically the lifetime of the torch. To overcome these problems alumina samples (250 mg) were dissolved with 5 mL HCl + 1.5 mLH(2)SO(4) + 1.5 mL H(2)O in a microwave oven. After complete dissolution the volume was completed to 20 mL and aluminum was precipitated as Al(OH)(3) with NH(3) (by bubbling NH(3) into the solution up to a pH similar to 8, for 10 min). The use of internal standards (Fe/Be, Ga/Dy, Zn/In and Na/Sc) was essential to obtain precise and accurate results. The reliability of the proposed method was checked by analysis of alumina certified reference material (Alumina Reduction Grade-699, NIST). The found concentrations (0.037%w(-1) CaO, 0.013% w w(-1) Fe(2)O(3), 0.012%w w(-1)Ga(2)O(3), 0.49% w w(-1) Na(2)O, 0.014% w w(-1) SiO(2) and 0.013% w w(-1) ZnO) presented no statistical differences compared to the certified values at a 95% confidence level. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Titanium dioxide with and without the addition of neodymium ions was prepared using sol-gel and precipitation methods. The resulting catalysts were characterized by thermal analysis, X-ray diffraction and BET specific surface area. Neodymium addition exerted a remarkable influence on the phase transition temperature and the surface properties of the TiO(2) matrix. TiO(2) samples synthesized by precipitation exhibit an exothermic event related from the amorphous to anatase phase transition at 510 degrees C, whereas in Nd-doped TiO(2) this transition occurred at 527 degrees C. A similar effect was observed in samples obtained using sol-gel method. The photocatalytic reactivity of the catalysts was evaluated by photodegradation of Remazol Black B (RB) under ultraviolet irradiation. Nd-doped TiO(2) showed enhanced photodegradation ability compared to undoped TiO(2) samples, independent of the method of synthesis. In samples obtained by sol-gel, RB decoloration was enhanced by 16% for TiO(2) doped with 0.5% neodymium ions. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Genetic algorithms are commonly used to solve combinatorial optimizationproblems. The implementation evolves using genetic operators (crossover, mutation,selection, etc.). Anyway, genetic algorithms like some other methods have parameters(population size, probabilities of crossover and mutation) which need to be tune orchosen.In this paper, our project is based on an existing hybrid genetic algorithmworking on the multiprocessor scheduling problem. We propose a hybrid Fuzzy-Genetic Algorithm (FLGA) approach to solve the multiprocessor scheduling problem.The algorithm consists in adding a fuzzy logic controller to control and tunedynamically different parameters (probabilities of crossover and mutation), in anattempt to improve the algorithm performance. For this purpose, we will design afuzzy logic controller based on fuzzy rules to control the probabilities of crossoverand mutation. Compared with the Standard Genetic Algorithm (SGA), the resultsclearly demonstrate that the FLGA method performs significantly better.
Resumo:
Using a physically based model, the microstructural evolution of Nb microalloyed steels during rolling in SSAB Tunnplåt’s hot strip mill was modeled. The model describes the evolution of dislocation density, the creation and diffusion of vacancies, dynamic and static recovery through climb and glide, subgrain formation and growth, dynamic and static recrystallization and grain growth. Also, the model describes the dissolution and precipitation of particles. The impeding effect on grain growth and recrystallization due to solute drag and particles is accounted for. During hot strip rolling of Nb steels, Nb in solid solution retards recrystallization due to solute drag and at lower temperatures strain-induced precipitation of Nb(C,N) may occur which effectively retard recrystallization. The flow stress behavior during hot rolling was calculated where the mean flow stress values were calculated using both the model and measured mill data. The model showed that solute drag has an essential effect on recrystallization during hot rolling of Nb steels.
Resumo:
Allvac 718 Plus and Haynes 282 are relatively new precipitation hardening nickel based superalloys with good high temperature mechanical properties. In addition, the weldability of these superalloys enhances easy fabrication. The combination of high temperature capabilities and superior weldability is unmatched by other precipitation hardening superalloys and linked to the amount of the γ’ hardening precipitates in the materials. Hence, it is these properties that make Allvac 718 Plus and Haynes 282 desirable in the manufacture of hot sections of aero engine components. Studies show that cast products are less weldable than wrought products. Segregation of elements in the cast results in inhomogeneous composition which consequently diminishes weldability. Segregation during solidification of the cast products results in dendritic microstructure with the segregating elements occupying interdendritic regions. These segregating elements are trapped in secondary phases present alongside γ matrix. Studies show that in Allvac 718Plus, the segregating phase is Laves while in Haynes 282 the segregating phase is not yet fully determined. Thus, the present study investigated the effects of homogenization heat treatments in eliminating segregation in cast Allvac 718 Plus and Haynes 282. Paramount to the study was the effect of different homogenization temperatures and dwell time in the removal of the segregating phases. Experimental methods used to both qualify and quantify the segregating phases included SEM, EDX analysis, manual point count and macro Vickers hardness tests. Main results show that there is a reduction in the segregating phases in both materials as homogenization proceeds hence a disappearance of the dendritic structure. In Allvac 718 Plus, plate like structures is observed to be closely associated with the Laves phase at low temperatures and dwell times. In addition, Nb is found to be segregating in the interdendritic areas. The expected trend of increase in Laves as a result of the dissolution of the plate like structures at the initial stage of homogenization is only detectable for few cases. In Haynes 282, white and grey phases are clearly distinguished and Mo is observed to be segregating in interdendritic areas.
Resumo:
A crucial aspect of evidential reasoning in crime investigation involves comparing the support that evidence provides for alternative hypotheses. Recent work in forensic statistics has shown how Bayesian Networks (BNs) can be employed for this purpose. However, the specification of BNs requires conditional probability tables describing the uncertain processes under evaluation. When these processes are poorly understood, it is necessary to rely on subjective probabilities provided by experts. Accurate probabilities of this type are normally hard to acquire from experts. Recent work in qualitative reasoning has developed methods to perform probabilistic reasoning using coarser representations. However, the latter types of approaches are too imprecise to compare the likelihood of alternative hypotheses. This paper examines this shortcoming of the qualitative approaches when applied to the aforementioned problem, and identifies and integrates techniques to refine them.
Resumo:
An operational complexity model (OCM) is proposed to enable the complexity of both the cognitive and the computational components of a process to be determined. From the complexity of formation of a set of traces via a specified route a measure of the probability of that route can be determined. By determining the complexities of alternative routes leading to the formation of the same set of traces, the odds ratio indicating the relative plausibility of the alternative routes can be found. An illustrative application to a BitTorrent piracy case is presented, and the results obtained suggest that the OCM is capable of providing a realistic estimate of the odds ratio for two competing hypotheses. It is also demonstrated that the OCM can be straightforwardly refined to encompass a variety of circumstances.
Resumo:
http://digitalcommons.colby.edu/atlasofmaine2005/1011/thumbnail.jpg
Resumo:
The regimen of environmental flows (EF) must be included as terms of environmental demand in the management of water resources. Even though there are numerous methods for the computation of EF, the criteria applied at different steps in the calculation process are quite subjective whereas the results are fixed values that must be meet by water planners. This study presents a friendly-user tool for the assessment of the probability of compliance of a certain EF scenario with the natural regimen in a semiarid area in southern Spain. 250 replications of a 25-yr period of different hydrological variables (rainfall, minimum and maximum flows, ...) were obtained at the study site from the combination of Monte Carlo technique and local hydrological relationships. Several assumptions are made such as the independence of annual rainfall from year to year and the variability of occurrence of the meteorological agents, mainly precipitation as the main source of uncertainty. Inputs to the tool are easily selected from a first menu and comprise measured rainfall data, EF values and the hydrological relationships for at least a 20-yr period. The outputs are the probabilities of compliance of the different components of the EF for the study period. From this, local optimization can be applied to establish EF components with a certain level of compliance in the study period. Different options for graphic output and analysis of results are included in terms of graphs and tables in several formats. This methodology turned out to be a useful tool for the implementation of an uncertainty analysis within the scope of environmental flows in water management and allowed the simulation of the impacts of several water resource development scenarios in the study site.
Resumo:
Jakarta is vulnerable to flooding mainly caused by prolonged and heavy rainfall and thus a robust hydrological modeling is called for. A good quality of spatial precipitation data is therefore desired so that a good hydrological model could be achieved. Two types of rainfall sources are available: satellite and gauge station observations. At-site rainfall is considered to be a reliable and accurate source of rainfall. However, the limited number of stations makes the spatial interpolation not very much appealing. On the other hand, the gridded rainfall nowadays has high spatial resolution and improved accuracy, but still, relatively less accurate than its counterpart. To achieve a better precipitation data set, the study proposes cokriging method, a blending algorithm, to yield the blended satellite-gauge gridded rainfall at approximately 10-km resolution. The Global Satellite Mapping of Precipitation (GSMaP, 0.1⁰×0.1⁰) and daily rainfall observations from gauge stations are used. The blended product is compared with satellite data by cross-validation method. The newly-yield blended product is then utilized to re-calibrate the hydrological model. Several scenarios are simulated by the hydrological models calibrated by gauge observations alone and blended product. The performance of two calibrated hydrological models is then assessed and compared based on simulated and observed runoff.
Resumo:
We apply the concept of exchangeable random variables to the case of non-additive robability distributions exhibiting ncertainty aversion, and in the lass generated bya convex core convex non-additive probabilities, ith a convex core). We are able to rove two versions of the law of arge numbers (de Finetti's heorems). By making use of two efinitions. of independence we rove two versions of the strong law f large numbers. It turns out that e cannot assure the convergence of he sample averages to a constant. e then modal the case there is a true" probability distribution ehind the successive realizations of the uncertain random variable. In this case convergence occurs. This result is important because it renders true the intuition that it is possible "to learn" the "true" additive distribution behind an uncertain event if one repeatedly observes it (a sufficiently large number of times). We also provide a conjecture regarding the "Iearning" (or updating) process above, and prove a partia I result for the case of Dempster-Shafer updating rule and binomial trials.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)