947 resultados para Politopic Uncertainty


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work uses crystal plasticity finite element simulations to elucidate the role of elastoplastic anisotropy in instrumented indentation P-h(s) curve measurements in face-centered Cubic (fcc) crystals. It is shown that although the experimental fluctuations in the loading stage of the P-h(s) curves can be attributed to anisotropy, the variability in the unloading stage of the experiments Is much greater than that resulting from anisotropy alone. Moreover, it is found that the conventional procedure used to evaluate the contact variables ruling the unloading P-h(s) curve introduces all uncertainty that approximates to the more fundamental influence of anisotropy. In view of these results, a robust procedure is proposed that uses contact area measurements in addition to the P-h(s) curves to extract homogenized J(2)-Plasticity-equivalent mechanical properties from single crystals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this manuscript is to discuss the existing barriers for the dissemination of medical guidelines, and to present strategies that facilitate the adaptation of the recommendations into clinical practice. The literature shows that it usually takes several years until new scientific evidence is adopted in current practice, even when there is obvious impact in patients' morbidity and mortality. There are some examples where more than thirty years have elapsed since the first case reports about the use of a effective therapy were published until its utilization became routine. That is the case of fibrinolysis for the treatment of acute myocardial infarction. Some of the main barriers for the implementation of new recommendations are: the lack of knowledge of a new guideline, personal resistance to changes, uncertainty about the efficacy of the proposed recommendation, fear of potential side-effects, difficulties in remembering the recommendations, inexistence of institutional policies reinforcing the recommendation and even economical restrains. In order to overcome these barriers a strategy that involves a program with multiple tools is always the best. That must include the implementation of easy-to-use algorithms, continuous medical education materials and lectures, electronic or paper alerts, tools to facilitate evaluation and prescription, and periodic audits to show results to the practitioners involved in the process. It is also fundamental that the medical societies involved with the specific medical issue support the program for its scientific and ethical soundness. The creation of multidisciplinary committees in each institution and the inclusion of opinion leaders that have pro-active and lasting attitudes are the key-points for the program's success. In this manuscript we use as an example the implementation of a guideline for venous thromboembolism prophylaxis, but the concepts described here can be easily applied to any other guideline. Therefore, these concepts could be very useful for institutions and services that aim at quality improvement of patient care. Changes in current medical practice recommended by guidelines may take some time. However, if there is a broader participation of opinion leaders and the use of several tools listed here, they surely have a greater probability of reaching the main objectives: improvement in provided medical care and patient safety.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work describes the seasonal and diurnal variations of downward longwave atmospheric irradiance (LW) at the surface in Sao Paulo, Brazil, using 5-min-averaged values of LW, air temperature, relative humidity, and solar radiation observed continuously and simultaneously from 1997 to 2006 on a micrometeorological platform, located at the top of a 4-story building. An objective procedure, including 2-step filtering and dome emission effect correction, was used to evaluate the quality of the 9-yr-long LW dataset. The comparison between LW values observed and yielded by the Surface Radiation Budget project shows spatial and temporal agreement, indicating that monthly and annual average values of LW observed in one point of Sao Paulo can be used as representative of the entire metropolitan region of Sao Paulo. The maximum monthly averaged value of the LW is observed during summer (389 +/- 14 W m(-2): January), and the minimum is observed during winter (332 +/- 12 W m(-2); July). The effective emissivity follows the LW and shows a maximum in summer (0.907 +/- 0.032; January) and a minimum in winter (0.818 +/- 0.029; June). The mean cloud effect, identified objectively by comparing the monthly averaged values of the LW during clear-sky days and all-sky conditions, intensified the monthly average LW by about 32.0 +/- 3.5 W m(-2) and the atmospheric effective emissivity by about 0.088 +/- 0.024. In August, the driest month of the year in Sao Paulo, the diurnal evolution of the LW shows a minimum (325 +/- 11 W m(-2)) at 0900 LT and a maximum (345 12 W m-2) at 1800 LT, which lags behind (by 4 h) the maximum diurnal variation of the screen temperature. The diurnal evolution of effective emissivity shows a minimum (0.781 +/- 0.027) during daytime and a maximum (0.842 +/- 0.030) during nighttime. The diurnal evolution of all-sky condition and clear-sky day differences in the effective emissivity remain relatively constant (7% +/- 1%), indicating that clouds do not change the emissivity diurnal pattern. The relationship between effective emissivity and screen air temperature and between effective emissivity and water vapor is complex. During the night, when the planetary boundary layer is shallower, the effective emissivity can be estimated by screen parameters. During the day, the relationship between effective emissivity and screen parameters varies from place to place and depends on the planetary boundary layer process. Because the empirical expressions do not contain enough information about the diurnal variation of the vertical stratification of air temperature and moisture in Sao Paulo, they are likely to fail in reproducing the diurnal variation of the surface emissivity. The most accurate way to estimate the LW for clear-sky conditions in Sao Paulo is to use an expression derived from a purely empirical approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aims. A model-independent reconstruction of the cosmic expansion rate is essential to a robust analysis of cosmological observations. Our goal is to demonstrate that current data are able to provide reasonable constraints on the behavior of the Hubble parameter with redshift, independently of any cosmological model or underlying gravity theory. Methods. Using type Ia supernova data, we show that it is possible to analytically calculate the Fisher matrix components in a Hubble parameter analysis without assumptions about the energy content of the Universe. We used a principal component analysis to reconstruct the Hubble parameter as a linear combination of the Fisher matrix eigenvectors (principal components). To suppress the bias introduced by the high redshift behavior of the components, we considered the value of the Hubble parameter at high redshift as a free parameter. We first tested our procedure using a mock sample of type Ia supernova observations, we then applied it to the real data compiled by the Sloan Digital Sky Survey (SDSS) group. Results. In the mock sample analysis, we demonstrate that it is possible to drastically suppress the bias introduced by the high redshift behavior of the principal components. Applying our procedure to the real data, we show that it allows us to determine the behavior of the Hubble parameter with reasonable uncertainty, without introducing any ad-hoc parameterizations. Beyond that, our reconstruction agrees with completely independent measurements of the Hubble parameter obtained from red-envelope galaxies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Bayesian mixing models have allowed for the inclusion of uncertainty and prior information in the analysis of trophic interactions using stable isotopes. Formulating prior distributions is relatively straightforward when incorporating dietary data. However, the use of data that are related, but not directly proportional, to diet (such as prey availability data) is often problematic because such information is not necessarily predictive of diet, and the information required to build a reliable prior distribution for all prey species is often unavailable. Omitting prey availability data impacts the estimation of a predator's diet and introduces the strong assumption of consumer ultrageneralism (where all prey are consumed in equal proportions), particularly when multiple prey have similar isotope values. Methodology: We develop a procedure to incorporate prey availability data into Bayesian mixing models conditional on the similarity of isotope values between two prey. If a pair of prey have similar isotope values (resulting in highly uncertain mixing model results), our model increases the weight of availability data in estimating the contribution of prey to a predator's diet. We test the utility of this method in an intertidal community against independently measured feeding rates. Conclusions: Our results indicate that our weighting procedure increases the accuracy by which consumer diets can be inferred in situations where multiple prey have similar isotope values. This suggests that the exchange of formalism for predictive power is merited, particularly when the relationship between prey availability and a predator's diet cannot be assumed for all species in a system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In many real situations, randomness is considered to be uncertainty or even confusion which impedes human beings from making a correct decision. Here we study the combined role of randomness and determinism in particle dynamics for complex network community detection. In the proposed model, particles walk in the network and compete with each other in such a way that each of them tries to possess as many nodes as possible. Moreover, we introduce a rule to adjust the level of randomness of particle walking in the network, and we have found that a portion of randomness can largely improve the community detection rate. Computer simulations show that the model has good community detection performance and at the same time presents low computational complexity. (C) 2008 American Institute of Physics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The double helicity asymmetry in neutral pion production for p(T) = 1 to 12 GeV/c was measured with the PHENIX experiment to access the gluon-spin contribution, Delta G, to the proton spin. Measured asymmetries are consistent with zero, and at a theory scale of mu 2 = 4 GeV(2) a next to leading order QCD analysis gives Delta G([0.02,0.3]) = 0.2, with a constraint of -0.7 < Delta G([0.02,0.3]) < 0.5 at Delta chi(2) = 9 (similar to 3 sigma) for the sampled gluon momentum fraction (x) range, 0.02 to 0.3. The results are obtained using predictions for the measured asymmetries generated from four representative fits to polarized deep inelastic scattering data. We also consider the dependence of the Delta G constraint on the choice of the theoretical scale, a dominant uncertainty in these predictions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present the results of an elliptic flow, v(2), analysis of Cu + Cu collisions recorded with the solenoidal tracker detector (STAR) at the BNL Relativistic Heavy Ion Collider at root s(NN) = 62.4 and 200 GeV. Elliptic flow as a function of transverse momentum, v(2)(p(T)), is reported for different collision centralities for charged hadrons h(+/-) and strangeness-ontaining hadrons K(S)(0), Lambda, Xi, and phi in the midrapidity region vertical bar eta vertical bar < 1.0. Significant reduction in systematic uncertainty of the measurement due to nonflow effects has been achieved by correlating particles at midrapidity, vertical bar eta vertical bar < 1.0, with those at forward rapidity, 2.5 < vertical bar eta vertical bar < 4.0. We also present azimuthal correlations in p + p collisions at root s = 200 GeV to help in estimating nonflow effects. To study the system-size dependence of elliptic flow, we present a detailed comparison with previously published results from Au + Au collisions at root s(NN) = 200 GeV. We observe that v(2)(p(T)) of strange hadrons has similar scaling properties as were first observed in Au + Au collisions, that is, (i) at low transverse momenta, p(T) < 2 GeV/c, v(2) scales with transverse kinetic energy, m(T) - m, and (ii) at intermediate p(T), 2 < p(T) < 4 GeV/c, it scales with the number of constituent quarks, n(q.) We have found that ideal hydrodynamic calculations fail to reproduce the centrality dependence of v(2)(p(T)) for K(S)(0) and Lambda. Eccentricity scaled v(2) values, v(2)/epsilon, are larger in more central collisions, suggesting stronger collective flow develops in more central collisions. The comparison with Au + Au collisions, which go further in density, shows that v(2)/epsilon depends on the system size, that is, the number of participants N(part). This indicates that the ideal hydrodynamic limit is not reached in Cu + Cu collisions, presumably because the assumption of thermalization is not attained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

High precision measurements of the differential cross sections for pi(0) photoproduction at forward angles for two nuclei, (12)C and (208)Pb, have been performed for incident photon energies of 4.9-5.5 GeV to extract the pi(0) -> gamma gamma decay width. The experiment was done at Jefferson Lab using the Hall B photon tagger and a high-resolution multichannel calorimeter. The pi(0) -> gamma gamma decay width was extracted by fitting the measured cross sections using recently updated theoretical models for the process. The resulting value for the decay width is Gamma(pi(0) -> gamma gamma) = 7.82 +/- 0.14(stat) +/- 0.17(syst) eV. With the 2.8% total uncertainty, this result is a factor of 2.5 more precise than the current Particle Data Group average of this fundamental quantity, and it is consistent with current theoretical predictions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The uncertainty about the possible involvement of a luciferase in fungal bioluminescence has not only hindered the understanding of its biochemistry but also delayed the characterization of its constituents. The present report describes how in vitro light emission can be obtained enzymatically from the cold and hot extracts assay using different species of fungi, which also indicates a common mechanism for all these organisms. Kinetic data suggest a consecutive two-step enzymatic mechanism and corroborate the enzymatic proposal of Airth and Foerster. Finally, overlapping of light emission spectra from the fungal bioluminescence and the in vitro assay confirm that this reaction is the same one that occurs in live fungi.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Live aboveground biomass (AGB) is an important source of uncertainty in the carbon balance from the tropical regions in part due scarcity of reliable estimates of live AGB and its variation across landscapes and forest types. Studies of forest structure and biomass stocks of Neotropical forests are biased toward Amazonian and Central American sites. In particular, standardized estimates of aboveground biomass stocks for the Brazilian Atlantic forest are rarely available. Notwithstanding the role of environmental variables that control the distribution and abundance of biomass in tropical lowland forests has been the subject of considerable research, the effect of short, steep elevational gradients on tropical forest structure and carbon dynamics is not well known. In order to evaluate forest structure and live AGB variation along an elevational gradient (0-1100 m a.s.l.) of coastal Atlantic Forest in SE Brazil, we carried out a standard census of woody stems >= 4.8 cm dbh in 13 1-ha permanent plots established on four different sites in 2006-2007. Live AGB ranged from 166.3 Mg ha(-1) (bootstrapped 95% CI: 1444,187.0) to 283.2 Mg ha(-1) (bootstrapped 95% CI: 253.0,325.2) and increased with elevation. We found that local-scale topographic variation associated with elevation influences the distribution of trees >50 cm dbh and total live AGB. Across all elevations, we found more stems (64-75%) with limited crown illumination but the largest proportion of the live AGB (68-85%) was stored in stems with highly illuminated or fully exposed crowns. Topography, disturbance and associated changes in light and nutrient supply probably control biomass distribution along this short but representative elevational gradient. Our findings also showed that intact Atlantic forest sites stored substantial amounts of carbon aboveground. The live tree AGB of the stands was found to be lower than Central Amazonian forests, but within the range of Neotropical forests, in particular when compared to Central American forests. Our comparative data suggests that differences in live tree AGB among Neotropical forests are probably related to the heterogeneous distribution of large and medium-sized diameter trees within forests and how the live biomass is partitioned among those size classes, in accordance with general trends found by previous studies. In addition, the elevational variation in live AGB stocks suggests a large spatial variability over coastal Atlantic forests in Brazil, clearly indicating that it is important to consider regional differences in biomass stocks for evaluating the role of this threatened tropical biome in the global carbon cycle. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cadmium is known to be a toxic agent that accumulates in the living organisms and present high toxicity potential over lifetime. Efforts towards the development of methods for microanalysis of environmental samples, including the determination of this element by graphite furnace atomic absorption spectrometry (GFAAS). inductively coupled plasma optical emission spectrometry (ICP OES), and inductively coupled plasma-mass spectrometry (ICP-MS) techniques, have been increasing. Laser induced breakdown spectroscopy (UBS) is an emerging technique dedicated to microanalysis and there is a lack of information dealing with the determination of cadmium. The aim of this work is to demonstrate the feasibility of LIBS for cadmium detection in soils. The experimental setup was designed using a laser Q-switched (Nd:YAG, 10 Hz, lambda = 1064 nm) and the emission signals were collimated by lenses into an optical fiber Coupled to a high-resolution intensified charge-coupled device (ICCD)-echelle spectrometer. Samples were cryogenically ground and thereafter pelletized before LIBS analysis. Best results were achieved by exploring a test portion (i.e. sampling spots) with larger surface area, which contributes to diminish the uncertainty due to element specific microheterogeneity. Calibration curves for cadmium determination were achieved using certified reference materials. The metrological figures of merit indicate that LIBS can be recommended for screening of cadmium contamination in soils. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The metrological principles of neutron activation analysis are discussed. It has been demonstrated that this method can provide elemental amount of substance with values fully traceable to the SI. The method has been used by several laboratories worldwide in a number of CCQM key comparisons - interlaboratory comparison tests at the highest metrological level - supplying results equivalent to values from other methods for elemental or isotopic analysis in complex samples without the need to perform chemical destruction and dissolution of these samples. The CCOM accepted therefore in April 2007 the claim that neutron activation analysis should have the similar status as the methods originally listed by the CCOM as `primary methods of measurement`. Analytical characteristics and scope of application are given.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Selenium detection limits of INAA are normally above its concentration in most biological materials. Gamma-gamma coincidence methodology can be used to improve the detection limits and uncertainties in the determination of selenium. Here, some edible parts of plants were measured using a HPGe detector equipped with a NaI(Tl) active shielding, producing spectra both in normal and coincidence modes. The results presented the reduction of the detection limits of selenium by a factor of 2 to 3 times and improvement in the uncertainty of up to 2 times.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Here, I investigate the use of Bayesian updating rules applied to modeling how social agents change their minds in the case of continuous opinion models. Given another agent statement about the continuous value of a variable, we will see that interesting dynamics emerge when an agent assigns a likelihood to that value that is a mixture of a Gaussian and a uniform distribution. This represents the idea that the other agent might have no idea about what is being talked about. The effect of updating only the first moments of the distribution will be studied, and we will see that this generates results similar to those of the bounded confidence models. On also updating the second moment, several different opinions always survive in the long run, as agents become more stubborn with time. However, depending on the probability of error and initial uncertainty, those opinions might be clustered around a central value.