945 resultados para initialization uncertainty


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A method to quantify lycopene and β-carotene in freeze dried tomato pulp by high performance liquid chromatography (HLPC) was validated according to the criteria of selectivity, sensitivity, precision and accuracy, and uncertainty estimation of measurement was determined with data obtained in the validation. The validated method presented is selective in terms of analysis, and it had a good precision and accuracy. Detection limit for lycopene and β-carotene was 4.2 and 0.23 mg 100 g-1, respectively. The estimation of expanded uncertainty (K = 2) for lycopene was 104 ± 21 mg 100 g-1 and for β-carotene was 6.4 ± 1.5 mg 100 g-1.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pombos privados de comida foram expostos a tentativas que podiam terminar com ou sem a apresentação de comida independentemente de qualquer resposta. Durante uma tentativa, bicadas podiam mudar a cor do disco de resposta de branco para verde (S+) ou vermelho (S-) a depender do acionamento (ou não) do comedouro. Em linha de base, bicadas produziam ambas as cores em intervalos médios variáveis de 15 s. Em duas condições experimentais distintas, tandem VI DRH foi empregado na produção, ora de S+, ora de S-. Resultados mostraram que o esquema tandem levou a uma diminuição geral na freqüência de estímulos discriminativos produzidos, marcadamente na de S+, mas não na de S-. Esses dados fornecem suporte para o modelo de reforçamento condicionado baseado na redução da incerteza.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

No litoral sul do estado de São Paulo, ocorreu uma epidemia de encefalite pelo arbovírus Rocio de 1975 a 1978. As altas taxas de morbidade e mortalidade causaram impacto social. Neste trabalho, o objetivo foi apresentar um estudo sobre como a mídia impressa relatou os acontecimentos sociais relacionados ao surgimento da epidemia no primeiro semestre de 1975. Reportagens sobre a epidemia no litoral sul foram obtidas do banco de dados dos jornais A Tribuna, Folha de S.Paulo e Jornal da Tarde. Foram analisadas as notícias até o mês de julho de 1975, fase inicial e de maior impacto da epidemia. Com a identificação de casos de encefalite, de causa desconhecida, a Secretaria de Estado da Saúde desaconselhou a ida de turistas para o litoral, utilizando a mídia como veículo de divulgação. Diante das notícias, ocorreu a fuga dos turistas e, consequentemente, a crise do comércio. Observou-se a revolta dos comerciantes, que geraram embates contra a mídia, no que tange à forma de divulgação da epidemia. Alguns prefeitos alegaram inveracidade de notícias publicadas. A proibição feita pelas autoridades sanitárias foi relatada pela mídia de forma abrangente, englobando sujeitos envolvidos nesse discurso. Assim, foram reveladas ao público as tensões geradas entre os detentores do conhecimento científico e o poder econômico local. Os jornais realizaram cobertura abrangente, abordando vários temas, entretanto disseminaram incertezas e fizeram uso de imagens sensacionalistas, além de desarticular acontecimentos biológicos e sociais. Os temas chegaram aos leitores de forma fragmentada e com sentidos sociais comprometidos.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The structural engineering community in Brazil faces new challenges with the recent occurrence of high intensity tornados. Satellite surveillance data shows that the area covering the south-east of Brazil, Uruguay and some of Argentina is one of the world most tornado-prone areas, second only to the infamous tornado alley in central United States. The design of structures subject to tornado winds is a typical example of decision making in the presence of uncertainty. Structural design involves finding a good balance between the competing goals of safety and economy. This paper presents a methodology to find the optimum balance between these goals in the presence of uncertainty. In this paper, reliability-based risk optimization is used to find the optimal safety coefficient that minimizes the total expected cost of a steel frame communications tower, subject to extreme storm and tornado wind loads. The technique is not new, but it is applied to a practical problem of increasing interest to Brazilian structural engineers. The problem is formulated in the partial safety factor format used in current design codes, with all additional partial factor introduced to serve as optimization variable. The expected cost of failure (or risk) is defined as the product of a. limit state exceedance probability by a limit state exceedance cost. These costs include costs of repairing, rebuilding, and paying compensation for injury and loss of life. The total expected failure cost is the sum of individual expected costs over all failure modes. The steel frame communications, tower subject of this study has become very common in Brazil due to increasing mobile phone coverage. The study shows that optimum reliability is strongly dependent on the cost (or consequences) of failure. Since failure consequences depend oil actual tower location, it turn,,; out that different optimum designs should be used in different locations. Failure consequences are also different for the different parties involved in the design, construction and operation of the tower. Hence, it is important that risk is well understood by the parties involved, so that proper contracts call be made. The investigation shows that when non-structural terms dominate design costs (e.g, in residential or office buildings) it is not too costly to over-design; this observation is in agreement with the observed practice for non-optimized structural systems. In this situation, is much easier to loose money by under-design. When by under-design. When structural material cost is a significant part of design cost (e.g. concrete dam or bridge), one is likely to lose significantmoney by over-design. In this situation, a cost-risk-benefit optimization analysis is highly recommended. Finally, the study also shows that under time-varying loads like tornados, the optimum reliability is strongly dependent on the selected design life.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network. Copyright (c) 2008 J. R. C. Piqueira and F. B. Cesar. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work uses crystal plasticity finite element simulations to elucidate the role of elastoplastic anisotropy in instrumented indentation P-h(s) curve measurements in face-centered Cubic (fcc) crystals. It is shown that although the experimental fluctuations in the loading stage of the P-h(s) curves can be attributed to anisotropy, the variability in the unloading stage of the experiments Is much greater than that resulting from anisotropy alone. Moreover, it is found that the conventional procedure used to evaluate the contact variables ruling the unloading P-h(s) curve introduces all uncertainty that approximates to the more fundamental influence of anisotropy. In view of these results, a robust procedure is proposed that uses contact area measurements in addition to the P-h(s) curves to extract homogenized J(2)-Plasticity-equivalent mechanical properties from single crystals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this manuscript is to discuss the existing barriers for the dissemination of medical guidelines, and to present strategies that facilitate the adaptation of the recommendations into clinical practice. The literature shows that it usually takes several years until new scientific evidence is adopted in current practice, even when there is obvious impact in patients' morbidity and mortality. There are some examples where more than thirty years have elapsed since the first case reports about the use of a effective therapy were published until its utilization became routine. That is the case of fibrinolysis for the treatment of acute myocardial infarction. Some of the main barriers for the implementation of new recommendations are: the lack of knowledge of a new guideline, personal resistance to changes, uncertainty about the efficacy of the proposed recommendation, fear of potential side-effects, difficulties in remembering the recommendations, inexistence of institutional policies reinforcing the recommendation and even economical restrains. In order to overcome these barriers a strategy that involves a program with multiple tools is always the best. That must include the implementation of easy-to-use algorithms, continuous medical education materials and lectures, electronic or paper alerts, tools to facilitate evaluation and prescription, and periodic audits to show results to the practitioners involved in the process. It is also fundamental that the medical societies involved with the specific medical issue support the program for its scientific and ethical soundness. The creation of multidisciplinary committees in each institution and the inclusion of opinion leaders that have pro-active and lasting attitudes are the key-points for the program's success. In this manuscript we use as an example the implementation of a guideline for venous thromboembolism prophylaxis, but the concepts described here can be easily applied to any other guideline. Therefore, these concepts could be very useful for institutions and services that aim at quality improvement of patient care. Changes in current medical practice recommended by guidelines may take some time. However, if there is a broader participation of opinion leaders and the use of several tools listed here, they surely have a greater probability of reaching the main objectives: improvement in provided medical care and patient safety.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work describes the seasonal and diurnal variations of downward longwave atmospheric irradiance (LW) at the surface in Sao Paulo, Brazil, using 5-min-averaged values of LW, air temperature, relative humidity, and solar radiation observed continuously and simultaneously from 1997 to 2006 on a micrometeorological platform, located at the top of a 4-story building. An objective procedure, including 2-step filtering and dome emission effect correction, was used to evaluate the quality of the 9-yr-long LW dataset. The comparison between LW values observed and yielded by the Surface Radiation Budget project shows spatial and temporal agreement, indicating that monthly and annual average values of LW observed in one point of Sao Paulo can be used as representative of the entire metropolitan region of Sao Paulo. The maximum monthly averaged value of the LW is observed during summer (389 +/- 14 W m(-2): January), and the minimum is observed during winter (332 +/- 12 W m(-2); July). The effective emissivity follows the LW and shows a maximum in summer (0.907 +/- 0.032; January) and a minimum in winter (0.818 +/- 0.029; June). The mean cloud effect, identified objectively by comparing the monthly averaged values of the LW during clear-sky days and all-sky conditions, intensified the monthly average LW by about 32.0 +/- 3.5 W m(-2) and the atmospheric effective emissivity by about 0.088 +/- 0.024. In August, the driest month of the year in Sao Paulo, the diurnal evolution of the LW shows a minimum (325 +/- 11 W m(-2)) at 0900 LT and a maximum (345 12 W m-2) at 1800 LT, which lags behind (by 4 h) the maximum diurnal variation of the screen temperature. The diurnal evolution of effective emissivity shows a minimum (0.781 +/- 0.027) during daytime and a maximum (0.842 +/- 0.030) during nighttime. The diurnal evolution of all-sky condition and clear-sky day differences in the effective emissivity remain relatively constant (7% +/- 1%), indicating that clouds do not change the emissivity diurnal pattern. The relationship between effective emissivity and screen air temperature and between effective emissivity and water vapor is complex. During the night, when the planetary boundary layer is shallower, the effective emissivity can be estimated by screen parameters. During the day, the relationship between effective emissivity and screen parameters varies from place to place and depends on the planetary boundary layer process. Because the empirical expressions do not contain enough information about the diurnal variation of the vertical stratification of air temperature and moisture in Sao Paulo, they are likely to fail in reproducing the diurnal variation of the surface emissivity. The most accurate way to estimate the LW for clear-sky conditions in Sao Paulo is to use an expression derived from a purely empirical approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduce the Coupled Aerosol and Tracer Transport model to the Brazilian developments on the Regional Atmospheric Modeling System (CATT-BRAMS). CATT-BRAMS is an on-line transport model fully consistent with the simulated atmospheric dynamics. Emission sources from biomass burning and urban-industrial-vehicular activities for trace gases and from biomass burning aerosol particles are obtained from several published datasets and remote sensing information. The tracer and aerosol mass concentration prognostics include the effects of sub-grid scale turbulence in the planetary boundary layer, convective transport by shallow and deep moist convection, wet and dry deposition, and plume rise associated with vegetation fires in addition to the grid scale transport. The radiation parameterization takes into account the interaction between the simulated biomass burning aerosol particles and short and long wave radiation. The atmospheric model BRAMS is based on the Regional Atmospheric Modeling System (RAMS), with several improvements associated with cumulus convection representation, soil moisture initialization and surface scheme tuned for the tropics, among others. In this paper the CATT-BRAMS model is used to simulate carbon monoxide and particulate material (PM(2.5)) surface fluxes and atmospheric transport during the 2002 LBA field campaigns, conducted during the transition from the dry to wet season in the southwest Amazon Basin. Model evaluation is addressed with comparisons between model results and near surface, radiosondes and airborne measurements performed during the field campaign, as well as remote sensing derived products. We show the matching of emissions strengths to observed carbon monoxide in the LBA campaign. A relatively good comparison to the MOPITT data, in spite of the fact that MOPITT a priori assumptions imply several difficulties, is also obtained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aims. A model-independent reconstruction of the cosmic expansion rate is essential to a robust analysis of cosmological observations. Our goal is to demonstrate that current data are able to provide reasonable constraints on the behavior of the Hubble parameter with redshift, independently of any cosmological model or underlying gravity theory. Methods. Using type Ia supernova data, we show that it is possible to analytically calculate the Fisher matrix components in a Hubble parameter analysis without assumptions about the energy content of the Universe. We used a principal component analysis to reconstruct the Hubble parameter as a linear combination of the Fisher matrix eigenvectors (principal components). To suppress the bias introduced by the high redshift behavior of the components, we considered the value of the Hubble parameter at high redshift as a free parameter. We first tested our procedure using a mock sample of type Ia supernova observations, we then applied it to the real data compiled by the Sloan Digital Sky Survey (SDSS) group. Results. In the mock sample analysis, we demonstrate that it is possible to drastically suppress the bias introduced by the high redshift behavior of the principal components. Applying our procedure to the real data, we show that it allows us to determine the behavior of the Hubble parameter with reasonable uncertainty, without introducing any ad-hoc parameterizations. Beyond that, our reconstruction agrees with completely independent measurements of the Hubble parameter obtained from red-envelope galaxies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Bayesian mixing models have allowed for the inclusion of uncertainty and prior information in the analysis of trophic interactions using stable isotopes. Formulating prior distributions is relatively straightforward when incorporating dietary data. However, the use of data that are related, but not directly proportional, to diet (such as prey availability data) is often problematic because such information is not necessarily predictive of diet, and the information required to build a reliable prior distribution for all prey species is often unavailable. Omitting prey availability data impacts the estimation of a predator's diet and introduces the strong assumption of consumer ultrageneralism (where all prey are consumed in equal proportions), particularly when multiple prey have similar isotope values. Methodology: We develop a procedure to incorporate prey availability data into Bayesian mixing models conditional on the similarity of isotope values between two prey. If a pair of prey have similar isotope values (resulting in highly uncertain mixing model results), our model increases the weight of availability data in estimating the contribution of prey to a predator's diet. We test the utility of this method in an intertidal community against independently measured feeding rates. Conclusions: Our results indicate that our weighting procedure increases the accuracy by which consumer diets can be inferred in situations where multiple prey have similar isotope values. This suggests that the exchange of formalism for predictive power is merited, particularly when the relationship between prey availability and a predator's diet cannot be assumed for all species in a system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In many real situations, randomness is considered to be uncertainty or even confusion which impedes human beings from making a correct decision. Here we study the combined role of randomness and determinism in particle dynamics for complex network community detection. In the proposed model, particles walk in the network and compete with each other in such a way that each of them tries to possess as many nodes as possible. Moreover, we introduce a rule to adjust the level of randomness of particle walking in the network, and we have found that a portion of randomness can largely improve the community detection rate. Computer simulations show that the model has good community detection performance and at the same time presents low computational complexity. (C) 2008 American Institute of Physics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The double helicity asymmetry in neutral pion production for p(T) = 1 to 12 GeV/c was measured with the PHENIX experiment to access the gluon-spin contribution, Delta G, to the proton spin. Measured asymmetries are consistent with zero, and at a theory scale of mu 2 = 4 GeV(2) a next to leading order QCD analysis gives Delta G([0.02,0.3]) = 0.2, with a constraint of -0.7 < Delta G([0.02,0.3]) < 0.5 at Delta chi(2) = 9 (similar to 3 sigma) for the sampled gluon momentum fraction (x) range, 0.02 to 0.3. The results are obtained using predictions for the measured asymmetries generated from four representative fits to polarized deep inelastic scattering data. We also consider the dependence of the Delta G constraint on the choice of the theoretical scale, a dominant uncertainty in these predictions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present the results of an elliptic flow, v(2), analysis of Cu + Cu collisions recorded with the solenoidal tracker detector (STAR) at the BNL Relativistic Heavy Ion Collider at root s(NN) = 62.4 and 200 GeV. Elliptic flow as a function of transverse momentum, v(2)(p(T)), is reported for different collision centralities for charged hadrons h(+/-) and strangeness-ontaining hadrons K(S)(0), Lambda, Xi, and phi in the midrapidity region vertical bar eta vertical bar < 1.0. Significant reduction in systematic uncertainty of the measurement due to nonflow effects has been achieved by correlating particles at midrapidity, vertical bar eta vertical bar < 1.0, with those at forward rapidity, 2.5 < vertical bar eta vertical bar < 4.0. We also present azimuthal correlations in p + p collisions at root s = 200 GeV to help in estimating nonflow effects. To study the system-size dependence of elliptic flow, we present a detailed comparison with previously published results from Au + Au collisions at root s(NN) = 200 GeV. We observe that v(2)(p(T)) of strange hadrons has similar scaling properties as were first observed in Au + Au collisions, that is, (i) at low transverse momenta, p(T) < 2 GeV/c, v(2) scales with transverse kinetic energy, m(T) - m, and (ii) at intermediate p(T), 2 < p(T) < 4 GeV/c, it scales with the number of constituent quarks, n(q.) We have found that ideal hydrodynamic calculations fail to reproduce the centrality dependence of v(2)(p(T)) for K(S)(0) and Lambda. Eccentricity scaled v(2) values, v(2)/epsilon, are larger in more central collisions, suggesting stronger collective flow develops in more central collisions. The comparison with Au + Au collisions, which go further in density, shows that v(2)/epsilon depends on the system size, that is, the number of participants N(part). This indicates that the ideal hydrodynamic limit is not reached in Cu + Cu collisions, presumably because the assumption of thermalization is not attained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

High precision measurements of the differential cross sections for pi(0) photoproduction at forward angles for two nuclei, (12)C and (208)Pb, have been performed for incident photon energies of 4.9-5.5 GeV to extract the pi(0) -> gamma gamma decay width. The experiment was done at Jefferson Lab using the Hall B photon tagger and a high-resolution multichannel calorimeter. The pi(0) -> gamma gamma decay width was extracted by fitting the measured cross sections using recently updated theoretical models for the process. The resulting value for the decay width is Gamma(pi(0) -> gamma gamma) = 7.82 +/- 0.14(stat) +/- 0.17(syst) eV. With the 2.8% total uncertainty, this result is a factor of 2.5 more precise than the current Particle Data Group average of this fundamental quantity, and it is consistent with current theoretical predictions.