35 resultados para Calibration uncertainty
Resumo:
This paper presents a mathematical model and a methodology to solve a transmission network expansion planning problem considering uncertainty in demand and generation. The methodology used to solve the problem, finds the optimal transmission network expansion plan that allows the power system to operate adequately in an environment with uncertainty. The model presented results in an optimization problem that is solved using a specialized genetic algorithm. The results obtained for known systems from the literature show that cheaper plans can be found satisfying the uncertainty in demand and generation. ©2008 IEEE.
Resumo:
The pCT deals with relatively thick targets like the human head or trunk. Thus, the fidelity of pCT as a tool for proton therapy planning depends on the accuracy of physical formulas used for proton interaction with thick absorbers. Although the actual overall accuracy of the proton stopping power in the Bethe-Bloch domain is about 1%, the analytical calculations and the Monte Carlo simulations with codes like TRIM/SRIM, MCNPX and GEANT4 do not agreed with each other. A tentative to validate the codes against experimental data for thick absorbers bring some difficulties: only a few data is available and the existing data sets have been acquired at different initial proton energies, and for different absorber materials. In this work we compare the results of our Monte Carlo simulations with existing experimental data in terms of reduced calibration curve, i.e. the range - energy dependence normalized on the range scale by the full projected CSDA range for given initial proton energy in a given material, taken from the NIST PSTAR database, and on the final proton energy scale - by the given initial energy of protons. This approach is almost energy and material independent. The results of our analysis are important for pCT development because the contradictions observed at arbitrary low initial proton energies could be easily scaled now to typical pCT energies. © 2010 American Institute of Physics.
Resumo:
The GEANT4 simulations are essential for the development of medical tomography with proton beams pCT. In the case of thin absorbers the latest releases of GEANT4 generate very similar final spectra which agree well with the results of other popular Monte Carlo codes like TRIM/SRIM, or MCNPX. For thick absorbers, however, the disagreements became evident. In a part, these disagreements are due to the known contradictions in the NIST PSTAR and SRIM reference data. Therefore, it is interesting to compare the GEANT4 results with each other, with experiment, and with diverse code results in a reduced form, which is free from this kind of doubts. In this work such comparison is done within the Reduced Calibration Curve concept elaborated for the proton beam tomography. © 2010 IEEE.
Resumo:
In this paper, a novel methodology to price the reactive power support ancillary service of Distributed Generators (DGs) with primary energy source uncertainty is shown. The proposed methodology provides the service pricing based on the Loss of Opportunity Costs (LOC) calculation. An algorithm is proposed to reduce the uncertainty present in these generators using Multiobjective Power Flows (MOPFs) implemented in multiple probabilistic scenarios through Monte Carlo Simulations (MCS), and modeling the time series associated with the generation of active power from DGs through Markov Chains (MC). © 2011 IEEE.
Resumo:
Due to the renewed interest in distributed generation (DG), the number of DG units incorporated in distribution systems has been rapidly increasing in the past few years. This situation requires new analysis tools for understanding system performance, and taking advantage of the potential benefits of DG. This paper presents an evolutionary multi-objective programming approach to determine the optimal operation of DG in distribution systems. The objectives are the minimization of the system power losses and operation cost of the DG units. The proposed approach also considers the inherent stochasticity of DG technologies powered by renewable resources. Some tests were carried out on the IEEE 34 bus distribution test system showing the robustness and applicability of the proposed methodology. © 2011 IEEE.
Resumo:
Despite the large use of differential scanning calorimetry (DSC) technique in advanced polymer materials characterization, the new methodology called DSC in high heating rates was developed. The heating rate during conventional DSC experiments varying from 10 to 20°C.min-1, sample mass from 10 to 15mg and standard aluminum sample pan weighting, approximately, 27mg. In order to contribute to a better comprehension of DSC behavior in different heating rates, this work correlates as high heating rate influences to the thermal events in DSC experiments. Samples of metallic standard (In, Pb, Sn and Zn) with masses varying from 0.570mg to 20.9mg were analyzed in multiples sample heating rate from 4 to 324°C. min-1. In order to make properly all those experiments, a precise and careful temperature and enthalpy calibrations were performed and deeply discussed. Thus, this work shows a DSC methodology able to generate good and reliable results on experiments under any researcher choice heating rates to characterize the advanced materials used, for example, for aerospace industry. Also it helps the DSC users to find in their available instruments, already installed, a better and more accurate DSC test results, improving in just one shot the analysis sensitivity and resolution. Polypropylene melting and enthalpy thermal events are also studied using both the conventional DSC method and high heating rate method.
Resumo:
The aim of this work is to evaluate the influence of point measurements in images, with subpixel accuracy, and its contribution in the calibration of digital cameras. Also, the effect of subpixel measurements in 3D coordinates of check points in the object space will be evaluated. With this purpose, an algorithm that allows subpixel accuracy was implemented for semi-automatic determination of points of interest, based on Fõrstner operator. Experiments were accomplished with a block of images acquired with the multispectral camera DuncanTech MS3100-CIR. The influence of subpixel measurements in the adjustment by Least Square Method (LSM) was evaluated by the comparison of estimated standard deviation of parameters in both situations, with manual measurement (pixel accuracy) and with subpixel estimation. Additionally, the influence of subpixel measurements in the 3D reconstruction was also analyzed. Based on the obtained results, i.e., on the quantification of the standard deviation reduction in the Inner Orientation Parameters (IOP) and also in the relative error of the 3D reconstruction, it was shown that measurements with subpixel accuracy are relevant for some tasks in Photogrammetry, mainly for those in which the metric quality is of great relevance, as Camera Calibration.
Resumo:
A foreground is formed through the possibilities, tendencies, propensities, obstructions, barriers, hindrances, et cetera, which his or her context provides for a person. Simultaneously, a foreground is formed through the person's interpretations of these possibilities, tendencies, propensities, obstructions, barriers, hindrances. A foreground is a fragmented, partial, and inconsistent constellation of bits and pieces of aspirations, hopes, and frustrations. It might be both promising and frightening; it is always being rebuilt and restructured. Foregrounds are multiple as one person might see very different possibilities; at the same time they are collective and established through processes of communication. In this article educational meaning is discussed in terms of relationships between the students' foregrounds and activities in the classroom. I illustrate how students' dreams might be kept in cages, and how this has implications for how they engage or do not engage in learning processes. I investigate how a foreground might be ruined, and in what sense a ruined foreground might turn into a learning obstacle. Finally, I discuss processes of inclusion and exclusion with reference to the notion of foreground. © 2012. The Authors.
Resumo:
Increasing human demands on soil-derived ecosystem services requires reliable data on global soil resources for sustainable development. The soil organic carbon (SOC) pool is a key indicator of soil quality as it affects essential biological, chemical and physical soil functions such as nutrient cycling, pesticide and water retention, and soil structure maintenance. However, information on the SOC pool, and its temporal and spatial dynamics is unbalanced. Even in well-studied regions with a pronounced interest in environmental issues information on soil carbon (C) is inconsistent. Several activities for the compilation of global soil C data are under way. However, different approaches for soil sampling and chemical analyses make even regional comparisons highly uncertain. Often, the procedures used so far have not allowed the reliable estimation of the total SOC pool, partly because the available knowledge is focused on not clearly defined upper soil horizons and the contribution of subsoil to SOC stocks has been less considered. Even more difficult is quantifying SOC pool changes over time. SOC consists of variable amounts of labile and recalcitrant molecules of plant, and microbial and animal origin that are often operationally defined. A comprehensively active soil expert community needs to agree on protocols of soil surveying and lab procedures towards reliable SOC pool estimates. Already established long-term ecological research sites, where SOC changes are quantified and the underlying mechanisms are investigated, are potentially the backbones for regional, national, and international SOC monitoring programs. © 2013 Elsevier B.V.
Resumo:
Pós-graduação em Química - IQ
Resumo:
Vibration monitoring requires acceleration transducers capable of providing data with high precision. Accelerometers are the most frequently used vibration transducers. Their calibration plays an important role in measuring vibrations and is a key component in ensuring the integrity of the vibration measurement. For managing secondary calibration data of accelerometers, a database computer system was implemented. The implementation of this software has been an important step forward in providing a wide range of analysis and display tools. This paper reviews the main concepts involving accelerometer secondary calibration and describes the tool developed and the methods used in its development. (C) 2013 Academie des sciences. Published by Elsevier Masson SAS. All rights reserved.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This paper proposes a Fuzzy Goal Programming model (FGP) for a real aggregate production-planning problem. To do so, an application was made in a Brazilian Sugar and Ethanol Milling Company. The FGP Model depicts the comprehensive production process of sugar, ethanol, molasses and derivatives, and considers the uncertainties involved in ethanol and sugar production. Decision-makings, related to the agricultural and logistics phases, were considered on a weekly-basis planning horizon to include the whole harvesting season and the periods between harvests. The research has provided interesting results about decisions in the agricultural stages of cutting, loading and transportation to sugarcane suppliers and, especially, in milling decisions, whose choice of production process includes storage and logistics distribution. (C)2014 Elsevier B.V. All rights reserved.