816 resultados para Rating tool
Resumo:
Recent observations from type Ia Supernovae and from cosmic microwave background (CMB) anisotropies have revealed that most of the matter of the Universe interacts in a repulsive manner, composing the so-called dark energy constituent of the Universe. Determining the properties of dark energy is one of the most important tasks of modern cosmology and this is the main motivation for this work. The analysis of cosmic gravitational waves (GW) represents, besides the CMB temperature and polarization anisotropies, an additional approach in the determination of parameters that may constrain the dark energy models and their consistence. In recent work, a generalized Chaplygin gas model was considered in a flat universe and the corresponding spectrum of gravitational waves was obtained. In the present work we have added a massless gas component to that model and the new spectrum has been compared to the previous one. The Chaplygin gas is also used to simulate a L-CDM model by means of a particular combination of parameters so that the Chaplygin gas and the L-CDM models can be easily distinguished in the theoretical scenarios here established. We find that the models are strongly degenerated in the range of frequencies studied. This degeneracy is in part expected since the models must converge to each other when some particular combinations of parameters are considered.
Resumo:
Graphical user interfaces (GUIs) are critical components of today's software. Developers are dedicating a larger portion of code to implementing them. Given their increased importance, correctness of GUIs code is becoming essential. This paper describes the latest results in the development of GUISurfer, a tool to reverse engineer the GUI layer of interactive computing systems. The ultimate goal of the tool is to enable analysis of interactive system from source code.
Resumo:
More and more current software systems rely on non trivial coordination logic for combining autonomous services typically running on different platforms and often owned by different organizations. Often, however, coordination data is deeply entangled in the code and, therefore, difficult to isolate and analyse separately. COORDINSPECTOR is a software tool which combines slicing and program analysis techniques to isolate all coordination elements from the source code of an existing application. Such a reverse engineering process provides a clear view of the actually invoked services as well as of the orchestration patterns which bind them together. The tool analyses Common Intermediate Language (CIL) code, the native language of Microsoft .Net Framework. Therefore, the scope of application of COORDINSPECTOR is quite large: potentially any piece of code developed in any of the programming languages which compiles to the .Net Framework. The tool generates graphical representations of the coordination layer together and identifies the underlying business process orchestrations, rendering them as Orc specifications
Resumo:
Purpose – Castings defects are usually easy to characterize, but to eradicate them can be a difficult task. In many cases, defects are caused by the combined effect of different factors, whose identification is often difficult. Besides, the real non-quality costs are usually unknown, and even neglected. This paper aims to describe the development of a modular tool for quality improvement in foundries, and its main objective is to present the application potential and the foundry process areas that are covered and taken into account. Design/methodology/approach – The integrated model was conceived as an expert system, designated Qualifound, which performs both qualitative and quantitative analyses. For the qualitative analyses mode, the nomenclature and the description of defects are based on the classification suggested by the International Committee of the Foundry Technical Association. Thus, a database of defects was established, enabling one to associate the defects with the relevant process operations and the identification of their possible causes. The quantitative analysis mode deals with the number of produced and rejected castings and includes the calculation of the non-quality costs. Findings – The validation of Qualifound was carried out in a Portuguese foundry, whose quality system had been certified according to the ISO 9000 standards. Qualifound was used in every management area and it was concluded that the application had the required technological requisites to provide the necessary information for the foundry management to improve process quality. Originality/value – The paper presents a successful application of an informatics tool on quality improvement in foundries.
Resumo:
Survival analysis is applied when the time until the occurrence of an event is of interest. Such data are routinely collected in plant diseases, although applications of the method are uncommon. The objective of this study was to use two studies on post-harvest diseases of peaches, considering two harvests together and the existence of random effect shared by fruits of a same tree, in order to describe the main techniques in survival analysis. The nonparametric Kaplan-Meier method, the log-rank test and the semi-parametric Cox's proportional hazards model were used to estimate the effect of cultivars and the number of days after full bloom on the survival to the brown rot symptom and the instantaneous risk of expressing it in two consecutive harvests. The joint analysis with baseline effect, varying between harvests, and the confirmation of the tree effect as a grouping factor with random effect were appropriate to interpret the phenomenon (disease) evaluated and can be important tools to replace or complement the conventional analysis, respecting the nature of the variable and the phenomenon.
Resumo:
A fast and direct surface plasmon resonance (SPR) method for the kinetic analysis of the interactions between peptide antigens and immobilised monoclonal antibodies (mAb) has been established. Protocols have been developed to overcome the problems posed by the small size of the analytes (< 1600 Da). The interactions were well described by a simple 1:1 bimolecular interaction and the rate constants were self-consistent and reproducible. The key features for the accuracy of the kinetic constants measured were high buffer flow rates, medium antibody surface densities and high peptide concentrations. The method was applied to an extensive analysis of over 40 peptide analogues towards two distinct anti-FMDV antibodies, providing data in total agreement with previous competition ELISA experiments. Eleven linear 15-residue synthetic peptides, reproducing all possible combinations of the four replacements found in foot-and-mouth disease virus (FMDV) field isolate C-S30, were evaluated. The direct kinetic SPR analysis of the interactions between these peptides and three anti-site A mAbs suggested additivity in all combinations of the four relevant mutations, which was confirmed by parallel ELISA analysis. The four-point mutant peptide (A15S30) reproducing site A from the C-S30 strain was the least antigenic of the set, in disagreement with previously reported studies with the virus isolate. Increasing peptide size from 15 to 21 residues did not significantly improve antigenicity. Overnight incubation of A15S30 with mAb 4C4 in solution showed a marked increase in peptide antigenicity not observed for other peptide analogues, suggesting that conformational rearrangement could lead to a stable peptide-antibody complex. In fact, peptide cyclization clearly improved antigenicity, confirming an antigenic reversion in a multiply substituted peptide. Solution NMR studies of both linear and cyclic versions of the antigenic loop of FMDV C-S30 showed that structural features previously correlated with antigenicity were more pronounced in the cyclic peptide. Twenty-six synthetic peptides, corresponding to all possible combinations of five single-point antigenicity-enhancing replacements in the GH loop of FMDV C-S8c1, were also studied. SPR kinetic screening of these peptides was not possible due to problems mainly related to the high mAb affinities displayed by these synthetic antigens. Solution affinity SPR analysis was employed and affinities displayed were generally comparable to or even higher than those corresponding to the C-S8c1 reference peptide A15. The NMR characterisation of one of these multiple mutants in solution showed that it had a conformational behaviour quite similar to that of the native sequence A15 and the X-ray diffraction crystallographic analysis of the peptide ? mAb 4C4 complex showed paratope ? epitope interactions identical to all FMDV peptide ? mAb complexes studied so far. Key residues for these interactions are those directly involved in epitope ? paratope contacts (141Arg, 143Asp, 146His) as well as residues able to stabilise a particular peptide global folding. A quasi-cyclic conformation is held up by a hydrophobic cavity defined by residues 138, 144 and 147 and by other key intrapeptide hydrogen bonds, delineating an open turn at positions 141, 142 and 143 (corresponding to the Arg-Gly-Asp motif).
Resumo:
As debêntures padronizadas possibilitam aos investidores prescindir de complexas análises contratuais e cálculos sofisticados em mercados secundários. Este artigo analisa, com base em contratos de debêntures não padronizados, se, caso as diferenças contratuais fossem controladas estatisticamente, os ratings dessas debêntures seriam suficientes para captar os custos inerentes nas taxas de juros de suas emissões. Para tanto, analisamos uma amostra de 24 emissões de debêntures no período de 1999-2001, e verificamos se houve alguma diferença estatisticamente significativa (distribuição binomial) nas cláusulas contratuais entre as emissões com elevado rating e baixo rating. Concluímos que, para ratings elevados, a padronização afeta as taxas de juros como reflexo de seu rating. Contudo, para ratings baixos, a padronização não capta diferenças contratuais específicas, tais como repactuação programada e garantia.
Resumo:
The participation of citizens in public policies is an opportunity not only to educate them, but also to increase their empowerment. However, the best way for deploying participatory policies, defining their scope and approach, still remains an open and continuous debate. Using as a case study the Brazilian National Agency of Electric Energy (Aneel), with its public hearings about tariff review, this paper aims at analyzing the democratic aspects of these hearings and challenges the hypothesis of many scholars about the social participation bias in this kind of procedure. This study points out a majority participation of experts, contrasting with the political content of discussions. And, this way, it contributes to a critical analysis of the public hearings as a participatory tool, indicating their strengths and their aspects which deserve a special attention.
Resumo:
The purpose of this paper is to analyse if Multiple-Choice Tests may be considered an interesting alternative for assessing knowledge, particularly in the Mathematics area, as opposed to the traditional methods, such as open questions exams. In this sense we illustrate some opinions of the researchers in this area. Often the perception of the people about the construction of this kind of exams is that they are easy to create. But it is not true! Construct well written tests it’s a hard work and needs writing ability from the teachers. Our proposal is analyse the construction difficulties of multiple - choice tests as well some advantages and limitations of this type of tests. We also show the frequent critics and worries, since the beginning of this objective format usage. Finally in this context some examples of Multiple-Choice Items in the Mathematics area are given, and we illustrate as how we can take advantage and improve this kind of tests.
Resumo:
The design and development of simulation models and tools for Demand Response (DR) programs are becoming more and more important for adequately taking the maximum advantages of DR programs use. Moreover, a more active consumers’ participation in DR programs can help improving the system reliability and decrease or defer the required investments. DemSi, a DR simulator, designed and implemented by the authors of this paper, allows studying DR actions and schemes in distribution networks. It undertakes the technical validation of the solution using realistic network simulation based on PSCAD. DemSi considers the players involved in DR actions, and the results can be analyzed from each specific player point of view.
Resumo:
The study of electricity markets operation has been gaining an increasing importance in last years, as result of the new challenges that the electricity markets restructuring produced. This restructuring increased the competitiveness of the market, but with it its complexity. The growing complexity and unpredictability of the market’s evolution consequently increases the decision making difficulty. Therefore, the intervenient entities are forced to rethink their behaviour and market strategies. Currently, lots of information concerning electricity markets is available. These data, concerning innumerous regards of electricity markets operation, is accessible free of charge, and it is essential for understanding and suitably modelling electricity markets. This paper proposes a tool which is able to handle, store and dynamically update data. The development of the proposed tool is expected to be of great importance to improve the comprehension of electricity markets and the interactions among the involved entities.
Resumo:
This paper presents a simulator for electric vehicles in the context of smart grids and distribution networks. It aims to support network operator´s planning and operations but can be used by other entities for related studies. The paper describes the parameters supported by the current version of the Electric Vehicle Scenario Simulator (EVeSSi) tool and its current algorithm. EVeSSi enables the definition of electric vehicles scenarios on distribution networks using a built-in movement engine. The scenarios created with EVeSSi can be used by external tools (e.g., power flow) for specific analysis, for instance grid impacts. Two scenarios are briefly presented for illustration of the simulator capabilities.
Resumo:
Short-term risk management is highly dependent on long-term contractual decisions previously established; risk aversion factor of the agent and short-term price forecast accuracy. Trying to give answers to that problem, this paper provides a different approach for short-term risk management on electricity markets. Based on long-term contractual decisions and making use of a price range forecast method developed by the authors, the short-term risk management tool presented here has as main concern to find the optimal spot market strategies that a producer should have for a specific day in function of his risk aversion factor, with the objective to maximize the profits and simultaneously to practice the hedge against price market volatility. Due to the complexity of the optimization problem, the authors make use of Particle Swarm Optimization (PSO) to find the optimal solution. Results from realistic data, namely from OMEL electricity market, are presented and discussed in detail.
Resumo:
This paper proposes a swarm intelligence long-term hedging tool to support electricity producers in competitive electricity markets. This tool investigates the long-term hedging opportunities available to electric power producers through the use of contracts with physical (spot and forward) and financial (options) settlement. To find the optimal portfolio the producer risk preference is stated by a utility function (U) expressing the trade-off between the expectation and the variance of the return. Variance estimation and the expected return are based on a forecasted scenario interval determined by a long-term price range forecast model, developed by the authors, whose explanation is outside the scope of this paper. The proposed tool makes use of Particle Swarm Optimization (PSO) and its performance has been evaluated by comparing it with a Genetic Algorithm (GA) based approach. To validate the risk management tool a case study, using real price historical data for mainland Spanish market, is presented to demonstrate the effectiveness of the proposed methodology.
Resumo:
Neonatal anthropometry is an inexpensive, noninvasive and convenient tool for bedside evaluation, especially in sick and fragile neonates. Anthropometry can be used in neonates as a tool for several purposes: diagnosis of foetal malnutrition and prediction of early postnatal complications; postnatal assessment of growth, body composition and nutritional status; prediction of long-term complications including metabolic syndrome; assessment of dysmorphology; and estimation of body surface. However, in this age group anthropometry has been notorious for its inaccuracy and the main concern is to make validated indices available. Direct measurements, such as body weight, length and body circumferences are the most commonly used measurements for nutritional assessment in clinical practice and in field studies. Body weight is the most reliable anthropometric measurement and therefore is often used alone in the assessment of the nutritional status, despite not reflecting body composition. Derived indices from direct measurements have been proposed to improve the accuracy of anthropometry. Equations based on body weight and length, mid-arm circumference/head circumference ratio, and upper-arm cross-sectional areas are among the most used derived indices to assess nutritional status and body proportionality, even though these indices require further validation for the estimation of body composition in neonates.