967 resultados para Modeling methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper traces the developments of credit risk modeling in the past 10 years. Our work can be divided into two parts: selecting articles and summarizing results. On the one hand, by constructing an ordered logit model on historical Journal of Economic Literature (JEL) codes of articles about credit risk modeling, we sort out articles which are the most related to our topic. The result indicates that the JEL codes have become the standard to classify researches in credit risk modeling. On the other hand, comparing with the classical review Altman and Saunders(1998), we observe some important changes of research methods of credit risk. The main finding is that current focuses on credit risk modeling have moved from static individual-level models to dynamic portfolio models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data mining can be used in healthcare industry to “mine” clinical data to discover hidden information for intelligent and affective decision making. Discovery of hidden patterns and relationships often goes intact, yet advanced data mining techniques can be helpful as remedy to this scenario. This thesis mainly deals with Intelligent Prediction of Chronic Renal Disease (IPCRD). Data covers blood, urine test, and external symptoms applied to predict chronic renal disease. Data from the database is initially transformed to Weka (3.6) and Chi-Square method is used for features section. After normalizing data, three classifiers were applied and efficiency of output is evaluated. Mainly, three classifiers are analyzed: Decision Tree, Naïve Bayes, K-Nearest Neighbour algorithm. Results show that each technique has its unique strength in realizing the objectives of the defined mining goals. Efficiency of Decision Tree and KNN was almost same but Naïve Bayes proved a comparative edge over others. Further sensitivity and specificity tests are used as statistical measures to examine the performance of a binary classification. Sensitivity (also called recall rate in some fields) measures the proportion of actual positives which are correctly identified while Specificity measures the proportion of negatives which are correctly identified. CRISP-DM methodology is applied to build the mining models. It consists of six major phases: business understanding, data understanding, data preparation, modeling, evaluation, and deployment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed energy and water balance models require time-series surfaces of the meteorological variables involved in hydrological processes. Most of the hydrological GIS-based models apply simple interpolation techniques to extrapolate the point scale values registered at weather stations at a watershed scale. In mountainous areas, where the monitoring network ineffectively covers the complex terrain heterogeneity, simple geostatistical methods for spatial interpolation are not always representative enough, and algorithms that explicitly or implicitly account for the features creating strong local gradients in the meteorological variables must be applied. Originally developed as a meteorological pre-processing tool for a complete hydrological model (WiMMed), MeteoMap has become an independent software. The individual interpolation algorithms used to approximate the spatial distribution of each meteorological variable were carefully selected taking into account both, the specific variable being mapped, and the common lack of input data from Mediterranean mountainous areas. They include corrections with height for both rainfall and temperature (Herrero et al., 2007), and topographic corrections for solar radiation (Aguilar et al., 2010). MeteoMap is a GIS-based freeware upon registration. Input data include weather station records and topographic data and the output consists of tables and maps of the meteorological variables at hourly, daily, predefined rainfall event duration or annual scales. It offers its own pre and post-processing tools, including video outlook, map printing and the possibility of exporting the maps to images or ASCII ArcGIS formats. This study presents the friendly user interface of the software and shows some case studies with applications to hydrological modeling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A intenção deste trabalho é explorar dinâmicas de competição por meio de “simulação baseada em agentes”. Apoiando-se em um crescente número de estudos no campo da estratégia e teoria das organizações que utilizam métodos de simulação, desenvolveu-se um modelo computacional para simular situações de competição entre empresas e observar a eficiência relativa dos métodos de busca de melhoria de desempenho teorizados. O estudo também explora possíveis explicações para a persistência de desempenho superior ou inferior das empresas, associados às condições de vantagem ou desvantagem competitiva

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The venom of Crotalus durissus terrificus snakes presents various substances, including a serine protease with thrombin-like activity, called gyroxin, that clots plasmatic fibrinogen and promote the fibrin formation. The aim of this study was to purify and structurally characterize the gyroxin enzyme from Crotalus durissus terrificus venom. For isolation and purification, the following methods were employed: gel filtration on Sephadex G75 column and affinity chromatography on benzamidine Sepharose 6B; 12% SDS-PAGE under reducing conditions; N-terminal sequence analysis; cDNA cloning and expression through RT-PCR and crystallization tests. Theoretical molecular modeling was performed using bioinformatics tools based on comparative analysis of other serine proteases deposited in the NCBI (National Center for Biotechnology Information) database. Protein N-terminal sequencing produced a single chain with a molecular mass of similar to 30 kDa while its full-length cDNA had 714 bp which encoded a mature protein containing 238 amino acids. Crystals were obtained from the solutions 2 and 5 of the Crystal Screen Kit (R), two and one respectively, that reveal the protein constitution of the sample. For multiple sequence alignments of gyroxin-like B2.1 with six other serine proteases obtained from snake venoms (SVSPs), the preservation of cysteine residues and their main structural elements (alpha-helices, beta-barrel and loops) was indicated. The localization of the catalytic triad in His57, Asp102 and Ser198 as well as S1 and S2 specific activity sites in Thr193 and Gli215 amino acids was pointed. The area of recognition and cleavage of fibrinogen in SVSPs for modeling gyroxin B2.1 sequence was located at Arg60, Arg72, Gln75, Arg81, Arg82, Lis85, Glu86 and Lis87 residues. Theoretical modeling of gyroxin fraction generated a classical structure consisting of two alpha-helices, two beta-barrel structures, five disulfide bridges and loops in positions 37, 60, 70, 99, 148, 174 and 218. These results provided information about the functional structure of gyroxin allowing its application in the design of new drugs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper presents a new methodology to model material failure, in two-dimensional reinforced concrete members, using the Continuum Strong Discontinuity Approach (CSDA). The mixture theory is used as the methodological approach to model reinforced concrete as a composite material, constituted by a plain concrete matrix reinforced with two embedded orthogonal long fiber bundles (rebars). Matrix failure is modeled on the basis of a continuum damage model, equipped with strain softening, whereas the rebars effects are modeled by means of phenomenological constitutive models devised to reproduce the axial non-linear behavior, as well as the bondslip and dowel effects. The proposed methodology extends the fundamental ingredients of the standard Strong Discontinuity Approach, and the embedded discontinuity finite element formulations, in homogeneous materials, to matrix/fiber composite materials, as reinforced concrete. The specific aspects of the material failure modeling for those composites are also addressed. A number of available experimental tests are reproduced in order to illustrate the feasibility of the proposed methodology. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Formal methods should be used to specify and verify on-card software in Java Card applications. Furthermore, Java Card programming style requires runtime verification of all input conditions for all on-card methods, where the main goal is to preserve the data in the card. Design by contract, and in particular, the JML language, are an option for this kind of development and verification, as runtime verification is part of the Design by contract method implemented by JML. However, JML and its currently available tools for runtime verification were not designed with Java Card limitations in mind and are not Java Card compliant. In this thesis, we analyze how much of this situation is really intrinsic of Java Card limitations and how much is just a matter of a complete re-design of JML and its tools. We propose the requirements for a new language which is Java Card compliant and indicate the lines on which a compiler for this language should be built. JCML strips from JML non-Java Card aspects such as concurrency and unsupported types. This would not be enough, however, without a great effort in optimization of the verification code generated by its compiler, as this verification code must run on the card. The JCML compiler, although being much more restricted than the one for JML, is able to generate Java Card compliant verification code for some lightweight specifications. As conclusion, we present a Java Card compliant variant of JML, JCML (Java Card Modeling Language), with a preliminary version of its compiler

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective. To determine the influence of cement thickness and ceramic/cement bonding on stresses and failure of CAD/CAM crowns, using both multi-physics finite element analysis and monotonic testing.Methods. Axially symmetric FEA models were created for stress analysis of a stylized monolithic crown having resin cement thicknesses from 50 to 500 mu m under occlusal loading. Ceramic-cement interface was modeled as bonded or not-bonded (cement-dentin as bonded). Cement polymerization shrinkage was simulated as a thermal contraction. Loads necessary to reach stresses for radial cracking from the intaglio surface were calculated by FEA. Experimentally, feldspathic CAD/CAM crowns based on the FEA model were machined having different occlusal cementation spaces, etched and cemented to dentin analogs. Non-bonding of etched ceramic was achieved using a thin layer of poly(dimethylsiloxane). Crowns were loaded to failure at 5 N/s, with radial cracks detected acoustically.Results. Failure loads depended on the bonding condition and the cement thickness for both FEA and physical testing. Average fracture loads for bonded crowns were: 673.5 N at 50 mu m cement and 300.6 N at 500 mu m. FEA stresses due to polymerization shrinkage increased with the cement thickness overwhelming the protective effect of bonding, as was also seen experimentally. At 50 mu m cement thickness, bonded crowns withstood at least twice the load before failure than non-bonded crowns.Significance. Occlusal "fit" can have structural implications for CAD/CAM crowns; pre-cementation spaces around 50-100 mu m being recommended from this study. Bonding benefits were lost at thickness approaching 450-500 mu m due to polymerization shrinkage stresses. (C) 2012 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The purpose of this study was to evaluate the cohesive strength of the composite using different resinous monomers to lubricate instruments used in the Restorative Dental Modeling Insertion Technique (RDMIT).Materials and Methods: The composite specimens were made by using a prefabricated Teflon device. Different resinous monomers were used at the interface to lubricate the instruments, for a total of 72 specimens divided into 6 groups: 1. control group, no resinous monomer was used; 2. Composite Wetting Resin; 3. C & B Liquid; 4. Scotchbond Multi-Purpose Adhesive; 4. Adper Single Bond Adhesive; 6. Prime & Bond NT. Specimens were submitted to the circular area tensile test to evaluate the cohesive strength at the composite interfaces. Data were analyzed using ANOVA and Tukey's test (alpha = 0.05).Results: ANOVA showed a value of p < 0.0001, which indicated that there were significant differences among the groups. The means (SD) for the different groups were: Adper Single Bond Adhesive: 26 (12) a; control group: 28 (3) ab; Prime & Bond NT: 32 (12) ab; Composite Wetting Resin: 36 (9) abc; C&B Liquid: 38 (7) bc; Scotchbond Multi-Purpose Adhesive: 46 (10) c. Groups denoted with the same letters were not significantly different. Only Scotchbond Multi-Purpose Adhesive, used for direct restorations, had a statistically significantly higher bond strength than the control group, Adper Single Bond Adhesive, and Prime & Bond NT. Adper Single Bond with Adhesive showed a statistically significantly lower mean value than C & B Liquid.Conclusion: The results of this study indicate that the resinous monomers used for lubricating the instruments in the RDMIT did not alter the mechanical properties of the composite, and therefore did not reduce the cohesive bond strength at the composite interfaces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is to apply methods from optimal control theory, and from the theory of dynamic systems to the mathematical modeling of biological pest control. The linear feedback control problem for nonlinear systems has been formulated in order to obtain the optimal pest control strategy only through the introduction of natural enemies. Asymptotic stability of the closed-loop nonlinear Kolmogorov system is guaranteed by means of a Lyapunov function which can clearly be seen to be the solution of the Hamilton-Jacobi-Bellman equation, thus guaranteeing both stability and optimality. Numerical simulations for three possible scenarios of biological pest control based on the Lotka-Volterra models are provided to show the effectiveness of this method. (c) 2007 Elsevier B.V. All rights reserved.