922 resultados para Modeling Development
Resumo:
This paper traces the developments of credit risk modeling in the past 10 years. Our work can be divided into two parts: selecting articles and summarizing results. On the one hand, by constructing an ordered logit model on historical Journal of Economic Literature (JEL) codes of articles about credit risk modeling, we sort out articles which are the most related to our topic. The result indicates that the JEL codes have become the standard to classify researches in credit risk modeling. On the other hand, comparing with the classical review Altman and Saunders(1998), we observe some important changes of research methods of credit risk. The main finding is that current focuses on credit risk modeling have moved from static individual-level models to dynamic portfolio models.
Resumo:
Objective: The aim of this study was to develop a model capable of predicting variability in the mental workload experienced by frontline operators under routine and nonroutine conditions. Background: Excess workload is a risk that needs to be managed in safety-critical industries. Predictive models are needed to manage this risk effectively yet are difficult to develop. Much of the difficulty stems from the fact that workload prediction is a multilevel problem. Method: A multilevel workload model was developed in Study 1 with data collected from an en route air traffic management center. Dynamic density metrics were used to predict variability in workload within and between work units while controlling for variability among raters. The model was cross-validated in Studies 2 and 3 with the use of a high-fidelity simulator. Results: Reported workload generally remained within the bounds of the 90% prediction interval in Studies 2 and 3. Workload crossed the upper bound of the prediction interval only under nonroutine conditions. Qualitative analyses suggest that nonroutine events caused workload to cross the upper bound of the prediction interval because the controllers could not manage their workload strategically. Conclusion: The model performed well under both routine and nonroutine conditions and over different patterns of workload variation. Application: Workload prediction models can be used to support both strategic and tactical workload management. Strategic uses include the analysis of historical and projected workflows and the assessment of staffing needs. Tactical uses include the dynamic reallocation of resources to meet changes in demand.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
In early stages of design and modeling, computers and computer applications are often considered an obstacle, rather than a facilitator of the process. Most notably, brainstorms, process modeling with business experts, or development planning, are often performed by a team in front of a whiteboard. While "whiteboarding" is recognized as an effective tool, low-tech solutions that allow remote participants to contribute are still not generally available. This is a striking observation, considering that vast majority of teams in large organizations are distributed teams. And this has also been one of the key triggers behind the project described in this article, where a team of corporate researchers decided to identify state of the art technologies that could facilitate the scenario mentioned above. This paper is an account of a research project in the area of enterprise collaboration, with a strong focus on the aspects of human computer interaction in mixed mode environments, especially in areas of collaboration where computers still play a secondary role. It is describing a currently running corporate research project. © 2012 Springer-Verlag.
Resumo:
A remoção de compostos sulfurados da gasolina é um assunto de grande interesse na indústria de refino de petróleo em função das restrições ambientais cada vez mais rígidas em relação ao teor máximo de enxofre de produtos acabados. A solução mais comum para remoção de contaminantes são as unidades de hidrotratamento que operam a alta pressão e possuem alto custo de instalação e de operação além de levarem à perda de octanagem do produto acabado. O uso de membranas é uma alternativa promissora para a redução do teor de enxofre de correntes de gasolina e possui diversas vantagens em relação ao hidrotratamento convencional. O conhecimento aprofundado dos parâmetros que influenciam as etapas de sorção e difusão é crítico para o desenvolvimento da aplicação. Este trabalho avalioua seletividade e sorção do sistema formado por n-heptano e tiofeno em polímeros através de modelos termodinâmicos rigorosos, baseados em contribuição de grupos. O modelo UNIFAC-FV, variante do tradicional modelo UNIFAC para sistemas poliméricos, foi o modelo escolhido para cálculo de atividade dos sistemas estudados. Avaliou-se ainda a disponibilidade de parâmetros para desenvolvimento da modelagem e desenvolveu-se uma abordagem com alternativas para casos de indisponibilidade de parâmetros UNIFAC. Nos casos com ausência de parâmetros, o cálculo do termo residual da atividade das espécies é feito na forma proposta por Flory-Hugginsutilizando-se parâmetros de solubilidade obtidos também por contribuição de grupos. Entre os métodos de contribuição de grupos existentes para cálculo de parâmetros de solubilidade, o método de Hoy mostrou menores desvios para os sistemas estudados. A abordagem utilizada neste trabalho permite, ao final, uma análise de alterações da configuração da cadeia principal de polímeros de forma a influenciar sua seletividade e sorção para dessulfurização de naftas
Resumo:
The Virtual Learning Environment (VLE) is one of the fastest growing areas in educational technology research and development. In order to achieve learning effectiveness, ideal VLEs should be able to identify learning needs and customize solutions, with or without an instructor to supplement instruction. They are called Personalized VLEs (PVLEs). In order to achieve PVLEs success, comprehensive conceptual models corresponding to PVLEs are essential. Such conceptual modeling development is important because it facilitates early detection and correction of system development errors. Therefore, in order to capture the PVLEs knowledge explicitly, this paper focuses on the development of conceptual models for PVLEs, including models of knowledge primitives in terms of learner, curriculum, and situational models, models of VLEs in general pedagogical bases, and particularly, the definition of the ontology of PVLEs on the constructivist pedagogical principle. Based on those comprehensive conceptual models, a prototyped multiagent-based PVLE has been implemented. A field experiment was conducted to investigate the learning achievements by comparing personalized and non-personalized systems. The result indicates that the PVLE we developed under our comprehensive ontology successfully provides significant learning achievements. These comprehensive models also provide a solid knowledge representation framework for PVLEs development practice, guiding the analysis, design, and development of PVLEs. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
To survive adverse or unpredictable conditions in the ontogenetic environment, many organisms retain a level of phenotypic plasticity that allows them to meet the challenges of rapidly changing conditions. Larval anurans are widely known for their ability to modify behaviour, morphology and physiological processes during development, making them an ideal model system for studies of environmental effects on phenotypic traits. Although temperature is one of the most important factors influencing the growth, development and metamorphic condition of larval anurans, many studies have failed to include ecologically relevant thermal fluctuations among their treatments. We compared the growth and age at metamorphosis of striped marsh frogs Limnodynastes peronii raised in a diurnally fluctuating thermal regime and a stable regime of the same mean temperature. We then assessed the long-term effects of the larval environment on the morphology and performance of post-metamorphic frogs. Larval L. peronii from the fluctuating treatment were significantly longer throughout development and metamorphosed about 5 days earlier. Frogs from the fluctuating group metamorphosed at a smaller mass and in poorer condition compared with the stable group, and had proportionally shorter legs. Frogs from the fluctuating group showed greater jumping performance at metamorphosis and less degradation in performance during a 10-week dormancy. Treatment differences in performance could not be explained by whole-animal morphological variation, suggesting improved contractile properties of the muscles in the fluctuating group.
Resumo:
Heuristics, simulation, artificial intelligence techniques and combinations thereof have all been employed in the attempt to make computer systems adaptive, context-aware, reconfigurable and self-managing. This paper complements such efforts by exploring the possibility to achieve runtime adaptiveness using mathematically-based techniques from the area of formal methods. It is argued that formal methods @ runtime represents a feasible approach, and promising preliminary results are summarised to support this viewpoint. The survey of existing approaches to employing formal methods at runtime is accompanied by a discussion of their challenges and of the future research required to overcome them. © 2011 Springer-Verlag.
Resumo:
Process modeling is an emergent area of Information Systems research that is characterized through an abundance of conceptual work with little empirical research. To fill this gap, this paper reports on the development and validation of an instrument to measure user acceptance of process modeling grammars. We advance an extended model for a multi-stage measurement instrument development procedure, which incorporates feedback from both expert and user panels. We identify two main contributions: First, we provide a validated measurement instrument for the study of user acceptance of process modeling grammars, which can be used to assist in further empirical studies that investigate phenomena associated with the business process modeling domain. Second, in doing so, we describe in detail a procedural model for developing measurement instruments that ensures high levels of reliability and validity, which may assist fellow scholars in executing their empirical research.
Resumo:
If the land sector is to make significant contributions to mitigating anthropogenic greenhouse gas (GHG) emissions in coming decades, it must do so while concurrently expanding production of food and fiber. In our view, mathematical modeling will be required to provide scientific guidance to meet this challenge. In order to be useful in GHG mitigation policy measures, models must simultaneously meet scientific, software engineering, and human capacity requirements. They can be used to understand GHG fluxes, to evaluate proposed GHG mitigation actions, and to predict and monitor the effects of specific actions; the latter applications require a change in mindset that has parallels with the shift from research modeling to decision support. We compare and contrast 6 agro-ecosystem models (FullCAM, DayCent, DNDC, APSIM, WNMM, and AgMod), chosen because they are used in Australian agriculture and forestry. Underlying structural similarities in the representations of carbon flows though plants and soils in these models are complemented by a diverse range of emphases and approaches to the subprocesses within the agro-ecosystem. None of these agro-ecosystem models handles all land sector GHG fluxes, and considerable model-based uncertainty exists for soil C fluxes and enteric methane emissions. The models also show diverse approaches to the initialisation of model simulations, software implementation, distribution, licensing, and software quality assurance; each of these will differentially affect their usefulness for policy-driven GHG mitigation prediction and monitoring. Specific requirements imposed on the use of models by Australian mitigation policy settings are discussed, and areas for further scientific development of agro-ecosystem models for use in GHG mitigation policy are proposed.
Resumo:
Quantitative information regarding nitrogen (N) accumulation and its distribution to leaves, stems and grains under varying environmental and growth conditions are limited for chickpea (Cicer arietinum L.). The information is required for the development of crop growth models and also for assessment of the contribution of chickpea to N balances in cropping systems. Accordingly, these processes were quantified in chickpea under different environmental and growth conditions (still without water or N deficit) using four field experiments and 1325 N measurements. N concentration ([N]) in green leaves was 50 mg g-1 up to beginning of seed growth, and then it declined linearly to 30 mg g-1 at the end of seed growth phase. [N] in senesced leaves was 12 mg g-1. Stem [N] decreased from 30 mg g-1 early in the season to 8 mg g-1 in senesced stems at maturity. Pod [N] was constant (35 mg g-1), but grain [N] decreased from 60 mg g-1 early in seed growth to 43 mg g-1 at maturity. Total N accumulation ranged between 9 and 30 g m-2. N accumulation was closely linked to biomass accumulation until maturity. N accumulation efficiency (N accumulation relative to biomass accumulation) was 0.033 g g-1 where total biomass was -2 and during early growth period, but it decreased to 0.0176 g g-1 during the later growth period when total biomass was >218 g m-2. During vegetative growth (up to first-pod), 58% of N was partitioned to leaves and 42% to stems. Depending on growth conditions, 37-72% of leaf N and 12-56% of stem N was remobilized to the grains. The parameter estimates and functions obtained in this study can be used in chickpea simulation models to simulate N accumulation and distribution.
Contimuum Mesomechanical Finite Element Modeling in Materials Development: A State-of-the-Art Review
Resumo:
Pronunciation is an important part of speech acquisition, but little attention has been given to the mechanism or mechanisms by which it develops. Speech sound qualities, for example, have just been assumed to develop by simple imitation. In most accounts this is then assumed to be by acoustic matching, with the infant comparing his output to that of his caregiver. There are theoretical and empirical problems with both of these assumptions, and we present a computational model- Elija-that does not learn to pronounce speech sounds this way. Elija starts by exploring the sound making capabilities of his vocal apparatus. Then he uses the natural responses he gets from a caregiver to learn equivalence relations between his vocal actions and his caregiver's speech. We show that Elija progresses from a babbling stage to learning the names of objects. This demonstrates the viability of a non-imitative mechanism in learning to pronounce.