913 resultados para risk-based modeling
Resumo:
The problem of technology obsolescence in information intensive businesses (software and hardware no longer being supported and replaced by improved and different solutions) and a cost constrained market can severely increase costs and operational, and ultimately reputation risk. Although many businesses recognise technological obsolescence, the pervasive nature of technology often means they have little information to identify the risk and location of pending obsolescence and little money to apply to the solution. This paper presents a low cost structured method to identify obsolete software and the risk of their obsolescence where the structure of a business and its supporting IT resources can be captured, modelled, analysed and the risk to the business of technology obsolescence identified to enable remedial action using qualified obsolescence information. The technique is based on a structured modelling approach using enterprise architecture models and a heatmap algorithm to highlight high risk obsolescent elements. The method has been tested and applied in practice in three consulting studies carried out by Capgemini involving four UK police forces. However the generic technique could be applied to any industry based on plans to improve it using ontology framework methods. This paper contains details of enterprise architecture meta-models and related modelling.
Resumo:
This multicentric population-based study in Brazil is the first national effort to estimate the prevalence of hepatitis B (HBV) and risk factors in the capital cities of the Northeast. Central-West, and Federal Districts (2004-2005). Random multistage cluster sampling was used to select persons 13-69 years of age. Markers for HBV were tested by enzyme-linked immunosorbent assay. The HBV genotypes were determined by sequencing hepatitis B surface antigen (HBsAg). Multivariate analyses and simple catalytic model were performed. Overall. 7,881 persons were inculded < 70% were not vaccinated. Positivity for HBsAg was less than 1% among non-vaccinated persons and genotypes A, D, and F co-circulated. The incidence of infection increased with age with similar force of infection in all regions. Males and persons having initiated sexual activity were associated with HBV infection in the two settings: healthcare jobs and prior hospitalization were risk factors in the Federal District. Our survey classified these regions as areas with HBV endemicity and highlighted the risk factors differences among the settings.
Resumo:
This paper presents a GIS-based multicriteria flood risk assessment and mapping approach applied to coastal drainage basins where hydrological data are not available. It involves risk to different types of possible processes: coastal inundation (storm surge), river, estuarine and flash flood, either at urban or natural areas, and fords. Based on the causes of these processes, several environmental indicators were taken to build-up the risk assessment. Geoindicators include geological-geomorphologic proprieties of Quaternary sedimentary units, water table, drainage basin morphometry, coastal dynamics, beach morphodynamics and microclimatic characteristics. Bioindicators involve coastal plain and low slope native vegetation categories and two alteration states. Anthropogenic indicators encompass land use categories properties such as: type, occupation density, urban structure type and occupation consolidation degree. The selected indicators were stored within an expert Geoenvironmental Information System developed for the State of Sao Paulo Coastal Zone (SIIGAL), which attributes were mathematically classified through deterministic approaches, in order to estimate natural susceptibilities (Sn), human-induced susceptibilities (Sa), return period of rain events (Ri), potential damages (Dp) and the risk classification (R), according to the equation R=(Sn.Sa.Ri).Dp. Thematic maps were automatically processed within the SIIGAL, in which automata cells (""geoenvironmental management units"") aggregating geological-geomorphologic and land use/native vegetation categories were the units of classification. The method has been applied to the Northern Littoral of the State of Sao Paulo (Brazil) in 32 small drainage basins, demonstrating to be very useful for coastal zone public politics, civil defense programs and flood management.
Resumo:
A comparison of dengue virus (DENV) antibody levels in paired serum samples collected from predominantly DENV-naive residents in an agricultural settlement in Brazilian Amazonia (baseline seroprevalence, 18.3%) showed a seroconversion rate of 3.67 episodes/100 person-years at risk during 12 months of follow-up. Multivariate analysis identified male sex, poverty, and migration from extra-Amazonian states as significant predictors of baseline DENY seropositivity, whereas male sex, a history of clinical diagnosis of dengue fever, and travel to an urban area predicted subsequent seroconversion. The laboratory surveillance of acute febrile illnesses implemented at the study site and in a nearby town between 2004 and 2006 confirmed 11. DENV infections among 102 episodes studied with DENV IgM detection, reverse transcriptase-polymerise chain reaction, and virus isolation; DENV-3 was isolated. Because DENV exposure is associated with migration or travel, personal protection measures when visiting high-risk urban areas may reduce the incidence of DENV infection in this rural population.
Resumo:
IgG antibodies to Toxoplasma gondii were detected in, March-April 2004, in 65.8% (95% confidence interval, 60.8-70.8%) of 342 systematically sampled subjects 5-90 years of age (87.5% of the eligible) living in a rural settlement in Amazonia, with a seroconversion rate of 9% over I year of follow-up of 99 seronegative subjects. Multiple logistic regression analysis identified age as the only significant independent predictor of seropositivity at the baseline. Each additional year of age increases the odds of being seropositive by 6%, and 76.8% of the subjects are expected to be seropositive at 30 years of age. A single high-prevalence spatial cluster, comprising 11.9% of the seropositive subjects, was detected in the area; households in the cluster were less likely to have dogs as pets and their heads had a lower education level, when compared with households located outside the cluster. The challenges for preventing human toxoplasmosis in tropical rural settings are discussed.
Resumo:
Inhibition of microtubule function is an attractive rational approach to anticancer therapy. Although taxanes are the most prominent among the microtubule-stabilizers, their clinical toxicity, poor pharmacokinetic properties, and resistance have stimulated the search for new antitumor agents having the same mechanism of action. Discodermolide is an example of nontaxane natural product that has the same mechanism of action, demonstrating superior antitumor efficacy and therapeutic index. The extraordinary chemical and biological properties have qualified discodermolide as a lead structure for the design of novel anticancer agents with optimized therapeutic properties. In the present work, we have employed a specialized fragment-based method to develop robust quantitative structure - activity relationship models for a series of synthetic discodermolide analogs. The generated molecular recognition patterns were combined with three-dimensional molecular modeling studies as a fundamental step on the path to understanding the molecular basis of drug-receptor interactions within this important series of potent antitumoral agents.
Resumo:
This paper traces the developments of credit risk modeling in the past 10 years. Our work can be divided into two parts: selecting articles and summarizing results. On the one hand, by constructing an ordered logit model on historical Journal of Economic Literature (JEL) codes of articles about credit risk modeling, we sort out articles which are the most related to our topic. The result indicates that the JEL codes have become the standard to classify researches in credit risk modeling. On the other hand, comparing with the classical review Altman and Saunders(1998), we observe some important changes of research methods of credit risk. The main finding is that current focuses on credit risk modeling have moved from static individual-level models to dynamic portfolio models.
Resumo:
A major problem in e-service development is the prioritization of the requirements of different stakeholders. The main stakeholders are governments and their citizens, all of whom have different and sometimes conflicting requirements. In this paper, the prioritization problem is addressed by combining a value-based approach with an illustration technique. This paper examines the following research question: How can multiple stakeholder requirements be illustrated from a value-based perspective in order to be prioritizable? We used an e-service development case taken from a Swedish municipality to elaborate on our approach. Our contributions are: 1) a model of the relevant domains for requirement prioritization for government, citizens, technology, finances and laws and regulations; and 2) a requirement fulfillment analysis tool (RFA) that consists of a requirement-goal-value matrix (RGV), and a calculation and illustration module (CIM). The model reduces cognitive load, helps developers to focus on value fulfillment in e-service development and supports them in the formulation of requirements. It also offers an input to public policy makers, should they aim to target values in the design of e-services.
Resumo:
Distributed energy and water balance models require time-series surfaces of the meteorological variables involved in hydrological processes. Most of the hydrological GIS-based models apply simple interpolation techniques to extrapolate the point scale values registered at weather stations at a watershed scale. In mountainous areas, where the monitoring network ineffectively covers the complex terrain heterogeneity, simple geostatistical methods for spatial interpolation are not always representative enough, and algorithms that explicitly or implicitly account for the features creating strong local gradients in the meteorological variables must be applied. Originally developed as a meteorological pre-processing tool for a complete hydrological model (WiMMed), MeteoMap has become an independent software. The individual interpolation algorithms used to approximate the spatial distribution of each meteorological variable were carefully selected taking into account both, the specific variable being mapped, and the common lack of input data from Mediterranean mountainous areas. They include corrections with height for both rainfall and temperature (Herrero et al., 2007), and topographic corrections for solar radiation (Aguilar et al., 2010). MeteoMap is a GIS-based freeware upon registration. Input data include weather station records and topographic data and the output consists of tables and maps of the meteorological variables at hourly, daily, predefined rainfall event duration or annual scales. It offers its own pre and post-processing tools, including video outlook, map printing and the possibility of exporting the maps to images or ASCII ArcGIS formats. This study presents the friendly user interface of the software and shows some case studies with applications to hydrological modeling.
Resumo:
We consider a class of sampling-based decomposition methods to solve risk-averse multistage stochastic convex programs. We prove a formula for the computation of the cuts necessary to build the outer linearizations of the recourse functions. This formula can be used to obtain an efficient implementation of Stochastic Dual Dynamic Programming applied to convex nonlinear problems. We prove the almost sure convergence of these decomposition methods when the relatively complete recourse assumption holds. We also prove the almost sure convergence of these algorithms when applied to risk-averse multistage stochastic linear programs that do not satisfy the relatively complete recourse assumption. The analysis is first done assuming the underlying stochastic process is interstage independent and discrete, with a finite set of possible realizations at each stage. We then indicate two ways of extending the methods and convergence analysis to the case when the process is interstage dependent.
Resumo:
We consider risk-averse convex stochastic programs expressed in terms of extended polyhedral risk measures. We derive computable con dence intervals on the optimal value of such stochastic programs using the Robust Stochastic Approximation and the Stochastic Mirror Descent (SMD) algorithms. When the objective functions are uniformly convex, we also propose a multistep extension of the Stochastic Mirror Descent algorithm and obtain con dence intervals on both the optimal values and optimal solutions. Numerical simulations show that our con dence intervals are much less conservative and are quicker to compute than previously obtained con dence intervals for SMD and that the multistep Stochastic Mirror Descent algorithm can obtain a good approximate solution much quicker than its nonmultistep counterpart. Our con dence intervals are also more reliable than asymptotic con dence intervals when the sample size is not much larger than the problem size.
Resumo:
In a world where organizations are ever more complex the need for the knowledge of the organizational self is a growing necessity. The DEMO methodology sets a goal in achieving the specification of the organizational self capturing the essence of the organization in way independent of its implementation and also coherent, consistent, complete, modular and objective. But having such organization self notion is of little meaning if this notion is not shared by the organization actors. To achieve this goal in a society that has grown attached to technology and where time is of utmost importance, using a tool such as a semantic Wikipedia may be the perfect way of making the information accessible. However, to establish DEMO methodology in such platform there is a need to create bridges between its modeling components and semantic Wikipedia. It’s in that aspect that our thesis focuses, trying to establish and implement, using a study case, the principles of a way of transforming the DEMO methodology diagrams in comprehensive pages on semantic Wikipedia but keeping them as abstract as possible to allow expansibility and generalization to all diagrams without losing any valuable information so that, if that is the wish, those diagrams may be recreated from the semantic pages and make this process a full cycle.
Resumo:
The purpose of this study was to identify whether activity modeling framework supports problem analysis and provides a traceable and tangible connection from the problem identification up to solution modeling. Methodology validation relied on a real problem from a Portuguese teaching syndicate (ASPE), regarding courses development and management. The study was carried out with a perspective to elaborate a complete tutorial of how to apply activity modeling framework to a real world problem. Within each step of activity modeling, we provided a summary elucidation of the relevant elements required to perform it, pointed out some improvements and applied it to ASPE’s real problem. It was found that activity modeling potentiates well structured problem analysis as well as provides a guiding thread between problem and solution modeling. It was concluded that activity-based task modeling is key to shorten the gap between problem and solution. The results revealed that the solution obtained using activity modeling framework solved the core concerns of our customer and allowed them to enhance the quality of their courses development and management. The principal conclusion was that activity modeling is a properly defined methodology that supports software engineers in problem analysis, keeping a traceable guide among problem and solution.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The paper presents a methodology to model three-dimensional reinforced concrete members by means of embedded discontinuity elements based on the Continuum Strong Discontinuous Approach (CSDA). Mixture theory concepts are used to model reinforced concrete as a 31) composite material constituted of concrete with long fibers (rebars) bundles oriented in different directions embedded in it. The effects of the rebars are modeled by phenomenological constitutive models devised to reproduce the axial non-linear behavior, as well as the bond-slip and dowel action. The paper presents the constitutive models assumed for the components and the compatibility conditions chosen to constitute the composite. Numerical analyses of existing experimental reinforced concrete members are presented, illustrating the applicability of the proposed methodology.