887 resultados para credit risk model.


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reviews the literature concerning the practice of using Online Analytical Processing (OLAP) systems to recall information stored by Online Transactional Processing (OLTP) systems. Such a review provides a basis for discussion on the need for the information that are recalled through OLAP systems to maintain the contexts of transactions with the data captured by the respective OLTP system. The paper observes an industry trend involving the use of OLTP systems to process information into data, which are then stored in databases without the business rules that were used to process information and data stored in OLTP databases without associated business rules. This includes the necessitation of a practice, whereby, sets of business rules are used to extract, cleanse, transform and load data from disparate OLTP systems into OLAP databases to support the requirements for complex reporting and analytics. These sets of business rules are usually not the same as business rules used to capture data in particular OLTP systems. The paper argues that, differences between the business rules used to interpret these same data sets, risk gaps in semantics between information captured by OLTP systems and information recalled through OLAP systems. Literature concerning the modeling of business transaction information as facts with context as part of the modelling of information systems were reviewed to identify design trends that are contributing to the design quality of OLTP and OLAP systems. The paper then argues that; the quality of OLTP and OLAP systems design has a critical dependency on the capture of facts with associated context, encoding facts with contexts into data with business rules, storage and sourcing of data with business rules, decoding data with business rules into the facts with the context and recall of facts with associated contexts. The paper proposes UBIRQ, a design model to aid the co-design of data with business rules storage for OLTP and OLAP purposes. The proposed design model provides the opportunity for the implementation and use of multi-purpose databases, and business rules stores for OLTP and OLAP systems. Such implementations would enable the use of OLTP systems to record and store data with executions of business rules, which will allow for the use of OLTP and OLAP systems to query data with business rules used to capture the data. Thereby ensuring information recalled via OLAP systems preserves the contexts of transactions as per the data captured by the respective OLTP system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Comparative analyses are used to address the key question of what makes a species more prone to extinction by exploring the links between vulnerability and intrinsic species’ traits and/or extrinsic factors. This approach requires comprehensive species data but information is rarely available for all species of interest. As a result comparative analyses often rely on subsets of relatively few species that are assumed to be representative samples of the overall studied group. 2. Our study challenges this assumption and quantifies the taxonomic, spatial, and data type biases associated with the quantity of data available for 5415 mammalian species using the freely available life-history database PanTHERIA. 3. Moreover, we explore how existing biases influence results of comparative analyses of extinction risk by using subsets of data that attempt to correct for detected biases. In particular, we focus on links between four species’ traits commonly linked to vulnerability (distribution range area, adult body mass, population density and gestation length) and conduct univariate and multivariate analyses to understand how biases affect model predictions. 4. Our results show important biases in data availability with c.22% of mammals completely lacking data. Missing data, which appear to be not missing at random, occur frequently in all traits (14–99% of cases missing). Data availability is explained by intrinsic traits, with larger mammals occupying bigger range areas being the best studied. Importantly, we find that existing biases affect the results of comparative analyses by overestimating the risk of extinction and changing which traits are identified as important predictors. 5. Our results raise concerns over our ability to draw general conclusions regarding what makes a species more prone to extinction. Missing data represent a prevalent problem in comparative analyses, and unfortunately, because data are not missing at random, conventional approaches to fill data gaps, are not valid or present important challenges. These results show the importance of making appropriate inferences from comparative analyses by focusing on the subset of species for which data are available. Ultimately, addressing the data bias problem requires greater investment in data collection and dissemination, as well as the development of methodological approaches to effectively correct existing biases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Previous data support the benefits of reducing dietary saturated fatty acids (SFAs) on insulin resistance (IR) and other metabolic risk factors. However, whether the IR status of those suffering from metabolic syndrome (MetS) affects this response is not established. OBJECTIVE: Our objective was to determine whether the degree of IR influences the effect of substituting high-saturated fatty acid (HSFA) diets by isoenergetic alterations in the quality and quantity of dietary fat on MetS risk factors. DESIGN: In this single-blind, parallel, controlled, dietary intervention study, MetS subjects (n = 472) from 8 European countries classified by different IR levels according to homeostasis model assessment of insulin resistance (HOMA-IR) were randomly assigned to 4 diets: an HSFA diet; a high-monounsaturated fatty acid (HMUFA) diet; a low-fat, high-complex carbohydrate (LFHCC) diet supplemented with long-chain n-3 polyunsaturated fatty acids (1.2 g/d); or an LFHCC diet supplemented with placebo for 12 wk (control). Anthropometric, lipid, inflammatory, and IR markers were determined. RESULTS: Insulin-resistant MetS subjects with the highest HOMA-IR improved IR, with reduced insulin and HOMA-IR concentrations after consumption of the HMUFA and LFHCC n-3 diets (P < 0.05). In contrast, subjects with lower HOMA-IR showed reduced body mass index and waist circumference after consumption of the LFHCC control and LFHCC n-3 diets and increased HDL cholesterol concentrations after consumption of the HMUFA and HSFA diets (P < 0.05). MetS subjects with a low to medium HOMA-IR exhibited reduced blood pressure, triglyceride, and LDL cholesterol levels after the LFHCC n-3 diet and increased apolipoprotein A-I concentrations after consumption of the HMUFA and HSFA diets (all P < 0.05). CONCLUSIONS: Insulin-resistant MetS subjects with more metabolic complications responded differently to dietary fat modification, being more susceptible to a health effect from the substitution of SFAs in the HMUFA and LFHCC n-3 diets. Conversely, MetS subjects without IR may be more sensitive to the detrimental effects of HSFA intake. The metabolic phenotype of subjects clearly determines response to the quantity and quality of dietary fat on MetS risk factors, which suggests that targeted and personalized dietary therapies may be of value for its different metabolic features.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional approaches to the way people react to food risks often focus on ways in which the media distort information about risk, or on the deficiencies in people’s interpretation of this information. In this chapter Jones offers an alternative model which sees decisions regarding food risk as taking place at a complex nexus where different people, texts, objects and practices, each with their own histories, come together. Based on a case study of a food scandal involving a particular brand of Chinese candy, Jones argues that understanding why people respond the way they do to food risk requires tracing the itineraries along which different people, texts, objects and practices have traveled to converge at particular moments, and understanding the kinds of concrete social actions that these convergences make possible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding complex social-ecological systems, and anticipating how they may respond to rapid change, requires an approach that incorporates environmental, social, economic, and policy factors, usually in a context of fragmented data availability. We employed fuzzy cognitive mapping (FCM) to integrate these factors in the assessment of future wildfire risk in the Chiquitania region, Bolivia. In this region, dealing with wildfires is becoming increasingly challenging due to reinforcing feedbacks between multiple drivers. We conducted semi-structured interviews and constructed different FCMs in focus groups to understand the regional dynamics of wildfire from diverse perspectives. We used FCM modelling to evaluate possible adaptation scenarios in the context of future drier climatic conditions. Scenarios also considered possible failure to respond in time to the emergent risk. This approach proved of great potential to support decision-making for risk management. It helped identify key forcing variables and generate insights into potential risks and trade-offs of different strategies. All scenarios showed increased wildfire risk in the event of more droughts. The ‘Hands-off’ scenario resulted in amplified impacts driven by intensifying trends, affecting particularly the agricultural production. The ‘Fire management’ scenario, which adopted a bottom-up approach to improve controlled burning, showed less trade-offs between wildfire risk reduction and production compared to the ‘Fire suppression’ scenario. Findings highlighted the importance of considering strategies that involve all actors who use fire, and the need to nest these strategies for a more systemic approach to manage wildfire risk. The FCM model could be used as a decision-support tool and serve as a ‘boundary object’ to facilitate collaboration and integration of different forms of knowledge and perceptions of fire in the region. This approach has also the potential to support decisions in other dynamic frontier landscapes around the world that are facing increased risk of large wildfires.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Remotely sensed rainfall is increasingly being used to manage climate-related risk in gauge sparse regions. Applications based on such data must make maximal use of the skill of the methodology in order to avoid doing harm by providing misleading information. This is especially challenging in regions, such as Africa, which lack gauge data for validation. In this study, we show how calibrated ensembles of equally likely rainfall can be used to infer uncertainty in remotely sensed rainfall estimates, and subsequently in assessment of drought. We illustrate the methodology through a case study of weather index insurance (WII) in Zambia. Unlike traditional insurance, which compensates proven agricultural losses, WII pays out in the event that a weather index is breached. As remotely sensed rainfall is used to extend WII schemes to large numbers of farmers, it is crucial to ensure that the indices being insured are skillful representations of local environmental conditions. In our study we drive a land surface model with rainfall ensembles, in order to demonstrate how aggregation of rainfall estimates in space and time results in a clearer link with soil moisture, and hence a truer representation of agricultural drought. Although our study focuses on agricultural insurance, the methodological principles for application design are widely applicable in Africa and elsewhere.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sea-level rise (SLR) from global warming may have severe consequences for coastal cities, particularly when combined with predicted increases in the strength of tidal surges. Predicting the regional impact of SLR flooding is strongly dependent on the modelling approach and accuracy of topographic data. Here, the areas under risk of sea water flooding for London boroughs were quantified based on the projected SLR scenarios reported in Intergovernmental Panel on Climate Change (IPCC) fifth assessment report (AR5) and UK climatic projections 2009 (UKCP09) using a tidally-adjusted bathtub modelling approach. Medium- to very high-resolution digital elevation models (DEMs) are used to evaluate inundation extents as well as uncertainties. Depending on the SLR scenario and DEMs used, it is estimated that 3%–8% of the area of Greater London could be inundated by 2100. The boroughs with the largest areas at risk of flooding are Newham, Southwark, and Greenwich. The differences in inundation areas estimated from a digital terrain model and a digital surface model are much greater than the root mean square error differences observed between the two data types, which may be attributed to processing levels. Flood models from SRTM data underestimate the inundation extent, so their results may not be reliable for constructing flood risk maps. This analysis provides a broad-scale estimate of the potential consequences of SLR and uncertainties in the DEM-based bathtub type flood inundation modelling for London boroughs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This multicentric population-based study in Brazil is the first national effort to estimate the prevalence of hepatitis B (HBV) and risk factors in the capital cities of the Northeast. Central-West, and Federal Districts (2004-2005). Random multistage cluster sampling was used to select persons 13-69 years of age. Markers for HBV were tested by enzyme-linked immunosorbent assay. The HBV genotypes were determined by sequencing hepatitis B surface antigen (HBsAg). Multivariate analyses and simple catalytic model were performed. Overall. 7,881 persons were inculded < 70% were not vaccinated. Positivity for HBsAg was less than 1% among non-vaccinated persons and genotypes A, D, and F co-circulated. The incidence of infection increased with age with similar force of infection in all regions. Males and persons having initiated sexual activity were associated with HBV infection in the two settings: healthcare jobs and prior hospitalization were risk factors in the Federal District. Our survey classified these regions as areas with HBV endemicity and highlighted the risk factors differences among the settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study investigated the effects of exercise training on arterial pressure, baroreflex sensitivity, cardiovascular autonomic control and metabolic parameters on female LDL-receptor knockout ovariectomized mice. Mice were divided into two groups: sedentary and trained. Trained group was submitted to an exercise training protocol. Blood cholesterol was measured. Arterial pressure (AP) signals were directly recorded in conscious mice. Baroreflex sensitivity was evaluated by tachycardic and bradycardic responses to AP changes. Cardiovascular autonomic modulation was measured in frequency (FFT) and time domains. Maximal exercise capacity was increased in trained as compared to sedentary group. Blood cholesterol was diminished in trained mice (191 +/- 8 mg/dL) when compared to sedentary mice (250 +/- 9 mg/dL, p<0.05). Mean AP and HR were reduced in trained group (101 +/- 3 mmHg and 535 +/- 14 bpm, p<0.05) when compared with sedentary group (125 +/- 3 mmHg and 600 +/- 12 bpm). Exercise training induced improvement in bradycardic reflex response in trained animals (-4.24 +/- 0.62 bpm/mmHg) in relation to sedentary animals (-1.49 +/- 0.15 bpm/mmHg, p<0.01); tachycardic reflex responses were similar between studied groups. Exercise training increased the variance (34 +/- 8 vs. 6.6 +/- 1.5 ms(2) in sedentary, p<0.005) and the high-frequency band (HF) of the pulse interval (IP) (53 +/- 7% vs. 26 +/- 6% in sedentary, p<0.01). It is tempting to speculate that results of this experimental study might represent a rationale for this non-pharmacological intervention in the management of cardiovascular risk factors in dyslipidemic post-menopause women. (C) 2009 Elsevier Ireland Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The incidence of melanoma is increasing worldwide. It is one of the leading cancers in pregnancy and the most common malignancy to metastasize to placenta and fetus. There are no publications about experimental models of melanoma and pregnancy. We propose a new experimental murine model to study the effects of melanoma on pregnancy and its metastatic process. We tested several doses of melanoma cells until we arrived at the optimal dose, which produced tumor growth and allowed animal survival to the end of pregnancy. Two control groups were used: control (C) and stress control (SC) and three different routes of inoculation: intravenous (IV), intraperitoneal (IP) and subcutaneous (SC). All the fetuses and placentas were examined macroscopically and microscopically. The results suggest that melanoma is a risk factor for intrauterine growth restriction but does not affect placental weight. When inoculated by the SC route, the tumor grew only in the site of implantation. The IP route produced peritoneal tumoral growth and also ovarian and uterine metastases in 60% of the cases. The IV route produced pulmonary tumors. No placental or fetal metastases were obtained, regardless of the inoculation route. The injection of melanoma cells by any route did not increase the rate of fetal resorptions. Surprisingly, animals in the IV groups had no resorptions and a significantly higher number of fetuses. This finding may indicate that tumoral factors released in the host organism to favor tumor survival may also have a pro-gestational action and consequently improve the reproductive performance of these animals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we introduce a parametric model for handling lifetime data where an early lifetime can be related to the infant-mortality failure or to the wear processes but we do not know which risk is responsible for the failure. The maximum likelihood approach and the sampling-based approach are used to get the inferences of interest. Some special cases of the proposed model are studied via Monte Carlo methods for size and power of hypothesis tests. To illustrate the proposed methodology, we introduce an example consisting of a real data set.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we deal with robust inference in heteroscedastic measurement error models Rather than the normal distribution we postulate a Student t distribution for the observed variables Maximum likelihood estimates are computed numerically Consistent estimation of the asymptotic covariance matrices of the maximum likelihood and generalized least squares estimators is also discussed Three test statistics are proposed for testing hypotheses of interest with the asymptotic chi-square distribution which guarantees correct asymptotic significance levels Results of simulations and an application to a real data set are also reported (C) 2009 The Korean Statistical Society Published by Elsevier B V All rights reserved

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we proposed a new two-parameter lifetime distribution with increasing failure rate, the complementary exponential geometric distribution, which is complementary to the exponential geometric model proposed by Adamidis and Loukas (1998). The new distribution arises on a latent complementary risks scenario, in which the lifetime associated with a particular risk is not observable; rather, we observe only the maximum lifetime value among all risks. The properties of the proposed distribution are discussed, including a formal proof of its probability density function and explicit algebraic formulas for its reliability and failure rate functions, moments, including the mean and variance, variation coefficient, and modal value. The parameter estimation is based on the usual maximum likelihood approach. We report the results of a misspecification simulation study performed in order to assess the extent of misspecification errors when testing the exponential geometric distribution against our complementary one in the presence of different sample size and censoring percentage. The methodology is illustrated on four real datasets; we also make a comparison between both modeling approaches. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Steatosis is diagnosed on the basis of the macroscopic aspect of the liver evaluated by the surgeon at the time of organ extraction or by means of a frozen biopsy. In the present study, the applicability of laser-induced fluorescence (LIF) spectroscopy was investigated as a method for the diagnosis of different degrees of steatosis experimentally induced in rats. Rats received a high-lipid diet for different periods of time. The animals were divided into groups according to the degree of induced steatosis diagnosis by histology. The concentration of fat in the liver was correlated with LIF by means of the steatosis fluorescence factor (SFF). The histology classification, according to liver fat concentration was, Severe Steatosis, Moderate Steatosis, Mild Steatosis and Control (no liver steatosis). Fluorescence intensity could be directly correlated with fat content. It was possible to estimate an average of fluorescence intensity variable by means of different confidence intervals (P=95%) for each steatosis group. SFF was significantly higher in the Severe Steatosis group (P < 0.001) compared with the Moderate Steatosis, Mild Steatosis and Control groups. The various degrees of steatosis could be directly correlated with SFF. LIF spectroscopy proved to be a method capable of identifying the degree of hepatic steatosis in this animal model, and has the potential of clinical application for non-invasive evaluation of the degree of steatosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this article is to discuss the estimation of the systematic risk in capital asset pricing models with heavy-tailed error distributions to explain the asset returns. Diagnostic methods for assessing departures from the model assumptions as well as the influence of observations on the parameter estimates are also presented. It may be shown that outlying observations are down weighted in the maximum likelihood equations of linear models with heavy-tailed error distributions, such as Student-t, power exponential, logistic II, so on. This robustness aspect may also be extended to influential observations. An application in which the systematic risk estimate of Microsoft is compared under normal and heavy-tailed errors is presented for illustration.