976 resultados para Explicit hazard model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

A wind catcher/tower natural ventilation system was installed in a seminar room in the building of the School of Construction Management and Engineering, the University of Reading in the UK . Performance was analysed by means of ventilation tracer gas measurements, indoor climate measurements (temperature, humidity, CO2) and occupant surveys. In addition, the potential of simple design tools was evaluated by comparing observed ventilation results with those predicted by an explicit ventilation model and the AIDA implicit ventilation model. To support this analysis, external climate parameters (wind speed and direction, solar radiation, external temperature and humidity) were also monitored. The results showed the chosen ventilation design provided a substantially greater ventilation rate than an equivalent area of openable window. Also air quality parameters stayed within accepted norms while occupants expressed general satisfaction with the system and with comfort conditions. Night cooling was maximised by using the system in combination with openable windows. Comparisons of calculations with ventilation rate measurements showed that while AIDA gave reasonably correlated results with the monitored performance results, the widely used industry explicit model was found to over estimate the monitored ventilation rate.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Valuation is the process of estimating price. The methods used to determine value attempt to model the thought processes of the market and thus estimate price by reference to observed historic data. This can be done using either an explicit model, that models the worth calculation of the most likely bidder, or an implicit model, that that uses historic data suitably adjusted as a short cut to determine value by reference to previous similar sales. The former is generally referred to as the Discounted Cash Flow (DCF) model and the latter as the capitalisation (or All Risk Yield) model. However, regardless of the technique used, the valuation will be affected by uncertainties. Uncertainty in the comparable data available; uncertainty in the current and future market conditions and uncertainty in the specific inputs for the subject property. These input uncertainties will translate into an uncertainty with the output figure, the estimate of price. In a previous paper, we have considered the way in which uncertainty is allowed for in the capitalisation model in the UK. In this paper, we extend the analysis to look at the way in which uncertainty can be incorporated into the explicit DCF model. This is done by recognising that the input variables are uncertain and will have a probability distribution pertaining to each of them. Thus buy utilising a probability-based valuation model (using Crystal Ball) it is possible to incorporate uncertainty into the analysis and address the shortcomings of the current model. Although the capitalisation model is discussed, the paper concentrates upon the application of Crystal Ball to the Discounted Cash Flow approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

[1] High-elevation forests represent a large fraction of potential carbon uptake in North America, but this uptake is not well constrained by observations. Additionally, forests in the Rocky Mountains have recently been severely damaged by drought, fire, and insect outbreaks, which have been quantified at local scales but not assessed in terms of carbon uptake at regional scales. The Airborne Carbon in the Mountains Experiment was carried out in 2007 partly to assess carbon uptake in western U.S. mountain ecosystems. The magnitude and seasonal change of carbon uptake were quantified by (1) paired upwind-downwind airborne CO2 observations applied in a boundary layer budget, (2) a spatially explicit ecosystem model constrained using remote sensing and flux tower observations, and (3) a downscaled global tracer transport inversion. Top-down approaches had mean carbon uptake equivalent to flux tower observations at a subalpine forest, while the ecosystem model showed less. The techniques disagreed on temporal evolution. Regional carbon uptake was greatest in the early summer immediately following snowmelt and tended to lessen as the region experienced dry summer conditions. This reduction was more pronounced in the airborne budget and inversion than in flux tower or upscaling, possibly related to lower snow water availability in forests sampled by the aircraft, which were lower in elevation than the tower site. Changes in vegetative greenness associated with insect outbreaks were detected using satellite reflectance observations, but impacts on regional carbon cycling were unclear, highlighting the need to better quantify this emerging disturbance effect on montane forest carbon cycling.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Os modelos hazard, também conhecidos por modelos de tempo até a falência ou duração, são empregados para determinar quais variáveis independentes têm maior poder explicativo na previsão de falência de empresas. Consistem em uma abordagem alternativa aos modelos binários logit e probit, e à análise discriminante. Os modelos de duração deveriam ser mais eficientes que modelos de alternativas discretas, pois levam em consideração o tempo de sobrevivência para estimar a probabilidade instantânea de falência de um conjunto de observações sobre uma variável independente. Os modelos de alternativa discreta tipicamente ignoram a informação de tempo até a falência, e fornecem apenas a estimativa de falhar em um dado intervalo de tempo. A questão discutida neste trabalho é como utilizar modelos hazard para projetar taxas de inadimplência e construir matrizes de migração condicionadas ao estado da economia. Conceitualmente, o modelo é bastante análogo às taxas históricas de inadimplência e mortalidade utilizadas na literatura de crédito. O Modelo Semiparamétrico Proporcional de Cox é testado em empresas brasileiras não pertencentes ao setor financeiro, e observa-se que a probabilidade de inadimplência diminui sensivelmente após o terceiro ano da emissão do empréstimo. Observa-se também que a média e o desvio-padrão das probabilidades de inadimplência são afetados pelos ciclos econômicos. É discutido como o Modelo Proporcional de Cox pode ser incorporado aos quatro modelos mais famosos de gestão de risco .de crédito da atualidade: CreditRisk +, KMV, CreditPortfolio View e CreditMetrics, e as melhorias resultantes dessa incorporação

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective of this paper is to evaluate the effect of the 1985 ”Employment Services for Ex-Offenders” (ESEO) program on recidivism. Initially, the sample has been split randomly in a control group and a treatment group. However, the actual treatment (mainly being job related counseling) only takes place conditional on finding a job, and not having been arrested, for those selected in the treatment group. We use a multiple proportional hazard model with unobserved heterogeneity for job seach and recidivism time which incorporates the conditional treatment effect. We find that the program helps to reduce criminal activity, contrary to the result of the previous analysis of this data set. This finding is important for crime prevention policy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An approach using straight lines as features to solve the photogrammetric space resection problem is presented. An explicit mathematical model relating straight lines, in both object and image space, is used. Based on this model, Kalman Filtering is applied to solve the space resection problem. The recursive property of the filter is used in an iterative process which uses the sequentially estimated camera location parameters to feedback to the feature extraction process in the image. This feedback process leads to a gradual reduction of the image space for feature searching, and consequently eliminates the bottleneck due to the high computational cost of the image segmentation phase. It also enables feature extraction and the determination of feature correspondence in image and object space in an automatic way, i.e., without operator interference. Results obtained from simulated and real data show that highly accurate space resection parameters are obtained as well as a progressive processing time reduction. The obtained accuracy, the automatic correspondence process, and the short related processing time show that the proposed approach can be used in many real-time machine vision systems, making possible the implementation of applications not feasible until now.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Traditional software engineering approaches and metaphors fall short when applied to areas of growing relevance such as electronic commerce, enterprise resource planning, and mobile computing: such areas, in fact, generally call for open architectures that may evolve dynamically over time so as to accommodate new components and meet new requirements. This is probably one of the main reasons that the agent metaphor and the agent-oriented paradigm are gaining momentum in these areas. This thesis deals with the engineering of complex software systems in terms of the agent paradigm. This paradigm is based on the notions of agent and systems of interacting agents as fundamental abstractions for designing, developing and managing at runtime typically distributed software systems. However, today the engineer often works with technologies that do not support the abstractions used in the design of the systems. For this reason the research on methodologies becomes the basic point in the scientific activity. Currently most agent-oriented methodologies are supported by small teams of academic researchers, and as a result, most of them are in an early stage and still in the first context of mostly \academic" approaches for agent-oriented systems development. Moreover, such methodologies are not well documented and very often defined and presented only by focusing on specific aspects of the methodology. The role played by meta- models becomes fundamental for comparing and evaluating the methodologies. In fact a meta-model specifies the concepts, rules and relationships used to define methodologies. Although it is possible to describe a methodology without an explicit meta-model, formalising the underpinning ideas of the methodology in question is valuable when checking its consistency or planning extensions or modifications. A good meta-model must address all the different aspects of a methodology, i.e. the process to be followed, the work products to be generated and those responsible for making all this happen. In turn, specifying the work products that must be developed implies dening the basic modelling building blocks from which they are built. As a building block, the agent abstraction alone is not enough to fully model all the aspects related to multi-agent systems in a natural way. In particular, different perspectives exist on the role that environment plays within agent systems: however, it is clear at least that all non-agent elements of a multi-agent system are typically considered to be part of the multi-agent system environment. The key role of environment as a first-class abstraction in the engineering of multi-agent system is today generally acknowledged in the multi-agent system community, so environment should be explicitly accounted for in the engineering of multi-agent system, working as a new design dimension for agent-oriented methodologies. At least two main ingredients shape the environment: environment abstractions - entities of the environment encapsulating some functions -, and topology abstractions - entities of environment that represent the (either logical or physical) spatial structure. In addition, the engineering of non-trivial multi-agent systems requires principles and mechanisms for supporting the management of the system representation complexity. These principles lead to the adoption of a multi-layered description, which could be used by designers to provide different levels of abstraction over multi-agent systems. The research in these fields has lead to the formulation of a new version of the SODA methodology where environment abstractions and layering principles are exploited for en- gineering multi-agent systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this study was to assess the impact of body mass index (BMI) on clinical outcome of patients treated by percutaneous coronary intervention (PCI) using drug-eluting stents. Patients were stratified according to BMI as normal (<25 kg/m(2)), overweight (25 to 30 kg/m(2)), or obese (>30 kg/m(2)). At 5-year follow-up all-cause death, myocardial infarction, clinically justified target vessel revascularization (TVR), and definite stent thrombosis were assessed. A complete dataset was available in 7,427 patients, of which 45%, 22%, and 33% were classified according to BMI as overweight, obese, and normal, respectively. Mean age of patients was significantly older in those with a normal BMI (p <0.05). Incidence of diabetes mellitus, hypertension, and dyslipidemia increased as BMI increased (p <0.05). Significantly higher rates of TVR (15.3% vs 12.8%, p = 0.02) and early stent thrombosis (1.5% vs 0.9%, p = 0.04) were observed in the obese compared to the normal BMI group. No significant difference among the 3 BMI groups was observed for the composite of death/myocardial infarction/TVR or for definite stent thrombosis at 5 years, whereas the normal BMI group was at higher risk for all-cause death at 5 years (obese vs normal BMI, hazard ratio 0.74, confidence interval 0.53 to 0.99, p = 0.05; overweight vs normal BMI, hazard ratio 0.73, confidence interval 0.59 to 0.94, p = 0.01) in the multivariate Cox proportional hazard model. Age resulted in a linearly dependent covariate with BMI in the all-cause 5-year mortality multivariate model (p = 0.001). In conclusion, the "obesity paradox" observed in 5-year all-cause mortality could be explained by the higher rate of elderly patients in the normal BMI group and the existence of colinearity between BMI and age. However, obese patients had a higher rate of TVR and early stent thrombosis and a higher rate of other risk factors such as diabetes mellitus, hypertension, and hypercholesterolemia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective:  To investigate the predictive value of the Strauss and Carpenter Prognostic Scale (SCPS) for transition to a first psychotic episode in subjects clinically at high risk (CHR) of psychosis. Method:  Two hundred and forty-four CHR subjects participating in the European Prediction of Psychosis Study were assessed with the SCPS, an instrument that has been shown to predict outcome in patients with schizophrenia reliably. Results:  At 18-month follow-up, 37 participants had made the transition to psychosis. The SCPS total score was predictive of a first psychotic episode (P < 0.0001). SCPS items that remained as independent predictors in the Cox proportional hazard model were as follows: most usual quality of useful work in the past year (P = 0.006), quality of social relations (P = 0.006), presence of thought disorder, delusions or hallucinations in the past year (P = 0.001) and reported severity of subjective distress in past month (P = 0.003). Conclusion:  The SCPS could make a valuable contribution to a more accurate prediction of psychosis in CHR subjects as a second-step tool. SCPS items assessing quality of useful work and social relations, positive symptoms and subjective distress have predictive value for transition. Further research should focus on investigating whether targeted early interventions directed at the predictive domains may improve outcomes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is an emerging interest in modeling spatially correlated survival data in biomedical and epidemiological studies. In this paper, we propose a new class of semiparametric normal transformation models for right censored spatially correlated survival data. This class of models assumes that survival outcomes marginally follow a Cox proportional hazard model with unspecified baseline hazard, and their joint distribution is obtained by transforming survival outcomes to normal random variables, whose joint distribution is assumed to be multivariate normal with a spatial correlation structure. A key feature of the class of semiparametric normal transformation models is that it provides a rich class of spatial survival models where regression coefficients have population average interpretation and the spatial dependence of survival times is conveniently modeled using the transformed variables by flexible normal random fields. We study the relationship of the spatial correlation structure of the transformed normal variables and the dependence measures of the original survival times. Direct nonparametric maximum likelihood estimation in such models is practically prohibited due to the high dimensional intractable integration of the likelihood function and the infinite dimensional nuisance baseline hazard parameter. We hence develop a class of spatial semiparametric estimating equations, which conveniently estimate the population-level regression coefficients and the dependence parameters simultaneously. We study the asymptotic properties of the proposed estimators, and show that they are consistent and asymptotically normal. The proposed method is illustrated with an analysis of data from the East Boston Ashma Study and its performance is evaluated using simulations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We compared spot drug-eluting stenting (DES) to full stent coverage for treatment of long coronary stenoses. Consecutive, consenting patients with a long (>20 mm) coronary lesion of nonuniform severity and indication for percutaneous coronary intervention were randomized to full stent coverage of the atherosclerotic lesion with multiple, overlapping stenting (full DES group, n = 90) or spot stenting of hemodynamically significant parts of the lesion only (defined as diameter stenosis >50%; spot DES group, n = 89). At 1-year follow-up, 14 patients with full DES (15.6%) and 5 patients (5.6%) with spot DES had a major adverse cardiac event (MACE; p = 0.031). At 3 years, MACEs occurred in 18 patients with full DES (20%) and 7 patients (7.8%) with spot DES (p = 0.019). Cox proportional hazard model showed that the risk for MACEs was almost 60% lower in patients with spot DES compared to those with full DES (hazard ratio 0.41, 95% confidence interval 0.17 to 0.98, p = 0.044). This association remained even after controlling for age, gender, lesion length, and type of stent used (hazard ratio 0.42, 95% confidence interval 0.17 to 1.00, p = 0.05). In conclusion, total lesion coverage with DES is not necessary in the presence of diffuse disease of nonuniform severity. Selective stenting of only the significantly stenosed parts of the lesion is an appropriate therapeutic alternative in this setting, offering a favorable clinical outcome.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Much of the knowledge about software systems is implicit, and therefore difficult to recover by purely automated techniques. Architectural layers and the externally visible features of software systems are two examples of information that can be difficult to detect from source code alone, and that would benefit from additional human knowledge. Typical approaches to reasoning about data involve encoding an explicit meta-model and expressing analyses at that level. Due to its informal nature, however, human knowledge can be difficult to characterize up-front and integrate into such a meta-model. We propose a generic, annotation-based approach to capture such knowledge during the reverse engineering process. Annotation types can be iteratively defined, refined and transformed, without requiring a fixed meta-model to be defined in advance. We show how our approach supports reverse engineering by implementing it in a tool called Metanool and by applying it to (i) analyzing architectural layering, (ii) tracking reengineering tasks, (iii) detecting design flaws, and (iv) analyzing features.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: This study aimed to investigate the influence of deep sternal wound infection on long-term survival following cardiac surgery. MATERIAL AND METHODS: In our institutional database we retrospectively evaluated medical records of 4732 adult patients who received open-heart surgery from January 1995 through December 2005. The predictive factors for DSWI were determined using logistic regression analysis. Then, each patient with deep sternal wound infection (DSWI) was matched with 2 controls without DSWI, according to the risk factors identified previously. After checking balance resulting from matching, short-term mortality was compared between groups using a paired test, and long-term survival was compared using Kaplan-Meier analysis and a Cox proportional hazard model. RESULTS: Overall, 4732 records were analyzed. The mean age of the investigated population was 69.3±12.8 years. DSWI occurred in 74 (1.56%) patients. Significant independent predictive factors for deep sternal infections were active smoking (OR 2.19, CI95 1.35-3.53, p=0.001), obesity (OR 1.96, CI95 1.20-3.21, p=0.007), and insulin-dependent diabetes mellitus (OR 2.09, CI95 1.05-10.06, p=0.016). Mean follow-up in the matched set was 125 months, IQR 99-162. After matching, in-hospital mortality was higher in the DSWI group (8.1% vs. 2.7% p=0.03), but DSWI was not an independent predictor of long-term survival (adjusted HR 1.5, CI95 0.7-3.2, p=0.33). CONCLUSIONS: The results presented in this report clearly show that post-sternotomy deep wound infection does not influence long-term survival in an adult general cardio-surgical patient population.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we address the issue of who is most likely to participate in further training, for what reasons and at what stage of the life course. Special emphasis is given to the impact of labour-market policies to encourage further education and a person's individual or cohort possibilities to participate in further education. We apply a Cox proportional hazard model to data from the West German Life History Study, separately for women and men, within and outside the firm. Younger cohorts show not only higher proportions of participation in further education and training at early stages of the life course, they also continue to participate in higher numbers during later stages of the life course. General labour-force participation reduces and tenure with the same firm increases the propensity to participate in further education and training. Contrary to expectations, in Germany labour-market segmentation has been enhanced rather than reduced by further education and training policies, since in the firm-specific labour-market segment, i.e. skilled jobs in large firms, and in the public sector both women and men had a higher probability of participation. Particularly favourable conditions for participation in further education outside the firm prevailed during the first years of the labour promotion act (Arbeitsförderungsgesetz) between 1969 and 1974, but women did not benefit to the same extent as men. Training policies are, therefore, in need of continuous assessment based on a goal-achievement evaluation to avoid any unintended effects of such policies.