954 resultados para Explicit hazard model
Resumo:
BACKGROUND: Controversies exist regarding the indications for unicompartmental knee arthroplasty. The objective of this study is to report the mid-term results and examine predictors of failure in a metal-backed unicompartmental knee arthroplasty design. METHODS: At a mean follow-up of 60 months, 80 medial unicompartmental knee arthroplasties (68 patients) were evaluated. Implant survivorship was analyzed using Kaplan-Meier method. The Knee Society objective and functional scores and radiographic characteristics were compared before surgery and at final follow-up. A Cox proportional hazard model was used to examine the association of patient's age, gender, obesity (body mass index > 30 kg/m2), diagnosis, Knee Society scores and patella arthrosis with failure. RESULTS: There were 9 failures during the follow up. The mean Knee Society objective and functional scores were respectively 49 and 48 points preoperatively and 95 and 92 points postoperatively. The survival rate was 92% at 5 years and 84% at 10 years. The mean age was younger in the failure group than the non-failure group (p < 0.01). However, none of the factors assessed was independently associated with failure based on the results from the Cox proportional hazard model. CONCLUSION: Gender, pre-operative diagnosis, preoperative objective and functional scores and patellar osteophytes were not independent predictors of failure of unicompartmental knee implants, although high body mass index trended toward significance. The findings suggest that the standard criteria for UKA may be expanded without compromising the outcomes, although caution may be warranted in patients with very high body mass index pending additional data to confirm our results. LEVEL OF EVIDENCE: IV.
Resumo:
Harris R. and Trainor M. (2007) Impact of government intervention on employment change and plant closure in Northern Ireland, 1983-97, Regional Studies 41, 51-63. Financial assistance to manufacturing industry is an important element of the industrial development policy in Northern Ireland. This paper uses the individual plant-level records of the Annual Respondents Database (ARD) for the Northern Ireland manufacturing sector (1983-97) matched to the plant-level details of financial support provided by the Industrial Development Board to examine the effect of selective financial assistance (SFA) on employment change and plant closure. It is found that SFA concentrated on protecting existing, rather than new, enterprises in terms of employment change. Using a hazard model, it is found that the receipt of SFA significantly reduced the probability of plant closure by, on average, between 15 and 24%.
Resumo:
Introduction
PET-computed tomography (PET-CT) is a useful staging imaging modality in colorectal liver metastases (CRLM). This study aimed to determine whether PET-CT parameters, standardized uptake value (SUV) and reconstructed tumour volume (RTV), are predictors of prognosis and survival.
Methods
A study of all resectable CRLM patients in the regional HPB unit from 2007–2009 was performed. Preoperative PET-CT scans were retrospectively reviewed; SUV, diameter and RTV for each lesion was recorded. Correlation analysis was performed with other pathological and biochemical parameters, by Pearson’s correlation analysis. Survival analysis was performed using Cox regression hazard model. A P value of less than 0.05 was considered statistically significant.
Results
A total of 79 patients were included. SUV moderately correlated with tumour diameter, both PET-CT (r=0.4927; P<0.0001) and histology (r=0.4513; P=0.0003); RTV (r=0.4489; P<0.001), preoperative carcinoembryonic antigen (CEA) (r=0.4977; P=0.0001), and postoperative CEA (r=0.3727; P=0.004). Multivariate analysis found that an independent predictor of SUVmax was preoperative CEA (P=0.03). RTV strongly correlated with preoperative CEA (r=0.9389; P<0.0001). SUV and RTV had a negative effect on survival.
Conclusion
PET-CT, in the setting of CRLM, may have a prognostic role in assessing survival. Although no definite conclusions can be drawn regarding the prognostic role of SUV and RTV, it acts to reinforce the need for further prospective studies to validate these findings.
Resumo:
Tese apresentada como requisito parcial para obtenção do grau de Doutor em Estatística e Gestão de Informação pelo Instituto Superior de Estatística e Gestão de Informação da Universidade Nova de Lisboa
Resumo:
The present thesis examines the determinants of the bankruptcy protection duration for Canadian firms. Using a sample of Canadian firms that filed for bankruptcy protection between the calendar years 1992 and 2009, we fmd that the firm age, the industry adjusted operating margin, the default spread, the industrial production growth rate or the interest rate are influential factors on determining the length of the protection period. Older firms tend to stay longer under protection from creditors. As older firms have more complicated structures and issues to settle, the risk of exiting soon the protection (the hazard rate) is small. We also find that firms that perform better than their benchmark as measured by the industry they belong to, tend to leave quickly the bankruptcy protection state. We conclude that the fate of relatively successful companies is determined faster. Moreover, we report that it takes less time to achieve a final solution to firms under bankrupt~y when the default spread is low or when the appetite for risk is high. Conversely, during periods of high default spreads and flight for quality, it takes longer time to resolve the bankruptcy issue. This last finding may suggest that troubled firms should place themselves under protection when spreads are low. However, this ignores the endogeneity issue: high default spread may cause and incidentally reflect higher bankruptcy rates in the economy. Indeed, we find that bankruptcy protection is longer during economic downturns. We explain this relation by the natural increase in default rate among firms (and individuals) during economically troubled times. Default spreads are usually larger during these harsh periods as investors become more risk averse since their wealth shrinks. Using a Log-logistic hazard model, we also fmd that firms that file under the Companies' Creditors Arrangement Act (CCAA) protection spend longer time restructuring than firms that filed under the Bankruptcy and Insolvency Act (BIA). As BIA is more statutory and less flexible, solutions can be reached faster by court orders.
Determinantes de la deserción universitaria en la Facultad de Economía de la Universidad del Rosario
Resumo:
Este trabajo analiza el problema de la deserción estudiantil en la Facultad de Economía de la Universidad del Rosario, a través del estudio de los factores individuales, académicos y socioeconómicos que implican el riesgo de desertar. Con este objetivo, se utiliza el análisis de modelos de duración. Específi camente, se estima un modelo de riesgo proporcional de tiempo discreto con y sin heterogeneidad observada (Prentice- Gloeckler, 1978 y Meyer, 1980). Los resultados muestran que los estudiantes de sexo masculino, la vinculación de los estudiantes al mercado laboral y los estudiantes provenientes de otras regiones, tienen el mayor riesgo de deserción. Además, la edad del estudiante incrementa el riesgo, sin embargo, su efecto decrece marginalmente al aumentar la edad. Palabras clave: deserción estudiantil, modelos de duración, riesgo proporcional. Clasifi cación JEL: C41, C13, I21.
Resumo:
Globally there have been a number of concerns about the development of genetically modified crops many of which relate to the implications of gene flow at various levels. In Europe these concerns have led the European Union (EU) to promote the concept of 'coexistence' to allow the freedom to plant conventional and genetically modified (GM) varieties but to minimise the presence of transgenic material within conventional crops. Should a premium for non-GM varieties emerge on the market, the presence of transgenes would generate a 'negative externality' to conventional growers. The establishment of maximum tolerance level for the adventitious presence of GM material in conventional crops produces a threshold effect in the external costs. The existing literature suggests that apart from the biological characteristics of the plant under consideration (e.g. self-pollination rates, entomophilous species, anemophilous species, etc.), gene flow at the landscape level is affected by the relative size of the source and sink populations and the spatial arrangement of the fields in the landscape. In this paper, we take genetically modified herbicide tolerant oilseed rape (GM HT OSR) as a model crop. Starting from an individual pollen dispersal function, we develop a spatially explicit numerical model in order to assess the effect of the size of the source/sink populations and the degree of spatial aggregation on the extent of gene flow into conventional OSR varieties under two alternative settings. We find that when the transgene presence in conventional produce is detected at the field level, the external cost will increase with the size of the source area and with the level of spatial disaggregation. on the other hand when the transgene presence is averaged among all conventional fields in the landscape (e.g. because of grain mixing before detection), the external cost will only depend on the relative size of the source area. The model could readily be incorporated into an economic evaluation of policies to regulate adoption of GM HT OSR. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
A wind catcher/tower natural ventilation system was installed in a seminar room in the building of the School of Construction Management and Engineering, the University of Reading in the UK . Performance was analysed by means of ventilation tracer gas measurements, indoor climate measurements (temperature, humidity, CO2) and occupant surveys. In addition, the potential of simple design tools was evaluated by comparing observed ventilation results with those predicted by an explicit ventilation model and the AIDA implicit ventilation model. To support this analysis, external climate parameters (wind speed and direction, solar radiation, external temperature and humidity) were also monitored. The results showed the chosen ventilation design provided a substantially greater ventilation rate than an equivalent area of openable window. Also air quality parameters stayed within accepted norms while occupants expressed general satisfaction with the system and with comfort conditions. Night cooling was maximised by using the system in combination with openable windows. Comparisons of calculations with ventilation rate measurements showed that while AIDA gave reasonably correlated results with the monitored performance results, the widely used industry explicit model was found to over estimate the monitored ventilation rate.
Resumo:
Valuation is the process of estimating price. The methods used to determine value attempt to model the thought processes of the market and thus estimate price by reference to observed historic data. This can be done using either an explicit model, that models the worth calculation of the most likely bidder, or an implicit model, that that uses historic data suitably adjusted as a short cut to determine value by reference to previous similar sales. The former is generally referred to as the Discounted Cash Flow (DCF) model and the latter as the capitalisation (or All Risk Yield) model. However, regardless of the technique used, the valuation will be affected by uncertainties. Uncertainty in the comparable data available; uncertainty in the current and future market conditions and uncertainty in the specific inputs for the subject property. These input uncertainties will translate into an uncertainty with the output figure, the estimate of price. In a previous paper, we have considered the way in which uncertainty is allowed for in the capitalisation model in the UK. In this paper, we extend the analysis to look at the way in which uncertainty can be incorporated into the explicit DCF model. This is done by recognising that the input variables are uncertain and will have a probability distribution pertaining to each of them. Thus buy utilising a probability-based valuation model (using Crystal Ball) it is possible to incorporate uncertainty into the analysis and address the shortcomings of the current model. Although the capitalisation model is discussed, the paper concentrates upon the application of Crystal Ball to the Discounted Cash Flow approach.
Resumo:
[1] High-elevation forests represent a large fraction of potential carbon uptake in North America, but this uptake is not well constrained by observations. Additionally, forests in the Rocky Mountains have recently been severely damaged by drought, fire, and insect outbreaks, which have been quantified at local scales but not assessed in terms of carbon uptake at regional scales. The Airborne Carbon in the Mountains Experiment was carried out in 2007 partly to assess carbon uptake in western U.S. mountain ecosystems. The magnitude and seasonal change of carbon uptake were quantified by (1) paired upwind-downwind airborne CO2 observations applied in a boundary layer budget, (2) a spatially explicit ecosystem model constrained using remote sensing and flux tower observations, and (3) a downscaled global tracer transport inversion. Top-down approaches had mean carbon uptake equivalent to flux tower observations at a subalpine forest, while the ecosystem model showed less. The techniques disagreed on temporal evolution. Regional carbon uptake was greatest in the early summer immediately following snowmelt and tended to lessen as the region experienced dry summer conditions. This reduction was more pronounced in the airborne budget and inversion than in flux tower or upscaling, possibly related to lower snow water availability in forests sampled by the aircraft, which were lower in elevation than the tower site. Changes in vegetative greenness associated with insect outbreaks were detected using satellite reflectance observations, but impacts on regional carbon cycling were unclear, highlighting the need to better quantify this emerging disturbance effect on montane forest carbon cycling.
Resumo:
Os modelos hazard, também conhecidos por modelos de tempo até a falência ou duração, são empregados para determinar quais variáveis independentes têm maior poder explicativo na previsão de falência de empresas. Consistem em uma abordagem alternativa aos modelos binários logit e probit, e à análise discriminante. Os modelos de duração deveriam ser mais eficientes que modelos de alternativas discretas, pois levam em consideração o tempo de sobrevivência para estimar a probabilidade instantânea de falência de um conjunto de observações sobre uma variável independente. Os modelos de alternativa discreta tipicamente ignoram a informação de tempo até a falência, e fornecem apenas a estimativa de falhar em um dado intervalo de tempo. A questão discutida neste trabalho é como utilizar modelos hazard para projetar taxas de inadimplência e construir matrizes de migração condicionadas ao estado da economia. Conceitualmente, o modelo é bastante análogo às taxas históricas de inadimplência e mortalidade utilizadas na literatura de crédito. O Modelo Semiparamétrico Proporcional de Cox é testado em empresas brasileiras não pertencentes ao setor financeiro, e observa-se que a probabilidade de inadimplência diminui sensivelmente após o terceiro ano da emissão do empréstimo. Observa-se também que a média e o desvio-padrão das probabilidades de inadimplência são afetados pelos ciclos econômicos. É discutido como o Modelo Proporcional de Cox pode ser incorporado aos quatro modelos mais famosos de gestão de risco .de crédito da atualidade: CreditRisk +, KMV, CreditPortfolio View e CreditMetrics, e as melhorias resultantes dessa incorporação
Resumo:
The objective of this paper is to evaluate the effect of the 1985 ”Employment Services for Ex-Offenders” (ESEO) program on recidivism. Initially, the sample has been split randomly in a control group and a treatment group. However, the actual treatment (mainly being job related counseling) only takes place conditional on finding a job, and not having been arrested, for those selected in the treatment group. We use a multiple proportional hazard model with unobserved heterogeneity for job seach and recidivism time which incorporates the conditional treatment effect. We find that the program helps to reduce criminal activity, contrary to the result of the previous analysis of this data set. This finding is important for crime prevention policy.
Resumo:
An approach using straight lines as features to solve the photogrammetric space resection problem is presented. An explicit mathematical model relating straight lines, in both object and image space, is used. Based on this model, Kalman Filtering is applied to solve the space resection problem. The recursive property of the filter is used in an iterative process which uses the sequentially estimated camera location parameters to feedback to the feature extraction process in the image. This feedback process leads to a gradual reduction of the image space for feature searching, and consequently eliminates the bottleneck due to the high computational cost of the image segmentation phase. It also enables feature extraction and the determination of feature correspondence in image and object space in an automatic way, i.e., without operator interference. Results obtained from simulated and real data show that highly accurate space resection parameters are obtained as well as a progressive processing time reduction. The obtained accuracy, the automatic correspondence process, and the short related processing time show that the proposed approach can be used in many real-time machine vision systems, making possible the implementation of applications not feasible until now.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Traditional software engineering approaches and metaphors fall short when applied to areas of growing relevance such as electronic commerce, enterprise resource planning, and mobile computing: such areas, in fact, generally call for open architectures that may evolve dynamically over time so as to accommodate new components and meet new requirements. This is probably one of the main reasons that the agent metaphor and the agent-oriented paradigm are gaining momentum in these areas. This thesis deals with the engineering of complex software systems in terms of the agent paradigm. This paradigm is based on the notions of agent and systems of interacting agents as fundamental abstractions for designing, developing and managing at runtime typically distributed software systems. However, today the engineer often works with technologies that do not support the abstractions used in the design of the systems. For this reason the research on methodologies becomes the basic point in the scientific activity. Currently most agent-oriented methodologies are supported by small teams of academic researchers, and as a result, most of them are in an early stage and still in the first context of mostly \academic" approaches for agent-oriented systems development. Moreover, such methodologies are not well documented and very often defined and presented only by focusing on specific aspects of the methodology. The role played by meta- models becomes fundamental for comparing and evaluating the methodologies. In fact a meta-model specifies the concepts, rules and relationships used to define methodologies. Although it is possible to describe a methodology without an explicit meta-model, formalising the underpinning ideas of the methodology in question is valuable when checking its consistency or planning extensions or modifications. A good meta-model must address all the different aspects of a methodology, i.e. the process to be followed, the work products to be generated and those responsible for making all this happen. In turn, specifying the work products that must be developed implies dening the basic modelling building blocks from which they are built. As a building block, the agent abstraction alone is not enough to fully model all the aspects related to multi-agent systems in a natural way. In particular, different perspectives exist on the role that environment plays within agent systems: however, it is clear at least that all non-agent elements of a multi-agent system are typically considered to be part of the multi-agent system environment. The key role of environment as a first-class abstraction in the engineering of multi-agent system is today generally acknowledged in the multi-agent system community, so environment should be explicitly accounted for in the engineering of multi-agent system, working as a new design dimension for agent-oriented methodologies. At least two main ingredients shape the environment: environment abstractions - entities of the environment encapsulating some functions -, and topology abstractions - entities of environment that represent the (either logical or physical) spatial structure. In addition, the engineering of non-trivial multi-agent systems requires principles and mechanisms for supporting the management of the system representation complexity. These principles lead to the adoption of a multi-layered description, which could be used by designers to provide different levels of abstraction over multi-agent systems. The research in these fields has lead to the formulation of a new version of the SODA methodology where environment abstractions and layering principles are exploited for en- gineering multi-agent systems.