950 resultados para AFT Models for Crash Duration Survival Analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Se realizó un estudio descriptivo, retrospectivo; se usó la base de datos de los aislamientos microbiológicos documentados en las UCI de la Fundación Santa fe de Bogotá para el año 2014. La prevalencia de bacterias resistentes en los aislamientos de la FSFB no es baja, por lo que se requiere una terapia empírica acertada acorde con la flora local. Se requieren estudios analíticos para evaluar factores asociados al desarrollo de gérmenes multi resistentes y mortalidad por sepsis

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introducción: A partir de la década de los cincuenta el manejo de la enfermedad valvular presenta cambios significativos cuando se incorporan los reemplazos valvulares tanto mecánicos como biológicos dentro de las opciones de tratamiento quirúrgico (1). Las válvulas biológicas se desarrollaron como una alternativa que buscaba evitar los problemas relacionados con la anticoagulación y con la idea de utilizar un tejido que se comportara hemodinámicamente como el nativo. Este estudio está enfocado en establecer la sobrevida global y la libertad de reoperación de la válvula de los pacientes sometidos a reemplazo valvular aórtico y mitral biológicos en la Fundación Cardioinfantil - IC a 1, 3, 5 y 10 años. Materiales y métodos: Estudio de cohorte retrospectiva de supervivencia de pacientes sometidos a reemplazo valvular aórtico y/o mitral biológico intervenidos en la Fundación Cardioinfantil entre 2005 y 2013. Resultados: Se obtuvieron 919 pacientes incluidos en el análisis general y 876 (95,3%) pacientes con seguimiento efectivo para el análisis de sobrevida. La edad promedio fue 64años. La sobrevida a 1, 3, 5 y 10 años fue 95%,90%,85% y 69% respectivamente. El seguimiento efectivo para el desenlace reoperación fue del 55% y se encontró una libertad de reoperación del 99%, 96%, 93% y 81% a los 1, 3, 5 y 10 años. No hubo diferencias significativas entre la localización de la válvula ni en el tipo de válvula aortica empleada. Conclusiones: La sobrevida de los pacientes que son llevados a reemplazo valvular biológico en este estudio es comparable a grandes cohortes internacionales. La sobrevida de los pacientes llevados a reemplazo valvular con prótesis biológicas en posición mitral y aortica fue similar a 1, 3, 5 y 10 años.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linear response functions are implemented for a vibrational configuration interaction state allowing accurate analytical calculations of pure vibrational contributions to dynamical polarizabilities. Sample calculations are presented for the pure vibrational contributions to the polarizabilities of water and formaldehyde. We discuss the convergence of the results with respect to various details of the vibrational wave function description as well as the potential and property surfaces. We also analyze the frequency dependence of the linear response function and the effect of accounting phenomenologically for the finite lifetime of the excited vibrational states. Finally, we compare the analytical response approach to a sum-over-states approach

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cloud radar and lidar can be used to evaluate the skill of numerical weather prediction models in forecasting the timing and placement of clouds, but care must be taken in choosing the appropriate metric of skill to use due to the non- Gaussian nature of cloud-fraction distributions. We compare the properties of a number of different verification measures and conclude that of existing measures the Log of Odds Ratio is the most suitable for cloud fraction. We also propose a new measure, the Symmetric Extreme Dependency Score, which has very attractive properties, being equitable (for large samples), difficult to hedge and independent of the frequency of occurrence of the quantity being verified. We then use data from five European ground-based sites and seven forecast models, processed using the ‘Cloudnet’ analysis system, to investigate the dependence of forecast skill on cloud fraction threshold (for binary skill scores), height, horizontal scale and (for the Met Office and German Weather Service models) forecast lead time. The models are found to be least skillful at predicting the timing and placement of boundary-layer clouds and most skilful at predicting mid-level clouds, although in the latter case they tend to underestimate mean cloud fraction when cloud is present. It is found that skill decreases approximately inverse-exponentially with forecast lead time, enabling a forecast ‘half-life’ to be estimated. When considering the skill of instantaneous model snapshots, we find typical values ranging between 2.5 and 4.5 days. Copyright c 2009 Royal Meteorological Society

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An underestimate of atmospheric blocking occurrence is a well-known limitation of many climate models. This article presents an analysis of Northern Hemisphere winter blocking in an atmospheric model with increased horizontal resolution. European blocking frequency increases with model resolution, and this results from an improvement in the atmospheric patterns of variability as well as a simple improvement in the mean state. There is some evidence that the transient eddy momentum forcing of European blocks is increased at high resolution, which could account for this. However, it is also shown that the increase in resolution of the orography is needed to realise the improvement in blocking, consistent with the increase in height of the Rocky Mountains acting to increase the tilt of the Atlantic jet stream and giving higher mean geopotential heights over northern Europe. Blocking frequencies in the Pacific sector are also increased with atmospheric resolution, but in this case the improvement in orography actually leads to a decrease in blocking

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is an empirical-based study of the European Union’s Emissions Trading Scheme (EU ETS) and its implications in terms of corporate environmental and financial performance. The novelty of this study includes the extended scope of the data coverage, as most previous studies have examined only the power sector. The use of verified emissions data of ETS-regulated firms as the environmental compliance measure and as the potential differentiating criteria that concern the valuation of EU ETS-exposed firms in the stock market is also an original aspect of this study. The study begins in Chapter 2 by introducing the background information on the emission trading system (ETS), which focuses on (i) the adoption of ETS as an environmental management instrument and (ii) the adoption of ETS by the European Union as one of its central climate policies. Chapter 3 surveys four databases that provide carbon emissions data in order to determine the most suitable source of the data to be used in the later empirical chapters. The first empirical chapter, which is also Chapter 4 of this thesis, investigates the determinants of the emissions compliance performance of the EU ETS-exposed firms through constructing the best possible performance ratio from verified emissions data and self-configuring models for a panel regression analysis. Chapter 5 examines the impacts on the EU ETS-exposed firms in terms of their equity valuation with customised portfolios and multi-factor market models. The research design takes into account the emissions allowance (EUA) price as an additional factor, as it has the most direct association with the EU ETS to control for the exposure. The final empirical Chapter 6 takes the investigation one step further, by specifically testing the degree of ETS exposure facing different sectors with sector-based portfolios and an extended multi-factor market model. The findings from the emissions performance ratio analysis show that the business model of firms significantly influences emissions compliance, as the capital intensity has a positive association with the increasing emissions-to-emissions cap ratio. Furthermore, different sectors show different degrees of sensitivity towards the determining factors. The production factor influences the performance ratio of the Utilities sector, but not the Energy or Materials sectors. The results show that the capital intensity has a more profound influence on the utilities sector than on the materials sector. With regard to the financial performance impact, ETS-exposed firms as aggregate portfolios experienced a substantial underperformance during the 2001–2004 period, but not in the operating period of 2005–2011. The results of the sector-based portfolios show again the differentiating effect of the EU ETS on sectors, as one sector is priced indifferently against its benchmark, three sectors see a constant underperformance, and three sectors have altered outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study determined the sensory shelf life of a commercial brand of chocolate and carrot cupcakes, aiming at increasing the current 120 days of shelf life to 180. Appearance, texture, flavor and overall quality of cakes stored at six different storage times were evaluated by 102 consumers. The data were analyzed by analysis of variance and linear regression. For both flavors, the texture presented a greater loss in acceptance during the storage period, showing an acceptance mean close to indifference on the hedonic scale at 120 days. Nevertheless, appearance, flavor and overall quality stayed acceptable up to 150 days. The end of shelf life was estimated at about 161 days for chocolate cakes and 150 days for carrot cakes. This study showed that the current 120 days of shelf life can be extended to 150 days for carrot cake and to 160 days for chocolate cake. However, the 180 days of shelf life desired by the company were not achieved. PRACTICAL APPLICATIONS This research shows the adequacy of using sensory acceptance tests to determine the shelf life of two food products (chocolate and carrot cupcakes). This practical application is useful because the precise determination of the shelf life of a food product is of vital importance for its commercial success. The maximum storage time should always be evaluated in the development or reformulation of new products, changes in packing or storage conditions. Once the physical-chemical and microbiological stability of a product is guaranteed, sensorial changes that could affect consumer acceptance will determine the end of the shelf life of a food product. Thus, the use of sensitive and reliable methods to estimate the sensory shelf life of a product is very important. Findings show the importance of determining the shelf life of each product separately and to avoid using the shelf time estimated for a specific product on other, similar products.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we proposed a new two-parameter lifetime distribution with increasing failure rate, the complementary exponential geometric distribution, which is complementary to the exponential geometric model proposed by Adamidis and Loukas (1998). The new distribution arises on a latent complementary risks scenario, in which the lifetime associated with a particular risk is not observable; rather, we observe only the maximum lifetime value among all risks. The properties of the proposed distribution are discussed, including a formal proof of its probability density function and explicit algebraic formulas for its reliability and failure rate functions, moments, including the mean and variance, variation coefficient, and modal value. The parameter estimation is based on the usual maximum likelihood approach. We report the results of a misspecification simulation study performed in order to assess the extent of misspecification errors when testing the exponential geometric distribution against our complementary one in the presence of different sample size and censoring percentage. The methodology is illustrated on four real datasets; we also make a comparison between both modeling approaches. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When missing data occur in studies designed to compare the accuracy of diagnostic tests, a common, though naive, practice is to base the comparison of sensitivity, specificity, as well as of positive and negative predictive values on some subset of the data that fits into methods implemented in standard statistical packages. Such methods are usually valid only under the strong missing completely at random (MCAR) assumption and may generate biased and less precise estimates. We review some models that use the dependence structure of the completely observed cases to incorporate the information of the partially categorized observations into the analysis and show how they may be fitted via a two-stage hybrid process involving maximum likelihood in the first stage and weighted least squares in the second. We indicate how computational subroutines written in R may be used to fit the proposed models and illustrate the different analysis strategies with observational data collected to compare the accuracy of three distinct non-invasive diagnostic methods for endometriosis. The results indicate that even when the MCAR assumption is plausible, the naive partial analyses should be avoided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article, the prevailing official view of supervision as a regulatory instrument is examined as it applies to the social services sector in Sweden. The study is based on a comparison of the views expressed on the design of supervision as a regulatory instrument by two government commissions, the Supervision Commission and the Commission on Supervision within the Social Services (UTIS), and on the positions taken by the Government regarding the definitions of the concept of supervision proposed by these commissions. The view of supervision as a regulatory instrument expressed in these policy documents is analysed with the help of a theoretical framework describing the components, their functions and the governance characteristics of control systems. In the framework separate interrelated characteristics of the components are identified and summarized into two models of control systems. The analysis shows that supervision in the Swedish social services sector can be described in terms of both a disciplinary and non-disciplinary system. By its system theoretical basis and the identification of interrelated characteristics the study contributes to a broadened understanding of the construction and functions of supervision as a regulatory instrument and of how supervision within the Swedish social sector is meant to be designed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Home-management of malaria (HMM) strategy improves early access of anti-malarial medicines to high-risk groups in remote areas of sub-Saharan Africa. However, limited data are available on the effectiveness of using artemisinin-based combination therapy (ACT) within the HMM strategy. The aim of this study was to assess the effectiveness of artemether-lumefantrine (AL), presently the most favoured ACT in Africa, in under-five children with uncomplicated Plasmodium falciparum malaria in Tanzania, when provided by community health workers (CHWs) and administered unsupervised by parents or guardians at home. Methods: An open label, single arm prospective study was conducted in two rural villages with high malaria transmission in Kibaha District, Tanzania. Children presenting to CHWs with uncomplicated fever and a positive rapid malaria diagnostic test (RDT) were provisionally enrolled and provided AL for unsupervised treatment at home. Patients with microscopy confirmed P. falciparum parasitaemia were definitely enrolled and reviewed weekly by the CHWs during 42 days. Primary outcome measure was PCR corrected parasitological cure rate by day 42, as estimated by Kaplan-Meier survival analysis. This trial is registered with ClinicalTrials.gov, number NCT00454961. Results: A total of 244 febrile children were enrolled between March-August 2007. Two patients were lost to follow up on day 14, and one patient withdrew consent on day 21. Some 141/241 (58.5%) patients had recurrent infection during follow-up, of whom 14 had recrudescence. The PCR corrected cure rate by day 42 was 93.0% (95% CI 88.3%-95.9%). The median lumefantrine concentration was statistically significantly lower in patients with recrudescence (97 ng/mL [IQR 0-234]; n = 10) compared with reinfections (205 ng/mL [114-390]; n = 92), or no parasite reappearance (217 [121-374] ng/mL; n = 70; p <= 0.046). Conclusions: Provision of AL by CHWs for unsupervised malaria treatment at home was highly effective, which provides evidence base for scaling-up implementation of HMM with AL in Tanzania.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis develops and evaluates statistical methods for different types of genetic analyses, including quantitative trait loci (QTL) analysis, genome-wide association study (GWAS), and genomic evaluation. The main contribution of the thesis is to provide novel insights in modeling genetic variance, especially via random effects models. In variance component QTL analysis, a full likelihood model accounting for uncertainty in the identity-by-descent (IBD) matrix was developed. It was found to be able to correctly adjust the bias in genetic variance component estimation and gain power in QTL mapping in terms of precision.  Double hierarchical generalized linear models, and a non-iterative simplified version, were implemented and applied to fit data of an entire genome. These whole genome models were shown to have good performance in both QTL mapping and genomic prediction. A re-analysis of a publicly available GWAS data set identified significant loci in Arabidopsis that control phenotypic variance instead of mean, which validated the idea of variance-controlling genes.  The works in the thesis are accompanied by R packages available online, including a general statistical tool for fitting random effects models (hglm), an efficient generalized ridge regression for high-dimensional data (bigRR), a double-layer mixed model for genomic data analysis (iQTL), a stochastic IBD matrix calculator (MCIBD), a computational interface for QTL mapping (qtl.outbred), and a GWAS analysis tool for mapping variance-controlling loci (vGWAS).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

No próximo ano, completam-se 40 anos desde a primeira tentativa de transplante hepático (TxH) em seres humanos. Há quase 20 anos, o transplante (Tx) tornou-se uma opção terapêutica real para os pacientes portadores de doença hepática terminal. Atualmente, o TxH é o tratamento de escolha para diversas enfermidades hepáticas, agudas ou crônicas. Dos transplantes realizados na Europa ou nos EUA, em torno de 12% dos pacientes são crianças e adolescentes. No Brasil, 20,9% dos pacientes transplantados de fígado em 2001 tinham até 18 anos de idade e, destes, 60,7% tinham 5 anos ou menos. O objetivo do TxH é a manutenção da vida dos pacientes com doença hepática irreversível, e a principal forma de avaliação de sucesso é a sobrevida após o Tx. A primeira semana que se segue ao TxH, apesar dos excelentes progressos dos últimos anos, continua sendo o período mais crítico. A maioria dos óbitos ou das perdas do enxerto ocorrem nas primeiras semanas, em particular, nos primeiros 7 dias de TxH. Diversos fatores de risco para o resultado do TxH podem ser identificados na literatura, porém há poucos estudos específicos do Tx pediátrico. As crianças pequenas apresentam características particulares que os diferenciam do Tx nos adultos e nas crianças maiores. Com o objetivo de identificar fatores de risco para o óbito nos 7 primeiros dias após os transplantes hepáticos eletivos realizados em 45 crianças e adolescentes no Hospital de Clínicas de Porto Alegre entre março de 1995 e agosto de 2001, foi realizado um estudo de caso-controle. Entre os 6 casos (13,3%) e os 39 controles foram comparadas características relacionadas ao receptor, ao doador e ao procedimento cirúrgico e modelos prognósticos. Das variáveis relacionadas ao receptor, o gênero, o escore Z do peso e da estatura para a idade, a atresia de vias biliares, a cirurgia abdominal prévia, a cirurgia de Kasai, a história de ascite, de peritonite bacteriana espontânea, de hemorragia digestiva e de síndrome hepatopulmonar, a albuminemia, o INR, o tempo de tromboplastina parcial ativada e o fator V não foram associados com o óbito na primeira semana. A mortalidade inicial foi maior nas crianças com menor idade (p=0,0035), peso (p=0,0062) e estatura (p<0,0001), bilirrubinemia total (BT) (p=0,0083) e bilirrubinemia não conjugada (BNC) (p=0,0024) elevadas, e colesterolemia reduzida (p=0,0385). Os receptores menores de 3 anos tiveram um risco 25,5 vezes maior de óbito que as crianças maiores (IC 95%: 1,3–487,7). A chance de óbito após o Tx dos pacientes com BT superior a 20 mg/dL e BNC maior que 6 mg/dL foi 7,8 (IC95%: 1,2–50,1) e 12,7 (IC95%: 1,3–121,7) vezes maior que daqueles com níveis inferiores, respectivamente. Das características relacionadas ao doador e ao Tx, as variáveis gênero, doador de gênero e grupo sangüíneo ABO não idênticos ao do receptor, razão peso do doador/receptor, causa do óbito do doador, enxerto reduzido, tempo em lista de espera e experiência do Programa não foram associados com o óbito nos primeiros 7 dias. Transplantes com enxertos de doadores de idade até 3 anos, ou de peso até 12 Kg representaram risco para o óbito dos receptores 6,8 (IC95%: 1,1–43,5) e 19,3 (IC95%: 1,3–281,6) vezes maior, respectivamente. O tempo de isquemia total foi em média de 2 horas maior nos transplantes dos receptores não sobreviventes (p=0,0316). Os modelos prognósticos Child-Pugh, Rodeck e UNOS não foram preditivos do óbito. Os pacientes classificados como alto risco no modelo de Malatack apresentaram razão de chances para o óbito 18,0 (IC95%: 1,2–262,7) vezes maior que aqueles com baixo risco. A mortalidade na primeira semana foi associada a valores elevados do escore PELD. O risco de óbito foi de 11,3 (IC95%: 1,2–107,0) nas crianças com valor do PELD maior que 10. As crianças pequenas e com maior disfunção hepática apresentaram maior risco de óbito precoce. Doador de pequeno porte e prolongamento do tempo de isquemia também foram associados à mortalidade. Somente os modelos de Malatack e PELD foram preditivos da sobrevida.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este relatório faz uma análise teórica das diversas correntes em estratégia empresarial sobre a questão da vantagem competitiva. A análise aqui apresentada se fundamenta em um modelo bidimensional que classifica as teorias de estratégia em quatro grupos: (1) Análise de Indústria, (2) Processo de Mercado, (3) Teoria dos Recursos e (4) Competências Dinâmicas. As noções de vantagem competitiva inerentes a cada visão são descritas e comparadas. Uma segunda secção apresenta os resultados de uma análise visando selecionar um conjunto de empresas brasileiras de alto desempenho, e um grupo de controle a partir do cálculo do retorno médio sobre o patrimônio líquido. Uma base dados em MS Access é parte integrante deste relatório.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O objetivo desse trabalho é encontrar uma medida dinâmica de liquidez de ações brasileiras, chamada VNET. Foram utilizados dados de alta frequência para criar um modelo capaz de medir o excesso de compras e vendas associadas a um movimento de preços. Ao variar no tempo, o VNET pode ser entendido como a variação da proporção de agentes informados em um modelo de informação assimétrica. Uma vez estimado, ele pode ser utilizado para prever mudanças na liquidez de uma ação. O VNET tem implicações práticas importantes, podendo ser utilizado por operadores como uma medida estocástica para identificar quais seriam os melhores momentos para operar. Gerentes de risco também podem estimar a deterioração de preço esperada ao se liquidar uma posição, sendo possível analisar suas diversas opções, servindo de base para otimização da execução. Na construção do trabalho encontramos as durações de preço de cada ação e as diversas medidas associadas a elas. Com base nos dados observa-se que a profundidade varia com ágio de compra e venda, com o volume negociado, com o numero de negócios, com a duração de preços condicional e com o seu erro de previsão. Os resíduos da regressão de VNET se mostraram bem comportados o que corrobora a hipótese de que o modelo foi bem especificado. Para estimar a curva de reação do mercado, variamos os intervalos de preço usados na definição das durações.