869 resultados para PROPORTIONAL HAZARD AND ACCELERATED FAILURE MODELS
Resumo:
Despite compelling preclinical data in colorectal cancer (CRC), the efficacy of HDACIs has been disappointing in the clinic. The goal of this study was to evaluate the effectiveness of vorinostat and panobinostat in a dose- and exposure-dependent manner in order to better understand the dynamics of drug action and antitumor efficacy. In a standard 72 h drug exposure MTS assay, notable concentration-dependent antiproliferative effects were observed in the IC50 range of 1.2-2.8 μmol/L for vorinostat and 5.1-17.5 nmol/L for panobinostat. However, shorter clinically relevant exposures of 3 or 6 h failed to elicit any significant growth inhibition and in most cases a >24 h exposure to vorinostat or panobinostat was required to induce a sigmoidal dose-response. Similar results were observed in colony formation assays where ≥ 24 h of exposure was required to effectively reduce colony formation. Induction of acetyl-H3, acetyl-H4 and p21 by vorinostat were transient and rapidly reversed within 12 h of drug removal. In contrast, panobinostat-induced acetyl-H3, acetyl-H4, and p21 persisted for 48 h after an initial 3 h exposure. Treatment of HCT116 xenografts with panobinostat induced significant increases in acetyl-H3 and downregulation of thymidylate synthase after treatment. Although HDACIs exert both potent growth inhibition and cytotoxic effects when CRC cells were exposed to drug for ≥ 24 h, these cells demonstrate an inherent ability to survive HDACI concentrations and exposure times that exceed those clinically achievable. Continued efforts to develop novel HDACIs with improved pharmacokinetics/phamacodynamics, enhanced intratumoral delivery and class/isoform-specificity are needed to improve the therapeutic potential of HDACIs and HDACI-based combination regimens in solid tumors.
Resumo:
Ria deAveiro is a very complex shallow water coastal lagoon located on the northwest of Portugal. Important issues would be left unanswered without a good understanding of hydrodynamic and transport processes occurring in the lagoon. Calibration and validation of hydrodynamic, salt and heat transport models for Ria de Aveiro lagoon are presented. The calibration of the hydrodynamic model was performed adjusting the bottom friction coefficient, through the comparison between measured and predicted time series of sea surface elevation for 22 stations. Harmonic analysis was performed in order to evaluate the model's accuracy. To validate the hydrodynamic model measured and predicted SSE values were compared for 11 stations, as well as main flow direction velocities for 10 stations. The salt and heat transport models were calibrated comparing measured and predicted time series of salinity and water temperature for 7 stations, and the RMS of the difference between the series was determined. These models were validated comparing the model results with an independent field data set. The hydrodynamic and the salt and heat transport models for Ria de Aveiro were successfully calibrated and validated. They reproduce accurately the barotropic flows and can therefore adequately represent the salt and heat transport and the heat transfer processes occurring in Ria deAveiro.
Resumo:
The design of neuro-fuzzy models is still a complex problem, as it involves not only the determination of the model parameters, but also its structure. Of special importance is the incorporation of a priori information in the design process. In this paper two known design algorithms for B-spline models will be updated to account for function and derivatives equality restrictions, which are important when the neural model is used for performing single or multi-objective optimization on-line.
Resumo:
The use of preference-based measures of health in the measurement of Health Related Quality of Life has become widely used in health economics. Hence, the development of preference-based measures of health has been a major concern for researchers throughout the world. This study aims to model health state preference data using a new preference-based measure of health (the SF- 6D) and to suggest alternative models for predicting health state utilities using fixed and random effects models. It also seeks to investigate the problems found in the SF-6D and to suggest eventual changes to it.
Resumo:
In modern measurement and control systems, the available time and resources are often not only limited, but could change during the operation of the system. In these cases, the so-called anytime algorithms could be used advantageously. While diflerent soft computing methods are wide-spreadly used in system modeling, their usability in these cases are limited.
Resumo:
This article examines education reform under the first government of Northern Ireland (1921–5). This embryonic period offered the Ulster Unionist leadership a chance to construct a more inclusive society, one that might diminish sectarian animosities, and thereby secure the fledgling state through cooperation rather than coercion. Such aspirations were severely tested by the ruling party’s need to secure the state against insurgency, and, more lastingly, to assuage the concerns of its historic constituency. The former led to a draconian security policy, the latter to a dependency on populist strategies and rhetoric. It is argued here, however, that this dependency was not absolute until July 1925. Before that, the Belfast government withstood growing pressure from populist agitators to reverse controversial aspects of its education reforms, only relenting when Protestant disaffection threatened the unity of the governing party and the existence of the state.
Resumo:
In this paper we study a model for HIV and TB coinfection. We consider the integer order and the fractional order versions of the model. Let α∈[0.78,1.0] be the order of the fractional derivative, then the integer order model is obtained for α=1.0. The model includes vertical transmission for HIV and treatment for both diseases. We compute the reproduction number of the integer order model and HIV and TB submodels, and the stability of the disease free equilibrium. We sketch the bifurcation diagrams of the integer order model, for variation of the average number of sexual partners per person and per unit time, and the tuberculosis transmission rate. We analyze numerical results of the fractional order model for different values of α, including α=1. The results show distinct types of transients, for variation of α. Moreover, we speculate, from observation of the numerical results, that the order of the fractional derivative may behave as a bifurcation parameter for the model. We conclude that the dynamics of the integer and the fractional order versions of the model are very rich and that together these versions may provide a better understanding of the dynamics of HIV and TB coinfection.
Resumo:
BACKGROUND & AIMS: Hepatitis C virus (HCV) induces chronic infection in 50% to 80% of infected persons; approximately 50% of these do not respond to therapy. We performed a genome-wide association study to screen for host genetic determinants of HCV persistence and response to therapy. METHODS: The analysis included 1362 individuals: 1015 with chronic hepatitis C and 347 who spontaneously cleared the virus (448 were coinfected with human immunodeficiency virus [HIV]). Responses to pegylated interferon alfa and ribavirin were assessed in 465 individuals. Associations between more than 500,000 single nucleotide polymorphisms (SNPs) and outcomes were assessed by multivariate logistic regression. RESULTS: Chronic hepatitis C was associated with SNPs in the IL28B locus, which encodes the antiviral cytokine interferon lambda. The rs8099917 minor allele was associated with progression to chronic HCV infection (odds ratio [OR], 2.31; 95% confidence interval [CI], 1.74-3.06; P = 6.07 x 10(-9)). The association was observed in HCV mono-infected (OR, 2.49; 95% CI, 1.64-3.79; P = 1.96 x 10(-5)) and HCV/HIV coinfected individuals (OR, 2.16; 95% CI, 1.47-3.18; P = 8.24 x 10(-5)). rs8099917 was also associated with failure to respond to therapy (OR, 5.19; 95% CI, 2.90-9.30; P = 3.11 x 10(-8)), with the strongest effects in patients with HCV genotype 1 or 4. This risk allele was identified in 24% of individuals with spontaneous HCV clearance, 32% of chronically infected patients who responded to therapy, and 58% who did not respond (P = 3.2 x 10(-10)). Resequencing of IL28B identified distinct haplotypes that were associated with the clinical phenotype. CONCLUSIONS: The association of the IL28B locus with natural and treatment-associated control of HCV indicates the importance of innate immunity and interferon lambda in the pathogenesis of HCV infection.
Hydraulic and fluvial geomorphological models for a bedrock channel reach of the Twenty Mile Creek /
Resumo:
Bedrock channels have been considered challenging geomorphic settings for the application of numerical models. Bedrock fluvial systems exhibit boundaries that are typically less mobile than alluvial systems, yet they are still dynamic systems with a high degree of spatial and temporal variability. To understand the variability of fluvial systems, numerical models have been developed to quantify flow magnitudes and patterns as the driving force for geomorphic change. Two types of numerical model were assessed for their efficacy in examining the bedrock channel system consisting of a high gradient portion of the Twenty Mile Creek in the Niagara Region of Ontario, Canada. A one-dimensional (1-D) flow model that utilizes energy equations, HEC RAS, was used to determine velocity distributions through the study reach for the mean annual flood (MAF), the 100-year return flood and the 1,000-year return flood. A two-dimensional (2-D) flow model that makes use of Navier-Stokes equations, RMA2, was created with the same objectives. The 2-D modeling effort was not successful due to the spatial complexity of the system (high slope and high variance). The successful 1 -D model runs were further extended using very high resolution geospatial interpolations inherent to the HEC RAS extension, HEC geoRAS. The modeled velocity data then formed the basis for the creation of a geomorphological analysis that focused upon large particles (boulders) and the forces needed to mobilize them. Several existing boulders were examined by collecting detailed measurements to derive three-dimensional physical models for the application of fluid and solid mechanics to predict movement in the study reach. An imaginary unit cuboid (1 metre by 1 metre by 1 metre) boulder was also envisioned to determine the general propensity for the movement of such a boulder through the bedrock system. The efforts and findings of this study provide a standardized means for the assessment of large particle movement in a bedrock fluvial system. Further efforts may expand upon this standardization by modeling differing boulder configurations (platy boulders, etc.) at a high level of resolution.
Resumo:
The purpose of this study is to examine the impact of the choice of cut-off points, sampling procedures, and the business cycle on the accuracy of bankruptcy prediction models. Misclassification can result in erroneous predictions leading to prohibitive costs to firms, investors and the economy. To test the impact of the choice of cut-off points and sampling procedures, three bankruptcy prediction models are assessed- Bayesian, Hazard and Mixed Logit. A salient feature of the study is that the analysis includes both parametric and nonparametric bankruptcy prediction models. A sample of firms from Lynn M. LoPucki Bankruptcy Research Database in the U. S. was used to evaluate the relative performance of the three models. The choice of a cut-off point and sampling procedures were found to affect the rankings of the various models. In general, the results indicate that the empirical cut-off point estimated from the training sample resulted in the lowest misclassification costs for all three models. Although the Hazard and Mixed Logit models resulted in lower costs of misclassification in the randomly selected samples, the Mixed Logit model did not perform as well across varying business-cycles. In general, the Hazard model has the highest predictive power. However, the higher predictive power of the Bayesian model, when the ratio of the cost of Type I errors to the cost of Type II errors is high, is relatively consistent across all sampling methods. Such an advantage of the Bayesian model may make it more attractive in the current economic environment. This study extends recent research comparing the performance of bankruptcy prediction models by identifying under what conditions a model performs better. It also allays a range of user groups, including auditors, shareholders, employees, suppliers, rating agencies, and creditors' concerns with respect to assessing failure risk.
Resumo:
Although Insurers Face Adverse Selection and Moral Hazard When They Set Insurance Contracts, These Two Types of Asymmetrical Information Have Been Given Separate Treatments Sofar in the Economic Literature. This Paper Is a First Attempt to Integrate Both Problems Into a Single Model. We Show How It Is Possible to Use Time in Order to Achieve a First-Best Allocation of Risks When Both Problems Are Present Simultaneously.
Resumo:
The GARCH and Stochastic Volatility paradigms are often brought into conflict as two competitive views of the appropriate conditional variance concept : conditional variance given past values of the same series or conditional variance given a larger past information (including possibly unobservable state variables). The main thesis of this paper is that, since in general the econometrician has no idea about something like a structural level of disaggregation, a well-written volatility model should be specified in such a way that one is always allowed to reduce the information set without invalidating the model. To this respect, the debate between observable past information (in the GARCH spirit) versus unobservable conditioning information (in the state-space spirit) is irrelevant. In this paper, we stress a square-root autoregressive stochastic volatility (SR-SARV) model which remains true to the GARCH paradigm of ARMA dynamics for squared innovations but weakens the GARCH structure in order to obtain required robustness properties with respect to various kinds of aggregation. It is shown that the lack of robustness of the usual GARCH setting is due to two very restrictive assumptions : perfect linear correlation between squared innovations and conditional variance on the one hand and linear relationship between the conditional variance of the future conditional variance and the squared conditional variance on the other hand. By relaxing these assumptions, thanks to a state-space setting, we obtain aggregation results without renouncing to the conditional variance concept (and related leverage effects), as it is the case for the recently suggested weak GARCH model which gets aggregation results by replacing conditional expectations by linear projections on symmetric past innovations. Moreover, unlike the weak GARCH literature, we are able to define multivariate models, including higher order dynamics and risk premiums (in the spirit of GARCH (p,p) and GARCH in mean) and to derive conditional moment restrictions well suited for statistical inference. Finally, we are able to characterize the exact relationships between our SR-SARV models (including higher order dynamics, leverage effect and in-mean effect), usual GARCH models and continuous time stochastic volatility models, so that previous results about aggregation of weak GARCH and continuous time GARCH modeling can be recovered in our framework.