993 resultados para Hazard Models
Resumo:
BACKGROUND Several evidences indicate that gut microbiota is involved in the control of host energy metabolism. OBJECTIVE To evaluate the differences in the composition of gut microbiota in rat models under different nutritional status and physical activity and to identify their associations with serum leptin and ghrelin levels. METHODS IN A CASE CONTROL STUDY, FORTY MALE RATS WERE RANDOMLY ASSIGNED TO ONE OF THESE FOUR EXPERIMENTAL GROUPS: ABA group with food restriction and free access to exercise; control ABA group with food restriction and no access to exercise; exercise group with free access to exercise and feed ad libitum and ad libitum group without access to exercise and feed ad libitum. The fecal bacteria composition was investigated by PCR-denaturing gradient gel electrophoresis and real-time qPCR. RESULTS In restricted eaters, we have found a significant increase in the number of Proteobacteria, Bacteroides, Clostridium, Enterococcus, Prevotella and M. smithii and a significant decrease in the quantities of Actinobacteria, Firmicutes, Bacteroidetes, B. coccoides-E. rectale group, Lactobacillus and Bifidobacterium with respect to unrestricted eaters. Moreover, a significant increase in the number of Lactobacillus, Bifidobacterium and B. coccoides-E. rectale group was observed in exercise group with respect to the rest of groups. We also found a significant positive correlation between the quantity of Bifidobacterium and Lactobacillus and serum leptin levels, and a significant and negative correlation among the number of Clostridium, Bacteroides and Prevotella and serum leptin levels in all experimental groups. Furthermore, serum ghrelin levels were negatively correlated with the quantity of Bifidobacterium, Lactobacillus and B. coccoides-Eubacterium rectale group and positively correlated with the number of Bacteroides and Prevotella. CONCLUSIONS Nutritional status and physical activity alter gut microbiota composition affecting the diversity and similarity. This study highlights the associations between gut microbiota and appetite-regulating hormones that may be important in terms of satiety and host metabolism.
Resumo:
A variety of host immunogenetic factors appear to influence both an individual's susceptibility to infection with Mycobacterium leprae and the pathologic course of the disease. Animal models can contribute to a better understanding of the role of immunogenetics in leprosy through comparative studies helping to confirm the significance of various identified traits and in deciphering the underlying mechanisms that may be involved in expression of different disease related phenotypes. Genetically engineered mice, with specific immune or biochemical pathway defects, are particularly useful for investigating granuloma formation and resistance to infection and are shedding new light on borderline areas of the leprosy spectrum which are clinically unstable and have a tendency toward immunological complications. Though armadillos are less developed in this regard, these animals are the only other natural hosts of M. leprae and they present a unique opportunity for comparative study of genetic markers and mechanisms associable with disease susceptibility or resistance, especially the neurological aspects of leprosy. In this paper, we review the recent contributions of genetically engineered mice and armadillos toward our understanding of the immunogenetics of leprosy.
Resumo:
BACKGROUND Observational studies implicate higher dietary energy density (DED) as a potential risk factor for weight gain and obesity. It has been hypothesized that DED may also be associated with risk of type 2 diabetes (T2D), but limited evidence exists. Therefore, we investigated the association between DED and risk of T2D in a large prospective study with heterogeneity of dietary intake. METHODOLOGY/PRINCIPAL FINDINGS A case-cohort study was nested within the European Prospective Investigation into Cancer (EPIC) study of 340,234 participants contributing 3.99 million person years of follow-up, identifying 12,403 incident diabetes cases and a random subcohort of 16,835 individuals from 8 European countries. DED was calculated as energy (kcal) from foods (except beverages) divided by the weight (gram) of foods estimated from dietary questionnaires. Prentice-weighted Cox proportional hazard regression models were fitted by country. Risk estimates were pooled by random effects meta-analysis and heterogeneity was evaluated. Estimated mean (sd) DED was 1.5 (0.3) kcal/g among cases and subcohort members, varying across countries (range 1.4-1.7 kcal/g). After adjustment for age, sex, smoking, physical activity, alcohol intake, energy intake from beverages and misreporting of dietary intake, no association was observed between DED and T2D (HR 1.02 (95% CI: 0.93-1.13), which was consistent across countries (I(2) = 2.9%). CONCLUSIONS/SIGNIFICANCE In this large European case-cohort study no association between DED of solid and semi-solid foods and risk of T2D was observed. However, despite the fact that there currently is no conclusive evidence for an association between DED and T2DM risk, choosing low energy dense foods should be promoted as they support current WHO recommendations to prevent chronic diseases.
Resumo:
OBJECTIVES: To assess the extent to which stage at diagnosis and adherence to treatment guidelines may explain the persistent differences in colorectal cancer survival between the USA and Europe. DESIGN: A high-resolution study using detailed clinical data on Dukes' stage, diagnostic procedures, treatment and follow-up, collected directly from medical records by trained abstractors under a single protocol, with standardised quality control and central statistical analysis. SETTING AND PARTICIPANTS: 21 population-based registries in seven US states and nine European countries provided data for random samples comprising 12 523 adults (15-99 years) diagnosed with colorectal cancer during 1996-1998. OUTCOME MEASURES: Logistic regression models were used to compare adherence to 'standard care' in the USA and Europe. Net survival and excess risk of death were estimated with flexible parametric models. RESULTS: The proportion of Dukes' A and B tumours was similar in the USA and Europe, while that of Dukes' C was more frequent in the USA (38% vs 21%) and of Dukes' D more frequent in Europe (22% vs 10%). Resection with curative intent was more frequent in the USA (85% vs 75%). Elderly patients (75-99 years) were 70-90% less likely to receive radiotherapy and chemotherapy. Age-standardised 5-year net survival was similar in the USA (58%) and Northern and Western Europe (54-56%) and lowest in Eastern Europe (42%). The mean excess hazard up to 5 years after diagnosis was highest in Eastern Europe, especially among elderly patients and those with Dukes' D tumours. CONCLUSIONS: The wide differences in colorectal cancer survival between Europe and the USA in the late 1990s are probably attributable to earlier stage and more extensive use of surgery and adjuvant treatment in the USA. Elderly patients with colorectal cancer received surgery, chemotherapy or radiotherapy less often than younger patients, despite evidence that they could also have benefited.
Resumo:
BACKGROUND The purpose of this study was to assess the incidence of neurological complications in patients with infective endocarditis, the risk factors for their development, their influence on the clinical outcome, and the impact of cardiac surgery. METHODS AND RESULTS This was a retrospective analysis of prospectively collected data on a multicenter cohort of 1345 consecutive episodes of left-sided infective endocarditis from 8 centers in Spain. Cox regression models were developed to analyze variables predictive of neurological complications and associated mortality. Three hundred forty patients (25%) experienced such complications: 192 patients (14%) had ischemic events, 86 (6%) had encephalopathy/meningitis, 60 (4%) had hemorrhages, and 2 (1%) had brain abscesses. Independent risk factors associated with all neurological complications were vegetation size ≥3 cm (hazard ratio [HR] 1.91), Staphylococcus aureus as a cause (HR 2.47), mitral valve involvement (HR 1.29), and anticoagulant therapy (HR 1.31). This last variable was particularly related to a greater incidence of hemorrhagic events (HR 2.71). Overall mortality was 30%, and neurological complications had a negative impact on outcome (45% of deaths versus 24% in patients without these complications; P<0.01), although only moderate to severe ischemic stroke (HR 1.63) and brain hemorrhage (HR 1.73) were significantly associated with a poorer prognosis. Antimicrobial treatment reduced (by 33% to 75%) the risk of neurological complications. In patients with hemorrhage, mortality was higher when surgery was performed within 4 weeks of the hemorrhagic event (75% versus 40% in later surgery). CONCLUSIONS Moderate to severe ischemic stroke and brain hemorrhage were found to have a significant negative impact on the outcome of infective endocarditis. Early appropriate antimicrobial treatment is critical, and transitory discontinuation of anticoagulant therapy should be considered.
Resumo:
Background Multiple logistic regression is precluded from many practical applications in ecology that aim to predict the geographic distributions of species because it requires absence data, which are rarely available or are unreliable. In order to use multiple logistic regression, many studies have simulated "pseudo-absences" through a number of strategies, but it is unknown how the choice of strategy influences models and their geographic predictions of species. In this paper we evaluate the effect of several prevailing pseudo-absence strategies on the predictions of the geographic distribution of a virtual species whose "true" distribution and relationship to three environmental predictors was predefined. We evaluated the effect of using a) real absences b) pseudo-absences selected randomly from the background and c) two-step approaches: pseudo-absences selected from low suitability areas predicted by either Ecological Niche Factor Analysis: (ENFA) or BIOCLIM. We compared how the choice of pseudo-absence strategy affected model fit, predictive power, and information-theoretic model selection results. Results Models built with true absences had the best predictive power, best discriminatory power, and the "true" model (the one that contained the correct predictors) was supported by the data according to AIC, as expected. Models based on random pseudo-absences had among the lowest fit, but yielded the second highest AUC value (0.97), and the "true" model was also supported by the data. Models based on two-step approaches had intermediate fit, the lowest predictive power, and the "true" model was not supported by the data. Conclusion If ecologists wish to build parsimonious GLM models that will allow them to make robust predictions, a reasonable approach is to use a large number of randomly selected pseudo-absences, and perform model selection based on an information theoretic approach. However, the resulting models can be expected to have limited fit.
Resumo:
Heart tissue inflammation, progressive fibrosis and electrocardiographic alterations occur in approximately 30% of patients infected by Trypanosoma cruzi, 10-30 years after infection. Further, plasma levels of tumour necrosis factor (TNF) and nitric oxide (NO) are associated with the degree of heart dysfunction in chronic chagasic cardiomyopathy (CCC). Thus, our aim was to establish experimental models that mimic a range of parasitological, pathological and cardiac alterations described in patients with chronic Chagas’ heart disease and evaluate whether heart disease severity was associated with increased TNF and NO levels in the serum. Our results show that C3H/He mice chronically infected with the Colombian T. cruzi strain have more severe cardiac parasitism and inflammation than C57BL/6 mice. In addition, connexin 43 disorganisation and fibronectin deposition in the heart tissue, increased levels of creatine kinase cardiac MB isoenzyme activity in the serum and more severe electrical abnormalities were observed in T. cruzi-infected C3H/He mice compared to C57BL/6 mice. Therefore, T. cruzi-infected C3H/He and C57BL/6 mice represent severe and mild models of CCC, respectively. Moreover, the CCC severity paralleled the TNF and NO levels in the serum. Therefore, these models are appropriate for studying the pathophysiology and biomarkers of CCC progression, as well as for testing therapeutic agents for patients with Chagas’ heart disease.
Resumo:
Aquest és un estudi retrospectiu que compara la mobilitat i el conflicto escàpulo-humeral entre 2 models diferents de pròtesi invertida d’espatlla. Aquestes pròtesis s’han implantat en pacients amb ruptures del manegot dels rotadors irreparables. Aquesta cirugía no està exenta de complicacions, i una de les més habituals és el conflicto escàpulo-humeral o notch.
Resumo:
Cloud computing and its three facets (Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS)) are terms that denote new developments in the software industry. In particular, PaaS solutions, also referred to as cloud platforms, are changing the way software is being produced, distributed, consumed, and priced. Software vendors have started considering cloud platforms as a strategic option but are battling to redefine their offerings to embrace PaaS. In contrast to SaaS and IaaS, PaaS allows for value co-creation with partners to develop complementary components and applications. It thus requires multisided business models that bring together two or more distinct customer segments. Understanding how to design PaaS business models to establish a flourishing ecosystem is crucial for software vendors. This doctoral thesis aims to address this issue in three interrelated research parts. First, based on case study research, the thesis provides a deeper understanding of current PaaS business models and their evolution. Second, it analyses and simulates consumers' preferences regarding PaaS business models, using a conjoint approach to find out what determines the choice of cloud platforms. Finally, building on the previous research outcomes, the third part introduces a design theory for the emerging class of PaaS business models, which is grounded on an extensive action design research study with a large European software vendor. Understanding PaaS business models from a market as well as a consumer perspective will, together with the design theory, inform and guide decision makers in their business model innovation plans. It also closes gaps in the research related to PaaS business model design and more generally related to platform business models.
Resumo:
BACKGROUND: Histologic grade in breast cancer provides clinically important prognostic information. However, 30%-60% of tumors are classified as histologic grade 2. This grade is associated with an intermediate risk of recurrence and is thus not informative for clinical decision making. We examined whether histologic grade was associated with gene expression profiles of breast cancers and whether such profiles could be used to improve histologic grading. METHODS: We analyzed microarray data from 189 invasive breast carcinomas and from three published gene expression datasets from breast carcinomas. We identified differentially expressed genes in a training set of 64 estrogen receptor (ER)-positive tumor samples by comparing expression profiles between histologic grade 3 tumors and histologic grade 1 tumors and used the expression of these genes to define the gene expression grade index. Data from 597 independent tumors were used to evaluate the association between relapse-free survival and the gene expression grade index in a Kaplan-Meier analysis. All statistical tests were two-sided. RESULTS: We identified 97 genes in our training set that were associated with histologic grade; most of these genes were involved in cell cycle regulation and proliferation. In validation datasets, the gene expression grade index was strongly associated with histologic grade 1 and 3 status; however, among histologic grade 2 tumors, the index spanned the values for histologic grade 1-3 tumors. Among patients with histologic grade 2 tumors, a high gene expression grade index was associated with a higher risk of recurrence than a low gene expression grade index (hazard ratio = 3.61, 95% confidence interval = 2.25 to 5.78; P < .001, log-rank test). CONCLUSIONS: Gene expression grade index appeared to reclassify patients with histologic grade 2 tumors into two groups with high versus low risks of recurrence. This approach may improve the accuracy of tumor grading and thus its prognostic value.
Resumo:
Gene-on-gene regulations are key components of every living organism. Dynamical abstract models of genetic regulatory networks help explain the genome's evolvability and robustness. These properties can be attributed to the structural topology of the graph formed by genes, as vertices, and regulatory interactions, as edges. Moreover, the actual gene interaction of each gene is believed to play a key role in the stability of the structure. With advances in biology, some effort was deployed to develop update functions in Boolean models that include recent knowledge. We combine real-life gene interaction networks with novel update functions in a Boolean model. We use two sub-networks of biological organisms, the yeast cell-cycle and the mouse embryonic stem cell, as topological support for our system. On these structures, we substitute the original random update functions by a novel threshold-based dynamic function in which the promoting and repressing effect of each interaction is considered. We use a third real-life regulatory network, along with its inferred Boolean update functions to validate the proposed update function. Results of this validation hint to increased biological plausibility of the threshold-based function. To investigate the dynamical behavior of this new model, we visualized the phase transition between order and chaos into the critical regime using Derrida plots. We complement the qualitative nature of Derrida plots with an alternative measure, the criticality distance, that also allows to discriminate between regimes in a quantitative way. Simulation on both real-life genetic regulatory networks show that there exists a set of parameters that allows the systems to operate in the critical region. This new model includes experimentally derived biological information and recent discoveries, which makes it potentially useful to guide experimental research. The update function confers additional realism to the model, while reducing the complexity and solution space, thus making it easier to investigate.
Resumo:
Vegeu el resum a l'inici del document de l'arxiu adjunt
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
In this paper we propose a parsimonious regime-switching approach to model the correlations between assets, the threshold conditional correlation (TCC) model. This method allows the dynamics of the correlations to change from one state (or regime) to another as a function of observable transition variables. Our model is similar in spirit to Silvennoinen and Teräsvirta (2009) and Pelletier (2006) but with the appealing feature that it does not suffer from the course of dimensionality. In particular, estimation of the parameters of the TCC involves a simple grid search procedure. In addition, it is easy to guarantee a positive definite correlation matrix because the TCC estimator is given by the sample correlation matrix, which is positive definite by construction. The methodology is illustrated by evaluating the behaviour of international equities, govenrment bonds and major exchange rates, first separately and then jointly. We also test and allow for different parts in the correlation matrix to be governed by different transition variables. For this, we estimate a multi-threshold TCC specification. Further, we evaluate the economic performance of the TCC model against a constant conditional correlation (CCC) estimator using a Diebold-Mariano type test. We conclude that threshold correlation modelling gives rise to a significant reduction in portfolio´s variance.
Resumo:
This paper studies the limits of discrete time repeated games with public monitoring. We solve and characterize the Abreu, Milgrom and Pearce (1991) problem. We found that for the "bad" ("good") news model the lower (higher) magnitude events suggest cooperation, i.e., zero punishment probability, while the highrt (lower) magnitude events suggest defection, i.e., punishment with probability one. Public correlation is used to connect these two sets of signals and to make the enforceability to bind. The dynamic and limit behavior of the punishment probabilities for variations in ... (the discount rate) and ... (the time interval) are characterized, as well as the limit payo¤s for all these scenarios (We also introduce uncertainty in the time domain). The obtained ... limits are to the best of my knowledge, new. The obtained ... limits coincide with Fudenberg and Levine (2007) and Fudenberg and Olszewski (2011), with the exception that we clearly state the precise informational conditions that cause the limit to converge from above, to converge from below or to degenerate. JEL: C73, D82, D86. KEYWORDS: Repeated Games, Frequent Monitoring, Random Pub- lic Monitoring, Moral Hazard, Stochastic Processes.