806 resultados para Heterogeneous Stock (HS)
Resumo:
In this paper we present a Linguistic Meta-Model (LMM) allowing a semiotic-cognitive representation of knowledge. LMM is freely available and integrates the schemata of linguistic knowledge resources, such as WordNet and FrameNet, as well as foundational ontologies, such as DOLCE and its extensions. In addition, LMM is able to deal with multilinguality and to represent individuals and facts in an open domain perspective.
Resumo:
Etravirine (ETV) is recommended in combination with a boosted protease inhibitor plus an optimized background regimen for salvage therapy, but there is limited experience with its use in combination with two nucleos(t)ide reverse-transcriptase inhibitors (NRTIs). This multicenter study aimed to assess the efficacy of this combination in two scenarios: group A) subjects without virologic failure on or no experience with non-nucleoside reverse-transcriptase inhibitors (NNRTIs) switched due to adverse events and group B) subjects switched after a virologic failure on an efavirenz- or nevirapine-based regimen. The primary endpoint was efficacy at 52 weeks analysed by intention-to-treat. Virologic failure was defined as the inability to suppress plasma HIV-RNA to <50 copies/mL after 24 weeks on treatment, or a confirmed viral load >200 copies/mL in patients who had previously achieved a viral suppression or had an undetectable viral load at inclusion. Two hundred eighty seven patients were included. Treatment efficacy rates in group A and B were 88.0% (CI95, 83.9-92.1%) and 77.4% (CI95, 65.0-89.7%), respectively; the rates reached 97.2% (CI95, 95.1-99.3%) and 90.5% (CI95, 81.7-99.3), by on-treatment analysis. The once-a-day ETV treatment was as effective as the twice daily dosing regimen. Grade 1-2 adverse events were observed motivating a treatment switch in 4.2% of the subjects. In conclusion, ETV (once- or twice daily) plus two analogs is a suitable, well-tolerated combination both as a switching strategy and after failure with first generation NNRTIs, ensuring full drug activity. TRIAL REGISTRATION ClinicalTrials.gov NCT01437241.
Resumo:
C57BL/6J mice were fed a high-fat, carbohydrate-free diet (HFD) for 9 mo. Approximately 50% of the mice became obese and diabetic (ObD), approximately 10% lean and diabetic (LD), approximately 10% lean and nondiabetic (LnD), and approximately 30% displayed intermediate phenotype. All of the HFD mice were insulin resistant. In the fasted state, whole body glucose clearance was reduced in ObD mice, unchanged in the LD mice, and increased in the LnD mice compared with the normal-chow mice. Because fasted ObD mice were hyperinsulinemic and the lean mice slightly insulinopenic, there was no correlation between insulin levels and increased glucose utilization. In vivo, tissue glucose uptake assessed by 2-[(14)C]deoxyglucose accumulation was reduced in most muscles in the ObD mice but increased in the LnD mice compared with the values of the control mice. In the LD mice, the glucose uptake rates were reduced in extensor digitorum longus (EDL) and total hindlimb but increased in soleus, diaphragm, and heart. When assessed in vitro, glucose utilization rates in the absence and presence of insulin were similar in diaphragm, soleus, and EDL muscles isolated from all groups of mice. Thus, in genetically homogenous mice, HFD feeding lead to different metabolic adaptations. Whereas all of the mice became insulin resistant, this was associated, in obese mice, with decreased glucose clearance and hyperinsulinemia and, in lean mice, with increased glucose clearance in the presence of mild insulinopenia. Therefore, increased glucose clearance in lean mice could not be explained by increased insulin level, indicating that other in vivo mechanisms are triggered to control muscle glucose utilization. These adaptive mechanisms could participate in the protection against development of obesity.
Resumo:
ABSTRACTThe Copula Theory was used to analyze contagion among the BRIC (Brazil, Russia, India and China) and European Union stock markets with the U.S. Equity Market. The market indexes used for the period between January 01, 2005 and February 27, 2010 are: MXBRIC (BRIC), MXEU (European Union) and MXUS (United States). This article evaluated the adequacy of the main copulas found in the financial literature using log-likelihood, Akaike information and Bayesian information criteria. This article provides a groundbreaking study in the area of contagion due to the use of conditional copulas, allowing to calculate the correlation increase between indexes with non-parametric approach. The conditional Symmetrized Joe-Clayton copula was the one that fitted better to the considered pairs of returns. Results indicate evidence of contagion effect in both markets, European Union and BRIC members, with a 5% significance level. Furthermore, there is also evidence that the contagion of U.S. financial crisis was more pronounced in the European Union than in the BRIC markets, with a 5% significance level. Therefore, stock portfolios formed by equities from the BRIC countries were able to offer greater protection during the subprime crisis. The results are aligned with recent papers that present an increase in correlation between stock markets, especially in bear markets.
Resumo:
A new analytical approach for measuring methane in tissues is presented. For the first time, the use of in situ-produced, stably labelled CDH(3) provides a reliable and precise methane quantification. This method was applied to postmortem samples obtained from two victims to help determine the explosion origin. There was evidence of methane in the adipose tissue (82 nmol/g) and cardiac blood (1.3 nmol/g) of one victim, which corresponded to a lethal methane outburst. These results are discussed in the context of the available literature to define an analysis protocol for application in the event of a gas explosion.
Resumo:
We present a study of the continuous-time equations governing the dynamics of a susceptible infected-susceptible model on heterogeneous metapopulations. These equations have been recently proposed as an alternative formulation for the spread of infectious diseases in metapopulations in a continuous-time framework. Individual-based Monte Carlo simulations of epidemic spread in uncorrelated networks are also performed revealing a good agreement with analytical predictions under the assumption of simultaneous transmission or recovery and migration processes
Resumo:
We present the derivation of the continuous-time equations governing the limit dynamics of discrete-time reaction-diffusion processes defined on heterogeneous metapopulations. We show that, when a rigorous time limit is performed, the lack of an epidemic threshold in the spread of infections is not limited to metapopulations with a scale-free architecture, as it has been predicted from dynamical equations in which reaction and diffusion occur sequentially in time
Resumo:
The front speed problem for nonuniform reaction rate and diffusion coefficient is studied by using singular perturbation analysis, the geometric approach of Hamilton-Jacobi dynamics, and the local speed approach. Exact and perturbed expressions for the front speed are obtained in the limit of large times. For linear and fractal heterogeneities, the analytic results have been compared with numerical results exhibiting a good agreement. Finally we reach a general expression for the speed of the front in the case of smooth and weak heterogeneities
Resumo:
Using a numerical approach, we explore wave-induced fluid flow effects in partially saturated porous rocks in which the gas-water saturation patterns are governed by mesoscopic heterogeneities associated with the dry frame properties. The link between the dry frame properties and the gas saturation is defined by the assumption of capillary pressure equilibrium, which in the presence of heterogeneity implies that neighbouring regions can exhibit different levels of saturation. To determine the equivalent attenuation and phase velocity of the synthetic rock samples considered in this study, we apply a numerical upscaling procedure, which permits to take into account mesoscopic heterogeneities associated with the dry frame properties as well as spatially continuous variations of the pore fluid properties. The multiscale nature of the fluid saturation is taken into account by locally computing the physical properties of an effective fluid, which are then used for the larger-scale simulations. We consider two sets of numerical experiments to analyse such effects in heterogeneous partially saturated porous media, where the saturation field is determined by variations in porosity and clay content, respectively. In both cases we also evaluate the seismic responses of corresponding binary, patchy-type saturation patterns. Our results indicate that significant attenuation and modest velocity dispersion effects take place in this kind of media for both binary patchy-type and spatially continuous gas saturation patterns and in particular in the presence of relatively small amounts of gas. The numerical experiments also show that the nature of the gas distribution patterns is a critical parameter controlling the seismic responses of these environments, since attenuation and velocity dispersion effects are much more significant and occur over a broader saturation range for binary patchy-type gas-water distributions. This analysis therefore suggests that the physical mechanisms governing partial saturation should be accounted for when analysing seismic data in a poroelastic framework. In this context, heterogeneities associated with the dry frame properties, which do not play important roles in wave-induced fluid flow processes per se, should be taken into account since they may determine the kind of gas distribution pattern taking place in the porous rock.
Resumo:
Gene expression patterns are a key feature in understanding gene function, notably in development. Comparing gene expression patterns between animals is a major step in the study of gene function as well as of animal evolution. It also provides a link between genes and phenotypes. Thus we have developed Bgee, a database designed to compare expression patterns between animals, by implementing ontologies describing anatomies and developmental stages of species, and then designing homology relationships between anatomies and comparison criteria between developmental stages. To define homology relationships between anatomical features we have developed the software Homolonto, which uses a modified ontology alignment approach to propose homology relationships between ontologies. Bgee then uses these aligned ontologies, onto which heterogeneous expression data types are mapped. These already include microarrays and ESTs.
Resumo:
We consider stock market contagion as a significant increase in cross-market linkages after a shock to one country or group of countries. Under this definition we study if contagion occurred from the U.S. Financial Crisis to the rest of the major stock markets in the world by using the adjusted (unconditional) correlation coefficient approach (Forbes and Rigobon, 2002) which consists of testing if average crossmarket correlations increase significantly during the relevant period of turmoil. We would not reject the null hypothesis of interdependence in favour of contagion if the increase in correlation only suggests a continuation of high linkages in all state of the world. Moreover, if contagion occurs, this would justify the intervention of the IMF and the suddenly portfolio restructuring during the period under study.
Resumo:
Background: Systematic approaches for identifying proteins involved in different types of cancer are needed. Experimental techniques such as microarrays are being used to characterize cancer, but validating their results can be a laborious task. Computational approaches are used to prioritize between genes putatively involved in cancer, usually based on further analyzing experimental data. Results: We implemented a systematic method using the PIANA software that predicts cancer involvement of genes by integrating heterogeneous datasets. Specifically, we produced lists of genes likely to be involved in cancer by relying on: (i) protein-protein interactions; (ii) differential expression data; and (iii) structural and functional properties of cancer genes. The integrative approach that combines multiple sources of data obtained positive predictive values ranging from 23% (on a list of 811 genes) to 73% (on a list of 22 genes), outperforming the use of any of the data sources alone. We analyze a list of 20 cancer gene predictions, finding that most of them have been recently linked to cancer in literature. Conclusion: Our approach to identifying and prioritizing candidate cancer genes can be used to produce lists of genes likely to be involved in cancer. Our results suggest that differential expression studies yielding high numbers of candidate cancer genes can be filtered using protein interaction networks.
Resumo:
The increasing volume of data describing humandisease processes and the growing complexity of understanding, managing, and sharing such data presents a huge challenge for clinicians and medical researchers. This paper presents the@neurIST system, which provides an infrastructure for biomedical research while aiding clinical care, by bringing together heterogeneous data and complex processing and computing services. Although @neurIST targets the investigation and treatment of cerebral aneurysms, the system’s architecture is generic enough that it could be adapted to the treatment of other diseases.Innovations in @neurIST include confining the patient data pertaining to aneurysms inside a single environment that offers cliniciansthe tools to analyze and interpret patient data and make use of knowledge-based guidance in planning their treatment. Medicalresearchers gain access to a critical mass of aneurysm related data due to the system’s ability to federate distributed informationsources. A semantically mediated grid infrastructure ensures that both clinicians and researchers are able to seamlessly access andwork on data that is distributed across multiple sites in a secure way in addition to providing computing resources on demand forperforming computationally intensive simulations for treatment planning and research.
Resumo:
Based on accepted advances in the marketing, economics, consumer behavior, and satisfaction literatures, we develop a micro-foundations model of a firm that needs to manage the quality of a product that is inherently heterogeneous in the presence of varying customer tastes or expectations for quality. Our model blends elements of the returns to quality, customer lifetime value, and service profit chain approaches to marketing. The model is then used to explain several empirical results pertaining to the marketing literature by explicitly articulating the trade-offs between customer satisfaction and costs (including opportunity costs) of quality. In this environment firms will find it optimal to allow some customers to go unsatisfied. We show that the relationship between the expected number of repeated purchases by an individual customer is endogenous to the choice of quality by the firm, indicating that the number of purchases cannot be chosen freely to estimate a customer’s lifetime value.