948 resultados para HETEROGENEOUS ELASTOGRAPHY PHANTOMS
Resumo:
In this paper, we consider the ATM networks in which the virtual path concept is implemented. The question of how to multiplex two or more diverse traffic classes while providing different quality of service requirements is a very complicated open problem. Two distinct options are available: integration and segregation. In an integration approach all the traffic from different connections are multiplexed onto one VP. This implies that the most restrictive QOS requirements must be applied to all services. Therefore, link utilization will be decreased because unnecessarily stringent QOS is provided to all connections. With the segregation approach the problem can be much simplified if different types of traffic are separated by assigning a VP with dedicated resources (buffers and links). Therefore, resources may not be efficiently utilized because no sharing of bandwidth can take place across the VP. The probability that the bandwidth required by the accepted connections exceeds the capacity of the link is evaluated with the probability of congestion (PC). Since the PC can be expressed as the CLP, we shall simply carry out bandwidth allocation using the PC. We first focus on the influence of some parameters (CLP, bit rate and burstiness) on the capacity required by a VP supporting a single traffic class using the new convolution approach. Numerical results are presented both to compare the required capacity and to observe which conditions under each approach are preferred
Resumo:
Given the very large amount of data obtained everyday through population surveys, much of the new research again could use this information instead of collecting new samples. Unfortunately, relevant data are often disseminated into different files obtained through different sampling designs. Data fusion is a set of methods used to combine information from different sources into a single dataset. In this article, we are interested in a specific problem: the fusion of two data files, one of which being quite small. We propose a model-based procedure combining a logistic regression with an Expectation-Maximization algorithm. Results show that despite the lack of data, this procedure can perform better than standard matching procedures.
Resumo:
Neurocysticercosis (NC) is a clinically and radiologically heterogeneous parasitic disease caused by the establishment of larval Taenia solium in the human central nervous system. Host and/or parasite variations may be related to this observed heterogeneity. Genetic differences between pig and human-derived T. solium cysticerci have been reported previously. In this study, 28 cysticerci were surgically removed from 12 human NC patients, the mitochondrial gene that encodes cytochrome b was amplified from the cysticerci and genetic variations that may be related to NC heterogeneity were characterised. Nine different haplotypes (Ht), which were clustered in four haplogroups (Hg), were identified. Hg 3 and 4 exhibited a tendency to associate with age and gender, respectively. However, no significant associations were found between NC heterogeneity and the different T. solium cysticerci Ht or Hg. Parasite variants obtained from patients with similar NC clinical or radiological features were genetically closer than those found in groups of patients with a different NC profile when using the Mantel test. Overall, this study establishes the presence of genetic differences in the Cytb gene of T. solium isolated from human cysticerci and suggests that parasite variation could contribute to NC heterogeneity.
Resumo:
In this paper we present a Linguistic Meta-Model (LMM) allowing a semiotic-cognitive representation of knowledge. LMM is freely available and integrates the schemata of linguistic knowledge resources, such as WordNet and FrameNet, as well as foundational ontologies, such as DOLCE and its extensions. In addition, LMM is able to deal with multilinguality and to represent individuals and facts in an open domain perspective.
Resumo:
Etravirine (ETV) is recommended in combination with a boosted protease inhibitor plus an optimized background regimen for salvage therapy, but there is limited experience with its use in combination with two nucleos(t)ide reverse-transcriptase inhibitors (NRTIs). This multicenter study aimed to assess the efficacy of this combination in two scenarios: group A) subjects without virologic failure on or no experience with non-nucleoside reverse-transcriptase inhibitors (NNRTIs) switched due to adverse events and group B) subjects switched after a virologic failure on an efavirenz- or nevirapine-based regimen. The primary endpoint was efficacy at 52 weeks analysed by intention-to-treat. Virologic failure was defined as the inability to suppress plasma HIV-RNA to <50 copies/mL after 24 weeks on treatment, or a confirmed viral load >200 copies/mL in patients who had previously achieved a viral suppression or had an undetectable viral load at inclusion. Two hundred eighty seven patients were included. Treatment efficacy rates in group A and B were 88.0% (CI95, 83.9-92.1%) and 77.4% (CI95, 65.0-89.7%), respectively; the rates reached 97.2% (CI95, 95.1-99.3%) and 90.5% (CI95, 81.7-99.3), by on-treatment analysis. The once-a-day ETV treatment was as effective as the twice daily dosing regimen. Grade 1-2 adverse events were observed motivating a treatment switch in 4.2% of the subjects. In conclusion, ETV (once- or twice daily) plus two analogs is a suitable, well-tolerated combination both as a switching strategy and after failure with first generation NNRTIs, ensuring full drug activity. TRIAL REGISTRATION ClinicalTrials.gov NCT01437241.
Resumo:
C57BL/6J mice were fed a high-fat, carbohydrate-free diet (HFD) for 9 mo. Approximately 50% of the mice became obese and diabetic (ObD), approximately 10% lean and diabetic (LD), approximately 10% lean and nondiabetic (LnD), and approximately 30% displayed intermediate phenotype. All of the HFD mice were insulin resistant. In the fasted state, whole body glucose clearance was reduced in ObD mice, unchanged in the LD mice, and increased in the LnD mice compared with the normal-chow mice. Because fasted ObD mice were hyperinsulinemic and the lean mice slightly insulinopenic, there was no correlation between insulin levels and increased glucose utilization. In vivo, tissue glucose uptake assessed by 2-[(14)C]deoxyglucose accumulation was reduced in most muscles in the ObD mice but increased in the LnD mice compared with the values of the control mice. In the LD mice, the glucose uptake rates were reduced in extensor digitorum longus (EDL) and total hindlimb but increased in soleus, diaphragm, and heart. When assessed in vitro, glucose utilization rates in the absence and presence of insulin were similar in diaphragm, soleus, and EDL muscles isolated from all groups of mice. Thus, in genetically homogenous mice, HFD feeding lead to different metabolic adaptations. Whereas all of the mice became insulin resistant, this was associated, in obese mice, with decreased glucose clearance and hyperinsulinemia and, in lean mice, with increased glucose clearance in the presence of mild insulinopenia. Therefore, increased glucose clearance in lean mice could not be explained by increased insulin level, indicating that other in vivo mechanisms are triggered to control muscle glucose utilization. These adaptive mechanisms could participate in the protection against development of obesity.
Resumo:
We present a study of the continuous-time equations governing the dynamics of a susceptible infected-susceptible model on heterogeneous metapopulations. These equations have been recently proposed as an alternative formulation for the spread of infectious diseases in metapopulations in a continuous-time framework. Individual-based Monte Carlo simulations of epidemic spread in uncorrelated networks are also performed revealing a good agreement with analytical predictions under the assumption of simultaneous transmission or recovery and migration processes
Resumo:
We present the derivation of the continuous-time equations governing the limit dynamics of discrete-time reaction-diffusion processes defined on heterogeneous metapopulations. We show that, when a rigorous time limit is performed, the lack of an epidemic threshold in the spread of infections is not limited to metapopulations with a scale-free architecture, as it has been predicted from dynamical equations in which reaction and diffusion occur sequentially in time
Resumo:
Black-blood MR coronary vessel wall imaging may become a powerful tool for the quantitative and noninvasive assessment of atherosclerosis and positive arterial remodeling. Although dual-inversion recovery is currently the gold standard, optimal lumen-to-vessel wall contrast is sometimes difficult to obtain, and the time window available for imaging is limited due to competing requirements between blood signal nulling time and period of minimal myocardial motion. Further, atherosclerosis is a spatially heterogeneous disease, and imaging at multiple anatomic levels of the coronary circulation is mandatory. However, this requirement of enhanced volumetric coverage comes at the expense of scanning time. Phase-sensitive inversion recovery has shown to be very valuable for enhancing tissue-tissue contrast and for making inversion recovery imaging less sensitive to tissue signal nulling time. This work enables multislice black-blood coronary vessel wall imaging in a single breath hold by extending phase-sensitive inversion recovery to phase-sensitive dual-inversion recovery, by combining it with spiral imaging and yet relaxing constraints related to blood signal nulling time and period of minimal myocardial motion.
Resumo:
The front speed problem for nonuniform reaction rate and diffusion coefficient is studied by using singular perturbation analysis, the geometric approach of Hamilton-Jacobi dynamics, and the local speed approach. Exact and perturbed expressions for the front speed are obtained in the limit of large times. For linear and fractal heterogeneities, the analytic results have been compared with numerical results exhibiting a good agreement. Finally we reach a general expression for the speed of the front in the case of smooth and weak heterogeneities
Resumo:
Using a numerical approach, we explore wave-induced fluid flow effects in partially saturated porous rocks in which the gas-water saturation patterns are governed by mesoscopic heterogeneities associated with the dry frame properties. The link between the dry frame properties and the gas saturation is defined by the assumption of capillary pressure equilibrium, which in the presence of heterogeneity implies that neighbouring regions can exhibit different levels of saturation. To determine the equivalent attenuation and phase velocity of the synthetic rock samples considered in this study, we apply a numerical upscaling procedure, which permits to take into account mesoscopic heterogeneities associated with the dry frame properties as well as spatially continuous variations of the pore fluid properties. The multiscale nature of the fluid saturation is taken into account by locally computing the physical properties of an effective fluid, which are then used for the larger-scale simulations. We consider two sets of numerical experiments to analyse such effects in heterogeneous partially saturated porous media, where the saturation field is determined by variations in porosity and clay content, respectively. In both cases we also evaluate the seismic responses of corresponding binary, patchy-type saturation patterns. Our results indicate that significant attenuation and modest velocity dispersion effects take place in this kind of media for both binary patchy-type and spatially continuous gas saturation patterns and in particular in the presence of relatively small amounts of gas. The numerical experiments also show that the nature of the gas distribution patterns is a critical parameter controlling the seismic responses of these environments, since attenuation and velocity dispersion effects are much more significant and occur over a broader saturation range for binary patchy-type gas-water distributions. This analysis therefore suggests that the physical mechanisms governing partial saturation should be accounted for when analysing seismic data in a poroelastic framework. In this context, heterogeneities associated with the dry frame properties, which do not play important roles in wave-induced fluid flow processes per se, should be taken into account since they may determine the kind of gas distribution pattern taking place in the porous rock.
Resumo:
Gene expression patterns are a key feature in understanding gene function, notably in development. Comparing gene expression patterns between animals is a major step in the study of gene function as well as of animal evolution. It also provides a link between genes and phenotypes. Thus we have developed Bgee, a database designed to compare expression patterns between animals, by implementing ontologies describing anatomies and developmental stages of species, and then designing homology relationships between anatomies and comparison criteria between developmental stages. To define homology relationships between anatomical features we have developed the software Homolonto, which uses a modified ontology alignment approach to propose homology relationships between ontologies. Bgee then uses these aligned ontologies, onto which heterogeneous expression data types are mapped. These already include microarrays and ESTs.
Resumo:
Background: Systematic approaches for identifying proteins involved in different types of cancer are needed. Experimental techniques such as microarrays are being used to characterize cancer, but validating their results can be a laborious task. Computational approaches are used to prioritize between genes putatively involved in cancer, usually based on further analyzing experimental data. Results: We implemented a systematic method using the PIANA software that predicts cancer involvement of genes by integrating heterogeneous datasets. Specifically, we produced lists of genes likely to be involved in cancer by relying on: (i) protein-protein interactions; (ii) differential expression data; and (iii) structural and functional properties of cancer genes. The integrative approach that combines multiple sources of data obtained positive predictive values ranging from 23% (on a list of 811 genes) to 73% (on a list of 22 genes), outperforming the use of any of the data sources alone. We analyze a list of 20 cancer gene predictions, finding that most of them have been recently linked to cancer in literature. Conclusion: Our approach to identifying and prioritizing candidate cancer genes can be used to produce lists of genes likely to be involved in cancer. Our results suggest that differential expression studies yielding high numbers of candidate cancer genes can be filtered using protein interaction networks.
Resumo:
The increasing volume of data describing humandisease processes and the growing complexity of understanding, managing, and sharing such data presents a huge challenge for clinicians and medical researchers. This paper presents the@neurIST system, which provides an infrastructure for biomedical research while aiding clinical care, by bringing together heterogeneous data and complex processing and computing services. Although @neurIST targets the investigation and treatment of cerebral aneurysms, the system’s architecture is generic enough that it could be adapted to the treatment of other diseases.Innovations in @neurIST include confining the patient data pertaining to aneurysms inside a single environment that offers cliniciansthe tools to analyze and interpret patient data and make use of knowledge-based guidance in planning their treatment. Medicalresearchers gain access to a critical mass of aneurysm related data due to the system’s ability to federate distributed informationsources. A semantically mediated grid infrastructure ensures that both clinicians and researchers are able to seamlessly access andwork on data that is distributed across multiple sites in a secure way in addition to providing computing resources on demand forperforming computationally intensive simulations for treatment planning and research.
Resumo:
Based on accepted advances in the marketing, economics, consumer behavior, and satisfaction literatures, we develop a micro-foundations model of a firm that needs to manage the quality of a product that is inherently heterogeneous in the presence of varying customer tastes or expectations for quality. Our model blends elements of the returns to quality, customer lifetime value, and service profit chain approaches to marketing. The model is then used to explain several empirical results pertaining to the marketing literature by explicitly articulating the trade-offs between customer satisfaction and costs (including opportunity costs) of quality. In this environment firms will find it optimal to allow some customers to go unsatisfied. We show that the relationship between the expected number of repeated purchases by an individual customer is endogenous to the choice of quality by the firm, indicating that the number of purchases cannot be chosen freely to estimate a customer’s lifetime value.