982 resultados para Inverse Approach


Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-throughput technologies are now used to generate more than one type of data from the same biological samples. To properly integrate such data, we propose using co-modules, which describe coherent patterns across paired data sets, and conceive several modular methods for their identification. We first test these methods using in silico data, demonstrating that the integrative scheme of our Ping-Pong Algorithm uncovers drug-gene associations more accurately when considering noisy or complex data. Second, we provide an extensive comparative study using the gene-expression and drug-response data from the NCI-60 cell lines. Using information from the DrugBank and the Connectivity Map databases we show that the Ping-Pong Algorithm predicts drug-gene associations significantly better than other methods. Co-modules provide insights into possible mechanisms of action for a wide range of drugs and suggest new targets for therapy

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Solving multi-stage oligopoly models by backward induction can easily become a com- plex task when rms are multi-product and demands are derived from a nested logit frame- work. This paper shows that under the assumption that within-segment rm shares are equal across segments, the analytical expression for equilibrium pro ts can be substantially simpli ed. The size of the error arising when this condition does not hold perfectly is also computed. Through numerical examples, it is shown that the error is rather small in general. Therefore, using this assumption allows to gain analytical tractability in a class of models that has been used to approach relevant policy questions, such as for example rm entry in an industry or the relation between competition and location. The simplifying approach proposed in this paper is aimed at helping improving these type of models for reaching more accurate recommendations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the availability of new generation sequencing technologies, bacterial genome projects have undergone a major boost. Still, chromosome completion needs a costly and time-consuming gap closure, especially when containing highly repetitive elements. However, incomplete genome data may be sufficiently informative to derive the pursued information. For emerging pathogens, i.e. newly identified pathogens, lack of release of genome data during gap closure stage is clearly medically counterproductive. We thus investigated the feasibility of a dirty genome approach, i.e. the release of unfinished genome sequences to develop serological diagnostic tools. We showed that almost the whole genome sequence of the emerging pathogen Parachlamydia acanthamoebae was retrieved even with relatively short reads from Genome Sequencer 20 and Solexa. The bacterial proteome was analyzed to select immunogenic proteins, which were then expressed and used to elaborate the first steps of an ELISA. This work constitutes the proof of principle for a dirty genome approach, i.e. the use of unfinished genome sequences of pathogenic bacteria, coupled with proteomics to rapidly identify new immunogenic proteins useful to develop in the future specific diagnostic tests such as ELISA, immunohistochemistry and direct antigen detection. Although applied here to an emerging pathogen, this combined dirty genome sequencing/proteomic approach may be used for any pathogen for which better diagnostics are needed. These genome sequences may also be very useful to develop DNA based diagnostic tests. All these diagnostic tools will allow further evaluations of the pathogenic potential of this obligate intracellular bacterium.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Human organism is interpenetrated by the world of microorganisms, from the conception until the death. This interpenetration involves different levels of interactions between the partners including trophic exchanges, bi-directional cell signaling and gene activation, besides genetic and epigenetic phenomena, and tends towards mutual adaptation and coevolution. Since these processes are critical for the survival of individuals and species, they rely on the existence of a complex organization of adaptive systems aiming at two apparently conflicting purposes: the maintenance of the internal coherence of each partner, and a mutually advantageous coexistence and progressive adaptation between them. Humans possess three adaptive systems: the nervous, the endocrine and the immune system, each internally organized into subsystems functionally connected by intraconnections, to maintain the internal coherence of the system. The three adaptive systems aim at the maintenance of the internal coherence of the organism and are functionally linked by interconnections, in such way that what happens to one is immediately sensed by the others. The different communities of infectious agents that live within the organism are also organized into functional networks. The members of each community are linked by intraconnections, represented by the mutual trophic, metabolic and other influences, while the different infectious communities affect each other through interconnections. Furthermore, by means of its adaptive systems, the organism influences and is influenced by the microbial communities through the existence of transconnections. It is proposed that these highly complex and dynamic networks, involving gene exchange and epigenetic phenomena, represent major coevolutionary forces for humans and microorganisms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a distribution problem, and specfii cally in bankruptcy issues, the Proportional (P) and the Egalitarian (EA) divisions are two of the most popular ways to resolve the conflict. The Constrained Equal Awards rule (CEA) is introduced in bankruptcy literature to ensure that no agent receives more than her claim, a problem that can arise when using the egalitarian division. We propose an alternative modi cation, by using a convex combination of P and EA. The recursive application of this new rule finishes at the CEA rule. Our solution concept ensures a minimum amount to each agent, and distributes the remaining estate in a proportional way. Keywords: Bankruptcy problems, Proportional rule, Equal Awards, Convex combination of rules, Lorenz dominance. JEL classi fication: C71, D63, D71.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The solution for the ‘Contested Garment Problem’, proposed in the Babylonic Talmud, suggests that each agent should receive at least some part of the resources whenever the demand overcomes the available amount. In this context, we propose a new method to define lower bounds on awards, an idea that has underlied the theoretical analysis of bankruptcy problems from its beginning (O’Neill, 1982) to present day (Dominguez and Thomson, 2006). Specifically, starting from the fact that a society establishes its own set of ‘Commonly Accepted Equity Principles’, our proposal ensures to each agent the smallest amount she gets according to all the admissible rules. As in general this new bound will not exhaust the estate, we analyze its recursive application for different sets of equity principles. Keywords: Bankruptcy problems, Bankruptcy rules, Lower bounds, Recursive process

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Recent evidence suggests that there may be more than one Gilles de la Tourette syndrome (GTS)/tic disorder phenotype. However, little is known about the common patterns of these GTS/tic disorder-related comorbidities. In addition, sex-specific phenomenological data of GTS/tic disorder-affected adults are rare. Therefore, this community-based study used latent class analyses (LCA) to investigate sex-related and non-sex-related subtypes of GTS/tic disorders and their most common comorbidities. METHODS: The data were drawn from the PsyCoLaus study (n = 3691), a population-based survey conducted in Lausanne, Switzerland. LCA were performed on the data of 80 subjects manifesting motor/vocal tics during their childhood/adolescence. Comorbid attention-deficit hyperactivity disorder (ADHD), obsessive-compulsive disorder, depressive, phobia and panic symptoms/syndromes comprised the selected indicators. The resultant classes were characterized by psychosocial correlates. RESULTS: In LCA, four latent classes provided the best fit to the data. We identified two male-related classes. The first class exhibited both ADHD and depression. The second class comprised males with only depression. Class three was a female-related class depicting obsessive thoughts/compulsive acts, phobias and panic attacks. This class manifested high psychosocial impairment. Class four had a balanced sex proportion and comorbid symptoms/syndromes such as phobias and panic attacks. The complementary occurrence of comorbid obsessive thoughts/compulsive acts and ADHD impulsivity was remarkable. CONCLUSIONS: To the best of our knowledge, this is the first study applying LCA to community data of GTS symptoms/tic disorder-affected persons. Our findings support the utility of differentiating GTS/tic disorder subphenotypes on the basis of comorbid syndromes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: Retinal detachment (RD) is a major complication of cataract surgery, which can be treated by either primary vitrectomy without indentation or the scleral buckling procedure. The aim of this study is to compare the results of these two techniques for the treatment of pseudophakic RD. PATIENTS AND METHODS: The charts of 40 patients (40 eyes) treated with scleral buckling for a primary pseudophakic RD were retrospectively studied and compared to the charts of 32 patients (32 eyes) treated with primary vitrectomy without scleral buckle during the same period by the same surgeons. To obtain comparable samples, patients with giant retinal tears, vitreous hemorrhage, and severe preoperative proliferative vitreoretinopathy (PVR) were not included. Minimal follow-up was 6 months. RESULTS: The primary success rate was 84% in the vitrectomy group and 82.5% in the ab-externo group. Final anatomical success was observed in 100% of cases in the vitrectomy group and in 95% of cases in the ab-externo group. Final visual acuity was 0.5 or better in 44% of cases in the vitrectomy group and 37.5% in the ab-externo group. The duration of the surgery was significantly lower in the ab-externo group, whereas the hospital stay tended to be lower in the vitrectomy group. In the vitrectomy group, postoperative PVR developed in 3 eyes and new or undetected breaks were responsible for failure of the initial procedure in 2 eyes. CONCLUSION: Primary vitrectomy appears to be as effective as scleral buckling procedures for the treatment of pseudophakic RD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The article is composed of two sections. The first one is a critical review of the three main alternative indices to GDP which were proposed in the last decades – the Human Development Index (HDI), the Genuine Progress Indicator (GPI), and the Happy Planet Index (HPI) – which is made on the basis of conceptual foundations, rather than looking at issues of statistical consistency or mathematical refinement as most of the literature does. The pars construens aims to propose an alternative measure, the composite wealth index, consistent with an approach to development based on the notion of composite wealth, which is in turn derived from an empirical common sense criterion. Arguably, this approach is suitable to be conveyed into an easily understandable and coherent indicator, and thus appropriate to track development in its various dimensions: simple in its formulation, the wealth approach can incorporate social and ecological goals without significant alterations in conceptual foundations, while reducing to a minimum arbitrary weighting.