936 resultados para Vector Space Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Natural selection favors the survival and reproduction of organisms that are best adapted to their environment. Selection mechanism in evolutionary algorithms mimics this process, aiming to create environmental conditions in which artificial organisms could evolve solving the problem at hand. This paper proposes a new selection scheme for evolutionary multiobjective optimization. The similarity measure that defines the concept of the neighborhood is a key feature of the proposed selection. Contrary to commonly used approaches, usually defined on the basis of distances between either individuals or weight vectors, it is suggested to consider the similarity and neighborhood based on the angle between individuals in the objective space. The smaller the angle, the more similar individuals. This notion is exploited during the mating and environmental selections. The convergence is ensured by minimizing distances from individuals to a reference point, whereas the diversity is preserved by maximizing angles between neighboring individuals. Experimental results reveal a highly competitive performance and useful characteristics of the proposed selection. Its strong diversity preserving ability allows to produce a significantly better performance on some problems when compared with stat-of-the-art algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper considers model spaces in an Hp setting. The existence of unbounded functions and the characterisation of maximal functions in a model space are studied, and decomposition results for Toeplitz kernels, in terms of model spaces, are established

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To assess the transfection of the gene that encodes green fluorescent protein (GFP) through direct intramyocardial injection. METHODS: The pREGFP plasmid vector was used. The EGFP gene was inserted downstream from the constitutive promoter of the Rous sarcoma virus. Five male dogs were used (mean weight 13.5 kg), in which 0.5 mL of saline solution (n=1) or 0.5 mL of plasmid solution containing 0.5 µg of pREGFP/dog (n=4) were injected into the myocardium of the left ventricular lateral wall. The dogs were euthanized 1 week later, and cardiac biopsies were obtained. RESULTS: Fluorescence microscopy showed differences between the cells transfected and not transfected with pREGFP plasmid. Mild fluorescence was observed in the cardiac fibers that received saline solution; however, the myocardial cells transfected with pREGFP had overt EGFP expression. CONCLUSION: Transfection with the EGFP gene in healthy canine myocardium was effective. The reproduction of this efficacy using vascular endothelial growth factor (VEGF) instead of EGFP aims at developing gene therapy for ischemic heart disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An algebraic decay rate is derived which bounds the time required for velocities to equilibrate in a spatially homogeneous flow-through model representing the continuum limit of a gas of particles interacting through slightly inelastic collisions. This rate is obtained by reformulating the dynamical problem as the gradient flow of a convex energy on an infinite-dimensional manifold. An abstract theory is developed for gradient flows in length spaces, which shows how degenerate convexity (or even non-convexity) | if uniformly controlled | will quantify contractivity (limit expansivity) of the flow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We quantify the long-time behavior of a system of (partially) inelastic particles in a stochastic thermostat by means of the contractivity of a suitable metric in the set of probability measures. Existence, uniqueness, boundedness of moments and regularity of a steady state are derived from this basic property. The solutions of the kinetic model are proved to converge exponentially as t→ ∞ to this diffusive equilibrium in this distance metrizing the weak convergence of measures. Then, we prove a uniform bound in time on Sobolev norms of the solution, provided the initial data has a finite norm in the corresponding Sobolev space. These results are then combined, using interpolation inequalities, to obtain exponential convergence to the diffusive equilibrium in the strong L¹-norm, as well as various Sobolev norms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyze the two-dimensional parabolic-elliptic Patlak-Keller-Segel model in the whole Euclidean space R2. Under the hypotheses of integrable initial data with finite second moment and entropy, we first show local in time existence for any mass of "free-energy solutions", namely weak solutions with some free energy estimates. We also prove that the solution exists as long as the entropy is controlled from above. The main result of the paper is to show the global existence of free-energy solutions with initial data as before for the critical mass 8 Π/Χ. Actually, we prove that solutions blow-up as a delta dirac at the center of mass when t→∞ keeping constant their second moment at any time. Furthermore, all moments larger than 2 blow-up as t→∞ if initially bounded.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In search of a suitable vector species for xenodiagnosis of humans and animals with chronic Chagas' disease we first investigated the reactions of different vector species to acute infection with Trypanosoma cruzi. Vector species utilized in this study were: Triatoma infestans, Rhodnius prolixus and Triatoma dimidiata, all well adapted to human habitats; Triatoma rubrovaria and Rhodnius neglectus both considered totally wild species; Panstrongylus megistus, Triatoma sordida, Triatoma pseudomaculata and Triatoma brasiliensis, all essentially sylvatic but some with domiciliary tendencies and others restricted to peridomestic biotopes with incipient colonization of human houses after successful eradication of T. infestans. Results summarized in Table IV suggest the following order of infectivity among the 9 studied vector species: P. megistus with 97.8% of infected bugs, T. rubrovaria with 95% of positive bugs a close second followed by T. Pseudomaculata with 94.3% and R. neglectus with 93.8% of infected bugs, almost identical thirds. R. prolixus, T. infestans and T. dimidiata exhibited low infection rates of 53.1%, 51.6% and 38.2% respectively, coupled with sharp decreases occuring with aging of infection (Fig. 1). The situation was intermediate in T. brasiliensis and T. sordida infection rates being 76.9% and 80% respectively. Results also point to the existence of a close correlation between prevalence and intensity of infection in that, species with high infection rates ranging from 93.8% to 97.8% exhibited relatively large proportions of insects (27.3% - 33.5%) harbouring very dense populations of T. cruzi. In species with low infection rates ranging from 38.2% to 53.1% the proportion of bugs demonstrating comparable parasite densities was at most 6%. No differences attributable to blood-meal size or to greater susceptibility of indigenous vector species to parasites of their own geographical area, as suggested in earlier...

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to upgrade the reliability of xenodiagnosis, attention has been directed towards population dynamics of the parasite, with particular interest for the following factors: 1. Parasite density which by itself is not a research objective, but by giving an accurate portrayal of parasite development and multiplication, has been incorporated in screening of bugs for xenodiagnosis. 2. On the assumption that food availability might increase parasite density, bugs from xenodiagnosis have been refed at biweekly intervals on chicken blood. 3. Infectivity rates and positives harbouring large parasite yields were based on gut infections, in which the parasite population comprised of all developmental forms was more abundant and easier to detect than in fecal infections, thus minimizing the probability of recording false negatives. 4. Since parasite density, low in the first 15 days of infection, increases rapidly in the following 30 days, the interval of 45 days has been adopted for routine examination of bugs from xenodiagnosis. By following the enumerated measures, all aiming to reduce false negative cases, we are getting closer to a reliable xenodiagnostic procedure. Upgrading the efficacy of xenodiagnosis is also dependent on the xenodiagnostic agent. Of 9 investigated vector species, Panstrongylus megistus deserves top priority as a xenodiagnostic agent. Its extraordinary capability to support fast development and vigorous multiplication of the few parasites, ingested from the host with chronic Chagas' disease, has been revealed by the strikingly close infectivity rates of 91.2% vs. 96.4% among bugs engorged from the same host in the chronic and acute phase of the disease respectively (Table V), the latter comporting an estimated number of 12.3 x 10[raised to the power of 3] parasites in the circulation at the time of xenodiagnosis, as reported previously by the authors (1982).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to investigate the value of the rabbit as an experimental model for Chagas' disease, 72 animals have been inoculated by intraperitoneal and conjunctival route with bloodstream forms, vector-derived metacyclic trypomastigotes and tissue culture trypomastigotes of Trypanosoma cruzi strains Y, CL and Ernane. In 95.6% of the animals trypomastigotes had been detected at the early stages of infection by fresh blood examination. The course of parasitemia at the acute phase was strongly influenced by the parasite strain and route of inoculation. At the chronic phase parasites had been recovered by xenodiagnosis and/or hemoculture in 40% of the examined animals. The xenodiagnosis studies have shown selective interactions between the T. cruzi strains and the four species of vectors used, inducing significant variability in the results. The data herein present are consistent with the parasitological requirements established for a suitable model for chronic Chagas' disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Staphylococcus aureus harbors redundant adhesins mediating tissue colonization and infection. To evaluate their intrinsic role outside of the staphylococcal background, a system was designed to express them in Lactococcus lactis subsp. cremoris 1363. This bacterium is devoid of virulence factors and has a known genetic background. A new Escherichia coli-L. lactis shuttle and expression vector was constructed for this purpose. First, the high-copy-number lactococcal plasmid pIL253 was equipped with the oriColE1 origin, generating pOri253 that could replicate in E. coli. Second, the lactococcal promoters P23 or P59 were inserted at one end of the pOri253 multicloning site. Gene expression was assessed by a luciferase reporter system. The plasmid carrying P23 (named pOri23) expressed luciferase constitutively at a level 10,000 times greater than did the P59-containing plasmid. Transcription was absent in E. coli. The staphylococcal clumping factor A (clfA) gene was cloned into pOri23 and used as a model system. Lactococci carrying pOri23-clfA produced an unaltered and functional 130-kDa ClfA protein attached to their cell walls. This was indicated both by the presence of the protein in Western blots of solubilized cell walls and by the ability of ClfA-positive lactococci to clump in the presence of plasma. ClfA-positive lactococci had clumping titers (titer of 4,112) similar to those of S. aureus Newman in soluble fibrinogen and bound equally well to solid-phase fibrinogen. These experiments provide a new way to study individual staphylococcal pathogenic factors and might complement both classical knockout mutagenesis and modern in vivo expression technology and signature tag mutagenesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The algorithmic approach to data modelling has developed rapidly these last years, in particular methods based on data mining and machine learning have been used in a growing number of applications. These methods follow a data-driven methodology, aiming at providing the best possible generalization and predictive abilities instead of concentrating on the properties of the data model. One of the most successful groups of such methods is known as Support Vector algorithms. Following the fruitful developments in applying Support Vector algorithms to spatial data, this paper introduces a new extension of the traditional support vector regression (SVR) algorithm. This extension allows for the simultaneous modelling of environmental data at several spatial scales. The joint influence of environmental processes presenting different patterns at different scales is here learned automatically from data, providing the optimum mixture of short and large-scale models. The method is adaptive to the spatial scale of the data. With this advantage, it can provide efficient means to model local anomalies that may typically arise in situations at an early phase of an environmental emergency. However, the proposed approach still requires some prior knowledge on the possible existence of such short-scale patterns. This is a possible limitation of the method for its implementation in early warning systems. The purpose of this paper is to present the multi-scale SVR model and to illustrate its use with an application to the mapping of Cs137 activity given the measurements taken in the region of Briansk following the Chernobyl accident.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper contributes to the on-going empirical debate regarding the role of the RBC model and in particular of technology shocks in explaining aggregate fluctuations. To this end we estimate the model’s posterior density using Markov-Chain Monte-Carlo (MCMC) methods. Within this framework we extend Ireland’s (2001, 2004) hybrid estimation approach to allow for a vector autoregressive moving average (VARMA) process to describe the movements and co-movements of the model’s errors not explained by the basic RBC model. The results of marginal likelihood ratio tests reveal that the more general model of the errors significantly improves the model’s fit relative to the VAR and AR alternatives. Moreover, despite setting the RBC model a more difficult task under the VARMA specification, our analysis, based on forecast error and spectral decompositions, suggests that the RBC model is still capable of explaining a significant fraction of the observed variation in macroeconomic aggregates in the post-war U.S. economy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop methods for Bayesian inference in vector error correction models which are subject to a variety of switches in regime (e.g. Markov switches in regime or structural breaks). An important aspect of our approach is that we allow both the cointegrating vectors and the number of cointegrating relationships to change when the regime changes. We show how Bayesian model averaging or model selection methods can be used to deal with the high-dimensional model space that results. Our methods are used in an empirical study of the Fisher effect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop methods for Bayesian inference in vector error correction models which are subject to a variety of switches in regime (e.g. Markov switches in regime or structural breaks). An important aspect of our approach is that we allow both the cointegrating vectors and the number of cointegrating relationships to change when the regime changes. We show how Bayesian model averaging or model selection methods can be used to deal with the high-dimensional model space that results. Our methods are used in an empirical study of the Fisher e ffect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Employing an endogenous growth model with human capital, this paper explores how productivity shocks in the goods and human capital producing sectors contribute to explaining aggregate fluctuations in output, consumption, investment and hours. Given the importance of accounting for both the dynamics and the trends in the data not captured by the theoretical growth model, we introduce a vector error correction model (VECM) of the measurement errors and estimate the model’s posterior density function using Bayesian methods. To contextualize our findings with those in the literature, we also assess whether the endogenous growth model or the standard real business cycle model better explains the observed variation in these aggregates. In addressing these issues we contribute to both the methods of analysis and the ongoing debate regarding the effects of innovations to productivity on macroeconomic activity.