981 resultados para large vector autoregression
Resumo:
BACKGROUND Functional brain images such as Single-Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) have been widely used to guide the clinicians in the Alzheimer's Disease (AD) diagnosis. However, the subjectivity involved in their evaluation has favoured the development of Computer Aided Diagnosis (CAD) Systems. METHODS It is proposed a novel combination of feature extraction techniques to improve the diagnosis of AD. Firstly, Regions of Interest (ROIs) are selected by means of a t-test carried out on 3D Normalised Mean Square Error (NMSE) features restricted to be located within a predefined brain activation mask. In order to address the small sample-size problem, the dimension of the feature space was further reduced by: Large Margin Nearest Neighbours using a rectangular matrix (LMNN-RECT), Principal Component Analysis (PCA) or Partial Least Squares (PLS) (the two latter also analysed with a LMNN transformation). Regarding the classifiers, kernel Support Vector Machines (SVMs) and LMNN using Euclidean, Mahalanobis and Energy-based metrics were compared. RESULTS Several experiments were conducted in order to evaluate the proposed LMNN-based feature extraction algorithms and its benefits as: i) linear transformation of the PLS or PCA reduced data, ii) feature reduction technique, and iii) classifier (with Euclidean, Mahalanobis or Energy-based methodology). The system was evaluated by means of k-fold cross-validation yielding accuracy, sensitivity and specificity values of 92.78%, 91.07% and 95.12% (for SPECT) and 90.67%, 88% and 93.33% (for PET), respectively, when a NMSE-PLS-LMNN feature extraction method was used in combination with a SVM classifier, thus outperforming recently reported baseline methods. CONCLUSIONS All the proposed methods turned out to be a valid solution for the presented problem. One of the advances is the robustness of the LMNN algorithm that not only provides higher separation rate between the classes but it also makes (in combination with NMSE and PLS) this rate variation more stable. In addition, their generalization ability is another advance since several experiments were performed on two image modalities (SPECT and PET).
Resumo:
The sterile insect technique (SIT) is a promising pest control method in terms of efficacy and environmental compatibility. In this study, we determined the efficacy of thiotepa-sterilised males in reducing the target Aedes aegypti populations. Treated male pupae were released weekly into large laboratory cages at a constant ratio of either 5:1 or 2:1 sterile-to-fertile males. A two-to-one release ratio reduced the hatch rate of eggs laid in the cage by approximately a third and reduced the adult catch rate by approximately a quarter, but a 5:1 release drove the population to elimination after 15 weeks of release. These results indicate that thiotepa exposure is an effective means of sterilising Ae. aegypti and males thus treated are able to reduce the reproductive capacity of a stable population under laboratory conditions. Further testing of the method in semi-field enclosures is required to evaluate the mating competitiveness of sterile males when exposed to natural environmental conditions. If proven effective, SIT using thiotepa-sterilised males may be incorporated into an integrated programme of vector control to combat dengue in Cuba.
Resumo:
Due to their performance enhancing properties, use of anabolic steroids (e.g. testosterone, nandrolone, etc.) is banned in elite sports. Therefore, doping control laboratories accredited by the World Anti-Doping Agency (WADA) screen among others for these prohibited substances in urine. It is particularly challenging to detect misuse with naturally occurring anabolic steroids such as testosterone (T), which is a popular ergogenic agent in sports and society. To screen for misuse with these compounds, drug testing laboratories monitor the urinary concentrations of endogenous steroid metabolites and their ratios, which constitute the steroid profile and compare them with reference ranges to detect unnaturally high values. However, the interpretation of the steroid profile is difficult due to large inter-individual variances, various confounding factors and different endogenous steroids marketed that influence the steroid profile in various ways. A support vector machine (SVM) algorithm was developed to statistically evaluate urinary steroid profiles composed of an extended range of steroid profile metabolites. This model makes the interpretation of the analytical data in the quest for deviating steroid profiles feasible and shows its versatility towards different kinds of misused endogenous steroids. The SVM model outperforms the current biomarkers with respect to detection sensitivity and accuracy, particularly when it is coupled to individual data as stored in the Athlete Biological Passport.
Resumo:
Chagas disease prevention remains mostly based on triatomine vector control to reduce or eliminate house infestation with these bugs. The level of adaptation of triatomines to human housing is a key part of vector competence and needs to be precisely evaluated to allow for the design of effective vector control strategies. In this review, we examine how the domiciliation/intrusion level of different triatomine species/populations has been defined and measured and discuss how these concepts may be improved for a better understanding of their ecology and evolution, as well as for the design of more effective control strategies against a large variety of triatomine species. We suggest that a major limitation of current criteria for classifying triatomines into sylvatic, intrusive, domiciliary and domestic species is that these are essentially qualitative and do not rely on quantitative variables measuring population sustainability and fitness in their different habitats. However, such assessments may be derived from further analysis and modelling of field data. Such approaches can shed new light on the domiciliation process of triatomines and may represent a key tool for decision-making and the design of vector control interventions.
Resumo:
Standard practice in Bayesian VARs is to formulate priors on the autoregressive parameters, but economists and policy makers actually have priors about the behavior of observable variables. We show how this kind of prior can be used in a VAR under strict probability theory principles. We state the inverse problem to be solved and we propose a numerical algorithm that works well in practical situations with a very large number of parameters. We prove various convergence theorems for the algorithm. As an application, we first show that the results in Christiano et al. (1999) are very sensitive to the introduction of various priors that are widely used. These priors turn out to be associated with undesirable priors on observables. But an empirical prior on observables helps clarify the relevance of these estimates: we find much higher persistence of output responses to monetary policy shocks than the one reported in Christiano et al. (1999) and a significantly larger total effect.
Resumo:
Background: Two or three DNA primes have been used in previous smaller clinical trials, but the number required for optimal priming of viral vectors has never been assessed in adequately powered clinical trials. The EV03/ANRS Vac20 phase I/II trial investigated this issue using the DNA prime/poxvirus NYVAC boost combination, both expressing a common HIV-1 clade C immunogen consisting of Env and Gag-Pol-Nef polypeptide. Methods: 147 healthy volunteers were randomly allocated through 8 European centres to either 3xDNA plus 1xNYVAC (weeks 0, 4, 8 plus 24; n¼74) or to 2xDNA plus 2xNYVAC (weeks 0, 4 plus 20, 24; n¼73), stratified by geographical region and sex. T cell responses were quantified using the interferon g Elispot assay and 8 peptide pools; samples from weeks 0, 26 and 28 (time points for primary immunogenicity endpoint), 48 and 72 were considered for this analysis. Results: 140 of 147 participants were evaluable at weeks 26 and/ or 28. 64/70 (91%) in the 3xDNA arm compared to 56/70 (80%) in the 2xDNA arm developed a T cell response (P¼0.053). 26 (37%) participants of the 3xDNA arm developed a broader T cell response (Env plus at least to one of the Gag, Pol, Nef peptide pools) versus 15 (22%) in the 2xDNA arm (P¼0.047). At week 26, the overall magnitude of responses was also higher in the 3xDNA than in the 2xDNA arm (similar at week 28), with a median of 545 versus 328 SFUs/106 cells at week 26 (P<0.001). Preliminary overall evaluation showed that participants still developed T-cell response at weeks 48 (78%, n¼67) and 72 (70%, n¼66). Conclusion: This large clinical trial demonstrates that optimal priming of poxvirus-based vaccine regimens requires 3 DNA regimens and further confirms that the DNA/NYVAC prime boost vaccine combination is highly immunogenic and induced durable T-cell responses.
Resumo:
Purpose: We previously demonstrated efficient retinal rescue of RPE65 mouse models (Rpe65-/- (Bemelmans et al, 2006) and Rpe65R91W/R91W mice) using a HIV1-derived lentiviral vector encoding for the mouse RPE65 cDNA. In order to optimize a lentiviral vector as an alternative tool for RPE65-derived Leber Congenital Amaurosis clinical trials, we evaluated the efficiency of an integration-deficient lentiviral vector (IDLV) encoding the human RPE65 cDNA to restore retinal function in the Rpe65R91W/R91W mice. Methods: An HIV-1-derived lentiviral vector expressing either the hrGFPII or the human Rpe65 cDNA under the control of a 0.8 kb fragment of the human Rpe65 promoter (R0.8) was produced by transient transfection of 293T cells. A LQ-integrase mutant was used to generate the IDLV vectors. IDLV-R0.8-hRPE65 or hrGFPII were injected subretinally into 1 month-old Rpe65R91W/R91W mice. Functional rescue was assessed by ERG (1 and 3 months post-injection) and cone survival by immunohistology. Results: An increased light sensitivity was detected by scotopic ERG in animals injected with IDLV-R0.8-hRPE65 compared to hrGFPII-treated animals or untreated mice. However the improvement was delayed compared to integration-proficient LV and observed at 3 months but not 1 month post-injection. Immunolabelling of cone markers showed an increased number of cones in the transduced area compared to control groups. Conclusions: The IDLV-R0.8-hRPE65 vectors allow retinal improvement in the Rpe65R91W/R91W mice. Both rod function and cone survival were demonstrated even if there is a delay in the rescue as assessed by scotopic ERG. Integration-deficient vectors minimize insertional mutagenesis and thus are safer candidates for human application. Further experiments using large animals are now needed to validate correct gene transfer and expression of the RPE65 gene as well as tolerance of the vector after subretinal injection before envisaging a clinical trial application.
Resumo:
Ticks transmit more pathogens to humans and animals than any other arthropod. We describe the 2.1 Gbp nuclear genome of the tick, Ixodes scapularis (Say), which vectors pathogens that cause Lyme disease, human granulocytic anaplasmosis, babesiosis and other diseases. The large genome reflects accumulation of repetitive DNA, new lineages of retro-transposons, and gene architecture patterns resembling ancient metazoans rather than pancrustaceans. Annotation of scaffolds representing ∼57% of the genome, reveals 20,486 protein-coding genes and expansions of gene families associated with tick-host interactions. We report insights from genome analyses into parasitic processes unique to ticks, including host 'questing', prolonged feeding, cuticle synthesis, blood meal concentration, novel methods of haemoglobin digestion, haem detoxification, vitellogenesis and prolonged off-host survival. We identify proteins associated with the agent of human granulocytic anaplasmosis, an emerging disease, and the encephalitis-causing Langat virus, and a population structure correlated to life-history traits and transmission of the Lyme disease agent.
Resumo:
We derive a new representation for a function as a linear combination of local correlation kernels at optimal sparse locations and discuss its relation to PCA, regularization, sparsity principles and Support Vector Machines. We first review previous results for the approximation of a function from discrete data (Girosi, 1998) in the context of Vapnik"s feature space and dual representation (Vapnik, 1995). We apply them to show 1) that a standard regularization functional with a stabilizer defined in terms of the correlation function induces a regression function in the span of the feature space of classical Principal Components and 2) that there exist a dual representations of the regression function in terms of a regularization network with a kernel equal to a generalized correlation function. We then describe the main observation of the paper: the dual representation in terms of the correlation function can be sparsified using the Support Vector Machines (Vapnik, 1982) technique and this operation is equivalent to sparsify a large dictionary of basis functions adapted to the task, using a variation of Basis Pursuit De-Noising (Chen, Donoho and Saunders, 1995; see also related work by Donahue and Geiger, 1994; Olshausen and Field, 1995; Lewicki and Sejnowski, 1998). In addition to extending the close relations between regularization, Support Vector Machines and sparsity, our work also illuminates and formalizes the LFA concept of Penev and Atick (1996). We discuss the relation between our results, which are about regression, and the different problem of pattern classification.
Resumo:
The Support Vector Machine (SVM) is a new and very promising classification technique developed by Vapnik and his group at AT&T Bell Labs. This new learning algorithm can be seen as an alternative training technique for Polynomial, Radial Basis Function and Multi-Layer Perceptron classifiers. An interesting property of this approach is that it is an approximate implementation of the Structural Risk Minimization (SRM) induction principle. The derivation of Support Vector Machines, its relationship with SRM, and its geometrical insight, are discussed in this paper. Training a SVM is equivalent to solve a quadratic programming problem with linear and box constraints in a number of variables equal to the number of data points. When the number of data points exceeds few thousands the problem is very challenging, because the quadratic form is completely dense, so the memory needed to store the problem grows with the square of the number of data points. Therefore, training problems arising in some real applications with large data sets are impossible to load into memory, and cannot be solved using standard non-linear constrained optimization algorithms. We present a decomposition algorithm that can be used to train SVM's over large data sets. The main idea behind the decomposition is the iterative solution of sub-problems and the evaluation of, and also establish the stopping criteria for the algorithm. We present previous approaches, as well as results and important details of our implementation of the algorithm using a second-order variant of the Reduced Gradient Method as the solver of the sub-problems. As an application of SVM's, we present preliminary results we obtained applying SVM to the problem of detecting frontal human faces in real images.
Resumo:
L'increment de bases de dades que cada vegada contenen imatges més difícils i amb un nombre més elevat de categories, està forçant el desenvolupament de tècniques de representació d'imatges que siguin discriminatives quan es vol treballar amb múltiples classes i d'algorismes que siguin eficients en l'aprenentatge i classificació. Aquesta tesi explora el problema de classificar les imatges segons l'objecte que contenen quan es disposa d'un gran nombre de categories. Primerament s'investiga com un sistema híbrid format per un model generatiu i un model discriminatiu pot beneficiar la tasca de classificació d'imatges on el nivell d'anotació humà sigui mínim. Per aquesta tasca introduïm un nou vocabulari utilitzant una representació densa de descriptors color-SIFT, i desprès s'investiga com els diferents paràmetres afecten la classificació final. Tot seguit es proposa un mètode par tal d'incorporar informació espacial amb el sistema híbrid, mostrant que la informació de context es de gran ajuda per la classificació d'imatges. Desprès introduïm un nou descriptor de forma que representa la imatge segons la seva forma local i la seva forma espacial, tot junt amb un kernel que incorpora aquesta informació espacial en forma piramidal. La forma es representada per un vector compacte obtenint un descriptor molt adequat per ésser utilitzat amb algorismes d'aprenentatge amb kernels. Els experiments realitzats postren que aquesta informació de forma te uns resultats semblants (i a vegades millors) als descriptors basats en aparença. També s'investiga com diferents característiques es poden combinar per ésser utilitzades en la classificació d'imatges i es mostra com el descriptor de forma proposat juntament amb un descriptor d'aparença millora substancialment la classificació. Finalment es descriu un algoritme que detecta les regions d'interès automàticament durant l'entrenament i la classificació. Això proporciona un mètode per inhibir el fons de la imatge i afegeix invariança a la posició dels objectes dins les imatges. S'ensenya que la forma i l'aparença sobre aquesta regió d'interès i utilitzant els classificadors random forests millora la classificació i el temps computacional. Es comparen els postres resultats amb resultats de la literatura utilitzant les mateixes bases de dades que els autors Aixa com els mateixos protocols d'aprenentatge i classificació. Es veu com totes les innovacions introduïdes incrementen la classificació final de les imatges.
Resumo:
Most studies involving statistical time series analysis rely on assumptions of linearity, which by its simplicity facilitates parameter interpretation and estimation. However, the linearity assumption may be too restrictive for many practical applications. The implementation of nonlinear models in time series analysis involves the estimation of a large set of parameters, frequently leading to overfitting problems. In this article, a predictability coefficient is estimated using a combination of nonlinear autoregressive models and the use of support vector regression in this model is explored. We illustrate the usefulness and interpretability of results by using electroencephalographic records of an epileptic patient.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
We discuss the consistency of the traditional vector meson dominance (VMD) model for photons coupling to matter, with the vanishing of vector meson-meson and meson-photon mixing self-energies at q2 = 0. This vanishing of vector mixing has been demonstrated in the context of rho-omega mixing for a large class of effective theories. As a further constraint on such models, we here apply them to a study of photon-meson mixing and VMD. As an example we compare the predicted momentum dependence of one such model with a momentum-dependent version of VMD discussed by Sakurai in the 1960's. We find that it produces a result which is consistent with the traditional VMD phenomenology. We conclude that comparison with VMD phenomenology can provide a useful constraint on such models.