61 resultados para Amplification Techniques
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Background: MLPA method is a potentially useful semi-quantitative method to detect copy number alterations in targeted regions. In this paper, we propose a method for the normalization procedure based on a non-linear mixed-model, as well as a new approach for determining the statistical significance of altered probes based on linear mixed-model. This method establishes a threshold by using different tolerance intervals that accommodates the specific random error variability observed in each test sample.Results: Through simulation studies we have shown that our proposed method outperforms two existing methods that are based on simple threshold rules or iterative regression. We have illustrated the method using a controlled MLPA assay in which targeted regions are variable in copy number in individuals suffering from different disorders such as Prader-Willi, DiGeorge or Autism showing the best performace.Conclusion: Using the proposed mixed-model, we are able to determine thresholds to decide whether a region is altered. These threholds are specific for each individual, incorporating experimental variability, resulting in improved sensitivity and specificity as the examples with real data have revealed.
Resumo:
En aquest article es fa una descripció dels procediments realitzats per enregistrar dues imatges geomètricament, de forma automàtica, si es pren la primera com a imatge de referència. Es comparen els resultats obtinguts mitjançant tres mètodes. El primer mètode és el d’enregistrament clàssic en domini espacial maximitzant la correlació creuada (MCC)[1]. El segon mètode es basa en aplicar l’enregistrament MCC conjuntament amb un anàlisi multiescala a partir de transformades wavelet [2]. El tercer mètode és una variant de l’anterior que es situa a mig camí dels dos. Per cada un dels mètodes s’obté una estimació dels coeficients de la transformació que relaciona les dues imatges. A continuació es transforma per cada cas la segona imatge i es georeferencia respecte la primera. I per acabar es proposen unes mesures quantitatives que permeten discutir i comparar els resultats obtinguts amb cada mètode.
Resumo:
Estudi elaborat a partir d’una estada al Laboratori de Inmunopatología del SIDA del Dr Alcamí a l’Instituto de Salud Carlos III-Centro Nacional de Microbiologia, entre finals de desembre de 2006 i març de 2007. L’objectiu ha estat millorar la caracterització de l’envolta del VIH-1 mitjançant l’obtenció de virus recombinants, ja que això permet estudiar l’envolta viral tant genètica com fenotípicament. En aquest cas, s’ha estudiat l'envolta viral dels pacients sotmesos a vacunació terapèutica amb cèl•lules dendrítiques polsades amb virus autòlegs. Durant aquesta estada es realitza un aprenentatge profund de les tècniques adequades per a l'amplificació i clonatge del gen complet de l'envolta del VIH-1 (env), així com de l’obtenció de virus recombinants amb l’envolta del pacient i els corresponents assaigs de tropisme viral i neutralització sèrica. Aquesta metodologia empra el virus quimèric pNL4.3 delta_env Renilla, construït a partir del virus de referència NL4.3 i que té dues característiques importants: la primera és que conté un gen marcador Renilla, que a l’interior de les cèl•lules infectades té activitat luciferasa. La utilització del virus pNL4.3 delta_env Renilla en assaigs de neutralització presenta diversos avantatges front altres assaigs més convencionals, tant a nivell de sensibilitat i especificitat com d’estalvi de temps.
Resumo:
Landscape classification tackles issues related to the representation and analysis of continuous and variable ecological data. In this study, a methodology is created in order to define topo-climatic landscapes (TCL) in the north-west of Catalonia (north-east of the Iberian Peninsula). TCLs relate the ecological behaviour of a landscape in terms of topography, physiognomy and climate, which compound the main drivers of an ecosystem. Selected variables are derived from different sources such as remote sensing and climatic atlas. The proposed methodology combines unsupervised interative cluster classification with a supervised fuzzy classification. As a result, 28 TCLs have been found for the study area which may be differentiated in terms of vegetation physiognomy and vegetation altitudinal range type. Furthermore a hierarchy among TCLs is set, enabling the merging of clusters and allowing for changes of scale. Through the topo-climatic landscape map, managers may identify patches with similar environmental conditions and asses at the same time the uncertainty involved.
Resumo:
DNA based techniques have proved to be very useful methods to study trophic relationships 17 between pests and their natural enemies. However, most predators are best defined as omnivores, 18 and the identification of plant-specific DNA should also allow the identification of the plant 19 species the predators have been feeding on. In this study, a PCR approach based on the 20 development of specific primers was developed as a self-marking technique to detect plant DNA 21 within the gut of one heteropteran omnivorous predator (Macrolophus pygmaeus) and two 22 lepidopteran pest species (Helicoverpa armigera and Tuta absoluta). Specific tomato primers 23 were designed from the ITS 1-2 region, which allowed the amplification of a tomato DNA 24 fragment of 332 bp within the three insect species tested in all cases (100% of detection at t = 0) 25 and did not detect DNA of other plants nor of the starved insects. Plant DNA half-lives at 25ºC 26 ranged from 5.8h, to 27.7h and 28.7h within M. pygmaeus, H. armigera and T. absoluta, 27 respectively. Tomato DNA detection within field collected M. pygmaeus suggests dietary mixing 28 in this omnivorous predator and showed a higher detection of tomato DNA in females and 29 nymphs than males. This study provides a useful tool to detect and to identify plant food sources 30 of arthropods and to evaluate crop colonization from surrounding vegetation in conservation 31 biological control programs.
Resumo:
L'estudi a nivell molecular de la ruta de senyalització activada per TGF-B té un impacte notable en el panorama actual donada la seva implicació en processos autoimmunitaris i carcinogènics. D'altra banda, l'elucidació a nivell estructural dels mecanismes moleculars que permeten a les ubiquitin lligases de tipus E3 marcar específicament llurs dianes per a la degradació proteosòmica - entre d'altres - resulta fonamental donada la seva importància pel que fa al control sobre el turnover proteic a nivell intracel•lular. En aquest projecte es pretén elucidar els mecanismes d'activació i catlàlisi de les ubiquitin lligases E3 tipus HECT Smurf1 i Nedd4L al mateix temps que se n'estudia la implicació en la regulació dels agents missatgers de la ruta del TGF-B. Així, la tasca es divideix en tres sub-projectes els quals se centren en a) l'estudi de la interacció d'aquestes lligases amb llurs dianes; b) l'elucidació del mecanisme d'activació i c) del de catàlisi d'aquests enzims. Per tal d'assolir aquests objectius ens servim principalment dels avantatges que ens aporta la Ressonància Magnètica Nuclear i altres tècniques biofísiques, principalment la ITC. Durant el temps que he gaudit de la beca FI, m'he centrat principalment en la preparació de pèptids per SPPS, llur purificació per HPLC i caracterització per MS i RMN. Aquests pèptids representen diferents patrons de fosforilació de certes dianes de les lligases esmentades, de manera que han estat emprats per a l'estudi d'interaccions proteïna-proteïna per ITC i RMN. M'he iniciat doncs en l'ús d'aquestes tècniques. D'altra banda, també he preparat mostres protèiques mitjançant l'ús de sistemes d'expressió bacterians basats en E.coli, incloent l'amplificació i clonació del gen que codifica per la proteïna d'interés així com la seva expressió, purificació i caracterització per MS i RMN.
Resumo:
Given the urgence of a new paradigm in wireless digital trasmission which should allow for higher bit rate, lower latency and tigher delay constaints, it has been proposed to investigate the fundamental building blocks that at the circuital/device level, will boost the change towards a more efficient network architecture, with high capacity, higher bandwidth and a more satisfactory end user experience. At the core of each transciever, there are inherently analog devices capable of providing the carrier signal, the oscillators. It is strongly believed that many limitations in today's communication protocols, could be relieved by permitting high carrier frequency radio transmission, and having some degree of reconfigurability. This led us to studying distributed oscillator architectures which work in the microwave range and possess wideband tuning capability. As microvave oscillators are essentially nonlinear devices, a full nonlinear analyis, synthesis, and optimization had to be considered for their implementation. Consequently, all the most used nonlinear numerical techniques in commercial EDA software had been reviewed. An application of all the aforementioned techniques has been shown, considering a systems of three coupled oscillator ("triple push" oscillator) in which the stability of the various oscillating modes has been studied. Provided that a certain phase distribution is maintained among the oscillating elements, this topology permits a rise in the output power of the third harmonic; nevertheless due to circuit simmetry, "unwanted" oscillating modes coexist with the intenteded one. Starting with the necessary background on distributed amplification and distributed oscillator theory, the design of a four stage reverse mode distributed voltage controlled oscillator (DVCO) using lumped elments has been presented. All the design steps have been reported and for the first time a method for an optimized design with reduced variations in the output power has been presented. Ongoing work is devoted to model a wideband DVCO and to implement a frequency divider.
Resumo:
In order to obtain a high-resolution Pleistocene stratigraphy, eleven continuouslycored boreholes, 100 to 220m deep were drilled in the northern part of the PoPlain by Regione Lombardia in the last five years. Quantitative provenanceanalysis (QPA, Weltje and von Eynatten, 2004) of Pleistocene sands was carriedout by using multivariate statistical analysis (principal component analysis, PCA,and similarity analysis) on an integrated data set, including high-resolution bulkpetrography and heavy-mineral analyses on Pleistocene sands and of 250 majorand minor modern rivers draining the southern flank of the Alps from West toEast (Garzanti et al, 2004; 2006). Prior to the onset of major Alpine glaciations,metamorphic and quartzofeldspathic detritus from the Western and Central Alpswas carried from the axial belt to the Po basin longitudinally parallel to theSouthAlpine belt by a trunk river (Vezzoli and Garzanti, 2008). This scenariorapidly changed during the marine isotope stage 22 (0.87 Ma), with the onset ofthe first major Pleistocene glaciation in the Alps (Muttoni et al, 2003). PCA andsimilarity analysis from core samples show that the longitudinal trunk river at thistime was shifted southward by the rapid southward and westward progradation oftransverse alluvial river systems fed from the Central and Southern Alps.Sediments were transported southward by braided river systems as well as glacialsediments transported by Alpine valley glaciers invaded the alluvial plain.Kew words: Detrital modes; Modern sands; Provenance; Principal ComponentsAnalysis; Similarity, Canberra Distance; palaeodrainage
Resumo:
The main instrument used in psychological measurement is the self-report questionnaire. One of its majordrawbacks however is its susceptibility to response biases. A known strategy to control these biases hasbeen the use of so-called ipsative items. Ipsative items are items that require the respondent to makebetween-scale comparisons within each item. The selected option determines to which scale the weight ofthe answer is attributed. Consequently in questionnaires only consisting of ipsative items everyrespondent is allotted an equal amount, i.e. the total score, that each can distribute differently over thescales. Therefore this type of response format yields data that can be considered compositional from itsinception.Methodological oriented psychologists have heavily criticized this type of item format, since the resultingdata is also marked by the associated unfavourable statistical properties. Nevertheless, clinicians havekept using these questionnaires to their satisfaction. This investigation therefore aims to evaluate bothpositions and addresses the similarities and differences between the two data collection methods. Theultimate objective is to formulate a guideline when to use which type of item format.The comparison is based on data obtained with both an ipsative and normative version of threepsychological questionnaires, which were administered to 502 first-year students in psychology accordingto a balanced within-subjects design. Previous research only compared the direct ipsative scale scoreswith the derived ipsative scale scores. The use of compositional data analysis techniques also enables oneto compare derived normative score ratios with direct normative score ratios. The addition of the secondcomparison not only offers the advantage of a better-balanced research strategy. In principle it also allowsfor parametric testing in the evaluation
Resumo:
This work provides a general description of the multi sensor data fusion concept, along with a new classification of currently used sensor fusion techniques for unmanned underwater vehicles (UUV). Unlike previous proposals that focus the classification on the sensors involved in the fusion, we propose a synthetic approach that is focused on the techniques involved in the fusion and their applications in UUV navigation. We believe that our approach is better oriented towards the development of sensor fusion systems, since a sensor fusion architecture should be first of all focused on its goals and then on the fused sensors
Resumo:
Obtaining automatic 3D profile of objects is one of the most important issues in computer vision. With this information, a large number of applications become feasible: from visual inspection of industrial parts to 3D reconstruction of the environment for mobile robots. In order to achieve 3D data, range finders can be used. Coded structured light approach is one of the most widely used techniques to retrieve 3D information of an unknown surface. An overview of the existing techniques as well as a new classification of patterns for structured light sensors is presented. This kind of systems belong to the group of active triangulation method, which are based on projecting a light pattern and imaging the illuminated scene from one or more points of view. Since the patterns are coded, correspondences between points of the image(s) and points of the projected pattern can be easily found. Once correspondences are found, a classical triangulation strategy between camera(s) and projector device leads to the reconstruction of the surface. Advantages and constraints of the different patterns are discussed
Resumo:
The absolute necessity of obtaining 3D information of structured and unknown environments in autonomous navigation reduce considerably the set of sensors that can be used. The necessity to know, at each time, the position of the mobile robot with respect to the scene is indispensable. Furthermore, this information must be obtained in the least computing time. Stereo vision is an attractive and widely used method, but, it is rather limited to make fast 3D surface maps, due to the correspondence problem. The spatial and temporal correspondence among images can be alleviated using a method based on structured light. This relationship can be directly found codifying the projected light; then each imaged region of the projected pattern carries the needed information to solve the correspondence problem. We present the most significant techniques, used in recent years, concerning the coded structured light method
Resumo:
A study of how the machine learning technique, known as gentleboost, could improve different digital watermarking methods such as LSB, DWT, DCT2 and Histogram shifting.
Resumo:
Los sistemas de radio cognitivos son una solución a la deficiente distribución del espectro inalámbrico de frecuencias. Usando acceso dinámico al medio, los usuarios secundarios pueden comunicarse en canales de frecuencia disponibles, mientras los usuarios asignados no están usando dichos canales. Un buen sistema de mensajería de control es necesario para que los usuarios secundarios no interfieran con los usuarios primarios en las redes de radio cognitivas. Para redes en donde los usuarios son heterogéneos en frecuencia, es decir, no poseen los mismos canales de frecuencia para comunicarse, el grupo de canales utilizado para transmitir información de control debe elegirse cuidadosamente. Por esta razón, en esta tesis se estudian las ideas básicas de los esquemas de mensajería de control usados en las redes de radio cognitivas y se presenta un esquema adecuado para un control adecuado para usuarios heterogéneos en canales de frecuencia. Para ello, primero se presenta una nueva taxonomía para clasificar las estrategias de mensajería de control, identificando las principales características que debe cumplir un esquema de control para sistemas heterogéneos en frecuencia. Luego, se revisan diversas técnicas matemáticas para escoger el mínimo número de canales por los cuales se transmite la información de control. Después, se introduce un modelo de un esquema de mensajería de control que use el mínimo número de canales y que utilice las características de los sistemas heterogéneos en frecuencia. Por último, se comparan diversos esquemas de mensajería de control en términos de la eficiencia de transmisión.
Resumo:
In 2000 the European Statistical Office published the guidelines for developing theHarmonized European Time Use Surveys system. Under such a unified framework,the first Time Use Survey of national scope was conducted in Spain during 2002–03. The aim of these surveys is to understand human behavior and the lifestyle ofpeople. Time allocation data are of compositional nature in origin, that is, they aresubject to non-negativity and constant-sum constraints. Thus, standard multivariatetechniques cannot be directly applied to analyze them. The goal of this work is toidentify homogeneous Spanish Autonomous Communities with regard to the typicalactivity pattern of their respective populations. To this end, fuzzy clustering approachis followed. Rather than the hard partitioning of classical clustering, where objects areallocated to only a single group, fuzzy method identify overlapping groups of objectsby allowing them to belong to more than one group. Concretely, the probabilistic fuzzyc-means algorithm is conveniently adapted to deal with the Spanish Time Use Surveymicrodata. As a result, a map distinguishing Autonomous Communities with similaractivity pattern is drawn.Key words: Time use data, Fuzzy clustering; FCM; simplex space; Aitchison distance