952 resultados para signal analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background The search for enriched (aka over-represented or enhanced) ontology terms in a list of genes obtained from microarray experiments is becoming a standard procedure for a system-level analysis. This procedure tries to summarize the information focussing on classification designs such as Gene Ontology, KEGG pathways, and so on, instead of focussing on individual genes. Although it is well known in statistics that association and significance are distinct concepts, only the former approach has been used to deal with the ontology term enrichment problem. Results BayGO implements a Bayesian approach to search for enriched terms from microarray data. The R source-code is freely available at http://blasto.iq.usp.br/~tkoide/BayGO in three versions: Linux, which can be easily incorporated into pre-existent pipelines; Windows, to be controlled interactively; and as a web-tool. The software was validated using a bacterial heat shock response dataset, since this stress triggers known system-level responses. Conclusion The Bayesian model accounts for the fact that, eventually, not all the genes from a given category are observable in microarray data due to low intensity signal, quality filters, genes that were not spotted and so on. Moreover, BayGO allows one to measure the statistical association between generic ontology terms and differential expression, instead of working only with the common significance analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background The ongoing efforts to sequence the honey bee genome require additional initiatives to define its transcriptome. Towards this end, we employed the Open Reading frame ESTs (ORESTES) strategy to generate profiles for the life cycle of Apis mellifera workers. Results Of the 5,021 ORESTES, 35.2% matched with previously deposited Apis ESTs. The analysis of the remaining sequences defined a set of putative orthologs whose majority had their best-match hits with Anopheles and Drosophila genes. CAP3 assembly of the Apis ORESTES with the already existing 15,500 Apis ESTs generated 3,408 contigs. BLASTX comparison of these contigs with protein sets of organisms representing distinct phylogenetic clades revealed a total of 1,629 contigs that Apis mellifera shares with different taxa. Most (41%) represent genes that are in common to all taxa, another 21% are shared between metazoans (Bilateria), and 16% are shared only within the Insecta clade. A set of 23 putative genes presented a best match with human genes, many of which encode factors related to cell signaling/signal transduction. 1,779 contigs (52%) did not match any known sequence. Applying a correction factor deduced from a parallel analysis performed with Drosophila melanogaster ORESTES, we estimate that approximately half of these no-match ESTs contigs (22%) should represent Apis-specific genes. Conclusions The versatile and cost-efficient ORESTES approach produced minilibraries for honey bee life cycle stages. Such information on central gene regions contributes to genome annotation and also lends itself to cross-transcriptome comparisons to reveal evolutionary trends in insect genomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Sugarcane is an increasingly economically and environmentally important C4 grass, used for the production of sugar and bioethanol, a low-carbon emission fuel. Sugarcane originated from crosses of Saccharum species and is noted for its unique capacity to accumulate high amounts of sucrose in its stems. Environmental stresses limit enormously sugarcane productivity worldwide. To investigate transcriptome changes in response to environmental inputs that alter yield we used cDNA microarrays to profile expression of 1,545 genes in plants submitted to drought, phosphate starvation, herbivory and N2-fixing endophytic bacteria. We also investigated the response to phytohormones (abscisic acid and methyl jasmonate). The arrayed elements correspond mostly to genes involved in signal transduction, hormone biosynthesis, transcription factors, novel genes and genes corresponding to unknown proteins. Results Adopting an outliers searching method 179 genes with strikingly different expression levels were identified as differentially expressed in at least one of the treatments analysed. Self Organizing Maps were used to cluster the expression profiles of 695 genes that showed a highly correlated expression pattern among replicates. The expression data for 22 genes was evaluated for 36 experimental data points by quantitative RT-PCR indicating a validation rate of 80.5% using three biological experimental replicates. The SUCAST Database was created that provides public access to the data described in this work, linked to tissue expression profiling and the SUCAST gene category and sequence analysis. The SUCAST database also includes a categorization of the sugarcane kinome based on a phylogenetic grouping that included 182 undefined kinases. Conclusion An extensive study on the sugarcane transcriptome was performed. Sugarcane genes responsive to phytohormones and to challenges sugarcane commonly deals with in the field were identified. Additionally, the protein kinases were annotated based on a phylogenetic approach. The experimental design and statistical analysis applied proved robust to unravel genes associated with a diverse array of conditions attributing novel functions to previously unknown or undefined genes. The data consolidated in the SUCAST database resource can guide further studies and be useful for the development of improved sugarcane varieties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Membrane proteins are a large and important class of proteins. They are responsible for several of the key functions in a living cell, e.g. transport of nutrients and ions, cell-cell signaling, and cell-cell adhesion. Despite their importance it has not been possible to study their structure and organization in much detail because of the difficulty to obtain 3D structures. In this thesis theoretical studies of membrane protein sequences and structures have been carried out by analyzing existing experimental data. The data comes from several sources including sequence databases, genome sequencing projects, and 3D structures. Prediction of the membrane spanning regions by hydrophobicity analysis is a key technique used in several of the studies. A novel method for this is also presented and compared to other methods. The primary questions addressed in the thesis are: What properties are common to all membrane proteins? What is the overall architecture of a membrane protein? What properties govern the integration into the membrane? How many membrane proteins are there and how are they distributed in different organisms? Several of the findings have now been backed up by experiments. An analysis of the large family of G-protein coupled receptors pinpoints differences in length and amino acid composition of loops between proteins with and without a signal peptide and also differences between extra- and intracellular loops. Known 3D structures of membrane proteins have been studied in terms of hydrophobicity, distribution of secondary structure and amino acid types, position specific residue variability, and differences between loops and membrane spanning regions. An analysis of several fully and partially sequenced genomes from eukaryotes, prokaryotes, and archaea has been carried out. Several differences in the membrane protein content between organisms were found, the most important being the total number of membrane proteins and the distribution of membrane proteins with a given number of transmembrane segments. Of the properties that were found to be similar in all organisms, the most obvious is the bias in the distribution of positive charges between the extra- and intracellular loops. Finally, an analysis of homologues to membrane proteins with known topology uncovered two related, multi-spanning proteins with opposite predicted orientations. The predicted topologies were verified experimentally, providing a first example of "divergent topology evolution".

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis some multivariate spectroscopic methods for the analysis of solutions are proposed. Spectroscopy and multivariate data analysis form a powerful combination for obtaining both quantitative and qualitative information and it is shown how spectroscopic techniques in combination with chemometric data evaluation can be used to obtain rapid, simple and efficient analytical methods. These spectroscopic methods consisting of spectroscopic analysis, a high level of automation and chemometric data evaluation can lead to analytical methods with a high analytical capacity, and for these methods, the term high-capacity analysis (HCA) is suggested. It is further shown how chemometric evaluation of the multivariate data in chromatographic analyses decreases the need for baseline separation. The thesis is based on six papers and the chemometric tools used are experimental design, principal component analysis (PCA), soft independent modelling of class analogy (SIMCA), partial least squares regression (PLS) and parallel factor analysis (PARAFAC). The analytical techniques utilised are scanning ultraviolet-visible (UV-Vis) spectroscopy, diode array detection (DAD) used in non-column chromatographic diode array UV spectroscopy, high-performance liquid chromatography with diode array detection (HPLC-DAD) and fluorescence spectroscopy. The methods proposed are exemplified in the analysis of pharmaceutical solutions and serum proteins. In Paper I a method is proposed for the determination of the content and identity of the active compound in pharmaceutical solutions by means of UV-Vis spectroscopy, orthogonal signal correction and multivariate calibration with PLS and SIMCA classification. Paper II proposes a new method for the rapid determination of pharmaceutical solutions by the use of non-column chromatographic diode array UV spectroscopy, i.e. a conventional HPLC-DAD system without any chromatographic column connected. In Paper III an investigation is made of the ability of a control sample, of known content and identity to diagnose and correct errors in multivariate predictions something that together with use of multivariate residuals can make it possible to use the same calibration model over time. In Paper IV a method is proposed for simultaneous determination of serum proteins with fluorescence spectroscopy and multivariate calibration. Paper V proposes a method for the determination of chromatographic peak purity by means of PCA of HPLC-DAD data. In Paper VI PARAFAC is applied for the decomposition of DAD data of some partially separated peaks into the pure chromatographic, spectral and concentration profiles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present study we are using multi variate analysis techniques to discriminate signal from background in the fully hadronic decay channel of ttbar events. We give a brief introduction to the role of the Top quark in the standard model and a general description of the CMS Experiment at LHC. We have used the CMS experiment computing and software infrastructure to generate and prepare the data samples used in this analysis. We tested the performance of three different classifiers applied to our data samples and used the selection obtained with the Multi Layer Perceptron classifier to give an estimation of the statistical and systematical uncertainty on the cross section measurement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Every seismic event produces seismic waves which travel throughout the Earth. Seismology is the science of interpreting measurements to derive information about the structure of the Earth. Seismic tomography is the most powerful tool for determination of 3D structure of deep Earth's interiors. Tomographic models obtained at the global and regional scales are an underlying tool for determination of geodynamical state of the Earth, showing evident correlation with other geophysical and geological characteristics. The global tomographic images of the Earth can be written as a linear combinations of basis functions from a specifically chosen set, defining the model parameterization. A number of different parameterizations are commonly seen in literature: seismic velocities in the Earth have been expressed, for example, as combinations of spherical harmonics or by means of the simpler characteristic functions of discrete cells. With this work we are interested to focus our attention on this aspect, evaluating a new type of parameterization, performed by means of wavelet functions. It is known from the classical Fourier theory that a signal can be expressed as the sum of a, possibly infinite, series of sines and cosines. This sum is often referred as a Fourier expansion. The big disadvantage of a Fourier expansion is that it has only frequency resolution and no time resolution. The Wavelet Analysis (or Wavelet Transform) is probably the most recent solution to overcome the shortcomings of Fourier analysis. The fundamental idea behind this innovative analysis is to study signal according to scale. Wavelets, in fact, are mathematical functions that cut up data into different frequency components, and then study each component with resolution matched to its scale, so they are especially useful in the analysis of non stationary process that contains multi-scale features, discontinuities and sharp strike. Wavelets are essentially used in two ways when they are applied in geophysical process or signals studies: 1) as a basis for representation or characterization of process; 2) as an integration kernel for analysis to extract information about the process. These two types of applications of wavelets in geophysical field, are object of study of this work. At the beginning we use the wavelets as basis to represent and resolve the Tomographic Inverse Problem. After a briefly introduction to seismic tomography theory, we assess the power of wavelet analysis in the representation of two different type of synthetic models; then we apply it to real data, obtaining surface wave phase velocity maps and evaluating its abilities by means of comparison with an other type of parametrization (i.e., block parametrization). For the second type of wavelet application we analyze the ability of Continuous Wavelet Transform in the spectral analysis, starting again with some synthetic tests to evaluate its sensibility and capability and then apply the same analysis to real data to obtain Local Correlation Maps between different model at same depth or between different profiles of the same model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sigma (σ) receptors are well established as a non-opioid, non-phencyclidine, and haloperidol-sensitive receptor family with its own binding profile and a characteristic distribution in the central nervous system (CNS) as well as in endocrine, immune, and some peripheral tissues. Two σ receptors subtypes, termed σ1 and σ2, have been pharmacologically characterized, but, to date, only the σ1 has also been cloned. Activation of σ1 receptors alter several neurotransmitter systems and dopamine (DA) neurotrasmission has been often shown to constitute an important target of σ receptors in different experimental models; however the exact role of σ1 receptor in dopaminergic neurotransmission remains unclear. The DA transporter (DAT) modulates the spatial and temporal aspects of dopaminergic synaptic transmission and interprer the primary mechanism by wich dopaminergic neurons terminate the signal transmission. For this reason present studies have been focused in understanding whether, in cell models, the human subtype of σ1 (hσ1) receptor is able to directly modulate the human DA transporter (hDAT). In the first part of this thesis, HEK-293 and SH-SY5Y cells were permanently transfected with the hσ1 receptor. Subsequently, they were transfected with another plasmid for transiently expressing the hDAT. The hDAT activity was estimated using the described [3H]DA uptake assay and the effects of σ ligands were evaluated by measuring the uptaken [3H]DA after treating the cells with known σ agonists and antagonists. Results illustrated in this thesis demonstrate that activation of overexpressed hσ1 receptors by (+)-pentazocine, the σ1 agonist prototype, determines an increase of 40% of the extracellular [3H]DA uptake, in comparison to non-treated controls and the σ1 antagonists BD-1047 and NE-100 prevent the positive effect of (+)-pentazocine on DA reuptake DA is likely to be considered a neurotoxic molecule. In fact, when levels of intracellular DA abnormally invrease, vescicles can’t sequester the DA which is metabolized by MAO (A and B) and COMT with consequent overproduction of oxygen reactive species and toxic catabolites. Stress induced by these molecules leads cells to death. Thus, for the second part of this thesis, experiments have been performed in order to investigate functional alterations caused by the (+)-pentazocine-mediated increase of DA uptake; particularly it has been investigated if the increase of intracellular [DA] could affect cells viability. Results obtained from this study demonstrate that (+)-pentazocine alone increases DA cell toxicity in a concentration-dependent manner only in cells co-expressing hσ1 and hDAT and σ1 antagonists are able to revert the (+)-pentazocine-induced increase of cell susceptibility to DA toxicity. In the last part of this thesis, the functional cross-talking between hσ1 receptor and hDAT has been further investigated using confocal microscopy. From the acquired data it could be suggested that, following exposure to (+)-pentazocine, the hσ1 receptors massively translocate towards the plasma membrane and colocalize with the hDATs. However, any physical interaction between the two proteins remains to be proved. In conclusion, the presented study shows for the first time that, in cell models, hσ1 receptors directly modulate the hDAT activity. Facilitation of DA uptake induced by (+)-pentazocine is reflected on the increased cell susceptibility to DA toxicity; these effects are prevented by σ1 selective antagonists. Since numerous compounds, including several drugs of abuse, bind to σ1 receptors and activating them could facilitate the damage of dopaminergic neurons, the reported protective effect showed by σ1 antagonists would represent the pharmacological basis to test these compounds in experimental models of dopaminergic neurodegenerative diseases (i.e. Parkinson’s Disease).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis introduces new processing techniques for computer-aided interpretation of ultrasound images with the purpose of supporting medical diagnostic. In terms of practical application, the goal of this work is the improvement of current prostate biopsy protocols by providing physicians with a visual map overlaid over ultrasound images marking regions potentially affected by disease. As far as analysis techniques are concerned, the main contributions of this work to the state-of-the-art is the introduction of deconvolution as a pre-processing step in the standard ultrasonic tissue characterization procedure to improve the diagnostic significance of ultrasonic features. This thesis also includes some innovations in ultrasound modeling, in particular the employment of a continuous-time autoregressive moving-average (CARMA) model for ultrasound signals, a new maximum-likelihood CARMA estimator based on exponential splines and the definition of CARMA parameters as new ultrasonic features able to capture scatterers concentration. Finally, concerning the clinical usefulness of the developed techniques, the main contribution of this research is showing, through a study based on medical ground truth, that a reduction in the number of sampled cores in standard prostate biopsy is possible, preserving the same diagnostic power of the current clinical protocol.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monitoring foetal health is a very important task in clinical practice to appropriately plan pregnancy management and delivery. In the third trimester of pregnancy, ultrasound cardiotocography is the most employed diagnostic technique: foetal heart rate and uterine contractions signals are simultaneously recorded and analysed in order to ascertain foetal health. Because ultrasound cardiotocography interpretation still lacks of complete reliability, new parameters and methods of interpretation, or alternative methodologies, are necessary to further support physicians’ decisions. To this aim, in this thesis, foetal phonocardiography and electrocardiography are considered as different techniques. Further, variability of foetal heart rate is thoroughly studied. Frequency components and their modifications can be analysed by applying a time-frequency approach, for a distinct understanding of the spectral components and their change over time related to foetal reactions to internal and external stimuli (such as uterine contractions). Such modifications of the power spectrum can be a sign of autonomic nervous system reactions and therefore represent additional, objective information about foetal reactivity and health. However, some limits of ultrasonic cardiotocography still remain, such as in long-term foetal surveillance, which is often recommendable mainly in risky pregnancies. In these cases, the fully non-invasive acoustic recording, foetal phonocardiography, through maternal abdomen, represents a valuable alternative to the ultrasonic cardiotocography. Unfortunately, the so recorded foetal heart sound signal is heavily loaded by noise, thus the determination of the foetal heart rate raises serious signal processing issues. A new algorithm for foetal heart rate estimation from foetal phonocardiographic recordings is presented in this thesis. Different filtering and enhancement techniques, to enhance the first foetal heart sounds, were applied, so that different signal processing techniques were implemented, evaluated and compared, by identifying the strategy characterized on average by the best results. In particular, phonocardiographic signals were recorded simultaneously to ultrasonic cardiotocographic signals in order to compare the two foetal heart rate series (the one estimated by the developed algorithm and the other provided by cardiotocographic device). The algorithm performances were tested on phonocardiographic signals recorded on pregnant women, showing reliable foetal heart rate signals, very close to the ultrasound cardiotocographic recordings, considered as reference. The algorithm was also tested by using a foetal phonocardiographic recording simulator developed and presented in this research thesis. The target was to provide a software for simulating recordings relative to different foetal conditions and recordings situations and to use it as a test tool for comparing and assessing different foetal heart rate extraction algorithms. Since there are few studies about foetal heart sounds time characteristics and frequency content and the available literature is poor and not rigorous in this area, a data collection pilot study was also conducted with the purpose of specifically characterising both foetal and maternal heart sounds. Finally, in this thesis, the use of foetal phonocardiographic and electrocardiographic methodology and their combination, are presented in order to detect foetal heart rate and other functioning anomalies. The developed methodologies, suitable for longer-term assessment, were able to detect heart beat events correctly, such as first and second heart sounds and QRS waves. The detection of such events provides reliable measures of foetal heart rate, potentially information about measurement of the systolic time intervals and foetus circulatory impedance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sequenz spezifische biomolekulare Analyseverfahren erweisen sich gerade im Hinblick auf das Humane Genom Projekt als äußerst nützlich in der Detektion von einzelnen Nukleotid Polymorphismen (SNPs) und zur Identifizierung von Genen. Auf Grund der hohen Anzahl von Basenpaaren, die zu analysieren sind, werden sensitive und effiziente Rastermethoden benötigt, welche dazu fähig sind, DNA-Proben in einer geeigneten Art und Weise zu bearbeiten. Die meisten Detektionsarten berücksichtigen die Interaktion einer verankerten Probe und des korrespondierenden Targets mit den Oberflächen. Die Analyse des kinetischen Verhaltens der Oligonukleotide auf der Sensoroberfläche ist infolgedessen von höchster Wichtigkeit für die Verbesserung bereits bekannter Detektions - Schemata. In letzter Zeit wurde die Oberflächen Plasmonen feld-verstärkte Fluoreszenz Spektroskopie (SPFS) entwickelt. Sie stellt eine kinetische Analyse - und Detektions - Methode dar, die mit doppelter Aufzeichnung, d.h. der Änderung der Reflektivität und des Fluoreszenzsignals, für das Interphasen Phänomen operiert. Durch die Verwendung von SPFS können Kinetikmessungen für die Hybridisierung zwischen Peptid Nukleinsäure (PNA), welche eine synthetisierte Nukleinsäure DNA imitiert und eine stabilere Doppelhelix formt, und DNA auf der Sensoroberfläche ausgeführt werden. Mittels einzel-, umfassend-, und titrations- Experimenten sowohl mit einer komplementär zusammenpassenden Sequenz als auch einer mismatch Sequenz können basierend auf dem Langmuir Modell die Geschwindigkeitskonstanten für die Bindungsreaktion des oligomer DNA Targets bzw. des PCR Targets zur PNA ermittelt werden. Darüber hinaus wurden die Einflüsse der Ionenstärke und der Temperatur für die PNA/DNA Hybridisierung in einer kinetischen Analyse aufgezeigt.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this thesis is to investigate the strength and structure of the magnetized medium surrounding radio galaxies via observations of the Faraday effect. This study is based on an analysis of the polarization properties of radio galaxies selected to have a range of morphologies (elongated tails, or lobes with small axial ratios) and to be located in a variety of environments (from rich cluster core to small group). The targets include famous objects like M84 and M87. A key aspect of this work is the combination of accurate radio imaging with high-quality X-ray data for the gas surrounding the sources. Although the focus of this thesis is primarily observational, I developed analytical models and performed two- and three-dimensional numerical simulations of magnetic fields. The steps of the thesis are: (a) to analyze new and archival observations of Faraday rotation measure (RM) across radio galaxies and (b) to interpret these and existing RM images using sophisticated two and three-dimensional Monte Carlo simulations. The approach has been to select a few bright, very extended and highly polarized radio galaxies. This is essential to have high signal-to-noise in polarization over large enough areas to allow computation of spatial statistics such as the structure function (and hence the power spectrum) of rotation measure, which requires a large number of independent measurements. New and archival Very Large Array observations of the target sources have been analyzed in combination with high-quality X-ray data from the Chandra, XMM-Newton and ROSAT satellites. The work has been carried out by making use of: 1) Analytical predictions of the RM structure functions to quantify the RM statistics and to constrain the power spectra of the RM and magnetic field. 2) Two-dimensional Monte Carlo simulations to address the effect of an incomplete sampling of RM distribution and so to determine errors for the power spectra. 3) Methods to combine measurements of RM and depolarization in order to constrain the magnetic-field power spectrum on small scales. 4) Three-dimensional models of the group/cluster environments, including different magnetic field power spectra and gas density distributions. This thesis has shown that the magnetized medium surrounding radio galaxies appears more complicated than was apparent from earlier work. Three distinct types of magnetic-field structure are identified: an isotropic component with large-scale fluctuations, plausibly associated with the intergalactic medium not affected by the presence of a radio source; a well-ordered field draped around the front ends of the radio lobes and a field with small-scale fluctuations in rims of compressed gas surrounding the inner lobes, perhaps associated with a mixing layer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The surface electrocardiogram (ECG) is an established diagnostic tool for the detection of abnormalities in the electrical activity of the heart. The interest of the ECG, however, extends beyond the diagnostic purpose. In recent years, studies in cognitive psychophysiology have related heart rate variability (HRV) to memory performance and mental workload. The aim of this thesis was to analyze the variability of surface ECG derived rhythms, at two different time scales: the discrete-event time scale, typical of beat-related features (Objective I), and the “continuous” time scale of separated sources in the ECG (Objective II), in selected scenarios relevant to psychophysiological and clinical research, respectively. Objective I) Joint time-frequency and non-linear analysis of HRV was carried out, with the goal of assessing psychophysiological workload (PPW) in response to working memory engaging tasks. Results from fourteen healthy young subjects suggest the potential use of the proposed indices in discriminating PPW levels in response to varying memory-search task difficulty. Objective II) A novel source-cancellation method based on morphology clustering was proposed for the estimation of the atrial wavefront in atrial fibrillation (AF) from body surface potential maps. Strong direct correlation between spectral concentration (SC) of atrial wavefront and temporal variability of the spectral distribution was shown in persistent AF patients, suggesting that with higher SC, shorter observation time is required to collect spectral distribution, from which the fibrillatory rate is estimated. This could be time and cost effective in clinical decision-making. The results held for reduced leads sets, suggesting that a simplified setup could also be considered, further reducing the costs. In designing the methods of this thesis, an online signal processing approach was kept, with the goal of contributing to real-world applicability. An algorithm for automatic assessment of ambulatory ECG quality, and an automatic ECG delineation algorithm were designed and validated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents several data processing and compression techniques capable of addressing the strict requirements of wireless sensor networks. After introducing a general overview of sensor networks, the energy problem is introduced, dividing the different energy reduction approaches according to the different subsystem they try to optimize. To manage the complexity brought by these techniques, a quick overview of the most common middlewares for WSNs is given, describing in detail SPINE2, a framework for data processing in the node environment. The focus is then shifted on the in-network aggregation techniques, used to reduce data sent by the network nodes trying to prolong the network lifetime as long as possible. Among the several techniques, the most promising approach is the Compressive Sensing (CS). To investigate this technique, a practical implementation of the algorithm is compared against a simpler aggregation scheme, deriving a mixed algorithm able to successfully reduce the power consumption. The analysis moves from compression implemented on single nodes to CS for signal ensembles, trying to exploit the correlations among sensors and nodes to improve compression and reconstruction quality. The two main techniques for signal ensembles, Distributed CS (DCS) and Kronecker CS (KCS), are introduced and compared against a common set of data gathered by real deployments. The best trade-off between reconstruction quality and power consumption is then investigated. The usage of CS is also addressed when the signal of interest is sampled at a Sub-Nyquist rate, evaluating the reconstruction performance. Finally the group sparsity CS (GS-CS) is compared to another well-known technique for reconstruction of signals from an highly sub-sampled version. These two frameworks are compared again against a real data-set and an insightful analysis of the trade-off between reconstruction quality and lifetime is given.