178 resultados para linear projector arrays
Resumo:
In cooperative communication networks, owing to the nodes' arbitrary geographical locations and individual oscillators, the system is fundamentally asynchronous. This will damage some of the key properties of the space-time codes and can lead to substantial performance degradation. In this paper, we study the design of linear dispersion codes (LDCs) for such asynchronous cooperative communication networks. Firstly, the concept of conventional LDCs is extended to the delay-tolerant version and new design criteria are discussed. Then we propose a new design method to yield delay-tolerant LDCs that reach the optimal Jensen's upper bound on ergodic capacity as well as minimum average pairwise error probability. The proposed design employs stochastic gradient algorithm to approach a local optimum. Moreover, it is improved by using simulated annealing type optimization to increase the likelihood of the global optimum. The proposed method allows for flexible number of nodes, receive antennas, modulated symbols and flexible length of codewords. Simulation results confirm the performance of the newly-proposed delay-tolerant LDCs.
Resumo:
We give a characterisation of the spectral properties of linear differential operators with constant coefficients, acting on functions defined on a bounded interval, and determined by general linear boundary conditions. The boundary conditions may be such that the resulting operator is not selfadjoint. We associate the spectral properties of such an operator $S$ with the properties of the solution of a corresponding boundary value problem for the partial differential equation $\partial_t q \pm iSq=0$. Namely, we are able to establish an explicit correspondence between the properties of the family of eigenfunctions of the operator, and in particular whether this family is a basis, and the existence and properties of the unique solution of the associated boundary value problem. When such a unique solution exists, we consider its representation as a complex contour integral that is obtained using a transform method recently proposed by Fokas and one of the authors. The analyticity properties of the integrand in this representation are crucial for studying the spectral theory of the associated operator.
Resumo:
(ABR) is of fundamental importance to the investiga- tion of the auditory system behavior, though its in- terpretation has a subjective nature because of the manual process employed in its study and the clinical experience required for its analysis. When analyzing the ABR, clinicians are often interested in the identi- fication of ABR signal components referred to as Jewett waves. In particular, the detection and study of the time when these waves occur (i.e., the wave la- tency) is a practical tool for the diagnosis of disorders affecting the auditory system. In this context, the aim of this research is to compare ABR manual/visual analysis provided by different examiners. Methods: The ABR data were collected from 10 normal-hearing subjects (5 men and 5 women, from 20 to 52 years). A total of 160 data samples were analyzed and a pair- wise comparison between four distinct examiners was executed. We carried out a statistical study aiming to identify significant differences between assessments provided by the examiners. For this, we used Linear Regression in conjunction with Bootstrap, as a me- thod for evaluating the relation between the responses given by the examiners. Results: The analysis sug- gests agreement among examiners however reveals differences between assessments of the variability of the waves. We quantified the magnitude of the ob- tained wave latency differences and 18% of the inves- tigated waves presented substantial differences (large and moderate) and of these 3.79% were considered not acceptable for the clinical practice. Conclusions: Our results characterize the variability of the manual analysis of ABR data and the necessity of establishing unified standards and protocols for the analysis of these data. These results may also contribute to the validation and development of automatic systems that are employed in the early diagnosis of hearing loss.
Resumo:
Novel acid-terminated hyperbranched polymers (HBPs) containing adipic acid and oxazoline monomers derived from oleic and linoleic acid have been synthesized via a bulk polymerization procedure. Branching was achieved as a consequence of an acid-catalyzed opening of the oxazoline ring to produce a trifunctional monomer in situ which delivered branching levels of >45% as determined by 1H and 13C NMR spectroscopy. The HBPs were soluble in common solvents, such as CHCl3, acetone, tetrahydrofuran, dimethylformamide, and dimethyl sulfoxide and were further functionalized by addition of citronellol to afford white-spirit soluble materials that could be used in coating formulations. During end group modification, a reduction in branching levels of the HBPs (down to 12–24%) was observed, predominantly on account of oxazoline ring reformation and trans-esterification processes under the reaction conditions used. In comparison to commercial alkyd resin paint coatings, formulations of the citronellol-functionalized hyperbranched materials blended with a commercial alkyd resin exhibited dramatic decreases of the blend viscosity when the HBP content was increased. The curing characteristics of the HBP/alkyd blend formulations were studied by dynamic mechanical analysis which revealed that the new coatings cured more quickly and produced tougher materials than otherwise identical coatings prepared from only the commercial alkyd resins.
Resumo:
Criteria are proposed for evaluating sea surface temperature (SST) retrieved from satellite infra-red imagery: bias should be small on regional scales; sensitivity to atmospheric humidity should be small; and sensitivity of retrieved SST to surface temperature should be close to 1 K K−1. Their application is illustrated for non-linear sea surface temperature (NLSST) estimates. 233929 observations from the Advanced Very High Resolution Radiometer (AVHRR) on Metop-A are matched with in situ data and numerical weather prediction (NWP) fields. NLSST coefficients derived from these matches have regional biases from −0.5 to +0.3 K. Using radiative transfer modelling we find that a 10% increase in humidity alone can change the retrieved NLSST by between −0.5 K and +0.1 K. A 1 K increase in SST changes NLSST by <0.5 K in extreme cases. The validity of estimates of sensitivity by radiative transfer modelling is confirmed empirically.
Resumo:
Background: Expression microarrays are increasingly used to obtain large scale transcriptomic information on a wide range of biological samples. Nevertheless, there is still much debate on the best ways to process data, to design experiments and analyse the output. Furthermore, many of the more sophisticated mathematical approaches to data analysis in the literature remain inaccessible to much of the biological research community. In this study we examine ways of extracting and analysing a large data set obtained using the Agilent long oligonucleotide transcriptomics platform, applied to a set of human macrophage and dendritic cell samples. Results: We describe and validate a series of data extraction, transformation and normalisation steps which are implemented via a new R function. Analysis of replicate normalised reference data demonstrate that intrarray variability is small (only around 2 of the mean log signal), while interarray variability from replicate array measurements has a standard deviation (SD) of around 0.5 log(2) units (6 of mean). The common practise of working with ratios of Cy5/Cy3 signal offers little further improvement in terms of reducing error. Comparison to expression data obtained using Arabidopsis samples demonstrates that the large number of genes in each sample showing a low level of transcription reflect the real complexity of the cellular transcriptome. Multidimensional scaling is used to show that the processed data identifies an underlying structure which reflect some of the key biological variables which define the data set. This structure is robust, allowing reliable comparison of samples collected over a number of years and collected by a variety of operators. Conclusions: This study outlines a robust and easily implemented pipeline for extracting, transforming normalising and visualising transcriptomic array data from Agilent expression platform. The analysis is used to obtain quantitative estimates of the SD arising from experimental (non biological) intra- and interarray variability, and for a lower threshold for determining whether an individual gene is expressed. The study provides a reliable basis for further more extensive studies of the systems biology of eukaryotic cells.
Resumo:
Background: Affymetrix GeneChip arrays are widely used for transcriptomic studies in a diverse range of species. Each gene is represented on a GeneChip array by a probe- set, consisting of up to 16 probe-pairs. Signal intensities across probe- pairs within a probe-set vary in part due to different physical hybridisation characteristics of individual probes with their target labelled transcripts. We have previously developed a technique to study the transcriptomes of heterologous species based on hybridising genomic DNA (gDNA) to a GeneChip array designed for a different species, and subsequently using only those probes with good homology. Results: Here we have investigated the effects of hybridising homologous species gDNA to study the transcriptomes of species for which the arrays have been designed. Genomic DNA from Arabidopsis thaliana and rice (Oryza sativa) were hybridised to the Affymetrix Arabidopsis ATH1 and Rice Genome GeneChip arrays respectively. Probe selection based on gDNA hybridisation intensity increased the number of genes identified as significantly differentially expressed in two published studies of Arabidopsis development, and optimised the analysis of technical replicates obtained from pooled samples of RNA from rice. Conclusion: This mixed physical and bioinformatics approach can be used to optimise estimates of gene expression when using GeneChip arrays.
Resumo:
High-density oligonucleotide (oligo) arrays are a powerful tool for transcript profiling. Arrays based on GeneChip® technology are amongst the most widely used, although GeneChip® arrays are currently available for only a small number of plant and animal species. Thus, we have developed a method to improve the sensitivity of high-density oligonucleotide arrays when applied to heterologous species and tested the method by analysing the transcriptome of Brassica oleracea L., a species for which no GeneChip® array is available, using a GeneChip® array designed for Arabidopsis thaliana (L.) Heynh. Genomic DNA from B. oleracea was labelled and hybridised to the ATH1-121501 GeneChip® array. Arabidopsis thaliana probe-pairs that hybridised to the B. oleracea genomic DNA on the basis of the perfect-match (PM) probe signal were then selected for subsequent B. oleracea transcriptome analysis using a .cel file parser script to generate probe mask files. The transcriptional response of B. oleracea to a mineral nutrient (phosphorus; P) stress was quantified using probe mask files generated for a wide range of gDNA hybridisation intensity thresholds. An example probe mask file generated with a gDNA hybridisation intensity threshold of 400 removed > 68 % of the available PM probes from the analysis but retained >96 % of available A. thaliana probe-sets. Ninety-nine of these genes were then identified as significantly regulated under P stress in B. oleracea, including the homologues of P stress responsive genes in A. thaliana. Increasing the gDNA hybridisation intensity thresholds up to 500 for probe-selection increased the sensitivity of the GeneChip® array to detect regulation of gene expression in B. oleracea under P stress by up to 13-fold. Our open-source software to create probe mask files is freely available http://affymetrix.arabidopsis.info/xspecies/ webcite and may be used to facilitate transcriptomic analyses of a wide range of plant and animal species in the absence of custom arrays.
Resumo:
We report here a unique chiral hybrid gallium sulfide, [NC2H8]2[Ga10S16(N2C12H12)(NC2H7)2] 1, consisting of helical chains of organically-functionalised supertetrahedral clusters which form quadruple-stranded helical nanotubes of ca. 3 nm diameter. This material therefore consists of discrete metal-organic nanotubes which, to the best of our knowledge, are extremely rare. Whilst solvothermal reactions involving 1,2-di(4-pyridyl)ethylene (DPE) resulted in the formation of such single-walled chiral nanotubes, the use of longer 4,4’-trimethylenedipyridine (TMP) ligands resulted in the synthesis of a two-dimensional hybrid gallium sulfide, [C5H6N]3[Ga10S16(OH)(N2C13H14)] 2 in which, for the first time, inorganic and organic linkages between supertetrahedral clusters coexist.
Resumo:
The optically stimulated luminescence (OSL) from quartz is known to be the sum of several components with different rates of charge loss, originating from different trap types. The OSL components are clearly distinguished using the linear modulation (LM OSL) technique. A variety of pre-treatment and measurement conditions have been used on sedimentary samples in conjunction with linearly modulated optical stimulation to study in detail the behaviour of the OSL components of quartz. Single aliquots of different quartz samples have been found to contain typically five or six common LM OSL components when stimulated at View the MathML source. The components have been parameterised in terms of thermal stability (i.e. E and s), photoionisation cross-section energy dependence and dose response. The results of studies concerning applications of component-resolved LM OSL measurements on quartz are also presented. These include the detection of partial bleaching in young samples, use of ‘stepped wavelength’ stimulation to observe OSL from single components and attempts to extend the age range of quartz OSL dating.
Resumo:
The objective of this paper is to apply the mis-specification (M-S) encompassing perspective to the problem of choosing between linear and log-linear unit-root models. A simple M-S encompassing test, based on an auxiliary regression stemming from the conditional second moment, is proposed and its empirical size and power are investigated using Monte Carlo simulations. It is shown that by focusing on the conditional process the sampling distributions of the relevant statistics are well behaved under both the null and alternative hypotheses. The proposed M-S encompassing test is illustrated using US total disposable income quarterly data.
Resumo:
We compare a number of models of post War US output growth in terms of the degree and pattern of non-linearity they impart to the conditional mean, where we condition on either the previous period's growth rate, or the previous two periods' growth rates. The conditional means are estimated non-parametrically using a nearest-neighbour technique on data simulated from the models. In this way, we condense the complex, dynamic, responses that may be present in to graphical displays of the implied conditional mean.
Resumo:
We test whether there are nonlinearities in the response of short- and long-term interest rates to the spread in interest rates, and assess the out-of-sample predictability of interest rates using linear and nonlinear models. We find strong evidence of nonlinearities in the response of interest rates to the spread. Nonlinearities are shown to result in more accurate short-horizon forecasts, especially of the spread.
Resumo:
In this paper we discuss the current state-of-the-art in estimating, evaluating, and selecting among non-linear forecasting models for economic and financial time series. We review theoretical and empirical issues, including predictive density, interval and point evaluation and model selection, loss functions, data-mining, and aggregation. In addition, we argue that although the evidence in favor of constructing forecasts using non-linear models is rather sparse, there is reason to be optimistic. However, much remains to be done. Finally, we outline a variety of topics for future research, and discuss a number of areas which have received considerable attention in the recent literature, but where many questions remain.