180 resultados para Matrices.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Molecular phylogenetic studies of homologous sequences of nucleotides often assume that the underlying evolutionary process was globally stationary, reversible, and homogeneous (SRH), and that a model of evolution with one or more site-specific and time-reversible rate matrices (e.g., the GTR rate matrix) is enough to accurately model the evolution of data over the whole tree. However, an increasing body of data suggests that evolution under these conditions is an exception, rather than the norm. To address this issue, several non-SRH models of molecular evolution have been proposed, but they either ignore heterogeneity in the substitution process across sites (HAS) or assume it can be modeled accurately using the distribution. As an alternative to these models of evolution, we introduce a family of mixture models that approximate HAS without the assumption of an underlying predefined statistical distribution. This family of mixture models is combined with non-SRH models of evolution that account for heterogeneity in the substitution process across lineages (HAL). We also present two algorithms for searching model space and identifying an optimal model of evolution that is less likely to over- or underparameterize the data. The performance of the two new algorithms was evaluated using alignments of nucleotides with 10 000 sites simulated under complex non-SRH conditions on a 25-tipped tree. The algorithms were found to be very successful, identifying the correct HAL model with a 75% success rate (the average success rate for assigning rate matrices to the tree's 48 edges was 99.25%) and, for the correct HAL model, identifying the correct HAS model with a 98% success rate. Finally, parameter estimates obtained under the correct HAL-HAS model were found to be accurate and precise. The merits of our new algorithms were illustrated with an analysis of 42 337 second codon sites extracted from a concatenation of 106 alignments of orthologous genes encoded by the nuclear genomes of Saccharomyces cerevisiae, S. paradoxus, S. mikatae, S. kudriavzevii, S. castellii, S. kluyveri, S. bayanus, and Candida albicans. Our results show that second codon sites in the ancestral genome of these species contained 49.1% invariable sites, 39.6% variable sites belonging to one rate category (V1), and 11.3% variable sites belonging to a second rate category (V2). The ancestral nucleotide content was found to differ markedly across these three sets of sites, and the evolutionary processes operating at the variable sites were found to be non-SRH and best modeled by a combination of eight edge-specific rate matrices (four for V1 and four for V2). The number of substitutions per site at the variable sites also differed markedly, with sites belonging to V1 evolving slower than those belonging to V2 along the lineages separating the seven species of Saccharomyces. Finally, sites belonging to V1 appeared to have ceased evolving along the lineages separating S. cerevisiae, S. paradoxus, S. mikatae, S. kudriavzevii, and S. bayanus, implying that they might have become so selectively constrained that they could be considered invariable sites in these species.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Structural equation modeling (SEM) is a powerful statistical approach for the testing of networks of direct and indirect theoretical causal relationships in complex data sets with intercorrelated dependent and independent variables. SEM is commonly applied in ecology, but the spatial information commonly found in ecological data remains difficult to model in a SEM framework. Here we propose a simple method for spatially explicit SEM (SE-SEM) based on the analysis of variance/covariance matrices calculated across a range of lag distances. This method provides readily interpretable plots of the change in path coefficients across scale and can be implemented using any standard SEM software package. We demonstrate the application of this method using three studies examining the relationships between environmental factors, plant community structure, nitrogen fixation, and plant competition. By design, these data sets had a spatial component, but were previously analyzed using standard SEM models. Using these data sets, we demonstrate the application of SE-SEM to regularly spaced, irregularly spaced, and ad hoc spatial sampling designs and discuss the increased inferential capability of this approach compared with standard SEM. We provide an R package, sesem, to easily implement spatial structural equation modeling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In vitro pre-vascularization is one of the main vascularization strategies in the tissue engineering field. Culturing cells within a tissue-engineered construct (TEC) prior to implantation provides researchers with a greater degree of control over the fate of the cells. However, balancing the diverse range of different cell culture parameters in vitro is seldom easy and in most cases, especially in highly vascularized tissues, more than one cell type will reside within the cell culture system. Culturing multiple cell types in the same construct presents its own unique challenges and pitfalls. The following review examines endothelial-driven vascularization and evaluates the direct and indirect role other cell types have in vessel and capillary formation. The article then analyses the different parameters researchers can modulate in a co-culture system in order to design optimal tissue-engineered constructs to match desired clinical applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The application of decellularized extracellular matrices to aid tissue regeneration in reconstructive surgery and regenerative medicine has been promising. Several decellularization protocols for removing cellular materials from natural tissues such as heart valves are currently in use. This paper evaluates the feasibility of potential extension of this methodology relative to the desirable properties of load bearing joint tissues such as stiffness, porosity and ability to recover adequately after deformation to facilitate physiological function. Two decellularization protocols, namely: Trypsin and Triton X-100 were evaluated against their effects on bovine articular cartilage, using biomechanical, biochemical and microstructural techniques. These analyses revealed that decellularization with trypsin resulted in severe loss of mechanical stiffness including deleterious collapse of the collagen architecture which in turn significantly compromised the porosity of the construct. In contrast, triton X-100 detergent treatment yielded samples that retain mechanical stiffness relative to that of the normal intact cartilage sample, but the resulting construct contained ruminant cellular constituents. We conclude that both of these common decellularization protocols are inadequate for producing constructs that can serve as effective replacement and scaffolds to regenerate articular joint tissue.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wet-milling protocol was employed to produce pressed powder tablets with excellent cohesion and homogeneity suitable for laser ablation (LA) analysis of volatile and refractive elements in sediment. The influence of sample preparation on analytical performance was also investigated, including sample homogeneity, accuracy and limit of detection. Milling in volatile solvent for 40 min ensured sample is well mixed and could reasonably recover both volatile (Hg) and refractive (Zr) elements. With the exception of Cr (−52%) and Nb (+26%) major, minor and trace elements in STSD-1 and MESS-3 could be analysed within ±20% of the certified values. Comparison of the method with total digestion method using HF was tested by analysing 10 different sediment samples. The laser method recovers significantly higher amounts of analytes such as Ag, Cd, Sn and Sn than the total digestion method making it a more robust method for elements across the periodic table. LA-ICP-MS also eliminates the interferences from chemical reagents as well as the health and safety risks associated with digestion processes. Therefore, it can be considered as an enhanced method for the analysis of heterogeneous matrices such as river sediments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider rank regression for clustered data analysis and investigate the induced smoothing method for obtaining the asymptotic covariance matrices of the parameter estimators. We prove that the induced estimating functions are asymptotically unbiased and the resulting estimators are strongly consistent and asymptotically normal. The induced smoothing approach provides an effective way for obtaining asymptotic covariance matrices for between- and within-cluster estimators and for a combined estimator to take account of within-cluster correlations. We also carry out extensive simulation studies to assess the performance of different estimators. The proposed methodology is substantially Much faster in computation and more stable in numerical results than the existing methods. We apply the proposed methodology to a dataset from a randomized clinical trial.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the analysis of longitudinal data when the covariance function is modeled by additional parameters to the mean parameters. In general, inconsistent estimators of the covariance (variance/correlation) parameters will be produced when the "working" correlation matrix is misspecified, which may result in great loss of efficiency of the mean parameter estimators (albeit the consistency is preserved). We consider using different "Working" correlation models for the variance and the mean parameters. In particular, we find that an independence working model should be used for estimating the variance parameters to ensure their consistency in case the correlation structure is misspecified. The designated "working" correlation matrices should be used for estimating the mean and the correlation parameters to attain high efficiency for estimating the mean parameters. Simulation studies indicate that the proposed algorithm performs very well. We also applied different estimation procedures to a data set from a clinical trial for illustration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The method of generalised estimating equations for regression modelling of clustered outcomes allows for specification of a working matrix that is intended to approximate the true correlation matrix of the observations. We investigate the asymptotic relative efficiency of the generalised estimating equation for the mean parameters when the correlation parameters are estimated by various methods. The asymptotic relative efficiency depends on three-features of the analysis, namely (i) the discrepancy between the working correlation structure and the unobservable true correlation structure, (ii) the method by which the correlation parameters are estimated and (iii) the 'design', by which we refer to both the structures of the predictor matrices within clusters and distribution of cluster sizes. Analytical and numerical studies of realistic data-analysis scenarios show that choice of working covariance model has a substantial impact on regression estimator efficiency. Protection against avoidable loss of efficiency associated with covariance misspecification is obtained when a 'Gaussian estimation' pseudolikelihood procedure is used with an AR(1) structure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recovering the motion of a non-rigid body from a set of monocular images permits the analysis of dynamic scenes in uncontrolled environments. However, the extension of factorisation algorithms for rigid structure from motion to the low-rank non-rigid case has proved challenging. This stems from the comparatively hard problem of finding a linear “corrective transform” which recovers the projection and structure matrices from an ambiguous factorisation. We elucidate that this greater difficulty is due to the need to find multiple solutions to a non-trivial problem, casting a number of previous approaches as alleviating this issue by either a) introducing constraints on the basis, making the problems nonidentical, or b) incorporating heuristics to encourage a diverse set of solutions, making the problems inter-dependent. While it has previously been recognised that finding a single solution to this problem is sufficient to estimate cameras, we show that it is possible to bootstrap this partial solution to find the complete transform in closed-form. However, we acknowledge that our method minimises an algebraic error and is thus inherently sensitive to deviation from the low-rank model. We compare our closed-form solution for non-rigid structure with known cameras to the closed-form solution of Dai et al. [1], which we find to produce only coplanar reconstructions. We therefore make the recommendation that 3D reconstruction error always be measured relative to a trivial reconstruction such as a planar one.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Isolating, purifying, and identifying proteins in complex biological matrices is often difficult, time consuming, and unreliable. Herein we describe a rapid screening technique for proteins in biological matrices that combines selective protein isolation with direct surface enhanced Raman spectroscopy (SERS) detection. Magnetic core gold nanoparticles were synthesised, characterised, and subsequently functionalized with recombinant human erythropoietin (rHuEPO)-specific antibody. The functionalized nanoparticles were used to capture rHuEPO from horse blood plasma within 15 minutes. The selective binding between the protein and the functionalized nanoparticles was monitored by SERS. The purified protein was then released from the nanoparticles’ surface and directly spectroscopically identified on a commercial nanopillar SERS substrate. ELISA independently confirmed the SERS identification and quantified the released rHuEPO. Finally, the direct SERS detection of the extracted protein was successfully demonstrated for in-field screening by a handheld Raman spectrometer within 1 minute sample measurement time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this report, we describe a simple correction for multiple testing of single-nucleotide polymorphisms (SNPs) in linkage disequilibrium (LD) with each other, on the basis of the spectral decomposition (SpD) of matrices of pairwise LD between SNPs. This method provides a useful alternative to more computationally intensive permutation tests. Additionally, output from SNPSpD includes eigenvalues, principal-component coefficients, and factor "loadings" after varimax rotation, enabling the selection of a subset of SNPs that optimize the information in a genomic region.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quantification of pyridoxal-5´-phosphate (PLP) in biological samples is challenging due to the presence of endogenous PLP in matrices used for preparation of calibrators and quality control samples (QCs). Hence, we have developed an LC-MS/MS method for accurate and precise measurement of the concentrations of PLP in samples (20 µL) of human whole blood that addresses this issue by using a surrogate matrix and minimizing the matrix effect. We used a surrogate matrix comprising 2% bovine serum albumin (BSA) in phosphate buffer saline (PBS) for making calibrators, QCs and the concentrations were adjusted to include the endogenous PLP concentrations in the surrogate matrix according to the method of standard addition. PLP was separated from the other components of the sample matrix using protein precipitation with trichloroacetic acid 10% w/v. After centrifugation, supernatant were injected directly into the LC-MS/MS system. Calibration curves were linear and recovery was > 92%. QCs were accurate, precise, stable for four freeze-thaw cycles, and following storage at room temperature for 17h or at -80 °C for 3 months. There was no significant matrix effect using 9 different individual human blood samples. Our novel LC-MS/MS method has satisfied all of the criteria specified in the 2012 EMEA guideline on bioanalytical method validation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper presents a geometry-free approach to assess the variation of covariance matrices of undifferenced triple frequency GNSS measurements and its impact on positioning solutions. Four independent geometryfree/ ionosphere-free (GFIF) models formed from original triple-frequency code and phase signals allow for effective computation of variance-covariance matrices using real data. Variance Component Estimation (VCE) algorithms are implemented to obtain the covariance matrices for three pseudorange and three carrier-phase signals epoch-by-epoch. Covariance results from the triple frequency Beidou System (BDS) and GPS data sets demonstrate that the estimated standard deviation varies in consistence with the amplitude of actual GFIF error time series. The single point positioning (SPP) results from BDS ionosphere-free measurements at four MGEX stations demonstrate an improvement of up to about 50% in Up direction relative to the results based on a mean square statistics. Additionally, a more extensive SPP analysis at 95 global MGEX stations based on GPS ionosphere-free measurements shows an average improvement of about 10% relative to the traditional results. This finding provides a preliminary confirmation that adequate consideration of the variation of covariance leads to the improvement of GNSS state solutions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stationary processes are random variables whose value is a signal and whose distribution is invariant to translation in the domain of the signal. They are intimately connected to convolution, and therefore to the Fourier transform, since the covariance matrix of a stationary process is a Toeplitz matrix, and Toeplitz matrices are the expression of convolution as a linear operator. This thesis utilises this connection in the study of i) efficient training algorithms for object detection and ii) trajectory-based non-rigid structure-from-motion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A test for time-varying correlation is developed within the framework of a dynamic conditional score (DCS) model for both Gaussian and Student t-distributions. The test may be interpreted as a Lagrange multiplier test and modified to allow for the estimation of models for time-varying volatility in the individual series. Unlike standard moment-based tests, the score-based test statistic includes information on the level of correlation under the null hypothesis and local power arguments indicate the benefits of doing so. A simulation study shows that the performance of the score-based test is strong relative to existing tests across a range of data generating processes. An application to the Hong Kong and South Korean equity markets shows that the new test reveals changes in correlation that are not detected by the standard moment-based test.