88 resultados para Mathematical Techniques--Error Analysis
Resumo:
This work is part of a research under construction since 2000, in which the main objective is to measure small dynamic displacements by using L1 GPS receivers. A very sensible way to detect millimetric periodic displacements is based on the Phase Residual Method (PRM). This method is based on the frequency domain analysis of the phase residuals resulted from the L1 double difference static data processing of two satellites in almost orthogonal elevation angle. In this article, it is proposed to obtain the phase residuals directly from the raw phase observable collected in a short baseline during a limited time span, in lieu of obtaining the residual data file from regular GPS processing programs which not always allow the choice of the aimed satellites. In order to improve the ability to detect millimetric oscillations, two filtering techniques are introduced. One is auto-correlation which reduces the phase noise with random time behavior. The other is the running mean to separate low frequency from the high frequency phase sources. Two trials have been carried out to verify the proposed method and filtering techniques. One simulates a 2.5 millimeter vertical antenna displacement and the second uses the GPS data collected during a bridge load test. The results have shown a good consistency to detect millimetric oscillations.
Resumo:
Background: The post-genomic era has brought new challenges regarding the understanding of the organization and function of the human genome. Many of these challenges are centered on the meaning of differential gene regulation under distinct biological conditions and can be performed by analyzing the Multiple Differential Expression (MDE) of genes associated with normal and abnormal biological processes. Currently MDE analyses are limited to usual methods of differential expression initially designed for paired analysis. Results: We proposed a web platform named ProbFAST for MDE analysis which uses Bayesian inference to identify key genes that are intuitively prioritized by means of probabilities. A simulated study revealed that our method gives a better performance when compared to other approaches and when applied to public expression data, we demonstrated its flexibility to obtain relevant genes biologically associated with normal and abnormal biological processes. Conclusions: ProbFAST is a free accessible web-based application that enables MDE analysis on a global scale. It offers an efficient methodological approach for MDE analysis of a set of genes that are turned on and off related to functional information during the evolution of a tumor or tissue differentiation. ProbFAST server can be accessed at http://gdm.fmrp.usp.br/probfast.
Resumo:
Background: Microarray techniques have become an important tool to the investigation of genetic relationships and the assignment of different phenotypes. Since microarrays are still very expensive, most of the experiments are performed with small samples. This paper introduces a method to quantify dependency between data series composed of few sample points. The method is used to construct gene co-expression subnetworks of highly significant edges. Results: The results shown here are for an adapted subset of a Saccharomyces cerevisiae gene expression data set with low temporal resolution and poor statistics. The method reveals common transcription factors with a high confidence level and allows the construction of subnetworks with high biological relevance that reveals characteristic features of the processes driving the organism adaptations to specific environmental conditions. Conclusion: Our method allows a reliable and sophisticated analysis of microarray data even under severe constraints. The utilization of systems biology improves the biologists ability to elucidate the mechanisms underlying celular processes and to formulate new hypotheses.
Resumo:
Aims: Surgical staple line dehiscence usually leads to severe complications. Several techniques and materials have been used to reinforce this stapling and thus reduce the related complications. The objective was to compare safety of two types of anastomotic reinforcement in open gastric bypass. Methods: A prospective, randomized study comparing an extraluminal suture, fibrin glue, and a nonpermanent buttressing material, Seamguard (R), for staple line reinforcement. Fibrin glue was excluded from the study and analysis after two leaks, requiring surgical reintervention, antibiotic therapy, and prolonged patient hospitalization. Results: Twenty patients were assigned to the suture and Seamguard reinforcement groups. The groups were similar in terms of preoperative characteristics. No staple line dehiscence occurred in the two groups, whereas two cases of dehiscence occurred in the fibrin glue group. No mortality occurred and surgical time was statistically similar for both techniques. Seamguard made the surgery more expensive. Conclusion: In our service, staple line reinforcement in open bariatric surgery with oversewing or Seamguard was considered to be safe. Seamguard application was considered to be easier than oversewing, but more expensive.
Resumo:
Background: High-throughput molecular approaches for gene expression profiling, such as Serial Analysis of Gene Expression (SAGE), Massively Parallel Signature Sequencing (MPSS) or Sequencing-by-Synthesis (SBS) represent powerful techniques that provide global transcription profiles of different cell types through sequencing of short fragments of transcripts, denominated sequence tags. These techniques have improved our understanding about the relationships between these expression profiles and cellular phenotypes. Despite this, more reliable datasets are still necessary. In this work, we present a web-based tool named S3T: Score System for Sequence Tags, to index sequenced tags in accordance with their reliability. This is made through a series of evaluations based on a defined rule set. S3T allows the identification/selection of tags, considered more reliable for further gene expression analysis. Results: This methodology was applied to a public SAGE dataset. In order to compare data before and after filtering, a hierarchical clustering analysis was performed in samples from the same type of tissue, in distinct biological conditions, using these two datasets. Our results provide evidences suggesting that it is possible to find more congruous clusters after using S3T scoring system. Conclusion: These results substantiate the proposed application to generate more reliable data. This is a significant contribution for determination of global gene expression profiles. The library analysis with S3T is freely available at http://gdm.fmrp.usp.br/s3t/.S3T source code and datasets can also be downloaded from the aforementioned website.
Resumo:
Background: Detailed analysis of the dynamic interactions among biological, environmental, social, and economic factors that favour the spread of certain diseases is extremely useful for designing effective control strategies. Diseases like tuberculosis that kills somebody every 15 seconds in the world, require methods that take into account the disease dynamics to design truly efficient control and surveillance strategies. The usual and well established statistical approaches provide insights into the cause-effect relationships that favour disease transmission but they only estimate risk areas, spatial or temporal trends. Here we introduce a novel approach that allows figuring out the dynamical behaviour of the disease spreading. This information can subsequently be used to validate mathematical models of the dissemination process from which the underlying mechanisms that are responsible for this spreading could be inferred. Methodology/Principal Findings: The method presented here is based on the analysis of the spread of tuberculosis in a Brazilian endemic city during five consecutive years. The detailed analysis of the spatio-temporal correlation of the yearly geo-referenced data, using different characteristic times of the disease evolution, allowed us to trace the temporal path of the aetiological agent, to locate the sources of infection, and to characterize the dynamics of disease spreading. Consequently, the method also allowed for the identification of socio-economic factors that influence the process. Conclusions/Significance: The information obtained can contribute to more effective budget allocation, drug distribution and recruitment of human skilled resources, as well as guiding the design of vaccination programs. We propose that this novel strategy can also be applied to the evaluation of other diseases as well as other social processes.
Resumo:
Background: Without intensive selection, the majority of bovine oocytes submitted to in vitro embryo production (IVP) fail to develop to the blastocyst stage. This is attributed partly to their maturation status and competences. Using the Affymetrix GeneChip Bovine Genome Array, global mRNA expression analysis of immature (GV) and in vitro matured (IVM) bovine oocytes was carried out to characterize the transcriptome of bovine oocytes and then use a variety of approaches to determine whether the observed transcriptional changes during IVM was real or an artifact of the techniques used during analysis. Results: 8489 transcripts were detected across the two oocyte groups, of which similar to 25.0% (2117 transcripts) were differentially expressed (p < 0.001); corresponding to 589 over-expressed and 1528 under-expressed transcripts in the IVM oocytes compared to their immature counterparts. Over expression of transcripts by IVM oocytes is particularly interesting, therefore, a variety of approaches were employed to determine whether the observed transcriptional changes during IVM were real or an artifact of the techniques used during analysis, including the analysis of transcript abundance in oocytes in vitro matured in the presence of a-amanitin. Subsets of the differentially expressed genes were also validated by quantitative real-time PCR (qPCR) and the gene expression data was classified according to gene ontology and pathway enrichment. Numerous cell cycle linked (CDC2, CDK5, CDK8, HSPA2, MAPK14, TXNL4B), molecular transport (STX5, STX17, SEC22A, SEC22B), and differentiation (NACA) related genes were found to be among the several over-expressed transcripts in GV oocytes compared to the matured counterparts, while ANXA1, PLAU, STC1and LUM were among the over-expressed genes after oocyte maturation. Conclusion: Using sequential experiments, we have shown and confirmed transcriptional changes during oocyte maturation. This dataset provides a unique reference resource for studies concerned with the molecular mechanisms controlling oocyte meiotic maturation in cattle, addresses the existing conflicting issue of transcription during meiotic maturation and contributes to the global goal of improving assisted reproductive technology.
Resumo:
Context. Star activity makes the mass determination of CoRoT-7b and CoRoT 7c uncertain. Investigators of the CoRoT team proposed several solutions, but all but one of them are larger than the initial determinations of 4.8 +/- 0.8 M(Earth) for CoRoT-7b and 8.4 +/- 0.9 M(Earth) for CoRoT 7c. Aims. This investigation uses the excellent HARPS radial velocity measurements of CoRoT-7 to redetermine the planet masses and to explore techniques for determining mass and orbital elements of planets discovered around active stars when the relative variation in the radial velocity due to the star activity cannot be considered as just noise and can exceed the variation due to the planets. Methods. The main technique used here is a self-consistent version of the high-pass filter used by Queloz et al. (2009, A&A, 506, 303) in the first mass determination of CoRoT-7b and CoRoT-7c. The results are compared to those given by two alternative techniques: (1) the approach proposed by Hatzes et al. (2010, A&A, 520, A93) using only those nights in which two or three observations were done; (2) a pure Fourier analysis. In all cases, the eccentricities are taken equal to zero as indicated by the study of the tidal evolution of the system. The periods are also kept fixed at the values given by Queloz et al. Only the observations done in the time interval BJD 2 454 847-873 are used because they include many nights with multiple observations; otherwise, it is not possible to separate the effects of the rotation fourth harmonic (5.91 d = P(rot)/4) from the alias of the orbital period of CoRoT-7b (0.853585 d). Results. The results of the various approaches are combined to give planet mass values 8.0 +/- 1.2 M(Earth) for CoRoT-7b and 13.6 +/- 1.4 M(Earth) for CoRoT 7c. An estimation of the variation of the radial velocity of the star due to its activity is also given. Conclusions. The results obtained with three different approaches agree to give higher masses than those in previous determinations. From the existing internal structure models they indicate that CoRoT-7b is a much denser super-Earth. The bulk density is 11 +/- 3.5 g cm(-3), so CoRoT-7b may be rocky with a large iron core.
Resumo:
Three-dimensional spectroscopy techniques are becoming more and more popular, producing an increasing number of large data cubes. The challenge of extracting information from these cubes requires the development of new techniques for data processing and analysis. We apply the recently developed technique of principal component analysis (PCA) tomography to a data cube from the center of the elliptical galaxy NGC 7097 and show that this technique is effective in decomposing the data into physically interpretable information. We find that the first five principal components of our data are associated with distinct physical characteristics. In particular, we detect a low-ionization nuclear-emitting region (LINER) with a weak broad component in the Balmer lines. Two images of the LINER are present in our data, one seen through a disk of gas and dust, and the other after scattering by free electrons and/or dust particles in the ionization cone. Furthermore, we extract the spectrum of the LINER, decontaminated from stellar and extended nebular emission, using only the technique of PCA tomography. We anticipate that the scattered image has polarized light due to its scattered nature.
Resumo:
In this work an iterative strategy is developed to tackle the problem of coupling dimensionally-heterogeneous models in the context of fluid mechanics. The procedure proposed here makes use of a reinterpretation of the original problem as a nonlinear interface problem for which classical nonlinear solvers can be applied. Strong coupling of the partitions is achieved while dealing with different codes for each partition, each code in black-box mode. The main application for which this procedure is envisaged arises when modeling hydraulic networks in which complex and simple subsystems are treated using detailed and simplified models, correspondingly. The potentialities and the performance of the strategy are assessed through several examples involving transient flows and complex network configurations.
Resumo:
In this work we investigate knowledge acquisition as performed by multiple agents interacting as they infer, under the presence of observation errors, respective models of a complex system. We focus the specific case in which, at each time step, each agent takes into account its current observation as well as the average of the models of its neighbors. The agents are connected by a network of interaction of Erdos-Renyi or Barabasi-Albert type. First, we investigate situations in which one of the agents has a different probability of observation error (higher or lower). It is shown that the influence of this special agent over the quality of the models inferred by the rest of the network can be substantial, varying linearly with the respective degree of the agent with different estimation error. In case the degree of this agent is taken as a respective fitness parameter, the effect of the different estimation error is even more pronounced, becoming superlinear. To complement our analysis, we provide the analytical solution of the overall performance of the system. We also investigate the knowledge acquisition dynamic when the agents are grouped into communities. We verify that the inclusion of edges between agents (within a community) having higher probability of observation error promotes the loss of quality in the estimation of the agents in the other communities.
Resumo:
Thanks to recent advances in molecular biology, allied to an ever increasing amount of experimental data, the functional state of thousands of genes can now be extracted simultaneously by using methods such as cDNA microarrays and RNA-Seq. Particularly important related investigations are the modeling and identification of gene regulatory networks from expression data sets. Such a knowledge is fundamental for many applications, such as disease treatment, therapeutic intervention strategies and drugs design, as well as for planning high-throughput new experiments. Methods have been developed for gene networks modeling and identification from expression profiles. However, an important open problem regards how to validate such approaches and its results. This work presents an objective approach for validation of gene network modeling and identification which comprises the following three main aspects: (1) Artificial Gene Networks (AGNs) model generation through theoretical models of complex networks, which is used to simulate temporal expression data; (2) a computational method for gene network identification from the simulated data, which is founded on a feature selection approach where a target gene is fixed and the expression profile is observed for all other genes in order to identify a relevant subset of predictors; and (3) validation of the identified AGN-based network through comparison with the original network. The proposed framework allows several types of AGNs to be generated and used in order to simulate temporal expression data. The results of the network identification method can then be compared to the original network in order to estimate its properties and accuracy. Some of the most important theoretical models of complex networks have been assessed: the uniformly-random Erdos-Renyi (ER), the small-world Watts-Strogatz (WS), the scale-free Barabasi-Albert (BA), and geographical networks (GG). The experimental results indicate that the inference method was sensitive to average degree k variation, decreasing its network recovery rate with the increase of k. The signal size was important for the inference method to get better accuracy in the network identification rate, presenting very good results with small expression profiles. However, the adopted inference method was not sensible to recognize distinct structures of interaction among genes, presenting a similar behavior when applied to different network topologies. In summary, the proposed framework, though simple, was adequate for the validation of the inferred networks by identifying some properties of the evaluated method, which can be extended to other inference methods.
Resumo:
Background: Identifying local similarity between two or more sequences, or identifying repeats occurring at least twice in a sequence, is an essential part in the analysis of biological sequences and of their phylogenetic relationship. Finding such fragments while allowing for a certain number of insertions, deletions, and substitutions, is however known to be a computationally expensive task, and consequently exact methods can usually not be applied in practice. Results: The filter TUIUIU that we introduce in this paper provides a possible solution to this problem. It can be used as a preprocessing step to any multiple alignment or repeats inference method, eliminating a possibly large fraction of the input that is guaranteed not to contain any approximate repeat. It consists in the verification of several strong necessary conditions that can be checked in a fast way. We implemented three versions of the filter. The first is simply a straightforward extension to the case of multiple sequences of an application of conditions already existing in the literature. The second uses a stronger condition which, as our results show, enable to filter sensibly more with negligible (if any) additional time. The third version uses an additional condition and pushes the sensibility of the filter even further with a non negligible additional time in many circumstances; our experiments show that it is particularly useful with large error rates. The latter version was applied as a preprocessing of a multiple alignment tool, obtaining an overall time (filter plus alignment) on average 63 and at best 530 times smaller than before (direct alignment), with in most cases a better quality alignment. Conclusion: To the best of our knowledge, TUIUIU is the first filter designed for multiple repeats and for dealing with error rates greater than 10% of the repeats length.
Resumo:
In 2003-2004, several food items were purchased from large commercial outlets in Coimbra, Portugal. Such items included meats (chicken, pork, beef), eggs, rice, beans and vegetables (tomato, carrot, potato, cabbage, broccoli, lettuce). Elemental analysis was carried out through INAA at the Technological and Nuclear Institute (ITN, Portugal), the Nuclear Energy Centre for Agriculture (CENA, Brazil), and the Nuclear Engineering Teaching Lab of the University of Texas at Austin (NETL, USA). At the latter two, INAA was also associated to Compton suppression. It can be concluded that by applying Compton suppression (1) the detection limits for arsenic, copper and potassium improved; (2) the counting-statistics error for molybdenum diminished; and (3) the long-lived zinc had its 1115-keV photopeak better defined. In general, the improvement sought by introducing Compton suppression in foodstuff analysis was not significant. Lettuce, cabbage and chicken (liver, stomach, heart) are the richest diets in terms of human nutrients.
Resumo:
This work proposes an association between musical analysis techniques developed during the twentieth and the twenty-first centuries, presented by authors like Felix Salzer and Joseph Straus, and the musical theory concepts presented by Olivier Messiaen, for the analysis of Prelude n(o) 1, La Colombe. The analysis contributes to broaden the theory concepts presented by the composer. In the Conclusion we trace lines of an authorial sonority by Olivier Messiaen.