970 resultados para Univariate Analysis box-jenkins methodology
Resumo:
Background: Microarray data is frequently used to characterize the expression profile of a whole genome and to compare the characteristics of that genome under several conditions. Geneset analysis methods have been described previously to analyze the expression values of several genes related by known biological criteria (metabolic pathway, pathology signature, co-regulation by a common factor, etc.) at the same time and the cost of these methods allows for the use of more values to help discover the underlying biological mechanisms. Results: As several methods assume different null hypotheses, we propose to reformulate the main question that biologists seek to answer. To determine which genesets are associated with expression values that differ between two experiments, we focused on three ad hoc criteria: expression levels, the direction of individual gene expression changes (up or down regulation), and correlations between genes. We introduce the FAERI methodology, tailored from a two-way ANOVA to examine these criteria. The significance of the results was evaluated according to the self-contained null hypothesis, using label sampling or by inferring the null distribution from normally distributed random data. Evaluations performed on simulated data revealed that FAERI outperforms currently available methods for each type of set tested. We then applied the FAERI method to analyze three real-world datasets on hypoxia response. FAERI was able to detect more genesets than other methodologies, and the genesets selected were coherent with current knowledge of cellular response to hypoxia. Moreover, the genesets selected by FAERI were confirmed when the analysis was repeated on two additional related datasets. Conclusions: The expression values of genesets are associated with several biological effects. The underlying mathematical structure of the genesets allows for analysis of data from several genes at the same time. Focusing on expression levels, the direction of the expression changes, and correlations, we showed that two-step data reduction allowed us to significantly improve the performance of geneset analysis using a modified two-way ANOVA procedure, and to detect genesets that current methods fail to detect.
Resumo:
The availability of high resolution Digital Elevation Models (DEM) at a regional scale enables the analysis of topography with high levels of detail. Hence, a DEM-based geomorphometric approach becomes more accurate for detecting potential rockfall sources. Potential rockfall source areas are identified according to the slope angle distribution deduced from high resolution DEM crossed with other information extracted from geological and topographic maps in GIS format. The slope angle distribution can be decomposed in several Gaussian distributions that can be considered as characteristic of morphological units: rock cliffs, steep slopes, footslopes and plains. A terrain is considered as potential rockfall sources when their slope angles lie over an angle threshold, which is defined where the Gaussian distribution of the morphological unit "Rock cliffs" become dominant over the one of "Steep slopes". In addition to this analysis, the cliff outcrops indicated by the topographic maps were added. They contain however "flat areas", so that only the slope angles values above the mode of the Gaussian distribution of the morphological unit "Steep slopes" were considered. An application of this method is presented over the entire Canton of Vaud (3200 km2), Switzerland. The results were compared with rockfall sources observed on the field and orthophotos analysis in order to validate the method. Finally, the influence of the cell size of the DEM is inspected by applying the methodology over six different DEM resolutions.
Resumo:
A procedure based on quantum molecular similarity measures (QMSM) has been used to compare electron densities obtained from conventional ab initio and density functional methodologies at their respective optimized geometries. This method has been applied to a series of small molecules which have experimentally known properties and molecular bonds of diverse degrees of ionicity and covalency. Results show that in most cases the electron densities obtained from density functional methodologies are of a similar quality than post-Hartree-Fock generalized densities. For molecules where Hartree-Fock methodology yields erroneous results, the density functional methodology is shown to yield usually more accurate densities than those provided by the second order Møller-Plesset perturbation theory
Resumo:
Several superstructure design methodologies have been developed for low volume road bridges by the Iowa State University Bridge Engineering Center. However, to date no standard abutment designs have been developed. Thus, there was a need to establish an easy to use design methodology in addition to generating generic abutment standards and other design aids for the more common substructure systems used in Iowa. The final report for this project consists of three volumes. The first volume (this volume) summarizes the research completed in this project. A survey of the Iowa County Engineers was conducted from which it was determined that while most counties use similar types of abutments, only 17 percent use some type of standard abutment designs or plans. A literature review revealed several possible alternative abutment systems for future use on low volume road bridges in addition to two separate substructure lateral load analysis methods. These consisted of a linear and a non-linear method. The linear analysis method was used for this project due to its relative simplicity and the relative accuracy of the maximum pile moment when compared to values obtained from the more complex non-linear analysis method. The resulting design methodology was developed for single span stub abutments supported on steel or timber piles with a bridge span length ranging from 20 to 90 ft and roadway widths of 24 and 30 ft. However, other roadway widths can be designed using the foundation design template provided. The backwall height is limited to a range of 6 to 12 ft, and the soil type is classified as cohesive or cohesionless. The design methodology was developed using the guidelines specified by the American Association of State Highway Transportation Officials Standard Specifications, the Iowa Department of Transportation Bridge Design Manual, and the National Design Specifications for Wood Construction. The second volume introduces and outlines the use of the various design aids developed for this project. Charts for determining dead and live gravity loads based on the roadway width, span length, and superstructure type are provided. A foundation design template was developed in which the engineer can check a substructure design by inputting basic bridge site information. Tables published by the Iowa Department of Transportation that provide values for estimating pile friction and end bearing for different combinations of soils and pile types are also included. Generic standard abutment plans were developed for which the engineer can provide necessary bridge site information in the spaces provided. These tools enable engineers to design and detail county bridge substructures more efficiently. The third volume provides two sets of calculations that demonstrate the application of the substructure design methodology developed in this project. These calculations also verify the accuracy of the foundation design template. The printouts from the foundation design template are provided at the end of each example. Also several tables provide various foundation details for a pre-cast double tee superstructure with different combinations of soil type, backwall height, and pile type.
Resumo:
Several superstructure design methodologies have been developed for low volume road bridges by the Iowa State University Bridge Engineering Center. However, to date no standard abutment designs have been developed. Thus, there was a need to establish an easy to use design methodology in addition to generating generic abutment standards and other design aids for the more common substructure systems used in Iowa. The final report for this project consists of three volumes. The first volume summarizes the research completed in this project. A survey of the Iowa County Engineers was conducted from which it was determined that while most counties use similar types of abutments, only 17 percent use some type of standard abutment designs or plans. A literature review revealed several possible alternative abutment systems for future use on low volume road bridges in addition to two separate substructure lateral load analysis methods. These consisted of a linear and a non-linear method. The linear analysis method was used for this project due to its relative simplicity and the relative accuracy of the maximum pile moment when compared to values obtained from the more complex non-linear analysis method. The resulting design methodology was developed for single span stub abutments supported on steel or timber piles with a bridge span length ranging from 20 to 90 ft and roadway widths of 24 and 30 ft. However, other roadway widths can be designed using the foundation design template provided. The backwall height is limited to a range of 6 to 12 ft, and the soil type is classified as cohesive or cohesionless. The design methodology was developed using the guidelines specified by the American Association of State Highway Transportation Officials Standard Specifications, the Iowa Department of Transportation Bridge Design Manual, and the National Design Specifications for Wood Construction. The second volume introduces and outlines the use of the various design aids developed for this project. Charts for determining dead and live gravity loads based on the roadway width, span length, and superstructure type are provided. A foundation design template was developed in which the engineer can check a substructure design by inputting basic bridge site information. Tables published by the Iowa Department of Transportation that provide values for estimating pile friction and end bearing for different combinations of soils and pile types are also included. Generic standard abutment plans were developed for which the engineer can provide necessary bridge site information in the spaces provided. These tools enable engineers to design and detail county bridge substructures more efficiently. The third volume (this volume) provides two sets of calculations that demonstrate the application of the substructure design methodology developed in this project. These calculations also verify the accuracy of the foundation design template. The printouts from the foundation design template are provided at the end of each example. Also several tables provide various foundation details for a pre-cast double tee superstructure with different combinations of soil type, backwall height, and pile type.
Resumo:
The alignment between competences, teaching-learning methodologies and assessment is a key element of the European Higher Education Area. This paper presents the efforts carried out by six Telematics, Computer Science and Electronic Engineering Education teachers towards achieving this alignment in their subjects. In a joint work with pedagogues, a set of recommended actions were identified. A selection of these actions were applied and evaluated in the six subjects. The cross-analysis of the results indicate that the actions allow students to better understand the methodologies and assessment planned for the subjects, facilitate (self-) regulation and increase students’ involvement in the subjects.
Resumo:
User generated content shared in online communities is often described using collaborative tagging systems where users assign labels to content resources. As a result, a folksonomy emerges that relates a number of tags with the resources they label and the users that have used them. In this paper we analyze the folksonomy of Freesound, an online audio clip sharing site which contains more than two million users and 150,000 user-contributed sound samplescovering a wide variety of sounds. By following methodologies taken from similar studies, we compute some metrics that characterize the folksonomy both at the globallevel and at the tag level. In this manner, we are able to betterunderstand the behavior of the folksonomy as a whole, and also obtain some indicators that can be used as metadata for describing tags themselves. We expect that such a methodology for characterizing folksonomies can be useful to support processes such as tag recommendation or automatic annotation of online resources.
Resumo:
Background To replicate, retroviruses must insert DNA copies of their RNA genomes into the host genome. This integration process is catalyzed by the viral integrase protein. The site of viral integration has been shown to be non-random and retrovirus-specific. LEDGF/p75, a splice variant encoded by PSIP1 gene and described as a general transcription coactivator, was identified as a tethering factor binding both to chromatin and to lentiviral integrases, thereby affecting integration efficiency as well as integration site selection. LEDGF/p75 is still a poorly characterized protein, and its cellular endogenous function has yet to be fully determined. In order to start unveiling the roles of LEDGF/p75 in the cell, we started to investigate the mechanisms involved in the regulation of LEDGF/p75. Materials and methods To identify PSIP1 minimal promoter and associated regulatory elements, we cloned a region starting 5 kb upstream the transcription start site (TSS, +1 reference position) to the ATG start codon (+816), as well as systematic truncations, in a plasmid containing the firefly luciferase reporter gene. These constructs were co-transfected into HEK293 cells with a plasmid encoding the Renilla luciferase under the pTK promoter as an internal control for transfection efficiency. Both luciferase activities were assessed by luminescence as an indicator of promoter activity. Results Luciferase assays identified regions -76 to +1 and +1 to +94 as two independent minimal promoters showing respectively a 3.7x and 2.3x increase in luciferase activity. These two independent minimal promoters worked synergistically increasing luciferase activity up to 16.3x as compared to background. Moreover, we identified five regulatory blocks which modulated luciferase activity depending on the DNA region tested, three enhancers (- 2007 to -1159, -284 to -171 and +94 to +644) and two silencers (-171 to -76 and +796 to +816). However, the silencing effect of the region -171 to -76 is dependent on the presence of the +94 to +644 region, ruling out the enhancer activity of the latter. Computational analysis of PSIP1 promoter revealed the absence of TATA box and initiator (INR) sequences, classifying this promoter as nonconventional. TATA-less and INR-less promoters are characterized by multiple Sp1 binding sites, involved in the recruitment of the RNA pol II complex. Consistent with this, PSIP1 promoter contains multiple putative Sp1 binding sequences in regions -76 to +1 and +1 to +94.
Resumo:
In insects, the steroid hormone 20-hydroxyecdysone (20E) coordinates major developmental transitions. While the first and the final steps of 20E biosynthesis are characterized, the pathway from 7-dehydrocholesterol to 5β-ketodiol, commonly referred as the "black box", remains hypothetical and whether there are still unidentified enzymes is unknown. The black box would include some oxidative steps, which are believed to be mediated by P450 enzymes. To identify new enzyme(s) involved in steroid synthesis, we analyzed by small-scale microarray the expression of all the genes encoding P450 enzymes of the malaria mosquito Anopheles gambiae in active steroidogenic organs of adults, ovaries from blood-fed females and male reproductive tracts, compared to inactive steroidogenic organs, ovaries from non-blood-fed females. Some genes encoding P450 enzymes were specifically overexpressed in female ovaries after a blood-meal or in male reproductive tracts but only three genes were found to be overexpressed in active steroidogenic organs of both females and males: cyp307a1, cyp4g16 and cyp6n1. Among these genes, only cyp307a1 has an expression pattern similar to other mosquito steroidogenic genes. Moreover, loss-of-function by transient RNAi targeting cyp307a1 disrupted ecdysteroid production demonstrating that this gene is required for ecdysteroid biosynthesis in Anopheles gambiae.
Resumo:
General Introduction This thesis can be divided into two main parts :the first one, corresponding to the first three chapters, studies Rules of Origin (RoOs) in Preferential Trade Agreements (PTAs); the second part -the fourth chapter- is concerned with Anti-Dumping (AD) measures. Despite wide-ranging preferential access granted to developing countries by industrial ones under North-South Trade Agreements -whether reciprocal, like the Europe Agreements (EAs) or NAFTA, or not, such as the GSP, AGOA, or EBA-, it has been claimed that the benefits from improved market access keep falling short of the full potential benefits. RoOs are largely regarded as a primary cause of the under-utilization of improved market access of PTAs. RoOs are the rules that determine the eligibility of goods to preferential treatment. Their economic justification is to prevent trade deflection, i.e. to prevent non-preferred exporters from using the tariff preferences. However, they are complex, cost raising and cumbersome, and can be manipulated by organised special interest groups. As a result, RoOs can restrain trade beyond what it is needed to prevent trade deflection and hence restrict market access in a statistically significant and quantitatively large proportion. Part l In order to further our understanding of the effects of RoOs in PTAs, the first chapter, written with Pr. Olivier Cadot, Celine Carrère and Pr. Jaime de Melo, describes and evaluates the RoOs governing EU and US PTAs. It draws on utilization-rate data for Mexican exports to the US in 2001 and on similar data for ACP exports to the EU in 2002. The paper makes two contributions. First, we construct an R-index of restrictiveness of RoOs along the lines first proposed by Estevadeordal (2000) for NAFTA, modifying it and extending it for the EU's single-list (SL). This synthetic R-index is then used to compare Roos under NAFTA and PANEURO. The two main findings of the chapter are as follows. First, it shows, in the case of PANEURO, that the R-index is useful to summarize how countries are differently affected by the same set of RoOs because of their different export baskets to the EU. Second, it is shown that the Rindex is a relatively reliable statistic in the sense that, subject to caveats, after controlling for the extent of tariff preference at the tariff-line level, it accounts for differences in utilization rates at the tariff line level. Finally, together with utilization rates, the index can be used to estimate total compliance costs of RoOs. The second chapter proposes a reform of preferential Roos with the aim of making them more transparent and less discriminatory. Such a reform would make preferential blocs more "cross-compatible" and would therefore facilitate cumulation. It would also contribute to move regionalism toward more openness and hence to make it more compatible with the multilateral trading system. It focuses on NAFTA, one of the most restrictive FTAs (see Estevadeordal and Suominen 2006), and proposes a way forward that is close in spirit to what the EU Commission is considering for the PANEURO system. In a nutshell, the idea is to replace the current array of RoOs by a single instrument- Maximum Foreign Content (MFC). An MFC is a conceptually clear and transparent instrument, like a tariff. Therefore changing all instruments into an MFC would bring improved transparency pretty much like the "tariffication" of NTBs. The methodology for this exercise is as follows: In step 1, I estimate the relationship between utilization rates, tariff preferences and RoOs. In step 2, I retrieve the estimates and invert the relationship to get a simulated MFC that gives, line by line, the same utilization rate as the old array of Roos. In step 3, I calculate the trade-weighted average of the simulated MFC across all lines to get an overall equivalent of the current system and explore the possibility of setting this unique instrument at a uniform rate across lines. This would have two advantages. First, like a uniform tariff, a uniform MFC would make it difficult for lobbies to manipulate the instrument at the margin. This argument is standard in the political-economy literature and has been used time and again in support of reductions in the variance of tariffs (together with standard welfare considerations). Second, uniformity across lines is the only way to eliminate the indirect source of discrimination alluded to earlier. Only if two countries face uniform RoOs and tariff preference will they face uniform incentives irrespective of their initial export structure. The result of this exercise is striking: the average simulated MFC is 25% of good value, a very low (i.e. restrictive) level, confirming Estevadeordal and Suominen's critical assessment of NAFTA's RoOs. Adopting a uniform MFC would imply a relaxation from the benchmark level for sectors like chemicals or textiles & apparel, and a stiffening for wood products, papers and base metals. Overall, however, the changes are not drastic, suggesting perhaps only moderate resistance to change from special interests. The third chapter of the thesis considers whether Europe Agreements of the EU, with the current sets of RoOs, could be the potential model for future EU-centered PTAs. First, I have studied and coded at the six-digit level of the Harmonised System (HS) .both the old RoOs -used before 1997- and the "Single list" Roos -used since 1997. Second, using a Constant Elasticity Transformation function where CEEC exporters smoothly mix sales between the EU and the rest of the world by comparing producer prices on each market, I have estimated the trade effects of the EU RoOs. The estimates suggest that much of the market access conferred by the EAs -outside sensitive sectors- was undone by the cost-raising effects of RoOs. The chapter also contains an analysis of the evolution of the CEECs' trade with the EU from post-communism to accession. Part II The last chapter of the thesis is concerned with anti-dumping, another trade-policy instrument having the effect of reducing market access. In 1995, the Uruguay Round introduced in the Anti-Dumping Agreement (ADA) a mandatory "sunset-review" clause (Article 11.3 ADA) under which anti-dumping measures should be reviewed no later than five years from their imposition and terminated unless there was a serious risk of resumption of injurious dumping. The last chapter, written with Pr. Olivier Cadot and Pr. Jaime de Melo, uses a new database on Anti-Dumping (AD) measures worldwide to assess whether the sunset-review agreement had any effect. The question we address is whether the WTO Agreement succeeded in imposing the discipline of a five-year cycle on AD measures and, ultimately, in curbing their length. Two methods are used; count data analysis and survival analysis. First, using Poisson and Negative Binomial regressions, the count of AD measures' revocations is regressed on (inter alia) the count of "initiations" lagged five years. The analysis yields a coefficient on measures' initiations lagged five years that is larger and more precisely estimated after the agreement than before, suggesting some effect. However the coefficient estimate is nowhere near the value that would give a one-for-one relationship between initiations and revocations after five years. We also find that (i) if the agreement affected EU AD practices, the effect went the wrong way, the five-year cycle being quantitatively weaker after the agreement than before; (ii) the agreement had no visible effect on the United States except for aone-time peak in 2000, suggesting a mopping-up of old cases. Second, the survival analysis of AD measures around the world suggests a shortening of their expected lifetime after the agreement, and this shortening effect (a downward shift in the survival function postagreement) was larger and more significant for measures targeted at WTO members than for those targeted at non-members (for which WTO disciplines do not bind), suggesting that compliance was de jure. A difference-in-differences Cox regression confirms this diagnosis: controlling for the countries imposing the measures, for the investigated countries and for the products' sector, we find a larger increase in the hazard rate of AD measures covered by the Agreement than for other measures.
Resumo:
Objective To assess the effectiveness of Problem-Solving Therapy (PST) on family caregivers through the use of scales to measure anxiety, depression and emotional distress; and to explore facilitating factors and obstacles for its use based on the narrative of nurses. Method A clinical trial and an exploratory focus group with the use of mixed analysis methodology. The study was conducted in a primary health care center in Tarragona, Spain, and the sample consisted of 122 family caregivers who were included in the home care service, and 10 nurses who participated in the intervention group. Family caregivers with evident symptoms of anxiety, depression and emotional distress received PST in the intervention group. The intervention group also consisted of a discussion with eight nurses, which was transcribed and submitted to content analysis. Conclusion Problem-Solving Therapy proved to be effective in reducing perceived anxiety, depression and emotional distress. We identified its strong points and obstacles as described by nurses.
Resumo:
Flow cytometry (FCM) is emerging as an important tool in environmental microbiology. Although flow cytometry applications have to date largely been restricted to certain specialized fields of microbiology, such as the bacterial cell cycle and marine phytoplankton communities, technical advances in instrumentation and methodology are leading to its increased popularity and extending its range of applications. Here we will focus on a number of recent flow cytometry developments important for addressing questions in environmental microbiology. These include (i) the study of microbial physiology under environmentally relevant conditions, (ii) new methods to identify active microbial populations and to isolate previously uncultured microorganisms, and (iii) the development of high-throughput autofluorescence bioreporter assays
Resumo:
The mathematical representation of Brunswik s lens model has been usedextensively to study human judgment and provides a unique opportunity to conduct ameta-analysis of studies that covers roughly five decades. Specifically, we analyzestatistics of the lens model equation (Tucker, 1964) associated with 259 different taskenvironments obtained from 78 papers. In short, we find on average fairly high levelsof judgmental achievement and note that people can achieve similar levels of cognitiveperformance in both noisy and predictable environments. Although overall performancevaries little between laboratory and field studies, both differ in terms of components ofperformance and types of environments (numbers of cues and redundancy). An analysisof learning studies reveals that the most effective form of feedback is information aboutthe task. We also analyze empirically when bootstrapping is more likely to occur. Weconclude by indicating shortcomings of the kinds of studies conducted to date, limitationsin the lens model methodology, and possibilities for future research.
Resumo:
BACKGROUND: Three non-synonymous single nucleotide polymorphisms (Q223R, K109R and K656N) of the leptin receptor gene (LEPR) have been tested for association with obesity-related outcomes in multiple studies, showing inconclusive results. We performed a systematic review and meta-analysis on the association of the three LEPR variants with BMI. In addition, we analysed 15 SNPs within the LEPR gene in the CoLaus study, assessing the interaction of the variants with sex. METHODOLOGY/PRINCIPAL FINDINGS: We searched electronic databases, including population-based studies that investigated the association between LEPR variants Q223R, K109R and K656N and obesity- related phenotypes in healthy, unrelated subjects. We furthermore performed meta-analyses of the genotype and allele frequencies in case-control studies. Results were stratified by SNP and by potential effect modifiers. CoLaus data were analysed by logistic and linear regressions and tested for interaction with sex. The meta-analysis of published data did not show an overall association between any of the tested LEPR variants and overweight. However, the choice of a BMI cut-off value to distinguish cases from controls was crucial to explain heterogeneity in Q223R. Differences in allele frequencies across ethnic groups are compatible with natural selection of derived alleles in Q223R and K109R and of the ancient allele in K656N in Asians. In CoLaus, the rs10128072, rs3790438 and rs3790437 variants showed interaction with sex for their association with overweight, waist circumference and fat mass in linear regressions. CONCLUSIONS: Our systematic review and analysis of primary data from the CoLaus study did not show an overall association between LEPR SNPs and overweight. Most studies were underpowered to detect small effect sizes. A potential effect modification by sex, population stratification, as well as the role of natural selection should be addressed in future genetic association studies.
Resumo:
Power transformations of positive data tables, prior to applying the correspondence analysis algorithm, are shown to open up a family of methods with direct connections to the analysis of log-ratios. Two variations of this idea are illustrated. The first approach is simply to power the original data and perform a correspondence analysis this method is shown to converge to unweighted log-ratio analysis as the power parameter tends to zero. The second approach is to apply the power transformation to thecontingency ratios, that is the values in the table relative to expected values based on the marginals this method converges to weighted log-ratio analysis, or the spectral map. Two applications are described: first, a matrix of population genetic data which is inherently two-dimensional, and second, a larger cross-tabulation with higher dimensionality, from a linguistic analysis of several books.