842 resultados para MULTIVARIATE FACTORIAL ANALYSIS
Resumo:
The ecological sciences have experienced immense growth over the course of this century, and chances are that they will continue to grow well on into the next millennium. There are some good reasons for this – ecology encompasses some of the most pressing concerns facing humanity. With recent advances in data collection technology and ambitious field research, ecologists are increasingly calling upon multivariate statistics to explore and test for patterns in their data. The goal of FISH 560 (Applied Multivariate Statistics for Ecologists) at the University of Washington is to introduce graduate students to the multivariate statistical techniques necessary to carry out sophisticated analyses and to critically evaluate scientific papers using these approaches. It is a practical, hands-on course emphasizing the analysis and interpretation of multivariate analysis, and covers the majority of approaches in common use by ecologists. To celebrate the hard work of past students, I am pleased to announce the creation of the Electronic Journal of Applied Multivariate Statistics (EJAMS). Each year, students in FISH 560 are required to write a final paper consisting of a statistical analysis of their own multivariate data set. These papers are submitted to EJAMS at the end of quarter and are peer reviewed by two other class members. A decision on publication is based on the reviewers’ recommendations and my own reading the paper. In closing, there is a need for the rapid dissemination of ecological research using multivariate statistics at the University of Washington. EJAMS is committed to this challenge.
Resumo:
We propose a graphical method to visualize possible time-varying correlations between fifteen stock market values. The method is useful for observing stable or emerging clusters of stock markets with similar behaviour. The graphs, originated from applying multidimensional scaling techniques (MDS), may also guide the construction of multivariate econometric models.
Resumo:
Beyond the classical statistical approaches (determination of basic statistics, regression analysis, ANOVA, etc.) a new set of applications of different statistical techniques has increasingly gained relevance in the analysis, processing and interpretation of data concerning the characteristics of forest soils. This is possible to be seen in some of the recent publications in the context of Multivariate Statistics. These new methods require additional care that is not always included or refered in some approaches. In the particular case of geostatistical data applications it is necessary, besides to geo-reference all the data acquisition, to collect the samples in regular grids and in sufficient quantity so that the variograms can reflect the spatial distribution of soil properties in a representative manner. In the case of the great majority of Multivariate Statistics techniques (Principal Component Analysis, Correspondence Analysis, Cluster Analysis, etc.) despite the fact they do not require in most cases the assumption of normal distribution, they however need a proper and rigorous strategy for its utilization. In this work, some reflections about these methodologies and, in particular, about the main constraints that often occur during the information collecting process and about the various linking possibilities of these different techniques will be presented. At the end, illustrations of some particular cases of the applications of these statistical methods will also be presented.
Resumo:
Glass fibre-reinforced plastics (GFRP), nowadays commonly used in the construction, transportation and automobile sectors, have been considered inherently difficult to recycle due to both: cross-linked nature of thermoset resins, which cannot be remolded, and complex composition of the composite itself, which includes glass fibres, matrix and different types of inorganic fillers. Presently, most of the GFRP waste is landfilled leading to negative environmental impacts and supplementary added costs. With an increasing awareness of environmental matters and the subsequent desire to save resources, recycling would convert an expensive waste disposal into a profitable reusable material. There are several methods to recycle GFR thermostable materials: (a) incineration, with partial energy recovery due to the heat generated during organic part combustion; (b) thermal and/or chemical recycling, such as solvolysis, pyrolisis and similar thermal decomposition processes, with glass fibre recovering; and (c) mechanical recycling or size reduction, in which the material is subjected to a milling process in order to obtain a specific grain size that makes the material suitable as reinforcement in new formulations. This last method has important advantages over the previous ones: there is no atmospheric pollution by gas emission, a much simpler equipment is required as compared with ovens necessary for thermal recycling processes, and does not require the use of chemical solvents with subsequent environmental impacts. In this study the effect of incorporation of recycled GFRP waste materials, obtained by means of milling processes, on mechanical behavior of polyester polymer mortars was assessed. For this purpose, different contents of recycled GFRP waste materials, with distinct size gradings, were incorporated into polyester polymer mortars as sand aggregates and filler replacements. The effect of GFRP waste treatment with silane coupling agent was also assessed. Design of experiments and data treatment were accomplish by means of factorial design and analysis of variance ANOVA. The use of factorial experiment design, instead of the one factor at-a-time method is efficient at allowing the evaluation of the effects and possible interactions of the different material factors involved. Experimental results were promising toward the recyclability of GFRP waste materials as polymer mortar aggregates, without significant loss of mechanical properties with regard to non-modified polymer mortars.
Resumo:
Trihalomethanes (THMs) are widely referred and studied as disinfection by-products (DBPs). The THMs that are most commonly detected are chloroform (TCM), bromodichloromethane (BDCM), chlorodibromomethane (CDBM), and bromoform (TBM). Several studies regarding the determination of THMs in swimming pool water and air samples have been published. This paper reviews the most recent work in this field, with a special focus on water and air sampling, sample preparation and analytical determination methods. An experimental study has been developed in order to optimize the headspace solid-phasemicroextraction (HS-SPME) conditions of TCM, BDCM, CDBM and TBM from water samples using a 23 factorial design. An extraction temperature of 45 °C, for 25min, and a desorption time of 5 min were found to be the best conditions. Analysis was performed by gas chromatography with an electron capture detector (GC-ECD). The method was successfully applied to a set of 27 swimming pool water samples collected in the Oporto area (Portugal). TCM was the only THM detected with levels between 4.5 and 406.5 μg L−1. Four of the samples exceeded the guideline value for total THMs in swimming pool water (100 μgL−1) indicated by the Portuguese Health Authority.
Resumo:
Dissertação apresentada como requisito parcial para obtenção do grau de Mestre em Estatística e Gestão de Informação
Resumo:
One hundred twenty-two early-stage anal canal cancer patients (median age: 69 years) were treated with curative radiotherapy with (70 patients) or without (52 patients) concomitant chemotherapy. Median follow-up was 65 months (range: 4-238). At multivariate analysis, concomitant chemotherapy significantly improved local control (p = .007). Local control significantly influenced all considered endpoints, except the metastases free survival. The global rates of G3-G4 acute and late toxicity were 13.1% and 8.2%, respectively, and they were not increased by concomitant chemotherapy. Finally, concomitant chemotherapy is efficacious and safe in the treatment of T1-2N0 anal canal cancer patients and should be prospectively studied.
Resumo:
BACKGROUND: The impact of early valve surgery (EVS) on the outcome of Staphylococcus aureus (SA) prosthetic valve infective endocarditis (PVIE) is unresolved. The objective of this study was to evaluate the association between EVS, performed within the first 60 days of hospitalization, and outcome of SA PVIE within the International Collaboration on Endocarditis-Prospective Cohort Study. METHODS: Participants were enrolled between June 2000 and December 2006. Cox proportional hazards modeling that included surgery as a time-dependent covariate and propensity adjustment for likelihood to receive cardiac surgery was used to evaluate the impact of EVS and 1-year all-cause mortality on patients with definite left-sided S. aureus PVIE and no history of injection drug use. RESULTS: EVS was performed in 74 of the 168 (44.3%) patients. One-year mortality was significantly higher among patients with S. aureus PVIE than in patients with non-S. aureus PVIE (48.2% vs 32.9%; P = .003). Staphylococcus aureus PVIE patients who underwent EVS had a significantly lower 1-year mortality rate (33.8% vs 59.1%; P = .001). In multivariate, propensity-adjusted models, EVS was not associated with 1-year mortality (risk ratio, 0.67 [95% confidence interval, .39-1.15]; P = .15). CONCLUSIONS: In this prospective, multinational cohort of patients with S. aureus PVIE, EVS was not associated with reduced 1-year mortality. The decision to pursue EVS should be individualized for each patient, based upon infection-specific characteristics rather than solely upon the microbiology of the infection causing PVIE.
Resumo:
This study examines the efficiency of search engine advertising strategies employed by firms. The research setting is the online retailing industry, which is characterized by extensive use of Web technologies and high competition for market share and profitability. For Internet retailers, search engines are increasingly serving as an information gateway for many decision-making tasks. In particular, Search engine advertising (SEA) has opened a new marketing channel for retailers to attract new customers and improve their performance. In addition to natural (organic) search marketing strategies, search engine advertisers compete for top advertisement slots provided by search brokers such as Google and Yahoo! through keyword auctions. The rationale being that greater visibility on a search engine during a keyword search will capture customers' interest in a business and its product or service offerings. Search engines account for most online activities today. Compared with the slow growth of traditional marketing channels, online search volumes continue to grow at a steady rate. According to the Search Engine Marketing Professional Organization, spending on search engine marketing by North American firms in 2008 was estimated at $13.5 billion. Despite the significant role SEA plays in Web retailing, scholarly research on the topic is limited. Prior studies in SEA have focused on search engine auction mechanism design. In contrast, research on the business value of SEA has been limited by the lack of empirical data on search advertising practices. Recent advances in search and retail technologies have created datarich environments that enable new research opportunities at the interface of marketing and information technology. This research uses extensive data from Web retailing and Google-based search advertising and evaluates Web retailers' use of resources, search advertising techniques, and other relevant factors that contribute to business performance across different metrics. The methods used include Data Envelopment Analysis (DEA), data mining, and multivariate statistics. This research contributes to empirical research by analyzing several Web retail firms in different industry sectors and product categories. One of the key findings is that the dynamics of sponsored search advertising vary between multi-channel and Web-only retailers. While the key performance metrics for multi-channel retailers include measures such as online sales, conversion rate (CR), c1ick-through-rate (CTR), and impressions, the key performance metrics for Web-only retailers focus on organic and sponsored ad ranks. These results provide a useful contribution to our organizational level understanding of search engine advertising strategies, both for multi-channel and Web-only retailers. These results also contribute to current knowledge in technology-driven marketing strategies and provide managers with a better understanding of sponsored search advertising and its impact on various performance metrics in Web retailing.
Resumo:
Latent variable models in finance originate both from asset pricing theory and time series analysis. These two strands of literature appeal to two different concepts of latent structures, which are both useful to reduce the dimension of a statistical model specified for a multivariate time series of asset prices. In the CAPM or APT beta pricing models, the dimension reduction is cross-sectional in nature, while in time-series state-space models, dimension is reduced longitudinally by assuming conditional independence between consecutive returns, given a small number of state variables. In this paper, we use the concept of Stochastic Discount Factor (SDF) or pricing kernel as a unifying principle to integrate these two concepts of latent variables. Beta pricing relations amount to characterize the factors as a basis of a vectorial space for the SDF. The coefficients of the SDF with respect to the factors are specified as deterministic functions of some state variables which summarize their dynamics. In beta pricing models, it is often said that only the factorial risk is compensated since the remaining idiosyncratic risk is diversifiable. Implicitly, this argument can be interpreted as a conditional cross-sectional factor structure, that is, a conditional independence between contemporaneous returns of a large number of assets, given a small number of factors, like in standard Factor Analysis. We provide this unifying analysis in the context of conditional equilibrium beta pricing as well as asset pricing with stochastic volatility, stochastic interest rates and other state variables. We address the general issue of econometric specifications of dynamic asset pricing models, which cover the modern literature on conditionally heteroskedastic factor models as well as equilibrium-based asset pricing models with an intertemporal specification of preferences and market fundamentals. We interpret various instantaneous causality relationships between state variables and market fundamentals as leverage effects and discuss their central role relative to the validity of standard CAPM-like stock pricing and preference-free option pricing.
Resumo:
This thesis entitled Reliability Modelling and Analysis in Discrete time Some Concepts and Models Useful in the Analysis of discrete life time data.The present study consists of five chapters. In Chapter II we take up the derivation of some general results useful in reliability modelling that involves two component mixtures. Expression for the failure rate, mean residual life and second moment of residual life of the mixture distributions in terms of the corresponding quantities in the component distributions are investigated. Some applications of these results are also pointed out. The role of the geometric,Waring and negative hypergeometric distributions as models of life lengths in the discrete time domain has been discussed already. While describing various reliability characteristics, it was found that they can be often considered as a class. The applicability of these models in single populations naturally extends to the case of populations composed of sub-populations making mixtures of these distributions worth investigating. Accordingly the general properties, various reliability characteristics and characterizations of these models are discussed in chapter III. Inference of parameters in mixture distribution is usually a difficult problem because the mass function of the mixture is a linear function of the component masses that makes manipulation of the likelihood equations, leastsquare function etc and the resulting computations.very difficult. We show that one of our characterizations help in inferring the parameters of the geometric mixture without involving computational hazards. As mentioned in the review of results in the previous sections, partial moments were not studied extensively in literature especially in the case of discrete distributions. Chapters IV and V deal with descending and ascending partial factorial moments. Apart from studying their properties, we prove characterizations of distributions by functional forms of partial moments and establish recurrence relations between successive moments for some well known families. It is further demonstrated that partial moments are equally efficient and convenient compared to many of the conventional tools to resolve practical problems in reliability modelling and analysis. The study concludes by indicating some new problems that surfaced during the course of the present investigation which could be the subject for a future work in this area.
Resumo:
Soil fertility constraints to crop production have been recognized widely as a major obstacle to food security and agro-ecosystem sustainability in sub-Saharan West Africa. As such, they have led to a multitude of research projects and policy debates on how best they should be overcome. Conclusions, based on long-term multi-site experiments, are lacking with respect to a regional assessment of phosphorus and nitrogen fertilizer effects, surface mulched crop residues, and legume rotations on total dry matter of cereals in this region. A mixed model time-trend analysis was used to investigate the effects of four nitrogen and phosphorus rates, annually applied crop residue dry matter at 500 and 2000 kg ha^-1, and cereal-legume rotation versus continuous cereal cropping on the total dry matter of cereals and legumes. The multi-factorial experiment was conducted over four years at eight locations, with annual rainfall ranging from 510 to 1300 mm, in Niger, Burkina Faso, and Togo. With the exception of phosphorus, treatment effects on legume growth were marginal. At most locations, except for typical Sudanian sites with very low base saturation and high rainfall, phosphorus effects on cereal total dry matter were much lower with rock phosphate than with soluble phosphorus, unless the rock phosphate was combined with an annual seed-placement of 4 kg ha^-1 phosphorus. Across all other treatments, nitrogen effects were negligible at 500 mm annual rainfall but at 900 mm, the highest nitrogen rate led to total dry matter increases of up to 77% and, at 1300 mm, to 183%. Mulch-induced increases in cereal total dry matter were larger with lower base saturation, reaching 45% on typical acid sandy Sahelian soils. Legume rotation effects tended to increase over time but were strongly species-dependent.
Resumo:
These notes have been prepared as support to a short course on compositional data analysis. Their aim is to transmit the basic concepts and skills for simple applications, thus setting the premises for more advanced projects
Resumo:
One of the disadvantages of old age is that there is more past than future: this, however, may be turned into an advantage if the wealth of experience and, hopefully, wisdom gained in the past can be reflected upon and throw some light on possible future trends. To an extent, then, this talk is necessarily personal, certainly nostalgic, but also self critical and inquisitive about our understanding of the discipline of statistics. A number of almost philosophical themes will run through the talk: search for appropriate modelling in relation to the real problem envisaged, emphasis on sensible balances between simplicity and complexity, the relative roles of theory and practice, the nature of communication of inferential ideas to the statistical layman, the inter-related roles of teaching, consultation and research. A list of keywords might be: identification of sample space and its mathematical structure, choices between transform and stay, the role of parametric modelling, the role of a sample space metric, the underused hypothesis lattice, the nature of compositional change, particularly in relation to the modelling of processes. While the main theme will be relevance to compositional data analysis we shall point to substantial implications for general multivariate analysis arising from experience of the development of compositional data analysis…