925 resultados para Models and Methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes solutions to three issues pertaining to the estimation of finite mixture models with an unknown number of components: the non-identifiability induced by overfitting the number of components, the mixing limitations of standard Markov Chain Monte Carlo (MCMC) sampling techniques, and the related label switching problem. An overfitting approach is used to estimate the number of components in a finite mixture model via a Zmix algorithm. Zmix provides a bridge between multidimensional samplers and test based estimation methods, whereby priors are chosen to encourage extra groups to have weights approaching zero. MCMC sampling is made possible by the implementation of prior parallel tempering, an extension of parallel tempering. Zmix can accurately estimate the number of components, posterior parameter estimates and allocation probabilities given a sufficiently large sample size. The results will reflect uncertainty in the final model and will report the range of possible candidate models and their respective estimated probabilities from a single run. Label switching is resolved with a computationally light-weight method, Zswitch, developed for overfitted mixtures by exploiting the intuitiveness of allocation-based relabelling algorithms and the precision of label-invariant loss functions. Four simulation studies are included to illustrate Zmix and Zswitch, as well as three case studies from the literature. All methods are available as part of the R package Zmix, which can currently be applied to univariate Gaussian mixture models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis explores melodic and harmonic features of heavy metal, and while doing so, explores various methods of music analysis; their applicability and limitations regarding the study of heavy metal music. The study is built on three general hypotheses according to which 1) acoustic characteristics play a significant role for chord constructing in heavy metal, 2) heavy metal has strong ties and similarities with other Western musical styles, and 3) theories and analytical methods of Western art music may be applied to heavy metal. It seems evident that in heavy metal some chord structures appear far more frequently than others. It is suggested here that the fundamental reason for this is the use of guitar distortion effect. Subsequently, theories as to how and under what principles heavy metal is constructed need to be put under discussion; analytical models regarding the classification of consonance and dissonance and chord categorization are here revised to meet the common practices of this music. It is evident that heavy metal is not an isolated style of music; it is seen here as a cultural fusion of various musical styles. Moreover, it is suggested that the theoretical background to the construction of Western music and its analysis can offer invaluable insights to heavy metal. However, the analytical methods need to be reformed to some extent to meet the characteristics of the music. This reformation includes an accommodation of linear and functional theories that has been found rather rarely in music theory and musicology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: Head and neck squamous cell carcinoma (HNSCC) is a heterogeneous tumour type which necessitates multiple invitro models to attain an appreciation of its multiple subtypes. The phenomenon of epithelial-mesenchymal transition (EMT) isimportant to the development of a metastatic cancer cell phenotype being relevant to the ability of cancer cells to intravasate intovasculature and to invade tissues. The role of EMT in human papilloma virus (HPV) positive HNSCC is not well understood. Thispaper aims to characterize seven HNSCC cell lines (FaDu, SCC-25, SCC-15, CAL27, RPMI2650) including two new HPV-16positive HNSCC cell lines (UD-SCC2, 93-VU-147T) for their epithelial and mesenchymal properties. Materials and methods: A panel of HNSCC cell lines from multiple head and neck anatomical sites were profiled for basalexpression of epithelial and mesenchymal characteristics at mRNA, protein and functional levels (proliferative, migratory andinvasive properties). Furthermore, 3D spheroid forming capabilities were investigated. Results: We found that the HPV-16 positive cell line, in particular UD-SCC2 demonstrated a more invasive and mesenchymalphenotype at the molecular and functional levels suggesting HPV infection may mediate some of these cellular properties.Moreover, HPV-negative cell lines were not strictly epithelial presenting with a dynamic range of expression. Conclusions: This study presents the molecular and phenotypic diversity of HNSCC cell lines. It highlights the need formore studies in this field and a scoring system where HNSCC cell lines are ranked according to their respective epithelial andmesenchymal nature. This data will be useful to anyone modelling HNSCC behaviour, providing a molecular context which willenable them to decipher cell phenotypes and to develop therapies which block EMT progression.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article presents a method for checking the conformance between an event log capturing the actual execution of a business process, and a model capturing its expected or normative execution. Given a business process model and an event log, the method returns a set of statements in natural language describing the behavior allowed by the process model but not observed in the log and vice versa. The method relies on a unified representation of process models and event logs based on a well-known model of concurrency, namely event structures. Specifically, the problem of conformance checking is approached by folding the input event log into an event structure, unfolding the process model into another event structure, and comparing the two event structures via an error-correcting synchronized product. Each behavioral difference detected in the synchronized product is then verbalized as a natural language statement. An empirical evaluation shows that the proposed method scales up to real-life datasets while producing more concise and higher-level difference descriptions than state-of-the-art conformance checking methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding the effects of different types and quality of data on bioclimatic modeling predictions is vital to ascertaining the value of existing models, and to improving future models. Bioclimatic models were constructed using the CLIMEX program, using different data types – seasonal dynamics, geographic (overseas) distribution, and a combination of the two – for two biological control agents for the major weed Lantana camara L. in Australia. The models for one agent, Teleonemia scrupulosa Stål (Hemiptera:Tingidae) were based on a higher quality and quantity of data than the models for the other agent, Octotoma scabripennis Guérin-Méneville (Coleoptera: Chrysomelidae). Predictions of the geographic distribution for Australia showed that T. scrupulosa models exhibited greater accuracy with a progressive improvement from seasonal dynamics data, to the model based on overseas distribution, and finally the model combining the two data types. In contrast, O. scabripennis models were of low accuracy, and showed no clear trends across the various model types. These case studies demonstrate the importance of high quality data for developing models, and of supplementing distributional data with species seasonal dynamics data wherever possible. Seasonal dynamics data allows the modeller to focus on the species response to climatic trends, while distributional data enables easier fitting of stress parameters by restricting the species envelope to the described distribution. It is apparent that CLIMEX models based on low quality seasonal dynamics data, together with a small quantity of distributional data, are of minimal value in predicting the spatial extent of species distribution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is shown, in the composite fermion models studied by 't Hooft and others, that the requirements of Adler-Bell-Jackiw anomaly matching and n-independence are sufficient to fix the indices of composite representations. The third requirement, namely that of decoupling relations, follows from these two constraints in such models and hence is inessential.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The forest simulator is a computerized model for predicting forest growth and future development as well as effects of forest harvests and treatments. The forest planning system is a decision support tool, usually including a forest simulator and an optimisation model, for finding the optimal forest management actions. The information produced by forest simulators and forest planning systems is used for various analytical purposes and in support of decision making. However, the quality and reliability of this information can often be questioned. Natural variation in forest growth and estimation errors in forest inventory, among other things, cause uncertainty in predictions of forest growth and development. This uncertainty stemming from different sources has various undesirable effects. In many cases outcomes of decisions based on uncertain information are something else than desired. The objective of this thesis was to study various sources of uncertainty and their effects in forest simulators and forest planning systems. The study focused on three notable sources of uncertainty: errors in forest growth predictions, errors in forest inventory data, and stochastic fluctuation of timber assortment prices. Effects of uncertainty were studied using two types of forest growth models, individual tree-level models and stand-level models, and with various error simulation methods. New method for simulating more realistic forest inventory errors was introduced and tested. Also, three notable sources of uncertainty were combined and their joint effects on stand-level net present value estimates were simulated. According to the results, the various sources of uncertainty can have distinct effects in different forest growth simulators. The new forest inventory error simulation method proved to produce more realistic errors. The analysis on the joint effects of various sources of uncertainty provided interesting knowledge about uncertainty in forest simulators.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Remote sensing provides methods to infer land cover information over large geographical areas at a variety of spatial and temporal resolutions. Land cover is input data for a range of environmental models and information on land cover dynamics is required for monitoring the implications of global change. Such data are also essential in support of environmental management and policymaking. Boreal forests are a key component of the global climate and a major sink of carbon. The northern latitudes are expected to experience a disproportionate and rapid warming, which can have a major impact on vegetation at forest limits. This thesis examines the use of optical remote sensing for estimating aboveground biomass, leaf area index (LAI), tree cover and tree height in the boreal forests and tundra taiga transition zone in Finland. The continuous fields of forest attributes are required, for example, to improve the mapping of forest extent. The thesis focus on studying the feasibility of satellite data at multiple spatial resolutions, assessing the potential of multispectral, -angular and -temporal information, and provides regional evaluation for global land cover data. Preprocessed ASTER, MISR and MODIS products are the principal satellite data. The reference data consist of field measurements, forest inventory data and fine resolution land cover maps. Fine resolution studies demonstrate how statistical relationships between biomass and satellite data are relatively strong in single species and low biomass mountain birch forests in comparison to higher biomass coniferous stands. The combination of forest stand data and fine resolution ASTER images provides a method for biomass estimation using medium resolution MODIS data. The multiangular data improve the accuracy of land cover mapping in the sparsely forested tundra taiga transition zone, particularly in mires. Similarly, multitemporal data improve the accuracy of coarse resolution tree cover estimates in comparison to single date data. Furthermore, the peak of the growing season is not necessarily the optimal time for land cover mapping in the northern boreal regions. The evaluated coarse resolution land cover data sets have considerable shortcomings in northernmost Finland and should be used with caution in similar regions. The quantitative reference data and upscaling methods for integrating multiresolution data are required for calibration of statistical models and evaluation of land cover data sets. The preprocessed image products have potential for wider use as they can considerably reduce the time and effort used for data processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis addresses modeling of financial time series, especially stock market returns and daily price ranges. Modeling data of this kind can be approached with so-called multiplicative error models (MEM). These models nest several well known time series models such as GARCH, ACD and CARR models. They are able to capture many well established features of financial time series including volatility clustering and leptokurtosis. In contrast to these phenomena, different kinds of asymmetries have received relatively little attention in the existing literature. In this thesis asymmetries arise from various sources. They are observed in both conditional and unconditional distributions, for variables with non-negative values and for variables that have values on the real line. In the multivariate context asymmetries can be observed in the marginal distributions as well as in the relationships of the variables modeled. New methods for all these cases are proposed. Chapter 2 considers GARCH models and modeling of returns of two stock market indices. The chapter introduces the so-called generalized hyperbolic (GH) GARCH model to account for asymmetries in both conditional and unconditional distribution. In particular, two special cases of the GARCH-GH model which describe the data most accurately are proposed. They are found to improve the fit of the model when compared to symmetric GARCH models. The advantages of accounting for asymmetries are also observed through Value-at-Risk applications. Both theoretical and empirical contributions are provided in Chapter 3 of the thesis. In this chapter the so-called mixture conditional autoregressive range (MCARR) model is introduced, examined and applied to daily price ranges of the Hang Seng Index. The conditions for the strict and weak stationarity of the model as well as an expression for the autocorrelation function are obtained by writing the MCARR model as a first order autoregressive process with random coefficients. The chapter also introduces inverse gamma (IG) distribution to CARR models. The advantages of CARR-IG and MCARR-IG specifications over conventional CARR models are found in the empirical application both in- and out-of-sample. Chapter 4 discusses the simultaneous modeling of absolute returns and daily price ranges. In this part of the thesis a vector multiplicative error model (VMEM) with asymmetric Gumbel copula is found to provide substantial benefits over the existing VMEM models based on elliptical copulas. The proposed specification is able to capture the highly asymmetric dependence of the modeled variables thereby improving the performance of the model considerably. The economic significance of the results obtained is established when the information content of the volatility forecasts derived is examined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ubiquitous computing is about making computers and computerized artefacts a pervasive part of our everyday lifes, bringing more and more activities into the realm of information. The computationalization, informationalization of everyday activities increases not only our reach, efficiency and capabilities but also the amount and kinds of data gathered about us and our activities. In this thesis, I explore how information systems can be constructed so that they handle this personal data in a reasonable manner. The thesis provides two kinds of results: on one hand, tools and methods for both the construction as well as the evaluation of ubiquitous and mobile systems---on the other hand an evaluation of the privacy aspects of a ubiquitous social awareness system. The work emphasises real-world experiments as the most important way to study privacy. Additionally, the state of current information systems as regards data protection is studied. The tools and methods in this thesis consist of three distinct contributions. An algorithm for locationing in cellular networks is proposed that does not require the location information to be revealed beyond the user's terminal. A prototyping platform for the creation of context-aware ubiquitous applications called ContextPhone is described and released as open source. Finally, a set of methodological findings for the use of smartphones in social scientific field research is reported. A central contribution of this thesis are the pragmatic tools that allow other researchers to carry out experiments. The evaluation of the ubiquitous social awareness application ContextContacts covers both the usage of the system in general as well as an analysis of privacy implications. The usage of the system is analyzed in the light of how users make inferences of others based on real-time contextual cues mediated by the system, based on several long-term field studies. The analysis of privacy implications draws together the social psychological theory of self-presentation and research in privacy for ubiquitous computing, deriving a set of design guidelines for such systems. The main findings from these studies can be summarized as follows: The fact that ubiquitous computing systems gather more data about users can be used to not only study the use of such systems in an effort to create better systems but in general to study phenomena previously unstudied, such as the dynamic change of social networks. Systems that let people create new ways of presenting themselves to others can be fun for the users---but the self-presentation requires several thoughtful design decisions that allow the manipulation of the image mediated by the system. Finally, the growing amount of computational resources available to the users can be used to allow them to use the data themselves, rather than just being passive subjects of data gathering.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many species inhabit fragmented landscapes, resulting either from anthropogenic or from natural processes. The ecological and evolutionary dynamics of spatially structured populations are affected by a complex interplay between endogenous and exogenous factors. The metapopulation approach, simplifying the landscape to a discrete set of patches of breeding habitat surrounded by unsuitable matrix, has become a widely applied paradigm for the study of species inhabiting highly fragmented landscapes. In this thesis, I focus on the construction of biologically realistic models and their parameterization with empirical data, with the general objective of understanding how the interactions between individuals and their spatially structured environment affect ecological and evolutionary processes in fragmented landscapes. I study two hierarchically structured model systems, which are the Glanville fritillary butterfly in the Åland Islands, and a system of two interacting aphid species in the Tvärminne archipelago, both being located in South-Western Finland. The interesting and challenging feature of both study systems is that the population dynamics occur over multiple spatial scales that are linked by various processes. My main emphasis is in the development of mathematical and statistical methodologies. For the Glanville fritillary case study, I first build a Bayesian framework for the estimation of death rates and capture probabilities from mark-recapture data, with the novelty of accounting for variation among individuals in capture probabilities and survival. I then characterize the dispersal phase of the butterflies by deriving a mathematical approximation of a diffusion-based movement model applied to a network of patches. I use the movement model as a building block to construct an individual-based evolutionary model for the Glanville fritillary butterfly metapopulation. I parameterize the evolutionary model using a pattern-oriented approach, and use it to study how the landscape structure affects the evolution of dispersal. For the aphid case study, I develop a Bayesian model of hierarchical multi-scale metapopulation dynamics, where the observed extinction and colonization rates are decomposed into intrinsic rates operating specifically at each spatial scale. In summary, I show how analytical approaches, hierarchical Bayesian methods and individual-based simulations can be used individually or in combination to tackle complex problems from many different viewpoints. In particular, hierarchical Bayesian methods provide a useful tool for decomposing ecological complexity into more tractable components.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Regular electrical activation waves in cardiac tissue lead to the rhythmic contraction and expansion of the heart that ensures blood supply to the whole body. Irregularities in the propagation of these activation waves can result in cardiac arrhythmias, like ventricular tachycardia (VT) and ventricular fibrillation (VF), which are major causes of death in the industrialised world. Indeed there is growing consensus that spiral or scroll waves of electrical activation in cardiac tissue are associated with VT, whereas, when these waves break to yield spiral- or scroll-wave turbulence, VT develops into life-threatening VF: in the absence of medical intervention, this makes the heart incapable of pumping blood and a patient dies in roughly two-and-a-half minutes after the initiation of VF. Thus studies of spiral- and scroll-wave dynamics in cardiac tissue pose important challenges for in vivo and in vitro experimental studies and for in silico numerical studies of mathematical models for cardiac tissue. A major goal here is to develop low-amplitude defibrillation schemes for the elimination of VT and VF, especially in the presence of inhomogeneities that occur commonly in cardiac tissue. We present a detailed and systematic study of spiral- and scroll-wave turbulence and spatiotemporal chaos in four mathematical models for cardiac tissue, namely, the Panfilov, Luo-Rudy phase 1 (LRI), reduced Priebe-Beuckelmann (RPB) models, and the model of ten Tusscher, Noble, Noble, and Panfilov (TNNP). In particular, we use extensive numerical simulations to elucidate the interaction of spiral and scroll waves in these models with conduction and ionic inhomogeneities; we also examine the suppression of spiral- and scroll-wave turbulence by low-amplitude control pulses. Our central qualitative result is that, in all these models, the dynamics of such spiral waves depends very sensitively on such inhomogeneities. We also study two types of control chemes that have been suggested for the control of spiral turbulence, via low amplitude current pulses, in such mathematical models for cardiac tissue; our investigations here are designed to examine the efficacy of such control schemes in the presence of inhomogeneities. We find that a local pulsing scheme does not suppress spiral turbulence in the presence of inhomogeneities; but a scheme that uses control pulses on a spatially extended mesh is more successful in the elimination of spiral turbulence. We discuss the theoretical and experimental implications of our study that have a direct bearing on defibrillation, the control of life-threatening cardiac arrhythmias such as ventricular fibrillation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Backround and Purpose The often fatal (in 50-35%) subarachnoid hemorrhage (SAH) caused by saccular cerebral artery aneurysm (SCAA) rupture affects mainly the working aged population. The incidence of SAH is 10-11 / 100 000 in Western countries and twice as high in Finland and Japan. The estimated prevalence of SCAAs is around 2%. Many of those never rupture. Currently there are, however, no diagnostic methods to identify rupture-prone SCAAs from quiescent, (dormant) ones. Finding diagnostic markers for rupture-prone SCAAs is of primary importance since a SCAA rupture has such a sinister outcome, and all current treatment modalities are associated with morbidity and mortality. Also the therapies that prevent SCAA rupture need to be developed to as minimally invasive as possible. Although the clinical risk factors for SCAA rupture have been extensively studied and documented in large patient series, the cellular and molecular mechanisms how these risk factors lead to SCAA wall rupture remain incompletely known. Elucidation of the molecular and cellular pathobiology of the SCAA wall is needed in order to develop i) novel diagnostic tools that could identify rupture-prone SCAAs or patients at risk of SAH, and to ii) develop novel biological therapies that prevent SCAA wall rupture. Materials and Methods In this study, histological samples from unruptured and ruptured SCAAs and plasma samples from SCAA carriers were compared in order to identify structural changes, cell populations, growth factor receptors, or other molecular markers that would associate with SCAA wall rupture. In addition, experimental saccular aneurysm models and experimental models of mechanical vascular injury were used to study the cellular mechanisms of scar formation in the arterial wall, and the adaptation of the arterial wall to increased mechanical stress. Results and Interpretation Inflammation and degeneration of the SCAA wall, namely loss of mural cells and degradation of the wall matrix, were found to associate with rupture. Unruptured SCAA walls had structural resemblance with pads of myointimal hyperplasia or so called neointima that characterizes early atherosclerotic lesions, and is the repair and adaptation mechanism of the arterial wall after injury or increased mechanical stress. As in pads of myointimal hyperplasia elsewhere in the vasculature, oxidated LDL was found in the SCAA walls. Immunity against OxLDL was demonstrated in SAH patients with detection of circulating anti-oxidized LDL antibodies, which were significantly associated with the risk of rupture in patients with solitary SCAAs. Growth factor receptors associated with arterial wall remodeling and angiogenesis were more expressed in ruptured SCAA walls. In experimental saccular aneurysm models, capillary growth, arterial wall remodeling and neointima formation were found. The neointimal cells were shown to originate from the experimental aneurysm wall with minor contribution from the adjacent artery, and a negligible contribution of bone marrow-derived neointimal cells. Since loss of mural cells characterizes ruptured human SCAAs and likely impairs the adaptation and repair mechanism of ruptured or rupture-prone SCAAs, we investigated also the hypothesis that bone marrow-derived or circulating neointimal precursor cells could be used to enhance neointima formation and compensate the impaired repair capacity in ruptured SCAA walls. However, significant contribution of bone marrow cells or circulating mononuclear cells to neointima formation was not found.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis studies binary time series models and their applications in empirical macroeconomics and finance. In addition to previously suggested models, new dynamic extensions are proposed to the static probit model commonly used in the previous literature. In particular, we are interested in probit models with an autoregressive model structure. In Chapter 2, the main objective is to compare the predictive performance of the static and dynamic probit models in forecasting the U.S. and German business cycle recession periods. Financial variables, such as interest rates and stock market returns, are used as predictive variables. The empirical results suggest that the recession periods are predictable and dynamic probit models, especially models with the autoregressive structure, outperform the static model. Chapter 3 proposes a Lagrange Multiplier (LM) test for the usefulness of the autoregressive structure of the probit model. The finite sample properties of the LM test are considered with simulation experiments. Results indicate that the two alternative LM test statistics have reasonable size and power in large samples. In small samples, a parametric bootstrap method is suggested to obtain approximately correct size. In Chapter 4, the predictive power of dynamic probit models in predicting the direction of stock market returns are examined. The novel idea is to use recession forecast (see Chapter 2) as a predictor of the stock return sign. The evidence suggests that the signs of the U.S. excess stock returns over the risk-free return are predictable both in and out of sample. The new "error correction" probit model yields the best forecasts and it also outperforms other predictive models, such as ARMAX models, in terms of statistical and economic goodness-of-fit measures. Chapter 5 generalizes the analysis of univariate models considered in Chapters 2 4 to the case of a bivariate model. A new bivariate autoregressive probit model is applied to predict the current state of the U.S. business cycle and growth rate cycle periods. Evidence of predictability of both cycle indicators is obtained and the bivariate model is found to outperform the univariate models in terms of predictive power.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and aims: Low stage and curative surgery are established factors for improved survival in gastric cancer. However, not all low-stage patients have a good prognosis. Cyclooxygenase-2 (COX-2) is known to associate with reduced survival in several cancers, and has been shown to play an important role in gastric carcinogenesis. Since new and better prognostic markers are needed for gastric cancer, we studied the prognostic significance of COX-2 and of markers that associate with COX-2 expression. We also studied markers reflecting proliferation and apoptosis, and evaluated their association with COX-2. Our purpose was to construct an accurate prognostic model by combining tissue markers and clinicopathogical factors. Materials and methods: Of 342 consecutive patients who underwent surgery for gastric cancer at Meilahti Hospital, Helsinki University Central Hospital, 337 were included in this study. Low stages I to II were represented by 141 (42%) patients, and high stages III to IV by 196 (58%). Curative surgery was performed on 176 (52%) patients. Survival data were obtained from the national registers. Slides from archive tissue blocks were prepared for immunohistochemistry by use of COX-2, human antigen R (HuR), cyclin A, matrix metalloproteinases 2 and 9 (MMP-2, MMP-9), and Ki-67 antibodies. Immunostainings were scored by microscopy, and scores were entered into a database. Associations of tumor markers with clinicopathological factors were calculated, as well as associations with p53, p21, and results of flow cytometry from earlier studies. Survival analysis was performed by the Kaplan-Meier method, and Cox multivariate models were reconstructed. Cell culture experiments were performed to explore the effect of small interfering (si)RNA of HuR on COX-2 expression in a TMK-1 gastric cancer cell line. Results: Overall 5-year survival was 35.1%. Study I showed that COX-2 was an independent prognostic factor, and that the prognostic impact of COX-2 was more pronounced in low-stage patients. Cytoplasmic HuR expression also associated with reduced survival in gastric cancer patients in a non-independent manner. Cell culture experiments showed that HuR can regulate COX-2 expression in TMK-1 cells in vitro, with an association also between COX-2 and HuR tissue expression in a clinical material. In Study II, cyclin A was an independent prognostic factor and was associated with HuR expression in the gastric cancer material. The results of Study III showed that epithelial MMP-2 associated with survival in univariate, but not in multivariate analysis. However, MMP-9 showed no prognostic value. MMP-2 expression was associated with COX-2 expression. In Study IV, the prognostic power of COX-2 was compared with that of all tested markers associated with survival in Studies I to III, as well as with p21, p53, and flow cytometry results. COX-2 and p53 were independent prognostic factors, and COX-2 expression was associated with that of p53 and Ki-67 and also with aneuploidy. Conclusions: COX-2 is an independent prognostic factor in gastric cancer, and its prognostic power emerges especially in low stage cancer. COX-2 is regulated by HuR, and is associated with factors reflecting invasion, proliferation, and apoptosis. In an extended multivariate model, COX-2 retained its position as an independent prognosticator. COX-2 can be considered a promising new prognostic marker in gastric cancer.