945 resultados para Business model matrix


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interest towards Enterprise Architecture (EA) has been increasing during the last few years. EA has been found to be a crucial aspect of business survival, and thus the importance of EA implementation success is also crucial. Current literature does not have a tool to be used to measure the success of EA implementation. In this paper, a tentative model for measuring success is presented and empirically validated in EA context. Results show that the success of EA implementation can be measured indirectly by measuring the achievement of the objectives set for the implementation. Results also imply that achieving individual's objectives do not necessarily mean that organisation's objectives are achieved. The presented Success Measurement Model can be used as basis for developing measurement metrics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A class identification algorithms is introduced for Gaussian process(GP)models.The fundamental approach is to propose a new kernel function which leads to a covariance matrix with low rank,a property that is consequently exploited for computational efficiency for both model parameter estimation and model predictions.The objective of either maximizing the marginal likelihood or the Kullback–Leibler (K–L) divergence between the estimated output probability density function(pdf)and the true pdf has been used as respective cost functions.For each cost function,an efficient coordinate descent algorithm is proposed to estimate the kernel parameters using a one dimensional derivative free search, and noise variance using a fast gradient descent algorithm. Numerical examples are included to demonstrate the effectiveness of the new identification approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to examine the critical assumptions lying behind the Anglo American model of corporate governance. Design/methodology/approach – Literature review examining the concept of a nexus of contracts underpinning agency theory which, it is argued, act as the platform for neo-liberal corporate governance focusing on shareholder wealth creation. Findings – The paper highlights the unaddressed critical challenge of why eighteenth century ownership structures are readily adopted in the twenty-first century. Social implications – A re-examination of wealth creation and wealth redistribution. Originality/value – The paper is highly original due to the fact that few contributions have been made in the area of rethinking shareholder value.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 2008-2009 financial crisis and related organizational and economic failures have meant that financial organizations are faced with a ‘tsunami’ of new regulatory obligations. This environment provides new managerial challenges as organizations are forced to engage in complex and costly remediation projects with short deadlines. Drawing from a longitudinal study conducted with nine financial institutions over twelve years, this paper identifies nine IS capabilities which underpin activities for managing regulatory themed governance, risk and compliance efforts. The research shows that many firms are now focused on meeting the Regulators’ deadlines at the expense of developing a strategic, enterprise-wide connected approach to compliance. Consequently, executives are in danger of implementing siloed compliance solutions within business functions. By evaluating the maturity of their IS capabilities which underpin regulatory adherence, managers have an opportunity to develop robust operational architectures and so are better positioned to face the challenges derived from shifting regulatory landscapes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We systematically compare the performance of ETKF-4DVAR, 4DVAR-BEN and 4DENVAR with respect to two traditional methods (4DVAR and ETKF) and an ensemble transform Kalman smoother (ETKS) on the Lorenz 1963 model. We specifically investigated this performance with increasing nonlinearity and using a quasi-static variational assimilation algorithm as a comparison. Using the analysis root mean square error (RMSE) as a metric, these methods have been compared considering (1) assimilation window length and observation interval size and (2) ensemble size to investigate the influence of hybrid background error covariance matrices and nonlinearity on the performance of the methods. For short assimilation windows with close to linear dynamics, it has been shown that all hybrid methods show an improvement in RMSE compared to the traditional methods. For long assimilation window lengths in which nonlinear dynamics are substantial, the variational framework can have diffculties fnding the global minimum of the cost function, so we explore a quasi-static variational assimilation (QSVA) framework. Of the hybrid methods, it is seen that under certain parameters, hybrid methods which do not use a climatological background error covariance do not need QSVA to perform accurately. Generally, results show that the ETKS and hybrid methods that do not use a climatological background error covariance matrix with QSVA outperform all other methods due to the full flow dependency of the background error covariance matrix which also allows for the most nonlinearity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The disadvantage of the majority of data assimilation schemes is the assumption that the conditional probability density function of the state of the system given the observations [posterior probability density function (PDF)] is distributed either locally or globally as a Gaussian. The advantage, however, is that through various different mechanisms they ensure initial conditions that are predominantly in linear balance and therefore spurious gravity wave generation is suppressed. The equivalent-weights particle filter is a data assimilation scheme that allows for a representation of a potentially multimodal posterior PDF. It does this via proposal densities that lead to extra terms being added to the model equations and means the advantage of the traditional data assimilation schemes, in generating predominantly balanced initial conditions, is no longer guaranteed. This paper looks in detail at the impact the equivalent-weights particle filter has on dynamical balance and gravity wave generation in a primitive equation model. The primary conclusions are that (i) provided the model error covariance matrix imposes geostrophic balance, then each additional term required by the equivalent-weights particle filter is also geostrophically balanced; (ii) the relaxation term required to ensure the particles are in the locality of the observations has little effect on gravity waves and actually induces a reduction in gravity wave energy if sufficiently large; and (iii) the equivalent-weights term, which leads to the particles having equivalent significance in the posterior PDF, produces a change in gravity wave energy comparable to the stochastic model error. Thus, the scheme does not produce significant spurious gravity wave energy and so has potential for application in real high-dimensional geophysical applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Customers will not continue to pay for a service if it is perceived to be of poor quality, and/or of no value. With a paradigm shift towards business dependence on service orientated IS solutions [1], it is critical that alignment exists between service definition, delivery, and customer expectation, businesses are to ensure customer satisfaction. Services, and micro-service development, offer businesses a flexible structure for solution innovation, however, constant changes in technology, business and societal expectations means an iterative analysis solution is required to i) determine whether provider services adequately meet customer segment needs and expectations, and ii) to help guide business service innovation and development. In this paper, by incorporating multiple models, we propose a series of steps to help identify and prioritise service gaps. Moreover, the authors propose the Dual Semiosis Analysis Model, i.e. a tool that highlights where within the symbiotic customer / provider semiosis process, requirements misinterpretation, and/or service provision deficiencies occur. This paper offers the reader a powerful customer-centric tool, designed to help business managers highlight both what services are critical to customer quality perception, and where future innovation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cell migration is a highly coordinated process and any aberration in the regulatory mechanisms could result in pathological conditions such as cancer. The ability of cancer cells to disseminate to distant sites within the body has made it difficult to treat. Cancer cells also exhibit plasticity that makes them able to interconvert from an elongated, mesenchymal morphology to an amoeboid blebbing form under different physiological conditions. Blebs are spherical membrane protrusions formed by actomyosin-mediated contractility of cortical actin resulting in increased hydrostatic pressure and subsequent detachment of the membrane from the cortex. Tumour cells use blebbing as an alternative mode of migration by squeezing through preexisting gaps in the ECM, and bleb formation is believed to be mediated by the Rho-ROCK signaling pathway. However, the involvement of transmembrane water and ion channels in cell blebbing has not been examined. In the present study, the role of the transmembrane water channels, aquaporins, transmembrane ion transporters and lipid signaling enzymes in the regulation of blebbing was investigated. Using 3D matrigel matrix as an in vitro model to mimic normal extracellular matrix, and a combination of confocal and time-lapse microscopy, it was found that AQP1 knockdown by siRNA ablated blebbing of HT1080 and ACHN cells, and overexpression of AQP1-GFP not only significantly increased bleb size with a corresponding decrease in bleb numbers, but also induced bleb formation in non-blebbing cell lines. Importantly, AQP1 overexpression reduces bleb lifespan due to faster bleb retraction. This novel finding of AQP1-facilitated bleb retraction requires the activity of the Na+/H+ pump as inhibition of the ion transporter, which was found localized to intracellular vesicles, blocked bleb retraction in both cell lines. This study also demonstrated that a differential regulation of cell blebbing by AQP isoforms exists as knockdown of AQP5 had no effect on bleb formation. Data from this study also demonstrates that the lipid signaling PLD2 signals through PA in the LPA-LPAR-Rho-ROCK axis to positively regulate bleb formation in both cell lines. Taken together, this work provides a novel role of AQP1 and Na+/H+ pump in regulation of cell blebbing, and this could be exploited in the development of new therapy to treat cancer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the development of convection-permitting numerical weather prediction the efficient use of high resolution observations in data assimilation is becoming increasingly important. The operational assimilation of these observations, such as Dopplerradar radial winds, is now common, though to avoid violating the assumption of un- correlated observation errors the observation density is severely reduced. To improve the quantity of observations used and the impact that they have on the forecast will require the introduction of the full, potentially correlated, error statistics. In this work, observation error statistics are calculated for the Doppler radar radial winds that are assimilated into the Met Office high resolution UK model using a diagnostic that makes use of statistical averages of observation-minus-background and observation-minus-analysis residuals. This is the first in-depth study using the diagnostic to estimate both horizontal and along-beam correlated observation errors. By considering the new results obtained it is found that the Doppler radar radial wind error standard deviations are similar to those used operationally and increase as the observation height increases. Surprisingly the estimated observation error correlation length scales are longer than the operational thinning distance. They are dependent on both the height of the observation and on the distance of the observation away from the radar. Further tests show that the long correlations cannot be attributed to the use of superobservations or the background error covariance matrix used in the assimilation. The large horizontal correlation length scales are, however, in part, a result of using a simplified observation operator.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Land cover data derived from satellites are commonly used to prescribe inputs to models of the land surface. Since such data inevitably contains errors, quantifying how uncertainties in the data affect a model’s output is important. To do so, a spatial distribution of possible land cover values is required to propagate through the model’s simulation. However, at large scales, such as those required for climate models, such spatial modelling can be difficult. Also, computer models often require land cover proportions at sites larger than the original map scale as inputs, and it is the uncertainty in these proportions that this article discusses. This paper describes a Monte Carlo sampling scheme that generates realisations of land cover proportions from the posterior distribution as implied by a Bayesian analysis that combines spatial information in the land cover map and its associated confusion matrix. The technique is computationally simple and has been applied previously to the Land Cover Map 2000 for the region of England and Wales. This article demonstrates the ability of the technique to scale up to large (global) satellite derived land cover maps and reports its application to the GlobCover 2009 data product. The results show that, in general, the GlobCover data possesses only small biases, with the largest belonging to non–vegetated surfaces. In vegetated surfaces, the most prominent area of uncertainty is Southern Africa, which represents a complex heterogeneous landscape. It is also clear from this study that greater resources need to be devoted to the construction of comprehensive confusion matrices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – Recognizing the heterogeneity of services, this paper aims to clarify the characteristics of forward and the corresponding reverse supply chains of different services. Design/methodology/approach – The paper develops a two-dimensional typology matrix, representing four main clusters of services according to the degree of input standardization and the degree of output tangibility. Based on this matrix, this paper develops a typology and parsimonious conceptual models illustrating the characteristics of forward and the corresponding reverse supply chains of each cluster of services. Findings – The four main clusters of service supply chains have different characteristics. This provides the basis for the identification, presentation and explanation of the different characteristics of their corresponding reverse service supply chains. Research limitations/implications – The findings of this research can help future researchers to analyse, map and model forward and reverse service supply chains, and to identify potential research gaps in the area. Practical/implications – The findings of the research can help managers of service firms to gain better visibility of their forward and reverse supply chains, and refine their business models to help extend their reverse/closed-loop activities. Furthermore, the findings can help managers to better optimize their service operations to reduce service gaps and potentially secure new value-adding opportunities. Originality/value – This paper is the first, to the authors ' knowledge, to conceptualize the basic structure of the forward and reverse service supply chains while dealing with the high level of heterogeneity of services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The matrix-tolerance hypothesis suggests that the most abundant species in the inter-habitat matrix would be less vulnerable to their habitat fragmentation. This model was tested with leaf-litter frogs in the Atlantic Forest where the fragmentation process is older and more severe than in the Amazon, where the model was first developed. Frog abundance data from the agricultural matrix, forest fragments and continuous forest localities were used. We found an expected negative correlation between the abundance of frogs in the matrix and their vulnerability to fragmentation, however, results varied with fragment size and species traits. Smaller fragments exhibited stronger matrix-vulnerability correlation than intermediate fragments, while no significant relation was observed for large fragments. Moreover, some species that avoid the matrix were not sensitive to a decrease in the patch size, and the opposite was also true, indicating significant differences with that expected from the model. Most of the species that use the matrix were forest species with aquatic larvae development, but those species do not necessarily respond to fragmentation or fragment size, and thus affect more intensively the strengthen of the expected relationship. Therefore, the main relationship expected by the matrix-tolerance hypothesis was observed in the Atlantic Forest; however we noted that the prediction of this hypothesis can be substantially affected by the size of the fragments, and by species traits. We propose that matrix-tolerance model should be broadened to become a more effective model, including other patch characteristics, particularly fragment size, and individual species traits (e. g., reproductive mode and habitat preference).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information to guide decision making is especially urgent in human dominated landscapes in the tropics, where urban and agricultural frontiers are still expanding in an unplanned manner. Nevertheless, most studies that have investigated the influence of landscape structure on species distribution have not considered the heterogeneity of altered habitats of the matrix, which is usually high in human dominated landscapes. Using the distribution of small mammals in forest remnants and in the four main altered habitats in an Atlantic forest landscape, we investigated 1) how explanatory power of models describing species distribution in forest remnants varies between landscape structure variables that do or do not incorporate matrix quality and 2) the importance of spatial scale for analyzing the influence of landscape structure. We used standardized sampling in remnants and altered habitats to generate two indices of habitat quality, corresponding to the abundance and to the occurrence of small mammals. For each remnant, we calculated habitat quantity and connectivity in different spatial scales, considering or not the quality of surrounding habitats. The incorporation of matrix quality increased model explanatory power across all spatial scales for half the species that occurred in the matrix, but only when taking into account the distance between habitat patches (connectivity). These connectivity models were also less affected by spatial scale than habitat quantity models. The few consistent responses to the variation in spatial scales indicate that despite their small size, small mammals perceive landscape features at large spatial scales. Matrix quality index corresponding to species occurrence presented a better or similar performance compared to that of species abundance. Results indicate the importance of the matrix for the dynamics of fragmented landscapes and suggest that relatively simple indices can improve our understanding of species distribution, and could be applied in modeling, monitoring and managing complex tropical landscapes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Gamma-linolenic acid is a known inhibitor of tumour cell proliferation and migration in both in vitro and in vivo conditions. The aim of the present study was to determine the mechanisms by which gamma-linolenic acid (GLA) osmotic pump infusion alters glioma cell proliferation, and whether it affects cell cycle control and angiogenesis in the C6 glioma in vivo. Methods: Established C6 rat gliomas were treated for 14 days with 5 mM GLA in CSF or CSF alone. Tumour size was estimated, microvessel density (MVD) counted and protein and mRNA expression measured by immunohistochemistry, western blotting and RT-PCR. Results: GLA caused a significant decrease in tumour size (75 +/- 8.8%) and reduced MVD by 44 +/- 5.4%. These changes were associated with reduced expression of vascular endothelial growth factor (VEGF) (71 +/- 16%) and the VEGF receptor Flt1 (57 +/- 5.8%) but not Flk1. Expression of ERK1/2 was also reduced by 27 +/- 7.7% and 31 +/- 8.7% respectively. mRNA expression of matrix metalloproteinase-2 (MMP2) was reduced by 35 +/- 6.8% and zymography showed MMP2 proteolytic activity was reduced by 32 +/- 8.5%. GLA altered the expression of several proteins involved in cell cycle control. pRb protein expression was decreased (62 +/- 18%) while E2F1 remained unchanged. Cyclin D1 protein expression was increased by 42 +/- 12% in the presence of GLA. The cyclin dependent kinase inhibitors p21 and p27 responded differently to GLA, p27 expression was increased (27 +/- 7.3%) while p21 remained unchanged. The expression of p53 was increased (44 +/- 16%) by GLA. Finally, the BrdU incorporation studies found a significant inhibition (32 +/- 11%) of BrdU incorporation into the tumour in vivo. Conclusion: Overall the findings reported in the present study lend further support to the potential of GLA as an inhibitor of glioma cell proliferation in vivo and show it has direct effects upon cell cycle control and angiogenesis. These effects involve changes in protein expression of VEGF, Flt1, ERK1, ERK2, MMP2, Cyclin D1, pRb, p53 and p27. Combination therapy using drugs with other, complementary targets and GLA could lead to gains in treatment efficacy in this notoriously difficult to treat tumour.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Considering the Wald, score, and likelihood ratio asymptotic test statistics, we analyze a multivariate null intercept errors-in-variables regression model, where the explanatory and the response variables are subject to measurement errors, and a possible structure of dependency between the measurements taken within the same individual are incorporated, representing a longitudinal structure. This model was proposed by Aoki et al. (2003b) and analyzed under the bayesian approach. In this article, considering the classical approach, we analyze asymptotic test statistics and present a simulation study to compare the behavior of the three test statistics for different sample sizes, parameter values and nominal levels of the test. Also, closed form expressions for the score function and the Fisher information matrix are presented. We consider two real numerical illustrations, the odontological data set from Hadgu and Koch (1999), and a quality control data set.