11 resultados para Competitive analysis

em CentAUR: Central Archive University of Reading - UK


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context: Learning can be regarded as knowledge construction in which prior knowledge and experience serve as basis for the learners to expand their knowledge base. Such a process of knowledge construction has to take place continuously in order to enhance the learners’ competence in a competitive working environment. As the information consumers, the individual users demand personalised information provision which meets their own specific purposes, goals, and expectations. Objectives: The current methods in requirements engineering are capable of modelling the common user’s behaviour in the domain of knowledge construction. The users’ requirements can be represented as a case in the defined structure which can be reasoned to enable the requirements analysis. Such analysis needs to be enhanced so that personalised information provision can be tackled and modelled. However, there is a lack of suitable modelling methods to achieve this end. This paper presents a new ontological method for capturing individual user’s requirements and transforming the requirements onto personalised information provision specifications. Hence the right information can be provided to the right user for the right purpose. Method: An experiment was conducted based on the qualitative method. A medium size of group of users participated to validate the method and its techniques, i.e. articulates, maps, configures, and learning content. The results were used as the feedback for the improvement. Result: The research work has produced an ontology model with a set of techniques which support the functions for profiling user’s requirements, reasoning requirements patterns, generating workflow from norms, and formulating information provision specifications. Conclusion: The current requirements engineering approaches provide the methodical capability for developing solutions. Our research outcome, i.e. the ontology model with the techniques, can further enhance the RE approaches for modelling the individual user’s needs and discovering the user’s requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Elevated levels of low-density-lipoprotein cholesterol (LDL-C) in the plasma are a well-established risk factor for the development of coronary heart disease. Plasma LDL-C levels are in part determined by the rate at which LDL particles are removed from the bloodstream by hepatic uptake. The uptake of LDL by mammalian liver cells occurs mainly via receptor-mediated endocytosis, a process which entails the binding of these particles to specific receptors in specialised areas of the cell surface, the subsequent internalization of the receptor-lipoprotein complex, and ultimately the degradation and release of the ingested lipoproteins' constituent parts. We formulate a mathematical model to study the binding and internalization (endocytosis) of LDL and VLDL particles by hepatocytes in culture. The system of ordinary differential equations, which includes a cholesterol-dependent pit production term representing feedback regulation of surface receptors in response to intracellular cholesterol levels, is analysed using numerical simulations and steady-state analysis. Our numerical results show good agreement with in vitro experimental data describing LDL uptake by cultured hepatocytes following delivery of a single bolus of lipoprotein. Our model is adapted in order to reflect the in vivo situation, in which lipoproteins are continuously delivered to the hepatocyte. In this case, our model suggests that the competition between the LDL and VLDL particles for binding to the pits on the cell surface affects the intracellular cholesterol concentration. In particular, we predict that when there is continuous delivery of low levels of lipoproteins to the cell surface, more VLDL than LDL occupies the pit, since VLDL are better competitors for receptor binding. VLDL have a cholesterol content comparable to LDL particles; however, due to the larger size of VLDL, one pit-bound VLDL particle blocks binding of several LDLs, and there is a resultant drop in the intracellular cholesterol level. When there is continuous delivery of lipoprotein at high levels to the hepatocytes, VLDL particles still out-compete LDL particles for receptor binding, and consequently more VLDL than LDL particles occupy the pit. Although the maximum intracellular cholesterol level is similar for high and low levels of lipoprotein delivery, the maximum is reached more rapidly when the lipoprotein delivery rates are high. The implications of these results for the design of in vitro experiments is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A case study on the tendering process and cost/time performance of a public building project in Ghana is conducted. Competitive bids submitted by five contractors for the project, in which contractors were required to prepare their own quantities, were analyzed to compare differences in their pricing levels and risk/requirement perceptions. Queries sent to the consultants at the tender stage were also analyzed to identify the significant areas of concern to contractors in relation to the tender documentation. The five bidding prices were significantly different. The queries submitted for clarifications were significantly different, although a few were similar. Using a before-and-after experiment, the expected cost/time estimate at the start of the project was compared to the actual cost/time values, i.e. what happened in the actual construction phase. The analysis showed that the project exceeded its expected cost by 18% and its planned time by 210%. Variations and inadequate design were the major reasons. Following an exploration of these issues, an alternative tendering mechanism is recommended to clients. A shift away from the conventional approach of awarding work based on price, and serious consideration of alternative procurement routes can help clients in Ghana obtain better value for money on their projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a connectionist searching technique - the Stochastic Diffusion Search (SDS), capable of rapidly locating a specified pattern in a noisy search space. In operation SDS finds the position of the pre-specified pattern or if it does not exist - its best instantiation in the search space. This is achieved via parallel exploration of the whole search space by an ensemble of agents searching in a competitive cooperative manner. We prove mathematically the convergence of stochastic diffusion search. SDS converges to a statistical equilibrium when it locates the best instantiation of the object in the search space. Experiments presented in this paper indicate the high robustness of SDS and show good scalability with problem size. The convergence characteristic of SDS makes it a fully adaptive algorithm and suggests applications in dynamically changing environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to convert existing faba bean (Vicia faba L.) single nucleotide polymorphism (SNP) markers from cleaved amplification polymorphic sequences and SNaPshot® formats, which are expensive and time-consuming, to the more convenient KBiosciences competitive allele‐specific PCR (KASP) assay format. Out of 80 assays designed, 75 were validated, though a core set of 67 of the most robust markers is recommended for further use. The 67 best KASP SNP assays were used across two generations of single seed descent to detect unintended outcrossing and to track and quantify loss of heterozygosity, a capability that will significantly increase the efficiency and performance of pure line production and maintenance. This same set of assays was also used to examine genetic relationships between the 67 members of the partly inbred panel, and should prove useful for line identification and diversity studies in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Toxic or allelopathic compounds liberated by toxin-producing phytoplankton (TPP) acts as a strong mediator in plankton dynamics. On an analysis of a set of phytoplankton biomass data that have been collected by our group in the northwest part of the Bay of Bengal, and by analysis of a three-component mathematical model under a constant as well as a stochastic environment, we explore the role of toxin-allelopathy in determining the dynamic behavior of the competing phytoplankton species. The overall results, based on analytical and numerical wings, demonstrate that toxin-allelopathy due to the TPP promotes a stable co-existence of those competitive phytoplankton that would otherwise exhibit competitive exclusion of the weak species. Our study suggests that TPP might be a potential candidate for maintaining the co-existence and diversity of competing phytoplankton species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The network paradigm has been highly influential in spatial analysis in the globalisation era. As economies across the world have become increasingly integrated, so-called global cities have come to play a growing role as central nodes in the networked global economy. The idea that a city’s position in global networks benefits its economic performance has resulted in a competitive policy focus on promoting the economic growth of cities by improving their network connectivity. However, in spite of the attention being given to boosting city connectivity little is known about whether this directly translates to improved city economic performance and, if so, how well connected a city needs to be in order to benefit from this. In this paper we test the relationship between network connectivity and economic performance between 2000 and 2008 for cities with over 500,000 inhabitants in Europe and the USA to inform European policy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The role played by viral marketing has received considerable academic and digital media attention recently. Key issues in viral marketing have been examined through the lens of the mode of marketing message transmission, including self-replicating on the basis of quality difference, individuals’ emotional needs, as well as how users are connected across various social networks. This paper presents a review and analysis of viral marketing studies from 2001 to the present day. It investigates how viral marketing facilitate the diffusion of social media products and the relationship between marketers and these product users by taking a look at the implementation of viral marketing in two European online game firms Jagex Games Studio and Rovio Entertainment. The results from this review and analysis indicate that viral marketing plays an important role in accelerating the interaction between marketers and users (as well as the user groups) in the field of digital media and high tech consumption. Therefore, it is evident that firms should understand the social contagion process and target well-connected users purposefully in order to create its competitive advantage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work describes a new tool that helps bidders improve their competitive bidding strategies. This new tool consists of an easy-to-use graphical tool that allows the use of more complex decision analysis tools in the field of Competitive Bidding. The graphic tool described here tries to move away from previous bidding models which attempt to describe the result of an auction or a tender process by means of studying each possible bidder with probability density functions. As an illustration, the tool is applied to three practical cases. Theoretical and practical conclusions on the great potential breadth of application of the tool are also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new sparse kernel density estimator is introduced based on the minimum integrated square error criterion combining local component analysis for the finite mixture model. We start with a Parzen window estimator which has the Gaussian kernels with a common covariance matrix, the local component analysis is initially applied to find the covariance matrix using expectation maximization algorithm. Since the constraint on the mixing coefficients of a finite mixture model is on the multinomial manifold, we then use the well-known Riemannian trust-region algorithm to find the set of sparse mixing coefficients. The first and second order Riemannian geometry of the multinomial manifold are utilized in the Riemannian trust-region algorithm. Numerical examples are employed to demonstrate that the proposed approach is effective in constructing sparse kernel density estimators with competitive accuracy to existing kernel density estimators.