930 resultados para Competitive Analysis
Resumo:
We run a standard income convergence analysis for the last decade and confirm an already established finding in the growth economics literature. EU countries are converging. Regions in Europe are also converging. But, within countries, regional disparities are on the rise. At the same time, there is probably no reason for EU Cohesion Policy to be concerned with what happens inside countries. Ultimately, our data shows that national governments redistribute well across regions, whether they are fiscally centralised or decentralised. It is difficult to establish if Structural and Cohesion Funds play any role in recent growth convergence patterns in Europe. Generally, macroeconomic simulations produce better results than empirical tests. It is thus possible that Structural Funds do not fully realise their potential either because they are not efficiently allocated or are badly managed or are used for the wrong investments, or a combination of all three. The approach to assess the effectiveness of EU funds should be consistent with the rationale behind the post-1988 EU Cohesion Policy. Standard income convergence analysis is certainly not sufficient and should be accompanied by an assessment of the changes in the efficiency of the capital stock in the recipient countries or regions as well as by a more qualitative assessment. EU funds for competitiveness and employment should be allocated by looking at each region’s capital efficiency to maximise growth generating effects or on a pure competitive.
Resumo:
Context: Learning can be regarded as knowledge construction in which prior knowledge and experience serve as basis for the learners to expand their knowledge base. Such a process of knowledge construction has to take place continuously in order to enhance the learners’ competence in a competitive working environment. As the information consumers, the individual users demand personalised information provision which meets their own specific purposes, goals, and expectations. Objectives: The current methods in requirements engineering are capable of modelling the common user’s behaviour in the domain of knowledge construction. The users’ requirements can be represented as a case in the defined structure which can be reasoned to enable the requirements analysis. Such analysis needs to be enhanced so that personalised information provision can be tackled and modelled. However, there is a lack of suitable modelling methods to achieve this end. This paper presents a new ontological method for capturing individual user’s requirements and transforming the requirements onto personalised information provision specifications. Hence the right information can be provided to the right user for the right purpose. Method: An experiment was conducted based on the qualitative method. A medium size of group of users participated to validate the method and its techniques, i.e. articulates, maps, configures, and learning content. The results were used as the feedback for the improvement. Result: The research work has produced an ontology model with a set of techniques which support the functions for profiling user’s requirements, reasoning requirements patterns, generating workflow from norms, and formulating information provision specifications. Conclusion: The current requirements engineering approaches provide the methodical capability for developing solutions. Our research outcome, i.e. the ontology model with the techniques, can further enhance the RE approaches for modelling the individual user’s needs and discovering the user’s requirements.
Resumo:
Elevated levels of low-density-lipoprotein cholesterol (LDL-C) in the plasma are a well-established risk factor for the development of coronary heart disease. Plasma LDL-C levels are in part determined by the rate at which LDL particles are removed from the bloodstream by hepatic uptake. The uptake of LDL by mammalian liver cells occurs mainly via receptor-mediated endocytosis, a process which entails the binding of these particles to specific receptors in specialised areas of the cell surface, the subsequent internalization of the receptor-lipoprotein complex, and ultimately the degradation and release of the ingested lipoproteins' constituent parts. We formulate a mathematical model to study the binding and internalization (endocytosis) of LDL and VLDL particles by hepatocytes in culture. The system of ordinary differential equations, which includes a cholesterol-dependent pit production term representing feedback regulation of surface receptors in response to intracellular cholesterol levels, is analysed using numerical simulations and steady-state analysis. Our numerical results show good agreement with in vitro experimental data describing LDL uptake by cultured hepatocytes following delivery of a single bolus of lipoprotein. Our model is adapted in order to reflect the in vivo situation, in which lipoproteins are continuously delivered to the hepatocyte. In this case, our model suggests that the competition between the LDL and VLDL particles for binding to the pits on the cell surface affects the intracellular cholesterol concentration. In particular, we predict that when there is continuous delivery of low levels of lipoproteins to the cell surface, more VLDL than LDL occupies the pit, since VLDL are better competitors for receptor binding. VLDL have a cholesterol content comparable to LDL particles; however, due to the larger size of VLDL, one pit-bound VLDL particle blocks binding of several LDLs, and there is a resultant drop in the intracellular cholesterol level. When there is continuous delivery of lipoprotein at high levels to the hepatocytes, VLDL particles still out-compete LDL particles for receptor binding, and consequently more VLDL than LDL particles occupy the pit. Although the maximum intracellular cholesterol level is similar for high and low levels of lipoprotein delivery, the maximum is reached more rapidly when the lipoprotein delivery rates are high. The implications of these results for the design of in vitro experiments is discussed.
Resumo:
A case study on the tendering process and cost/time performance of a public building project in Ghana is conducted. Competitive bids submitted by five contractors for the project, in which contractors were required to prepare their own quantities, were analyzed to compare differences in their pricing levels and risk/requirement perceptions. Queries sent to the consultants at the tender stage were also analyzed to identify the significant areas of concern to contractors in relation to the tender documentation. The five bidding prices were significantly different. The queries submitted for clarifications were significantly different, although a few were similar. Using a before-and-after experiment, the expected cost/time estimate at the start of the project was compared to the actual cost/time values, i.e. what happened in the actual construction phase. The analysis showed that the project exceeded its expected cost by 18% and its planned time by 210%. Variations and inadequate design were the major reasons. Following an exploration of these issues, an alternative tendering mechanism is recommended to clients. A shift away from the conventional approach of awarding work based on price, and serious consideration of alternative procurement routes can help clients in Ghana obtain better value for money on their projects.
Resumo:
In this paper we present a connectionist searching technique - the Stochastic Diffusion Search (SDS), capable of rapidly locating a specified pattern in a noisy search space. In operation SDS finds the position of the pre-specified pattern or if it does not exist - its best instantiation in the search space. This is achieved via parallel exploration of the whole search space by an ensemble of agents searching in a competitive cooperative manner. We prove mathematically the convergence of stochastic diffusion search. SDS converges to a statistical equilibrium when it locates the best instantiation of the object in the search space. Experiments presented in this paper indicate the high robustness of SDS and show good scalability with problem size. The convergence characteristic of SDS makes it a fully adaptive algorithm and suggests applications in dynamically changing environments.
Resumo:
The aim of this study was to convert existing faba bean (Vicia faba L.) single nucleotide polymorphism (SNP) markers from cleaved amplification polymorphic sequences and SNaPshot® formats, which are expensive and time-consuming, to the more convenient KBiosciences competitive allele‐specific PCR (KASP) assay format. Out of 80 assays designed, 75 were validated, though a core set of 67 of the most robust markers is recommended for further use. The 67 best KASP SNP assays were used across two generations of single seed descent to detect unintended outcrossing and to track and quantify loss of heterozygosity, a capability that will significantly increase the efficiency and performance of pure line production and maintenance. This same set of assays was also used to examine genetic relationships between the 67 members of the partly inbred panel, and should prove useful for line identification and diversity studies in the future.
Resumo:
Toxic or allelopathic compounds liberated by toxin-producing phytoplankton (TPP) acts as a strong mediator in plankton dynamics. On an analysis of a set of phytoplankton biomass data that have been collected by our group in the northwest part of the Bay of Bengal, and by analysis of a three-component mathematical model under a constant as well as a stochastic environment, we explore the role of toxin-allelopathy in determining the dynamic behavior of the competing phytoplankton species. The overall results, based on analytical and numerical wings, demonstrate that toxin-allelopathy due to the TPP promotes a stable co-existence of those competitive phytoplankton that would otherwise exhibit competitive exclusion of the weak species. Our study suggests that TPP might be a potential candidate for maintaining the co-existence and diversity of competing phytoplankton species.
Resumo:
The network paradigm has been highly influential in spatial analysis in the globalisation era. As economies across the world have become increasingly integrated, so-called global cities have come to play a growing role as central nodes in the networked global economy. The idea that a city’s position in global networks benefits its economic performance has resulted in a competitive policy focus on promoting the economic growth of cities by improving their network connectivity. However, in spite of the attention being given to boosting city connectivity little is known about whether this directly translates to improved city economic performance and, if so, how well connected a city needs to be in order to benefit from this. In this paper we test the relationship between network connectivity and economic performance between 2000 and 2008 for cities with over 500,000 inhabitants in Europe and the USA to inform European policy.
Resumo:
The role played by viral marketing has received considerable academic and digital media attention recently. Key issues in viral marketing have been examined through the lens of the mode of marketing message transmission, including self-replicating on the basis of quality difference, individuals’ emotional needs, as well as how users are connected across various social networks. This paper presents a review and analysis of viral marketing studies from 2001 to the present day. It investigates how viral marketing facilitate the diffusion of social media products and the relationship between marketers and these product users by taking a look at the implementation of viral marketing in two European online game firms Jagex Games Studio and Rovio Entertainment. The results from this review and analysis indicate that viral marketing plays an important role in accelerating the interaction between marketers and users (as well as the user groups) in the field of digital media and high tech consumption. Therefore, it is evident that firms should understand the social contagion process and target well-connected users purposefully in order to create its competitive advantage.
Resumo:
The present work describes a new tool that helps bidders improve their competitive bidding strategies. This new tool consists of an easy-to-use graphical tool that allows the use of more complex decision analysis tools in the field of Competitive Bidding. The graphic tool described here tries to move away from previous bidding models which attempt to describe the result of an auction or a tender process by means of studying each possible bidder with probability density functions. As an illustration, the tool is applied to three practical cases. Theoretical and practical conclusions on the great potential breadth of application of the tool are also presented.
Resumo:
A new sparse kernel density estimator is introduced based on the minimum integrated square error criterion combining local component analysis for the finite mixture model. We start with a Parzen window estimator which has the Gaussian kernels with a common covariance matrix, the local component analysis is initially applied to find the covariance matrix using expectation maximization algorithm. Since the constraint on the mixing coefficients of a finite mixture model is on the multinomial manifold, we then use the well-known Riemannian trust-region algorithm to find the set of sparse mixing coefficients. The first and second order Riemannian geometry of the multinomial manifold are utilized in the Riemannian trust-region algorithm. Numerical examples are employed to demonstrate that the proposed approach is effective in constructing sparse kernel density estimators with competitive accuracy to existing kernel density estimators.
Resumo:
The reaction of cis-[RuCl(2)(P-P)(N-N)] type complexes (P-P = 1,4-bis(diphenylphosphino)butane or (1,1`-diphenylphosphino)ferrocene; N-N = 2,2`-bipyridine or 1,10-phenantroline) with monodentate ligands (L), such as 4-methylpyridine, 4-phenylpyridine and benzonitrile forms [RuCl(L)(P-P)(N-N)](+) species Upon characterization of the isolated compounds by elemental analysis, (31)P{(1)H} NMR and X-ray crystallography it was found out that the type of the L ligand determines its position in relation to the phosphorus atom. While pyridine derivatives like 4-methylpyridine and 4-phenylpyridine coordinate trans to the phosphorus atom, the benzonitrile ligand (bzCN), a good pi acceptor, coordinates trans to the nitrogen atom. A (31)P{(1)H} NMR experiment following the reaction of the precursor cis-[RuCl(2)(dppb)(phen)] with the benzonitrile ligand shows that the final position of the entering ligand in the complex is better defined as a consequence of the competitive effect between the phosphorus atom and the cyano-group from the benzonitrile moiety and not by the trans effect. In this case, the benzonitrile group is stabilized trans to one of the nitrogen atoms of the N-N ligand. A differential pulse voltammetry experiment confirms this statement. In both experiments the [RuCl(bzCN)(dppb)(phen)]PF(6) species with the bzCN ligand positioned trans to a phosphorus atom of the dppb ligand was detected as an intermediate complex. (c) 2009 Elsevier Ltd. All rights reserved.
Structural and thermodynamic analysis of thrombin:suramin interaction in solution and crystal phases
Resumo:
Suramin is a hexasulfonated naphthylurea which has been recently characterized as a non-competitive inhibitor of human alpha-thrombin activity over fibrinogen, although its binding site and mode of interaction with the enzyme remain elusive. Here, we determined two X-ray structure of the thrombin: suramin complex, refined at 2.4 angstrom resolution. While a single thrombin: suramin complex was found in the asymmetric unit cell of the crystal, some of the crystallographic contacts with symmetrically related molecules are mediated by both the enzyme and the ligand. Molecular dynamics simulations with the 1:1 complex demonstrate a large rearrangement of suramin in the complex, but with the protein scaffold and the more extensive protein-ligand regions keep unchanged. Small-angle X-ray scattering measurements at high micromolar concentration demonstrate a suramin-induced dimerization of the enzyme. These data indicating a dissimilar binding mode in the monomeric and oligomeric states, with a monomeric, 1:1 complex to be more likely to exist at the thrombin physiological, nanomolar concentration range. Collectively, close understanding on the structural basis for interaction is given which might establish a basis for design of suramin analogues targeting thrombin. Crown Copyright (C) 2009 Published by Elsevier B.V. All rights reserved.
Resumo:
The crystal structures of an aspartic proteinase from Trichoderma reesei (TrAsP) and of its complex with a competitive inhibitor, pepstatin A, were solved and refined to crystallographic R-factors of 17.9% (R(free)=21.2%) at 1.70 angstrom resolution and 15.81% (R(free) = 19.2%) at 1.85 angstrom resolution, respectively. The three-dimensional structure of TrAsP is similar to structures of other members of the pepsin-like family of aspartic proteinases. Each molecule is folded in a predominantly beta-sheet bilobal structure with the N-terminal and C-terminal domains of about the same size. Structural comparison of the native structure and the TrAsP-pepstatin complex reveals that the enzyme undergoes an induced-fit, rigid-body movement upon inhibitor binding, with the N-terminal and C-terminal lobes tightly enclosing the inhibitor. Upon recognition and binding of pepstatin A, amino acid residues of the enzyme active site form a number of short hydrogen bonds to the inhibitor that may play an important role in the mechanism of catalysis and inhibition. The structures of TrAsP were used as a template for performing statistical coupling analysis of the aspartic protease family. This approach permitted, for the first time, the identification of a network of structurally linked residues putatively mediating conformational changes relevant to the function of this family of enzymes. Statistical coupling analysis reveals coevolved continuous clusters of amino acid residues that extend from the active site into the hydrophobic cores of each of the two domains and include amino acid residues from the flap regions, highlighting the importance of these parts of the protein for its enzymatic activity. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
The cluster provides a greater commercial relationship between the companies that comprise it. This encourages companies to adopt competitive structures that allow solving problems that would hardly alone (Lubeck et. Al., 2011). With that this paper aims to describe the coopetition between companies operating on a commercial cluster planned, from the point of view of retailers, taking as a basis the theoretical models proposed by Bengtsson and Kock (1999) and Leon (2005) and operationalized by means of Social Network Analysis (SNA). Data collection consisted of two phases, the first exploratory aspect to identify the actors, and the second was characterized as descriptive as it aims to describe the coopetition among the enterprises. As a result we identified the companies that cooperate and compete simultaneously (coopetition), firms that only compete, companies just cooperate and businesses that do not compete and do not cooperate (coexistence)