44 resultados para Go-to-market strategy
Resumo:
Given the significance of forecasting in real estate investment decisions, this paper investigates forecast uncertainty and disagreement in real estate market forecasts. It compares the performance of real estate forecasters with non-real estate forecasters. Using the Investment Property Forum (IPF) quarterly survey amongst UK independent real estate forecasters and a similar survey of macro-economic and capital market forecasters, these forecasts are compared with actual performance to assess a number of forecasting issues in the UK over 1999-2004, including forecast error, bias and consensus. The results suggest that both groups are biased, less volatile compared to market returns and inefficient in that forecast errors tend to persist. The strongest finding is that forecasters display the characteristics associated with a consensus indicating herding.
Resumo:
In recent years, there has been a drive to save development costs and shorten time-to-market of new therapies. Research into novel trial designs to facilitate this goal has led to, amongst other approaches, the development of methodology for seamless phase II/III designs. Such designs allow treatment or dose selection at an interim analysis and comparative evaluation of efficacy with control, in the same study. Methods have gained much attention because of their potential advantages compared to conventional drug development programmes with separate trials for individual phases. In this article, we review the various approaches to seamless phase II/III designs based upon the group-sequential approach, the combination test approach and the adaptive Dunnett method. The objective of this article is to describe the approaches in a unified framework and highlight their similarities and differences to allow choice of an appropriate methodology by a trialist considering conducting such a trial.
Resumo:
This paper investigates the extent to which clients were able to influence performance measurement appraisals during the downturn in commercial property markets that began in the UK during the second half of 2007. The sharp change in market sentiment produced speculation that different client categories were attempting to influence their appraisers in different ways. In particular, it was recognised that the requirement for open-ended funds to meet redemptions gave them strong incentives to ensure that their asset values were marked down to market. Using data supplied by Investment Property Databank, we demonstrate that, indeed, unlisted open ended funds experienced sharper drops in capital values than other fund types in the second half of 2007, after the market turning point. These differences are statistically significant and cannot simply be explained by differences in portfolio composition. Client influence on appraisal forms one possible explanation of the results observed: the different pressures on fund managers resulting in different appraisal outcomes.
Resumo:
Background: Microarray based comparative genomic hybridisation (CGH) experiments have been used to study numerous biological problems including understanding genome plasticity in pathogenic bacteria. Typically such experiments produce large data sets that are difficult for biologists to handle. Although there are some programmes available for interpretation of bacterial transcriptomics data and CGH microarray data for looking at genetic stability in oncogenes, there are none specifically to understand the mosaic nature of bacterial genomes. Consequently a bottle neck still persists in accurate processing and mathematical analysis of these data. To address this shortfall we have produced a simple and robust CGH microarray data analysis process that may be automated in the future to understand bacterial genomic diversity. Results: The process involves five steps: cleaning, normalisation, estimating gene presence and absence or divergence, validation, and analysis of data from test against three reference strains simultaneously. Each stage of the process is described and we have compared a number of methods available for characterising bacterial genomic diversity, for calculating the cut-off between gene presence and absence or divergence, and shown that a simple dynamic approach using a kernel density estimator performed better than both established, as well as a more sophisticated mixture modelling technique. We have also shown that current methods commonly used for CGH microarray analysis in tumour and cancer cell lines are not appropriate for analysing our data. Conclusion: After carrying out the analysis and validation for three sequenced Escherichia coli strains, CGH microarray data from 19 E. coli O157 pathogenic test strains were used to demonstrate the benefits of applying this simple and robust process to CGH microarray studies using bacterial genomes.
Resumo:
A semi-quantitative cloacal-swab method was used as an indirect measure of caecal colonisation of one-day old and five-day old chicks after oral dosing with wild-type Salmonella enterica serovar Enteritidis PT4 and,genetically defined isogenic derivatives lacking the ability to elaborate flagella or fimbriae. Birds of both ages were readily and persistently colonised by all strains although there war a decline in shedding by the older birds after about 21 days. There were no significant differences in shedding of wild-type or mutants in single-dose experiments. In competition experiments, in which five-day old birds were dosed orally with wild-type and mutants together, shedding of non-motile derivatives was significantly lower than wild-type, At 35 days post infection, birds were sacrificed and direct counts of mutants and wild-type from each caecum were determined. Whilst there appeared to be poor correlation between direct counts and the indirect swab method, the overall trends shown by these methods of assessment indicated that flagella and not fimbriae were important in caecal colonisation in these models. Crown Copyright (C) 1999 Published by Elsevier Science B.V.
Resumo:
This paper considers the attitudes of students in Years 11, 12 and 13 towards French and, in particular, how they view the reasons behind their level of achievement. It reports findings from a small-scale pilot study, conducted in four schools and colleges, involving 83 students in Year 11, 26 in Year 12 and 14 in Year 13. The findings indicate that French is perceived by many Year 11 students to be difficult and uninteresting. These students, furthermore, do not consider that French is of much benefit in terms of their future career. The data suggest that there is a tendency among students in all three year groups to attribute their lack of success in French to their own low ability and to the difficulty of tasks set, which, it is argued, may affect their levels of motivation in a negative way. Few students in the study have any insight into the importance of learning strategies in overcoming difficulties experienced in language learning. Students' attitudes are then discussed in relation to learning strategy training. It is argued that if learners are encouraged to explore the possibility that their achievement in French may be related to the efficacy of the learning strategies they use, rather than to factors such as low ability or task difficulty, their self-concept, motivation and language learning achievements can be enhanced. A brief outline is given of a planned research project which proposes to address these issues further.
Resumo:
Numerous studies have attempted to develop strategic alignment mechanisms. The strategic alignment mechanism is broken down into two categories namely: strategy process and strategy content. Our review shows that alignment research has been carried out in isolation. We see this as having had the effect of limiting the extent to which executives can understand elements of performance. We confer with a number of researchers in postulating that using a mechanism such as multilevel learning to combine strategy content and strategy process under one metaphor can greatly facilitate, through exploration and exploitation, the understanding not only of human interactions within a firm, but also of the interaction existent between a firm and its environment. The findings in this study further support the idea of integrating strategy process and content to have a better understating of alignment maturity and impact on business performance. It also elaborates the affect of misalignment in companies on performance.
Resumo:
The concept of slow vortical dynamics and its role in theoretical understanding is central to geophysical fluid dynamics. It leads, for example, to “potential vorticity thinking” (Hoskins et al. 1985). Mathematically, one imagines an invariant manifold within the phase space of solutions, called the slow manifold (Leith 1980; Lorenz 1980), to which the dynamics are constrained. Whether this slow manifold truly exists has been a major subject of inquiry over the past 20 years. It has become clear that an exact slow manifold is an exceptional case, restricted to steady or perhaps temporally periodic flows (Warn 1997). Thus the concept of a “fuzzy slow manifold” (Warn and Ménard 1986) has been suggested. The idea is that nearly slow dynamics will occur in a stochastic layer about the putative slow manifold. The natural question then is, how thick is this layer? In a recent paper, Ford et al. (2000) argue that Lighthill emission—the spontaneous emission of freely propagating acoustic waves by unsteady vortical flows—is applicable to the problem of balance, with the Mach number Ma replaced by the Froude number F, and that it is a fundamental mechanism for this fuzziness. They consider the rotating shallow-water equations and find emission of inertia–gravity waves at O(F2). This is rather surprising at first sight, because several studies of balanced dynamics with the rotating shallow-water equations have gone beyond second order in F, and found only an exponentially small unbalanced component (Warn and Ménard 1986; Lorenz and Krishnamurthy 1987; Bokhove and Shepherd 1996; Wirosoetisno and Shepherd 2000). We have no technical objection to the analysis of Ford et al. (2000), but wish to point out that it depends crucially on R 1, where R is the Rossby number. This condition requires the ratio of the characteristic length scale of the flow L to the Rossby deformation radius LR to go to zero in the limit F → 0. This is the low Froude number scaling of Charney (1963), which, while originally designed for the Tropics, has been argued to be also relevant to mesoscale dynamics (Riley et al. 1981). If L/LR is fixed, however, then F → 0 implies R → 0, which is the standard quasigeostrophic scaling of Charney (1948; see, e.g., Pedlosky 1987). In this limit there is reason to expect the fuzziness of the slow manifold to be “exponentially thin,” and balance to be much more accurate than is consistent with (algebraic) Lighthill emission.
Resumo:
Evidence suggests that rational, periodically collapsing speculative bubbles may be pervasive in stock markets globally, but there is no research that considers them at the individual stock level. In this study we develop and test an empirical asset pricing model that allows for speculative bubbles to affect stock returns. We show that stocks incorporating larger bubbles yield higher returns. The bubble deviation, at the stock level as opposed to the industry or market level, is a priced source of risk that is separate from the standard market risk, size and value factors. We demonstrate that much of the common variation in stock returns that can be attributable to market risk is due to the co-movement of bubbles rather than being driven by fundamentals.
Resumo:
Despite the expectations of the benefits of this tool, the adoption of Electronic Commerce (EC) by small and medium firms of the agro-food sector in Italy is still not frequent, however, the understanding of opportunities it could create and how they can be exploited remains a relevant issue. This study, carried out in the Emilia -Romagna region during 2002, illustrates the results of a survey of 208 firms at all stages of the agro-food chain aimed at understanding the use of the Internet and the strategies adopted for EC implementation. The results show a low level of implementation of the instrument and a limited variety of adoption strategies. Agro-food firms actually invest very little in EC focusing their efforts on the Internet as promotion tool, while web-based direct selling is confined to market niches. The view that the Internet would reverse the disadvantages of small firms appears by now non realistic, even if interesting opportunities for further development are still present.
Resumo:
Trading commercial real estate involves a process of exchange that is costly and which occurs over an extended and uncertain period of time. This has consequences for the performance and risk of real estate investments. Most research on transaction times has occurred for residential rather than commercial real estate. We study the time taken to transact commercial real estate assets in the UK using a sample of 578 transactions over the period 2004 to 2013. We measure average time to transact from a buyer and seller perspective, distinguishing the search and due diligence phases of the process, and we conduct econometric analysis to explain variation in due diligence times between assets. The median time for purchase of real estate from introduction to completion was 104 days and the median time for sale from marketing to completion was 135 days. There is considerable variation around these times and results suggest that some of this variation is related to market state, type and quality of asset, and type of participants involved in the transaction. Our findings shed light on the drivers of liquidity at an individual asset level and can inform models that quantify the impact of uncertain time on market on real estate investment risk.
Resumo:
The paper examines the process of bank internationalisation and explores how banks become international organisations and what this involves. It also makes an assessment of the significance of their international operations and determines whether banks are truly global organisations. The empirical data are based on the 60 largest banks in the world and content analysis is used to categorise the information into the eight international strategies of Atamer, Calori, Gustavsson, and Menguzzato-Boulard [Internationalisation strategies. In R. Calori, T. Atamer, & P. Nunes (Eds.), The dynamics of international competition – from practice to theory, strategy series (pp. 162–206). London: Sage (2000)] and Bryan, Fraser, Oppenheim, and Rall [Race for the World strategies to build a great global firm. Boston, MA: Harvard Business School Press (1999)]. The findings suggest that the majority of banks focus on countries or geographic regions in which they have some sort of cultural or economic affinity. Moreover, apart from a relatively small number of very large banks, they are international rather than truly global organisations.