869 resultados para 150507 Pricing (incl. Consumer Value Estimation)
Resumo:
The aim of this research was to identify the role of brand reputation in encouraging consumer willingness to provide personal data online, for the benefits of personalisation. This study extends on Malhotra, Kim and Agarwal’s (2004) Internet Users Information Privacy Concerns Model, and uses the theoretical underpinning of Social Contract Theory to assess how brand reputation moderates the relationship between trusting beliefs and perceived value (Privacy Calculus framework) with willingness to give personal information. The research is highly relevant as most privacy research undertaken to date focuses on consumer related concerns. Very little research exists examining the role of brand reputation and online privacy. Practical implications of this research include gaining knowledge as to how to minimise online privacy concerns; improve brand reputation; and provide insight on how to reduce consumer resistance to the collection of personal information and encourage consumer opt-in.
Resumo:
Purpose This study aims to use opportunity as a theoretical lens to investigate how the spatio-temporal and social dimensions of the consumption environment create perceived opportunities for consumers to misbehave. Design/methodology/approach Drawing on routine activity theory and social impact theory, the authors use two experiments to demonstrate that spatio-temporal and social dimensions can explain consumer theft in retail settings. Findings Study 1 reveals mixed empirical support for the basic dimensions of routine activity theory, which posits that the opportunity to thieve is optimised when a motivated offender, suitable target and the absence of a capable formal guardian transpire in time and space. Extending the notion of guardianship, Study 2 tests social impact theory and shows that informal guardianship impacts the likelihood of theft under optimal routine activity conditions. Originality/value The study findings highlight important implications for academicians and retail managers: rather than focusing on the uncontrollable characteristics of thieving offenders, more controllable spatio-temporal and social factors of the retail environment can be actively monitored and manipulated to reduce perceived opportunities for consumer misbehaviour.
Resumo:
This report provides an evaluation of the behaviours and purchasing drivers of key sweetpotato consumers defined by Nielsen consumer research as Established Couples (two or more adults with no children 17 and under, and head of house 35-59), Senior Couples (two or more adults with no children 17 or under, and head of house 60 or over), and Independent Singles (one person household 35 or over, no children 17 or under). Research was qualitative in nature. Methods used included focus groups, depth interviews and shop-a-longs. The report found that preferences for sweetpotato amongst these groups were varied. In general a smaller torpedo shaped vegetable was valued for ease of preparation and the convenience of being of sufficient size for a meal for two. Satisfaction with sweetpotato was high with negative comments on quality exceedingly rare within discussions. However, shop-a-longs revealed that some quality issues were apparent at retail such as withered product, pitting and occasionally damage. A display with stock resting in any amount of water was a barrier to purchase for consumers and this was apparent on two out 15 occasions. A high quality sweetpotato was of a deep orange/red colour, had a smooth skin and was extremely dense and hard. An inferior sweetpotato was wrinkly, spongy, pitted and damaged. Awareness of sweetpotato was a relatively recent phenomenon amongst the respondents of this study with most recalling eating the vegetable in the last five to 10 years. Life-time eating patterns emerged as a consequence of childhood food experiences such as growing up with a ‘meat and three’ veg philosophy and traditional Australian meals. However, this was dependent on cultural background and those with ties to diverse cultures were more likely to have always known of the vegetable. Sweetpotato trial and consumption coincided with a breaking away from these traditional patterns, or was integrated into conventional meals such as a baked vegetable to accompany roasts. Increased health consciousness also led to awareness of the vegetable. A primary catalyst for consumption within the Established and Senior Couples groups was the health benefits associated with sweetpotato. Consumers had very little knowledge of the specific health properties of the vegetable and were surprised at the number of benefits consumption provided. Sweetpotato was important for diabetics for its low Glycemic Index status. Top-of-the-mind awareness of the vegetable resulted from the onset of the disease. Increasing fibre was a key motive for this demographic and this provided a significant link between consumption and preventing bowel cancer. For those on a weight loss regime, sweetpotato was perceived as a tasty, satisfying food that was low in carbohydrates. Swapping behaviours where white potato was replaced by sweetpotato was often a response to these health concerns. Other health properties mentioned by participants through the course of the research included the precursor β-carotene and Vitamins A & C. The sweetpotato was appreciated for its hedonic and timesaving qualities. For consumers with a high involvement in food, the vegetable was valued for its versatility in meals. These consumers took pride in cooking and the flavour and texture of sweetpotato lent itself to a variety of meals such as soups, salads, roasts, curries, tagines and so on. Participants who had little time or desire to prepare and cook meals valued sweetpotato because it was an easy way to add colour and variety to the plate and because including an orange vegetable to meals is a shortcut to ensuring vitamin intake. Several recommendations are made to the sweetpotato industry. • Vigorously promote the distinct nutritional and health properties of sweetpotatoes, particularly if they can be favourably compared to other vegetables or foods • Promote the salient properties to specific targets such as diabetics, those that are at risk to bowel cancer, and those embarking on a weight-loss regime. Utilise specialist channels of communication such as diabetic magazines and websites • Promote styles of cooking of sweetpotato that would appeal to traditionalists such as roasts and BBQs • Promote the vegetable as a low maintenance vegetable, easy to store, easy to cook and particularly focusing on it as a simple way to boost the appearance and nutritional value of meals. • Promote the vegetable to high food involvement consumers through exotic recipes and linking it to feelings of accomplishment with cooking • Promote the versatility of the vegetable • Devise promotions that link images and tone of communications with enjoying life to the fullest, having time to enjoy family and grandchildren, and of partaking in social activities • Educate retailers on consumer perceptions of quality and ensuring moisture and mould is not present at displays Qualitative information while providing a wealth of detail cannot be extrapolated to the overall target population and this may be considered a limitation to the research. However, within research theory, effective quantitative design is believed to stem from the insights developed from qualitative studies. • Develop and implement a quantitative study on sweetpotato attitudes and behaviours based on the results of this study.
Resumo:
Abstract of Macbeth, G. M., Broderick, D., Buckworth, R. & Ovenden, J. R. (In press, Feb 2013). Linkage disequilibrium estimation of effective population size with immigrants from divergent populations: a case study on Spanish mackerel (Scomberomorus commerson). G3: Genes, Genomes and Genetics. Estimates of genetic effective population size (Ne) using molecular markers are a potentially useful tool for the management of endangered through to commercial species. But, pitfalls are predicted when the effective size is large, as estimates require large numbers of samples from wild populations for statistical validity. Our simulations showed that linkage disequilibrium estimates of Ne up to 10,000 with finite confidence limits can be achieved with sample sizes around 5000. This was deduced from empirical allele frequencies of seven polymorphic microsatellite loci in a commercially harvested fisheries species, the narrow barred Spanish mackerel (Scomberomorus commerson). As expected, the smallest standard deviation of Ne estimates occurred when low frequency alleles were excluded. Additional simulations indicated that the linkage disequilibrium method was sensitive to small numbers of genotypes from cryptic species or conspecific immigrants. A correspondence analysis algorithm was developed to detect and remove outlier genotypes that could possibly be inadvertently sampled from cryptic species or non-breeding immigrants from genetically separate populations. Simulations demonstrated the value of this approach in Spanish mackerel data. When putative immigrants were removed from the empirical data, 95% of the Ne estimates from jacknife resampling were above 24,000.
Resumo:
Background Australian policy mandates consumer and carer participation in mental health services at all levels including research. Inspired by a UK model - Service Users Group Advising on Research [SUGAR] - we conducted a scoping project in 2013 with a view to create a consumer and carer led research process that moves beyond stigma and tokenism, that values the unique knowledge of lived experience and leads to people being treated better when accessing services. This poster presents the initial findings. Aims The project’s purpose was to explore with consumers, consumer companions and carers at the Metro North Mental Health-RBWH their interest in and views about research partnerships with academic and clinical colleagues. Methods This poster overviews the initial findings from three audio-recorded focus groups conducted with a total of 14 consumers, carers and consumer companions at the Brisbane site. Analysis Our work was guided by framework analysis (Gale et al. 2013). It defines 5 steps for analysing narrative data: familiarising; development of categories; indexing; charting and interpretation. Eight main ideas were initially developed and were divided between the authors to further index. This process identified 37 related analytic ideas. The authors integrated these by combining, removing and redefining them by consensus though a mapping process. The final step is the return of the analysis to the participants for feedback and input into the interpretation of the focus group discussions. Results 1. Value & Respect: Feeling Valued & Respected, Tokenism, Stigma, Governance, Valuing prior knowledge / background 2. Pathways to Knowledge and Involvement in Research: ‘Where to begin’, Support, Unity & partnership, Communication, Co-ordination, Flexibility due to fluctuating capacity 3. Personal Context: Barriers regarding Commitments & the nature of mental illness, Wellbeing needs, Prior experience of research, Motivators, Attributes 4. What is research? Developing Knowledge, What to do research on, how and why? Conclusion and Discussion Initial analysis suggests that participants saw potential for ‘amazing things’ in mental health research such as reflecting their priorities and moving beyond stigma and tokenism. The main needs identified were education, mentoring, funding support and research processes that fitted consumers’ and carers’limitations and fluctuating capacities. They identified maintaining motivation and interest as an issue since research processes are often extended by ethics and funding applications. Participants felt that consumer and carer led research would value the unique knowledge that the lived experience of consumers and carers brings and lead to people being treated better when accessing services.
Resumo:
It is common to model the dynamics of fisheries using natural and fishing mortality rates estimated independently using two separate analyses. Fishing mortality is routinely estimated from widely available logbook data, whereas natural mortality estimations have often required more specific, less frequently available, data. However, in the case of the fishery for brown tiger prawn (Penaeus esculentus) in Moreton Bay, both fishing and natural mortality rates have been estimated from logbook data. The present work extended the fishing mortality model to incorporate an eco-physiological response of tiger prawn to temperature, and allowed recruitment timing to vary from year to year. These ecological characteristics of the dynamics of this fishery were ignored in the separate model that estimated natural mortality. Therefore, we propose to estimate both natural and fishing mortality rates within a single model using a consistent set of hypotheses. This approach was applied to Moreton Bay brown tiger prawn data collected between 1990 and 2010. Natural mortality was estimated by maximum likelihood to be equal to 0.032 ± 0.002 week−1, approximately 30% lower than the fixed value used in previous models of this fishery (0.045 week−1).
Resumo:
We provide analytical models for capacity evaluation of an infrastructure IEEE 802.11 based network carrying TCP controlled file downloads or full-duplex packet telephone calls. In each case the analytical models utilize the attempt probabilities from a well known fixed-point based saturation analysis. For TCP controlled file downloads, following Bruno et al. (In Networking '04, LNCS 2042, pp. 626-637), we model the number of wireless stations (STAs) with ACKs as a Markov renewal process embedded at packet success instants. In our work, analysis of the evolution between the embedded instants is done by using saturation analysis to provide state dependent attempt probabilities. We show that in spite of its simplicity, our model works well, by comparing various simulated quantities, such as collision probability, with values predicted from our model. Next we consider N constant bit rate VoIP calls terminating at N STAs. We model the number of STAs that have an up-link voice packet as a Markov renewal process embedded at so called channel slot boundaries. Analysis of the evolution over a channel slot is done using saturation analysis as before. We find that again the AP is the bottleneck, and the system can support (in the sense of a bound on the probability of delay exceeding a given value) a number of calls less than that at which the arrival rate into the AP exceeds the average service rate applied to the AP. Finally, we extend the analytical model for VoIP calls to determine the call capacity of an 802.11b WLAN in a situation where VoIP calls originate from two different types of coders. We consider N-1 calls originating from Type 1 codecs and N-2 calls originating from Type 2 codecs. For G711 and G729 voice coders, we show that the analytical model again provides accurate results in comparison with simulations.
Resumo:
Purpose The research purpose was to identify both the inspiration sources used by fast fashion designers and ways the designers sort information from the sources during the product development process. Design/methodology/approach This is a qualitative study, drawing on semi-structured interviews conducted with the members of the in-house design teams of three Australian fast fashion companies. Findings Australian fast fashion designers rely on a combination of trend data, sales data, product analysis and travel for design development ideas. The designers then use the consensus and embodiment methods to interpret and synthesise information from those inspiration sources. Research limitations/implications The empirical data used in the analysis were limited by interviewing fashion designers within only three Australian companies. Originality/value This research augments knowledge of fast fashion product development, in particular designers’ methods and approaches to product design within a volatile and competitive market.
Resumo:
Asian elephants (Dephas maximus), prominent ``flagship species'', arelisted under the category of endangered species (EN - A2c, ver. 3.1, IUCN Red List 2009) and there is a need for their conservation This requires understanding demographic and reproductive dynamics of the species. Monitoring reproductive status of any species is traditionally being carried out through invasive blood sampling and this is restrictive for large animals such as wild or semi-captive elephants due to legal. ethical, and practical reasons Hence. there is a need for a non-invasive technique to assess reproductive cyclicity profiles of elephants. which will help in the species' conservation strategies In this study. we developed an indirect competitive enzyme linked immuno-sorbent assay (ELISA) to estimate the concentration of one of the progesterone-metabolites i.e, allopregnanolone (5 alpha-P-3OH) in fecal samples of As elephants We validated the assay which had a sensitivity of 0.25 mu M at 90% binding with an EC50 value of 1 37 mu M Using female elephants. kept under semi-captive conditions in the forest camps of Mudumalar Wildlife Sanctuary, Tamil Nadu and Bandipur National Park, Karnataka, India. we measured fecal progesterone-metabolite (5 alpha-P-3OH) concentrations in six an and showed their clear correlation with those of scrum progesterone measured by a standard radio-immuno assay. Statistical analyses using a Linear Mixed Effect model showed a positive correlation (P < 0 1) between the profiles of fecal 5 alpha-P-3OH (range 0 5-10 mu g/g) and serum progesterone (range: 0 1-1 8 ng/mL) Therefore, our studies show, for the first time, that the fecal progesterone-metabolite assay could be exploited to predict estrus cyclicity and to potentially assess the reproductive status of captive and free-ranging female Asian elephants, thereby helping to plan their breeding strategy (C) 2010 Elsevier Inc.All rights reserved.
Resumo:
Modeling and forecasting of implied volatility (IV) is important to both practitioners and academics, especially in trading, pricing, hedging, and risk management activities, all of which require an accurate volatility. However, it has become challenging since the 1987 stock market crash, as implied volatilities (IVs) recovered from stock index options present two patterns: volatility smirk(skew) and volatility term-structure, if the two are examined at the same time, presents a rich implied volatility surface (IVS). This implies that the assumptions behind the Black-Scholes (1973) model do not hold empirically, as asset prices are mostly influenced by many underlying risk factors. This thesis, consists of four essays, is modeling and forecasting implied volatility in the presence of options markets’ empirical regularities. The first essay is modeling the dynamics IVS, it extends the Dumas, Fleming and Whaley (DFW) (1998) framework; for instance, using moneyness in the implied forward price and OTM put-call options on the FTSE100 index, a nonlinear optimization is used to estimate different models and thereby produce rich, smooth IVSs. Here, the constant-volatility model fails to explain the variations in the rich IVS. Next, it is found that three factors can explain about 69-88% of the variance in the IVS. Of this, on average, 56% is explained by the level factor, 15% by the term-structure factor, and the additional 7% by the jump-fear factor. The second essay proposes a quantile regression model for modeling contemporaneous asymmetric return-volatility relationship, which is the generalization of Hibbert et al. (2008) model. The results show strong negative asymmetric return-volatility relationship at various quantiles of IV distributions, it is monotonically increasing when moving from the median quantile to the uppermost quantile (i.e., 95%); therefore, OLS underestimates this relationship at upper quantiles. Additionally, the asymmetric relationship is more pronounced with the smirk (skew) adjusted volatility index measure in comparison to the old volatility index measure. Nonetheless, the volatility indices are ranked in terms of asymmetric volatility as follows: VIX, VSTOXX, VDAX, and VXN. The third essay examines the information content of the new-VDAX volatility index to forecast daily Value-at-Risk (VaR) estimates and compares its VaR forecasts with the forecasts of the Filtered Historical Simulation and RiskMetrics. All daily VaR models are then backtested from 1992-2009 using unconditional, independence, conditional coverage, and quadratic-score tests. It is found that the VDAX subsumes almost all information required for the volatility of daily VaR forecasts for a portfolio of the DAX30 index; implied-VaR models outperform all other VaR models. The fourth essay models the risk factors driving the swaption IVs. It is found that three factors can explain 94-97% of the variation in each of the EUR, USD, and GBP swaption IVs. There are significant linkages across factors, and bi-directional causality is at work between the factors implied by EUR and USD swaption IVs. Furthermore, the factors implied by EUR and USD IVs respond to each others’ shocks; however, surprisingly, GBP does not affect them. Second, the string market model calibration results show it can efficiently reproduce (or forecast) the volatility surface for each of the swaptions markets.
Resumo:
Financial time series tend to behave in a manner that is not directly drawn from a normal distribution. Asymmetries and nonlinearities are usually seen and these characteristics need to be taken into account. To make forecasts and predictions of future return and risk is rather complicated. The existing models for predicting risk are of help to a certain degree, but the complexity in financial time series data makes it difficult. The introduction of nonlinearities and asymmetries for the purpose of better models and forecasts regarding both mean and variance is supported by the essays in this dissertation. Linear and nonlinear models are consequently introduced in this dissertation. The advantages of nonlinear models are that they can take into account asymmetries. Asymmetric patterns usually mean that large negative returns appear more often than positive returns of the same magnitude. This goes hand in hand with the fact that negative returns are associated with higher risk than in the case where positive returns of the same magnitude are observed. The reason why these models are of high importance lies in the ability to make the best possible estimations and predictions of future returns and for predicting risk.
Resumo:
This paper uses the Value-at-Risk approach to define the risk in both long and short trading positions. The investigation is done on some major market indices(Japanese, UK, German and US). The performance of models that takes into account skewness and fat-tails are compared to symmetric models in relation to both the specific model for estimating the variance, and the distribution of the variance estimate used as input in the VaR estimation. The results indicate that more flexible models not necessarily perform better in predicting the VaR forecast; the reason for this is most probably the complexity of these models. A general result is that different methods for estimating the variance are needed for different confidence levels of the VaR, and for the different indices. Also, different models are to be used for the left respectively the right tail of the distribution.
Resumo:
The objective of this paper is to investigate the pricing accuracy under stochastic volatility where the volatility follows a square root process. The theoretical prices are compared with market price data (the German DAX index options market) by using two different techniques of parameter estimation, the method of moments and implicit estimation by inversion. Standard Black & Scholes pricing is used as a benchmark. The results indicate that the stochastic volatility model with parameters estimated by inversion using the available prices on the preceding day, is the most accurate pricing method of the three in this study and can be considered satisfactory. However, as the same model with parameters estimated using a rolling window (the method of moments) proved to be inferior to the benchmark, the importance of stable and correct estimation of the parameters is evident.
Resumo:
Activity systems are the cognitively linked groups of activities that consumers carry out as a part of their daily life. The aim of this paper is to investigate how consumers experience value through their activities, and how services fit into the context of activity systems. A new technique for illustrating consumers’ activity systems is introduced. The technique consists of identifying a consumer’s activities through an interview, then quantitatively measuring how the consumer evaluates the identified activities on three dimensions: Experienced benefits, sacrifices and frequency. This information is used to create a graphical representation of the consumer’s activity system, an “activityscape map”. Activity systems work as infrastructure for the individual consumer’s value experience. The paper contributes to value and service literature, where there currently are no clearly described standardized techniques for visually mapping out individual consumer activity. Existing approaches are service- or relationship focused, and are mostly used to identify activities, not to understand them. The activityscape representation provides an overview of consumers’ perceptions of their activity patterns and the position of one or several services in this pattern. Comparing different consumers’ activityscapes, it shows the differences between consumers' activity structures, and provides insight into how services are used to create value within them. The paper is conceptual; an empirical illustration is used to indicate the potential in further empirical studies. The technique can be used by businesses to understand contexts for service use, which may uncover potential for business reconfiguration and customer segmentation.
Resumo:
A diffusion/replacement model for new consumer durables designed to be used as a long-term forecasting tool is developed. The model simulates new demand as well as replacement demand over time. The model is called DEMSIM and is built upon a counteractive adoption model specifying the basic forces affecting the adoption behaviour of individual consumers. These forces are the promoting forces and the resisting forces. The promoting forces are further divided into internal and external influences. These influences are operationalized within a multi-segmental diffusion model generating the adoption behaviour of the consumers in each segment as an expected value. This diffusion model is combined with a replacement model built upon the same segmental structure as the diffusion model. This model generates, in turn, the expected replacement behaviour in each segment. To be able to use DEMSIM as a forecasting tool in early stages of a diffusion process estimates of the model parameters are needed as soon as possible after product launch. However, traditional statistical techniques are not very helpful in estimating such parameters in early stages of a diffusion process. To enable early parameter calibration an optimization algorithm is developed by which the main parameters of the diffusion model can be estimated on the basis of very few sales observations. The optimization is carried out in iterative simulation runs. Empirical validations using the optimization algorithm reveal that the diffusion model performs well in early long-term sales forecasts, especially as it comes to the timing of future sales peaks.