894 resultados para Liquidity premium
Resumo:
The paper investigates whether Big-Four affiliated (B4A) firms earn audit premiums in an emerging economy context, using Bangladesh as a case. The joint determination of audit and non-audit service fees is also examined using a sample of 122 companies listed in the Dhaka Stock Exchange. Our findings reveal that although the B4A firms do not generally earn a fee premiumin Bangladesh, they charge higher audit fees for clients not purchasing non-audit services. This suggests that the B4A firms may actually lower audit fees to attract non-audit services, and cross subsidizes audit fees through non-audit-services fees. The lack of a B4A premium implies that there is lack of quality audit in emerging markets. We also document that audit and non-audit service fees are jointly determined in Bangladesh. Thus, we provide evidence of joint determination of audit and non-audit service fees in an emerging economy context.
Resumo:
Purpose – There is limited evidence on how differences in economic environments affect the demand for and supply of auditing. Research on audit pricing has mainly focused on large client markets in developed economies; in contrast, the purpose of this paper is to focus on the small client segment in the emerging economy of Thailand which offers a choice between auditors of two different qualities. Design/methodology/approach – This paper is based on a random stratified sample of small clients in Thailand qualifying for audit exemption. The final sample consists of 1,950 firm-year observations for 2002-2006. Findings – The authors find evidence of product differentiation in the small client market, suggesting that small firms view certified public accountants as superior and pay a premium for their services. The authors also find that audit fees have a positive significant association with leverage, metropolitan location and client size. Audit risk and audit opinion are not, however, significantly associated with audit fees. Furthermore, the authors find no evidence that clients whose financial year ends in the auditors’ busy period pay significantly higher audit fees, and auditors engage in low-balling on initial engagements to attract audit clients. Research limitations/implications – The research shows the importance of exploring actual decisions regarding audit practice and audit pricing in different institutional and organizational settings. Originality/value – The paper extends the literature from developed economies and large/listed market setting to the emerging economy and small client market setting. As far as the authors are aware, this is the first paper to examine audit pricing in the small client market in an emerging economy.
Resumo:
Digital innovation is transforming the media and entertainment industries. The professionalization of YouTube’s platform is paradigmatic of that change. The 100 original channel initiative launched in late 2011 was designed to transform YouTube’s brand through production of a high volume of quality premium video content that would more deeply engage its audience base and in the process attract big advertisers. An unanticipated by-product has been the rapid growth of a wave of aspiring next-generation digital media companies from within the YouTube ecosystem. Fuelled by early venture capital some have ambitious goals to become global media corporations in the online video space. A number of larger MCNs (Multi-Channel Networks) - BigFrame, Machinima, Fullscreen, AwesomenessTV, Maker Studios , Revision3 and DanceOn - have attracted interest from media incumbents like Warner Brothers, DreamWorks, Discovery, Bertlesmann, Comcast and AMC, and two larger MCNs Alloy and Break Media have merged. This indicates that a shakeout is underway in these new online supply chains, after rapid initial growth. The higher profile MCNs seek to rapidly develop scale economies in online distribution and facilitate audience growth for their member channels, helping channels optimize monetization, develop sustainable business models and to facilitate producer-collaboration within a growing online community of like-minded content creators. Some MCNs already attract far larger online audiences than any national TV network. The speed with which these developments have occurred is reminiscent of the 1910s, when Hollywood studios first emerged and within only a few years replaced the incumbent film studios as the dominant force within the film industry.
Resumo:
This paper investigates quality of service (QoS) and resource productivity implications of transit route passenger loading and travel time. It highlights the value of occupancy load factor as a direct passenger comfort QoS measure. Automatic Fare Collection data for a premium radial bus route in Brisbane, Australia, is used to investigate time series correlation between occupancy load factor and passenger average travel time. Correlation is strong across the entire span of service in both directions. Passengers tend to be making longer, peak direction commuter trips under significantly less comfortable conditions than off-peak. The Transit Capacity and Quality of Service Manual uses segment based load factor as a measure of onboard loading comfort QoS. This paper provides additional insight into QoS by relating the two route based dimensions of occupancy load factor and passenger average travel time together in a two dimensional format, both from the passenger’s and operator’s perspectives. Future research will apply Value of Time to QoS measurement, reflecting perceived passenger comfort through crowding and average time spent onboard. This would also assist in transit service quality econometric modeling. The methodology can be readily applied in a practical setting where AFC data for fixed scheduled routes is available. The study outcomes also provide valuable research and development directions.
Resumo:
This presentation investigates quality of service (QoS) and resource productivity implications of transit route passenger loading and travel time. It highlights the value of occupancy load factor as a direct passenger comfort QoS measure. Automatic Fare Collection data for a premium radial bus route in Brisbane, Australia, is used to investigate time series correlation between occupancy load factor and passenger average travel time. Correlation is strong across the entire span of service in both directions. Passengers tend to be making longer, peak direction commuter trips under significantly less comfortable conditions than off-peak. The Transit Capacity and Quality of Service Manual uses segment based load factor as a measure of onboard loading comfort QoS. This paper provides additional insight into QoS by relating the two route based dimensions of occupancy load factor and passenger average travel time together in a two dimensional format, both from the passenger’s and operator’s perspectives. Future research will apply Value of Time to QoS measurement, reflecting perceived passenger comfort through crowding and average time spent onboard. This would also assist in transit service quality econometric modeling. The methodology can be readily applied in a practical setting where AFC data for fixed scheduled routes is available. The study outcomes also provide valuable research and development directions.
Resumo:
This paper investigates stochastic analysis of transit segment hourly passenger load factor variation for transit capacity and quality of service (QoS) analysis using Automatic Fare Collection data for a premium radial bus route in Brisbane, Australia. It compares stochastic analysis to traditional peak hour factor (PHF) analysis to gain further insight into variability of transit route segments’ passenger loading during a study hour. It demonstrates that hourly design load factor is a useful method of modeling a route segment’s capacity and QoS time history across the study weekday. This analysis method is readily adaptable to different passenger load standards by adjusting design percentile, reflecting either a more relaxed or more stringent condition. This paper also considers hourly coefficient of variation of load factor as a capacity and QoS assessment measure, in particular through its relationships with hourly average and design load factors. Smaller value reflects uniform passenger loading, which is generally indicative of well dispersed passenger boarding demands and good schedule maintenance. Conversely, higher value may be indicative of pulsed or uneven passenger boarding demands, poor schedule maintenance, and/or bus bunching. An assessment table based on hourly coefficient of variation of load factor is developed and applied to this case study. Inferences are drawn for a selection of study hours across the weekday studied.
Resumo:
This study uses weekday Automatic Fare Collection (AFC) data on a premium bus line in Brisbane, Australia •Stochastic analysis is compared to peak hour factor (PHF) analysis for insight into passenger loading variability •Hourly design load factor (e.g. 88th percentile) is found to be a useful method of modeling a segment’s passenger demand time-history across a study weekday, for capacity and QoS assessment •Hourly coefficient of variation of load factor is found to be a useful QoS and operational assessment measure, particularly through its relationship with hourly average load factor, and with design load factor •An assessment table based on hourly coefficient of variation of load factor is developed from the case study
Resumo:
We explore the impact of delisting on the performance of the momentum trading strategy in Australia. We employ a new dataset of hand-collected delisting returns for all Australian stocks and provide the first study outside the U.S. to jointly examine the effects of delisting and missing returns on the magnitude of momentum profits. In the sample of all stocks, we find that the profitability of momentum strategies depends crucially on the returns of delisted stocks, especiallyon bankrupt firms. In the sample of large stocks, however, the momentum effect remains strong after controlling for the effect of delisted stocks, in contrast to the U.S. evidence in which delisting returns can explain 40% of momentum profits. As these large stocks are less exposed to liquidity risks, the momentum effect in Australia is even more puzzling than in the U.S.
Resumo:
In life cycle assessment studies, greenhouse gas (GHG) emissions from direct land-use change have been estimated to make a significant contribution to the global warming potential of agricultural products. However, these estimates have a high uncertainty due to the complexity of data requirements and difficulty in attribution of land-use change. This paper presents estimates of GHG emissions from direct land-use change from native woodland to grazing land for two beef production regions in eastern Australia, which were the subject of a multi-impact life cycle assessment study for premium beef production. Spatially- and temporally consistent datasets were derived for areas of forest cover and biomass carbon stocks using published remotely sensed tree-cover data and regionally applicable allometric equations consistent with Australia's national GHG inventory report. Standard life cycle assessment methodology was used to estimate GHG emissions and removals from direct land-use change attributed to beef production. For the northern-central New South Wales region of Australia estimates ranged from a net emission of 0.03 t CO2-e ha-1 year-1 to net removal of 0.12 t CO2-e ha-1 year-1 using low and high scenarios, respectively, for sequestration in regrowing forests. For the same period (1990-2010), the study region in southern-central Queensland was estimated to have net emissions from land-use change in the range of 0.45-0.25 t CO2-e ha-1 year-1. The difference between regions reflects continuation of higher rates of deforestation in Queensland until strict regulation in 2006 whereas native vegetation protection laws were introduced earlier in New South Wales. On the basis of liveweight produced at the farm-gate, emissions from direct land-use change for 1990-2010 were comparable in magnitude to those from other on-farm sources, which were dominated by enteric methane. However, calculation of land-use change impacts for the Queensland region for a period starting 2006, gave a range from net emissions of 0.11 t CO2-e ha-1 year-1 to net removals of 0.07 t CO2-e ha-1 year-1. This study demonstrated a method for deriving spatially- and temporally consistent datasets to improve estimates for direct land-use change impacts in life cycle assessment. It identified areas of uncertainty, including rates of sequestration in woody regrowth and impacts of land-use change on soil carbon stocks in grazed woodlands, but also showed the potential for direct land-use change to represent a net sink for GHG.
Resumo:
Cost estimating has been acknowledged as a crucial component of construction projects. Depending on available information and project requirements, cost estimates evolve in tandem with project lifecycle stages; conceptualisation, design development, execution and facility management. The premium placed on the accuracy of cost estimates is crucial to producing project tenders and eventually in budget management. Notwithstanding the initial slow pace of its adoption, Building Information Modelling (BIM) has successfully addressed a number of challenges previously characteristic of traditional approaches in the AEC, including poor communication, the prevalence of islands of information and frequent reworks. Therefore, it is conceivable that BIM can be leveraged to address specific shortcomings of cost estimation. The impetus for leveraging BIM models for accurate cost estimation is to align budgeted and actual cost. This paper hypothesises that the accuracy of BIM-based estimation, as more efficient, process-mirrors of traditional cost estimation methods, can be enhanced by simulating traditional cost estimation factors variables. Through literature reviews and preliminary expert interviews, this paper explores the factors that could potentially lead to more accurate cost estimates for construction projects. The findings show numerous factors that affect the cost estimates ranging from project information and its characteristic, project team, clients, contractual matters, and other external influences. This paper will make a particular contribution to the early phase of BIM-based project estimation.
Resumo:
This article describes a maximum likelihood method for estimating the parameters of the standard square-root stochastic volatility model and a variant of the model that includes jumps in equity prices. The model is fitted to data on the S&P 500 Index and the prices of vanilla options written on the index, for the period 1990 to 2011. The method is able to estimate both the parameters of the physical measure (associated with the index) and the parameters of the risk-neutral measure (associated with the options), including the volatility and jump risk premia. The estimation is implemented using a particle filter whose efficacy is demonstrated under simulation. The computational load of this estimation method, which previously has been prohibitive, is managed by the effective use of parallel computing using graphics processing units (GPUs). The empirical results indicate that the parameters of the models are reliably estimated and consistent with values reported in previous work. In particular, both the volatility risk premium and the jump risk premium are found to be significant.
Resumo:
Ginger autotetraploids were produced by immersing shoot tips in a 0.5% w/v colchicine, 2% v/v dimethyl sulfoxide solution for 2 h. Stomatal measurements were used as an early indicator of ploidy differences in culture with mean stomata length of tetraploids (49.2 μm) being significantly larger than the diploid (38.8 µm). Of the 500 shoot tips treated, 2% were characterised as stable autotetraploid lines following field evaluation over several seasons. Results were confirmed with flow cytometry and, of the 7 lines evaluated for distinctness and uniformity, 6 were solid tetraploid mutants and 1 was a periclinal chimera. Significant differences were noted between individual tetraploid lines in terms of shoot length, leaf length, leaf width, size of rhizome sections (knob weight) and fibre content. The solid autotetraploid lines had significantly wider, greener leaves than the diploids, they had significantly fewer but thicker shoots and, although ‘Queensland’ (the diploid parent from which the tetraploids were derived) had a greater total rhizome mass at harvest, its knob size was significantly smaller. From the autotetraploid lines, one line was selected for commercial release as ‘Buderim Gold’. It compared the most favourably with ‘Queensland’ in terms of the aroma/flavour profile and fibre content at early harvest, and had consistently good rhizome yield. More importantly it produced large rhizome sections, resulting in a higher recovery of premium grade confectionery ginger and a more attractive fresh market product.
Resumo:
A strategy comprising a winter/spring protein supplement, rumen modifier and hormonal growth promotant (Compudose 400) was used in either the first year (Tl), second year (T2), or in both years (T1+2) following weaning in Brahman cross steers as a means of increasing liveweight gain up to 2.5 years of age. T2 produced the heaviest final liveweight (544.7 kg) and highest overall liveweight gain (366.7 kg), but these were not significantly different from T1 (538.6 kg; 360.9 kg), or T1+2 (528.7 kg; 349.3 kg). However, final liveweight and overall liveweight gains of T1 and T2 but not T1+2 were significantly greater than for untreated (C) steers (504.9 kg; 325.2 kg, both P < 0.05). Regardless of the strategy imposed, liveweight and liveweight gain were enhanced, however final liveweights in each treatment were below the preferred minimum target liveweight (570-580 kg) for premium export markets. Treatment in both years gave no benefit over treatment in 1 year only. 19th Biennial Conference. 5-9 July 1992. LaTrobe University, Melbourne.
Resumo:
This paper investigates quality of service (QoS) and resource productivity implications of transit route passenger loading and travel time. It highlights the value of occupancy load factor as a direct passenger comfort QoS measure. Automatic Fare Collection data for a premium radial bus route in Brisbane, Australia, is used to investigate time series correlation between occupancy load factor and passenger average travel time. Correlation is strong across the entire span of service in both directions. Passengers tend to be making longer, peak direction commuter trips under significantly less comfortable conditions than off-peak. The Transit Capacity and Quality of Service Manual uses segment based load factor as a measure of onboard loading comfort QoS. This paper provides additional insight into QoS by relating the two route based dimensions of occupancy load factor and passenger average travel time together in a two dimensional format, both from the passenger’s and operator’s perspectives. Future research will apply Value of Time to QoS measurement, reflecting perceived passenger comfort through crowding and average time spent onboard. This would also assist in transit service quality econometric modeling. The methodology can be readily applied in a practical setting where AFC data for fixed scheduled routes is available. The study outcomes also provide valuable research and development directions.
Resumo:
Maize is a highly important crop to many countries around the world, through the sale of the maize crop to domestic processors and subsequent production of maize products and also provides a staple food to subsistance farms in undeveloped countries. In many countries, there have been long-term research efforts to develop a suitable hardness method that could assist the maize industry in improving efficiency in processing as well as possibly providing a quality specification for maize growers, which could attract a premium. This paper focuses specifically on hardness and reviews a number of methodologies as well as important biochemical aspects of maize that contribute to maize hardness used internationally. Numerous foods are produced from maize, and hardness has been described as having an impact on food quality. However, the basis of hardness and measurement of hardness are very general and would apply to any use of maize from any country. From the published literature, it would appear that one of the simpler methods used to measure hardness is a grinding step followed by a sieving step, using multiple sieve sizes. This would allow the range in hardness within a sample as well as average particle size and/or coarse/fine ratio to be calculated. Any of these parameters could easily be used as reference values for the development of near-infrared (NIR) spectroscopy calibrations. The development of precise NIR calibrations will provide an excellent tool for breeders, handlers, and processors to deliver specific cultivars in the case of growers and bulk loads in the case of handlers, thereby ensuring the most efficient use of maize by domestic and international processors. This paper also considers previous research describing the biochemical aspects of maize that have been related to maize hardness. Both starch and protein affect hardness, with most research focusing on the storage proteins (zeins). Both the content and composition of the zein fractions affect hardness. Genotypes and growing environment influence the final protein and starch content and. to a lesser extent, composition. However, hardness is a highly heritable trait and, hence, when a desirable level of hardness is finally agreed upon, the breeders will quickly be able to produce material with the hardness levels required by the industry.