507 resultados para 150507 Pricing (incl. Consumer Value Estimation)
Resumo:
For the 2005 season, Mackay Sugar and its growers agreed to implement a new cane payment system. The aim of the new system was to better align the business drivers between the mill and its growers and as a result improve business decision making. The technical basis of the new cane payment system included a fixed sharing of the revenue from sugar cane between the mill and growers. Further, the new system replaced the CCS formula with a new estimate of recoverable sugar (PRS) and introduced NIR for payment analyses. Significant mill and grower consultation processes led to the agreement to implement the new system in 2005 and this consultative approach has been reflected in two seasons of successful operation.
Resumo:
This thesis investigates profiling and differentiating customers through the use of statistical data mining techniques. The business application of our work centres on examining individuals’ seldomly studied yet critical consumption behaviour over an extensive time period within the context of the wireless telecommunication industry; consumption behaviour (as oppose to purchasing behaviour) is behaviour that has been performed so frequently that it become habitual and involves minimal intentions or decision making. Key variables investigated are the activity initialised timestamp and cell tower location as well as the activity type and usage quantity (e.g., voice call with duration in seconds); and the research focuses are on customers’ spatial and temporal usage behaviour. The main methodological emphasis is on the development of clustering models based on Gaussian mixture models (GMMs) which are fitted with the use of the recently developed variational Bayesian (VB) method. VB is an efficient deterministic alternative to the popular but computationally demandingMarkov chainMonte Carlo (MCMC) methods. The standard VBGMMalgorithm is extended by allowing component splitting such that it is robust to initial parameter choices and can automatically and efficiently determine the number of components. The new algorithm we propose allows more effective modelling of individuals’ highly heterogeneous and spiky spatial usage behaviour, or more generally human mobility patterns; the term spiky describes data patterns with large areas of low probability mixed with small areas of high probability. Customers are then characterised and segmented based on the fitted GMM which corresponds to how each of them uses the products/services spatially in their daily lives; this is essentially their likely lifestyle and occupational traits. Other significant research contributions include fitting GMMs using VB to circular data i.e., the temporal usage behaviour, and developing clustering algorithms suitable for high dimensional data based on the use of VB-GMM.
Resumo:
Travel time is an important network performance measure and it quantifies congestion in a manner easily understood by all transport users. In urban networks, travel time estimation is challenging due to number of reasons such as, fluctuations in traffic flow due to traffic signals, significant flow to/from mid link sinks/sources, etc. The classical analytical procedure utilizes cumulative plots at upstream and downstream locations for estimating travel time between the two locations. In this paper, we discuss about the issues and challenges with classical analytical procedure such as its vulnerability to non conservation of flow between the two locations. The complexity with respect to exit movement specific travel time is discussed. Recently, we have developed a methodology utilising classical procedure to estimate average travel time and its statistic on urban links (Bhaskar, Chung et al. 2010). Where, detector, signal and probe vehicle data is fused. In this paper we extend the methodology for route travel time estimation and test its performance using simulation. The originality is defining cumulative plots for each exit turning movement utilising historical database which is self updated after each estimation. The performance is also compared with a method solely based on probe (Probe-only). The performance of the proposed methodology has been found insensitive to different route flow, with average accuracy of more than 94% given a probe per estimation interval which is more than 5% increment in accuracy with respect to Probe-only method.
Resumo:
Transmission smart grids will use a digital platform for the automation of high voltage substations. The IEC 61850 series of standards, released in parts over the last ten years, provide a specification for substation communications networks and systems. These standards, along with IEEE Std 1588-2008 Precision Time Protocol version 2 (PTPv2) for precision timing, are recommended by the both IEC Smart Grid Strategy Group and the NIST Framework and Roadmap for Smart Grid Interoperability Standards for substation automation. IEC 61850, PTPv2 and Ethernet are three complementary protocol families that together define the future of sampled value digital process connections for smart substation automation. A time synchronisation system is required for a sampled value process bus, however the details are not defined in IEC 61850-9-2. PTPv2 provides the greatest accuracy of network based time transfer systems, with timing errors of less than 100 ns achievable. The suitability of PTPv2 to synchronise sampling in a digital process bus is evaluated, with preliminary results indicating that steady state performance of low cost clocks is an acceptable ±300 ns, but that corrections issued by grandmaster clocks can introduce significant transients. Extremely stable grandmaster oscillators are required to ensure any corrections are sufficiently small that time synchronising performance is not degraded.
Resumo:
This report provides an evaluation of the current available evidence-base for identification and surveillance of product-related injuries in children in Queensland. While the focal population was children in Queensland, the identification of information needs and data sources for product safety surveillance has applicability nationally for all age groups. The report firstly summarises the data needs of product safety regulators regarding product-related injury in children, describing the current sources of information informing product safety policy and practice, and documenting the priority product surveillance areas affecting children which have been a focus over recent years in Queensland. Health data sources in Queensland which have the potential to inform product safety surveillance initiatives were evaluated in terms of their ability to address the information needs of product safety regulators. Patterns in product-related injuries in children were analysed using routinely available health data to identify areas for future intervention, and the patterns in product-related injuries in children identified in health data were compared to those identified by product safety regulators. Recommendations were made for information system improvements and improved access to and utilisation of health data for more proactive approaches to product safety surveillance in the future.
Resumo:
In this paper, we seek to expand the use of direct methods in real-time applications by proposing a vision-based strategy for pose estimation of aerial vehicles. The vast majority of approaches make use of features to estimate motion. Conversely, the strategy we propose is based on a MR (Multi- Resolution) implementation of an image registration technique (Inverse Compositional Image Alignment ICIA) using direct methods. An on-board camera in a downwards-looking configuration, and the assumption of planar scenes, are the bases of the algorithm. The motion between frames (rotation and translation) is recovered by decomposing the frame-to-frame homography obtained by the ICIA algorithm applied to a patch that covers around the 80% of the image. When the visual estimation is required (e.g. GPS drop-out), this motion is integrated with the previous known estimation of the vehicles’ state, obtained from the on-board sensors (GPS/IMU), and the subsequent estimations are based only on the vision-based motion estimations. The proposed strategy is tested with real flight data in representative stages of a flight: cruise, landing, and take-off, being two of those stages considered critical: take-off and landing. The performance of the pose estimation strategy is analyzed by comparing it with the GPS/IMU estimations. Results show correlation between the visual estimation obtained with the MR-ICIA and the GPS/IMU data, that demonstrate that the visual estimation can be used to provide a good approximation of the vehicle’s state when it is required (e.g. GPS drop-outs). In terms of performance, the proposed strategy is able to maintain an estimation of the vehicle’s state for more than one minute, at real-time frame rates based, only on visual information.
Resumo:
One of the impediments to large-scale use of wind generation within power system is its variable and uncertain real-time availability. Due to the low marginal cost of wind power, its output will change the merit order of power markets and influence the Locational Marginal Price (LMP). For the large scale of wind power, LMP calculation can't ignore the essential variable and uncertain nature of wind power. This paper proposes an algorithm to estimate LMP. The estimation result of conventional Monte Carlo simulation is taken as benchmark to examine accuracy. Case study is conducted on a simplified SE Australian power system, and the simulation results show the feasibility of proposed method.
Resumo:
Research into complaints handling in the health care system has predominately focused on examining the processes that underpin the organisational systems. An understanding of the cognitive decisions made by patients that influence whether they are satisfied or dissatisfied with the care they are receiving has had limited attention thus far. This study explored the lived experiences of Queensland acute care patients who complained about some aspect of their inpatient stay. A purposive sample of sixteen participants was recruited and interviewed about their experience of making a complaint. The qualitative data gathered through the interview process was subjected to an Interpretative Phenomenological Analysis (IPA) approach, guided by the philosophical influences of Heidegger (1889-1976). As part of the interpretive endeavour of this study, Lazarus’ cognitive emotive model with situational challenge was drawn on to provide a contextual understanding of the emotions experienced by the study participants. Analysis of the research data, aided by Leximancer™ software, revealed a series of relational themes that supported the interpretative data analysis process undertaken. The superordinate thematic statements that emerged from the narratives via the hermeneutic process were ineffective communication, standards of care were not consistent, being treated with disrespect, information on how to complain was not clear, and perceptions of negligence. This study’s goal was to provide health services with information about complaints handling that can help them develop service improvements. The study patients articulated the need for health care system reform; they want to be listened to, to be acknowledged, to be believed, for people to take ownership if they had made a mistake, for mistakes not to occur again, and to receive an apology. For these initiatives to be fully realised, the paradigm shift must go beyond regurgitating complaints data metrics in percentages per patient contact, towards a concerted effort to evaluate what the qualitative complaints data is really saying. An opportunity to identify a more positive and proactive approach in encouraging our patients to complain when they are dissatisfied has the potential to influence improvements.
Resumo:
In this chapter I position the iPhone as a “moment” in the history of cultural technologies. Drawing predominantly on advertising materials and public conversations about other "moments" in the history of personal computing and focusing on Apple’s role in this history, I argue that the design philosophy, marketing, and business models behind the iPhone (and now the iPad) have decisively reframed the values of usability that underpin software and interface design in the consumer technology industry, marking a distinctive shift in the history and contested futures of digital culture.
Resumo:
Introduction Buildings, which account for approximately half of all annual energy and greenhouse gas emissions, are an important target area for any strategy addressing climate change. Whilst new commercial buildings increasingly address sustainability considerations, incorporating green technology in the refurbishment process of older buildings is technically, financially and socially challenging. This research explores the expectations and experiences of commercial office building tenants, whose building was under-going green refurbishment. Methodology Semi-structured in-depth interviews with seven residents and neighbours of a large case-study building under-going green refurbishment in Melbourne, Australia. Built in 1979, the 7,008m² ‘B’ grade building consists of 11 upper levels of office accommodation, ground floor retail, and a basement area leased as a licensed restaurant. After refurbishment, which included the installation of chilled water pumps, solar water heating, waterless urinals, insulation, disabled toilets, and automatic dimming lights, it was expected that the environmental performance of the building would move from a non-existent zero ABGR (Australian Building Greenhouse Rating) star rating to 3.5 stars, with a 40% reduction in water consumption and 20% reduction in energy consumption. Interviews were transcribed, with responses analysed using a thematic approach, identifying categories, themes and patterns. Results Commercial property tenants are on a journey to sustainability - they are interested and willing to engage in discussions about sustainability initiatives, but the process, costs and benefits need to be clear. Critically, whilst sustainability was an essential and non-negotiable criterion in building selection for government and larger corporate tenants, sustainability was not yet a core business value for smaller organisations – whilst they could see it as an emerging issue, they wanted detailed cost-benefit analyses, pay-back calculations of proposed technologies and, ideally, wished they could trial the technology first-hand in some way. Although extremely interested in learning more, most participants reported relatively minimal knowledge of specific sustainability features, designs or products. In discussions about different sustainable technologies (e.g., waterless urinals, green-rated carpets), participants frequently commented that they knew little about the technology, had not heard of it or were not sure exactly how it worked. Whilst participants viewed sustainable commercial buildings as the future, they had varied expectations about the fate of existing older buildings – most felt that they would have to be retrofitted at some point to meet market expectations and predicted the emergence of a ‘non-sustainability discount’ for residing in a building without sustainable features. Discussion This research offers a beginning point for understanding the difficulty of integrating green technology in older commercial buildings. Tenants currently have limited understandings of technology and potential building performance outcomes, which ultimately could impede the implementation of sustainable initiatives in older buildings. Whilst the commercial property market is interested in learning about sustainability in the built environment, the findings highlight the importance of developing a strong business case, communication and transition plan for implementing sustainability retrofits in existing commercial buildings.
Resumo:
This paper describes a lead project currently underway through Australia’s Sustainable Built Environment National Research Centre evaluating diffusion mechanisms and impacts of R&D investment in the Australian built environment. Through a retrospective analysis of R&D investment trends and industry outcomes, and a prospective assessment of industry futures using strategic foresighting, a future-focussed industry R&D roadmap and pursuant policy guidelines will be developed. This research aims to build new understandings and knowledge relevant to R&D funding strategies, research team formation and management, dissemination of outcomes and industry uptake. Each of these issues are critical due to: the disaggregated nature of the built environment industry; intense competition; limited R&D investment; and new challenges (e.g. IT, increased environmental expectations). This paper details the context within which this project is being undertaken and the research design. Findings of the retrospective analysis of past R&D investment in Australia will be presented at this conference.
Resumo:
Social networks have proven to be an attractive avenue of investigation for researchers since humans are social creatures. Numerous literature have explored the term “social networks” from different perspectives and in diverse research fields. With the popularity of the Internet, social networking has taken on a new dimension. Online social communities therefore have become an emerging social avenue for people to communicate in today’s information age. People use online social communities to share their interests, maintain friendships, and extend their so-called circle of “friends”. Likewise, social capital, also known as human capital, is an important theory in sociology. Researchers usually utilise social capital theory when they investigate the topic relating to social networks. However, there is little literature that can provide an explicit and strong assertion in that research area due to the complexity of social capital. This thesis therefore focuses on the issue related to providing a better understanding about the relationship between social capital and online social communities. To enhance the value within the scope of this analysis, an online survey was conducted to examine the effects of the dimensions of social capital: relational capital, structural capital, and cognitive capital, determining the intensity of using online social communities. The data were derived from a total of 350 self-selected respondents completing an online survey during the research period. The main results indicate that social capital exists in online social communities under normal circumstances. Finally, this thesis also presents three contributions for both theory and practice in Chapter 5. The main results contribute to the understanding of connectivity in the interrelationships between individual social capital exchange within online social networks. Secondly, social trust was found to have a weak effect in influencing the intensity of individuals using online social communities. Third, the perpetual role of information sharing has an indirect influence on individual users participating in online social communities. This study also benefits online marketing consultants as marketers can not only gain consumer information easier from online social communities but also this understanding assists in designing effective communication within online social communities. The cross-sectional study, the reliability of Internet survey data, and sampling issues are the major three limitations in this research. The thesis provides a new research model and recommends that the mediating effects, privacy paradox, and social trust on online social communities should be further explored in future research.
Resumo:
This paper describes modelling, estimation and control of the horizontal translational motion of an open-source and cost effective quadcopter — the MikroKopter. We determine the dynamics of its roll and pitch attitude controller, system latencies, and the units associated with the values exchanged with the vehicle over its serial port. Using this we create a horizontal-plane velocity estimator that uses data from the built-in inertial sensors and an onboard laser scanner, and implement translational control using a nested control loop architecture. We present experimental results for the model and estimator, as well as closed-loop positioning.
Resumo:
Value Management (VM) initially started in early 1940s in the US manufacturing industry has increasingly becoming popular within the construction industry community internationally. It has been widely accepted as an important tool in the management of projects. The structured, systematic and multi-disciplinary approach in decision making process is a niche for VM in delivering better value for money project to the client investment. It would appear to be gaining some momentum as an essential management tool in the Malaysian construction sector especially in the quantity surveying practice. Quantity surveyors increasing involvement in VM provides an opportunity for the profession to re-model some of its traditional services in a more positive light and develop leading-edge skills and promote the profession. International practice has associated VM to be part of services offered in the quantity surveying practice; especially in UK has proven to be a natural progression of QS profession. The introduction of VM as early 1980’s in Malaysia combined with increasing demand for construction project to facilitate nation progress is shedding a positive light for quantity surveying profession to take lead in developing VM as one of their niche area. Therefore, the quantity surveying profession having the opportunity to take lead of this service which reflect their traditional attributes for providing the best value-for-money advise to the client. This paper shall discuss on the development of VM in Malaysia and the challenges VM face services in QS firm to remain ahead of their competitors.
Resumo:
The research objectives of this thesis were to contribute to Bayesian statistical methodology by contributing to risk assessment statistical methodology, and to spatial and spatio-temporal methodology, by modelling error structures using complex hierarchical models. Specifically, I hoped to consider two applied areas, and use these applications as a springboard for developing new statistical methods as well as undertaking analyses which might give answers to particular applied questions. Thus, this thesis considers a series of models, firstly in the context of risk assessments for recycled water, and secondly in the context of water usage by crops. The research objective was to model error structures using hierarchical models in two problems, namely risk assessment analyses for wastewater, and secondly, in a four dimensional dataset, assessing differences between cropping systems over time and over three spatial dimensions. The aim was to use the simplicity and insight afforded by Bayesian networks to develop appropriate models for risk scenarios, and again to use Bayesian hierarchical models to explore the necessarily complex modelling of four dimensional agricultural data. The specific objectives of the research were to develop a method for the calculation of credible intervals for the point estimates of Bayesian networks; to develop a model structure to incorporate all the experimental uncertainty associated with various constants thereby allowing the calculation of more credible credible intervals for a risk assessment; to model a single day’s data from the agricultural dataset which satisfactorily captured the complexities of the data; to build a model for several days’ data, in order to consider how the full data might be modelled; and finally to build a model for the full four dimensional dataset and to consider the timevarying nature of the contrast of interest, having satisfactorily accounted for possible spatial and temporal autocorrelations. This work forms five papers, two of which have been published, with two submitted, and the final paper still in draft. The first two objectives were met by recasting the risk assessments as directed, acyclic graphs (DAGs). In the first case, we elicited uncertainty for the conditional probabilities needed by the Bayesian net, incorporated these into a corresponding DAG, and used Markov chain Monte Carlo (MCMC) to find credible intervals, for all the scenarios and outcomes of interest. In the second case, we incorporated the experimental data underlying the risk assessment constants into the DAG, and also treated some of that data as needing to be modelled as an ‘errors-invariables’ problem [Fuller, 1987]. This illustrated a simple method for the incorporation of experimental error into risk assessments. In considering one day of the three-dimensional agricultural data, it became clear that geostatistical models or conditional autoregressive (CAR) models over the three dimensions were not the best way to approach the data. Instead CAR models are used with neighbours only in the same depth layer. This gave flexibility to the model, allowing both the spatially structured and non-structured variances to differ at all depths. We call this model the CAR layered model. Given the experimental design, the fixed part of the model could have been modelled as a set of means by treatment and by depth, but doing so allows little insight into how the treatment effects vary with depth. Hence, a number of essentially non-parametric approaches were taken to see the effects of depth on treatment, with the model of choice incorporating an errors-in-variables approach for depth in addition to a non-parametric smooth. The statistical contribution here was the introduction of the CAR layered model, the applied contribution the analysis of moisture over depth and estimation of the contrast of interest together with its credible intervals. These models were fitted using WinBUGS [Lunn et al., 2000]. The work in the fifth paper deals with the fact that with large datasets, the use of WinBUGS becomes more problematic because of its highly correlated term by term updating. In this work, we introduce a Gibbs sampler with block updating for the CAR layered model. The Gibbs sampler was implemented by Chris Strickland using pyMCMC [Strickland, 2010]. This framework is then used to consider five days data, and we show that moisture in the soil for all the various treatments reaches levels particular to each treatment at a depth of 200 cm and thereafter stays constant, albeit with increasing variances with depth. In an analysis across three spatial dimensions and across time, there are many interactions of time and the spatial dimensions to be considered. Hence, we chose to use a daily model and to repeat the analysis at all time points, effectively creating an interaction model of time by the daily model. Such an approach allows great flexibility. However, this approach does not allow insight into the way in which the parameter of interest varies over time. Hence, a two-stage approach was also used, with estimates from the first-stage being analysed as a set of time series. We see this spatio-temporal interaction model as being a useful approach to data measured across three spatial dimensions and time, since it does not assume additivity of the random spatial or temporal effects.