848 resultados para Conditional sales


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sustainability Declarations were introduced by the Queensland State Government on 1 January 2010 as a compulsory measure for all dwelling sales. The purpose of this policy decision was to improve the relevance of sustainability in the home ownership decision making process. This paper assesses the initial impact of this initiative over its first year in operation. In partnership with the Real Estate Institute of Queensland, real estate agents and salespeople in Queensland were surveyed to determine what impact the Sustainability Declaration has had on home buyer decision making. The level of compliance by the real estate industry was also reviewed. These preliminary findings indicate a high level of compliance from the real estate industry, however results confirm that sustainability is yet to become a criterion of relevance to the majority of home buyers in Queensland. The Sustainability Declarations are a first step in raising awareness in home owners of the importance of sustainability in housing. Further monitoring of this impact will be carried out over time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The launch of the Apple iPad on January 2010 has seen considerable interest from the newspaper and publishing industry in developing content and business models for the tablet PC device that can address the limits of both the print and online news and information media products. It is early days in the iPad’s evolution, and we wait to see what competitor devices will emerge in the near future. It is apparent, however, that it has become a significant “niche” product, with considerable potential for mass market expansion over the next few years, possibly at the expense of netbook sales. The scope for the iPad and tablet PCs to become a “fourth screen” for users, alongside the TV, PC and mobile phone, is in early stages of evolution. The study used five criteria to assess iPad apps: • Content: timeliness; archive; personalisation; content depth; advertisements; the use of multimedia; and the extent to which the content was in sync with the provider brand. • Useability: degree of static content; ability to control multimedia; file size; page clutter; resolution; signposts; and customisation. • Interactivity: hyperlinks; ability to contribute content or provide feedback to news items; depth of multimedia; search function; ability to use plug-ins and linking; ability to highlight, rate and/or save items; functions that may facilitate a community of users. • Transactions capabilities: ecommerce functionality; purchase and download process; user privacy and transaction security. • Openness: degree of linking to outside sources; reader contribution processes; anonymity measures; and application code ownership.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In many product categories of durable goods such as TV, PC, and DVD player, the largest component of sales is generated by consumers replacing existing units. Aggregate sales models proposed by diffusion of innovation researchers for the replacement component of sales have incorporated several different replacement distributions such as Rayleigh, Weibull, Truncated Normal and Gamma. Although these alternative replacement distributions have been tested using both time series sales data and individual-level actuarial “life-tables” of replacement ages, there is no census on which distributions are more appropriate to model replacement behaviour. In the current study we are motivated to develop a new “modified gamma” distribution by two reasons. First we recognise that replacements have two fundamentally different drivers – those forced by failure and early, discretionary replacements. The replacement distribution for each of these drivers is expected to be quite different. Second, we observed a poor fit of other distributions to out empirical data. We conducted a survey of 8,077 households to empirically examine models of replacement sales for six electronic consumer durables – TVs, VCRs, DVD players, digital cameras, personal and notebook computers. This data allows us to construct individual-level “life-tables” for replacement ages. We demonstrate the new modified gamma model fits the empirical data better than existing models for all six products using both a primary and a hold-out sample.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Plant biosecurity requires statistical tools to interpret field surveillance data in order to manage pest incursions that threaten crop production and trade. Ultimately, management decisions need to be based on the probability that an area is infested or free of a pest. Current informal approaches to delimiting pest extent rely upon expert ecological interpretation of presence / absence data over space and time. Hierarchical Bayesian models provide a cohesive statistical framework that can formally integrate the available information on both pest ecology and data. The overarching method involves constructing an observation model for the surveillance data, conditional on the hidden extent of the pest and uncertain detection sensitivity. The extent of the pest is then modelled as a dynamic invasion process that includes uncertainty in ecological parameters. Modelling approaches to assimilate this information are explored through case studies on spiralling whitefly, Aleurodicus dispersus and red banded mango caterpillar, Deanolis sublimbalis. Markov chain Monte Carlo simulation is used to estimate the probable extent of pests, given the observation and process model conditioned by surveillance data. Statistical methods, based on time-to-event models, are developed to apply hierarchical Bayesian models to early detection programs and to demonstrate area freedom from pests. The value of early detection surveillance programs is demonstrated through an application to interpret surveillance data for exotic plant pests with uncertain spread rates. The model suggests that typical early detection programs provide a moderate reduction in the probability of an area being infested but a dramatic reduction in the expected area of incursions at a given time. Estimates of spiralling whitefly extent are examined at local, district and state-wide scales. The local model estimates the rate of natural spread and the influence of host architecture, host suitability and inspector efficiency. These parameter estimates can support the development of robust surveillance programs. Hierarchical Bayesian models for the human-mediated spread of spiralling whitefly are developed for the colonisation of discrete cells connected by a modified gravity model. By estimating dispersal parameters, the model can be used to predict the extent of the pest over time. An extended model predicts the climate restricted distribution of the pest in Queensland. These novel human-mediated movement models are well suited to demonstrating area freedom at coarse spatio-temporal scales. At finer scales, and in the presence of ecological complexity, exploratory models are developed to investigate the capacity for surveillance information to estimate the extent of red banded mango caterpillar. It is apparent that excessive uncertainty about observation and ecological parameters can impose limits on inference at the scales required for effective management of response programs. The thesis contributes novel statistical approaches to estimating the extent of pests and develops applications to assist decision-making across a range of plant biosecurity surveillance activities. Hierarchical Bayesian modelling is demonstrated as both a useful analytical tool for estimating pest extent and a natural investigative paradigm for developing and focussing biosecurity programs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the emergence of multi-core processors into the mainstream, parallel programming is no longer the specialized domain it once was. There is a growing need for systems to allow programmers to more easily reason about data dependencies and inherent parallelism in general purpose programs. Many of these programs are written in popular imperative programming languages like Java and C]. In this thesis I present a system for reasoning about side-effects of evaluation in an abstract and composable manner that is suitable for use by both programmers and automated tools such as compilers. The goal of developing such a system is to both facilitate the automatic exploitation of the inherent parallelism present in imperative programs and to allow programmers to reason about dependencies which may be limiting the parallelism available for exploitation in their applications. Previous work on languages and type systems for parallel computing has tended to focus on providing the programmer with tools to facilitate the manual parallelization of programs; programmers must decide when and where it is safe to employ parallelism without the assistance of the compiler or other automated tools. None of the existing systems combine abstraction and composition with parallelization and correctness checking to produce a framework which helps both programmers and automated tools to reason about inherent parallelism. In this work I present a system for abstractly reasoning about side-effects and data dependencies in modern, imperative, object-oriented languages using a type and effect system based on ideas from Ownership Types. I have developed sufficient conditions for the safe, automated detection and exploitation of a number task, data and loop parallelism patterns in terms of ownership relationships. To validate my work, I have applied my ideas to the C] version 3.0 language to produce a language extension called Zal. I have implemented a compiler for the Zal language as an extension of the GPC] research compiler as a proof of concept of my system. I have used it to parallelize a number of real-world applications to demonstrate the feasibility of my proposed approach. In addition to this empirical validation, I present an argument for the correctness of the type system and language semantics I have proposed as well as sketches of proofs for the correctness of the sufficient conditions for parallelization proposed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traffic oscillations are typical features of congested traffic flow that are characterized by recurring decelerations followed by accelerations (stop-and-go driving). The negative environmental impacts of these oscillations are widely accepted, but their impact on traffic safety has been debated. This paper describes the impact of freeway traffic oscillations on traffic safety. This study employs a matched case-control design using high-resolution traffic and crash data from a freeway segment. Traffic conditions prior to each crash were taken as cases, while traffic conditions during the same periods on days without crashes were taken as controls. These were also matched by presence of congestion, geometry and weather. A total of 82 cases and about 80,000 candidate controls were extracted from more than three years of data from 2004 to 2007. Conditional logistic regression models were developed based on the case-control samples. To verify consistency in the results, 20 different sets of controls were randomly extracted from the candidate pool for varying control-case ratios. The results reveal that the standard deviation of speed (thus, oscillations) is a significant variable, with an average odds ratio of about 1.08. This implies that the likelihood of a (rear-end) crash increases by about 8% with an additional unit increase in the standard deviation of speed. The average traffic states prior to crashes were less significant than the speed variations in congestion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The decision of the High Court in Butcher v Lachlan Elder Realty Pty Ltd [2004] HCA 60 involves issues that affect every person who is induced to buy real estate in Australia by statements in sales brochures distributed by real estate agents. One of these issues is the extent to which estate agents unwittingly engage in misleading or deceptive conduct under s 52 of the Trade Practices Act 1974 (Cth) (‘the Act’) when they distribute sales brochures that contain untrue or misleading statements prepared by others. A further issue is the extent to which agents can escape liability by relying on disclaimers about the authenticity of false statements contained in brochures prepared by them.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A pervasive and puzzling feature of banks’ Value-at-Risk (VaR) is its abnormally high level, which leads to excessive regulatory capital. A possible explanation for the tendency of commercial banks to overstate their VaR is that they incompletely account for the diversification effect among broad risk categories (e.g., equity, interest rate, commodity, credit spread, and foreign exchange). By underestimating the diversification effect, bank’s proprietary VaR models produce overly prudent market risk assessments. In this paper, we examine empirically the validity of this hypothesis using actual VaR data from major US commercial banks. In contrast to the VaR diversification hypothesis, we find that US banks show no sign of systematic underestimation of the diversification effect. In particular, diversification effects used by banks is very close to (and quite often larger than) our empirical diversification estimates. A direct implication of this finding is that individual VaRs for each broad risk category, just like aggregate VaRs, are biased risk assessments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite considerable success in treatment of early stage localized prostate cancer (PC), acute inadequacy of late stage PC treatment and its inherent heterogeneity poses a formidable challenge. Clearly, an improved understanding of PC genesis and progression along with the development of new targeted therapies are warranted. Animal models, especially, transgenic immunocompetent mouse models, have proven to be the best ally in this respect. A series of models have been developed by modulation of expression of genes implicated in cancer-genesis and progression; mainly, modulation of expression of oncogenes, steroid hormone receptors, growth factors and their receptors, cell cycle and apoptosis regulators, and tumor suppressor genes have been used. Such models have contributed significantly to our understanding of the molecular and pathological aspects of PC initiation and progression. In particular, the transgenic mouse models based on multiple genetic alterations can more accurately address the inherent complexity of PC, not only in revealing the mechanisms of tumorigenesis and progression but also for clinically relevant evaluation of new therapies. Further, with advances in conditional knockout technologies, otherwise embryonically lethal gene changes can be incorporated leading to the development of new generation transgenics, thus adding significantly to our existing knowledge base. Different models and their relevance to PC research are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the problem of binary classification where the classifier can, for a particular cost, choose not to classify an observation. Just as in the conventional classification problem, minimization of the sample average of the cost is a difficult optimization problem. As an alternative, we propose the optimization of a certain convex loss function φ, analogous to the hinge loss used in support vector machines (SVMs). Its convexity ensures that the sample average of this surrogate loss can be efficiently minimized. We study its statistical properties. We show that minimizing the expected surrogate loss—the φ-risk—also minimizes the risk. We also study the rate at which the φ-risk approaches its minimum value. We show that fast rates are possible when the conditional probability P(Y=1|X) is unlikely to be close to certain critical values.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In just under 3 months worldwide sales of Apple's iPad tablet device stood at over 3 million units sold. The iPad device, along with rival products signify a shift in the way in which print and other media products are purchased and consumed by users. While facing initial skepticism about the uptake of the device numerous industries have been quick to adapt the device to their specific needs. Based around a newly developed six point typology of “post PC” device utility this project undertook a significant review of publicly available material to identify worldwide trends in iPad adoption and use within the tertiary sector.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a method of spatial sampling based on stratification by Local Moran’s I i calculated using auxiliary information. The sampling technique is compared to other design-based approaches including simple random sampling, systematic sampling on a regular grid, conditional Latin Hypercube sampling and stratified sampling based on auxiliary information, and is illustrated using two different spatial data sets. Each of the samples for the two data sets is interpolated using regression kriging to form a geostatistical map for their respective areas. The proposed technique is shown to be competitive in reproducing specific areas of interest with high accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Networked Control System (NCS) is a feedback-driven control system wherein the control loops are closed through a real-time network. Control and feedback signals in an NCS are exchanged among the system’s components in the form of information packets via the network. Nowadays, wireless technologies such as IEEE802.11 are being introduced to modern NCSs as they offer better scalability, larger bandwidth and lower costs. However, this type of network is not designed for NCSs because it introduces a large amount of dropped data, and unpredictable and long transmission latencies due to the characteristics of wireless channels, which are not acceptable for real-time control systems. Real-time control is a class of time-critical application which requires lossless data transmission, small and deterministic delays and jitter. For a real-time control system, network-introduced problems may degrade the system’s performance significantly or even cause system instability. It is therefore important to develop solutions to satisfy real-time requirements in terms of delays, jitter and data losses, and guarantee high levels of performance for time-critical communications in Wireless Networked Control Systems (WNCSs). To improve or even guarantee real-time performance in wireless control systems, this thesis presents several network layout strategies and a new transport layer protocol. Firstly, real-time performances in regard to data transmission delays and reliability of IEEE 802.11b-based UDP/IP NCSs are evaluated through simulations. After analysis of the simulation results, some network layout strategies are presented to achieve relatively small and deterministic network-introduced latencies and reduce data loss rates. These are effective in providing better network performance without performance degradation of other services. After the investigation into the layout strategies, the thesis presents a new transport protocol which is more effcient than UDP and TCP for guaranteeing reliable and time-critical communications in WNCSs. From the networking perspective, introducing appropriate communication schemes, modifying existing network protocols and devising new protocols, have been the most effective and popular ways to improve or even guarantee real-time performance to a certain extent. Most previously proposed schemes and protocols were designed for real-time multimedia communication and they are not suitable for real-time control systems. Therefore, devising a new network protocol that is able to satisfy real-time requirements in WNCSs is the main objective of this research project. The Conditional Retransmission Enabled Transport Protocol (CRETP) is a new network protocol presented in this thesis. Retransmitting unacknowledged data packets is effective in compensating for data losses. However, every data packet in realtime control systems has a deadline and data is assumed invalid or even harmful when its deadline expires. CRETP performs data retransmission only in the case that data is still valid, which guarantees data timeliness and saves memory and network resources. A trade-off between delivery reliability, transmission latency and network resources can be achieved by the conditional retransmission mechanism. Evaluation of protocol performance was conducted through extensive simulations. Comparative studies between CRETP, UDP and TCP were also performed. These results showed that CRETP significantly: 1). improved reliability of communication, 2). guaranteed validity of received data, 3). reduced transmission latency to an acceptable value, and 4). made delays relatively deterministic and predictable. Furthermore, CRETP achieved the best overall performance in comparative studies which makes it the most suitable transport protocol among the three for real-time communications in a WNCS.