116 resultados para Dividends pay-out
Resumo:
Background: Adaptations and assistive technology (AT) have an important role in enabling older people to remain in their own homes. Objective: To measure the feasibility and cost of adaptations and AT, and the scope for these to substitute and supplement formal care. Design: Detailed design studies to benchmark the adaptability of 82 properties against the needs of seven notional users. Setting: Social rented housing sector. Main outcome measures: Measures of the adaptability of properties, costs of care, adaptations and AT, and relationships between these costs. Results: The adaptability of properties varies according to many design factors and the needs of occupiers. The most adaptable properties were ground floor flats and bungalows; the least were houses, maisonettes and flats in converted houses. Purpose-built sheltered properties were generally more adaptable than corresponding mainstream properties but the opposite was the case for bungalows. Adaptations and AT can substitute for and supplement formal care, and in most cases the initial investment in adaptations and AT is recouped through subsequently lower care costs within the average life expectancy of a user. Conclusion: Appropriately selected adaptations and AT can make a significant contribution to the provision of living environments which facilitate independence. They can both substitute for traditional formal care services and supplement these services in a cost-effective way.
Resumo:
Purpose – The purpose of this paper is to show the extent to which clients amend standard form contracts in practice, the locus of the amendments, and how contractors respond to the amendments when putting together a bid. Design/methodology/approach – Four live observational case studies were carried out in two of the top 20 UK construction firms. The whole process used to review the proposed terms and conditions of the contract was shadowed using participant observation, interview and documentary analysis. Findings – All four cases showed strong evidence of amendments relating mostly to payment and contractual aspects: 83 amendments in Case Study 1 (CS1), 80 in CS2, 15 in CS3 and 29 in CS4. This comprised clauses that were modified (37 per cent), substituted (23 per cent), deleted (7 per cent) and new additions (33 per cent). Risks inherent in the amendments were mostly addressed through contractual rather than price mechanisms, to reflect commercial imperatives. “Qualifications” and “clarifications” were included in the tender submissions for post-tender negotiations. Thus, the amendments did not necessarily influence price. There was no evidence of a “standard-form contract“ being used as such, although clients may draw on published “standard-form contracts” to derive the forms of contract actually used in practice. Practical implications – Contractors should pay attention to clauses relating to contractual and financial aspects when reviewing tender documents. Clients should draft equitable payment and contractual terms and conditions to reduce risk of dispute. Indeed, it is prudent for clients not to pass on inestimable risks. Originality/value – A better understanding of the extent and locus of amendments in standard form contracts, and how contractors respond, is provided.
Resumo:
Since the first PFI hospital was established in 1994, many debates centred on the value for money and risk transfer in PFIs. Little concern is shown with PFI hospitals’ performance in delivering healthcare. Exploratory research was carried out to compare PFI with non‐PFI hospital performance. Five performance indicators were analysed to compare differences between PFI and non‐PFI hospitals, namely the length of waiting, the length of stay, MRSA infection rate, C difficile infection rate and patient experience. Data was collected from various government bodies. The results show that only some indexes measuring patient experience emerge statistically significant. This leads to a conclusion that PFI hospitals may not perform better than non‐PFI hospitals but they are not worse than non‐PFI hospitals in the delivery of services. However, future research needs to pay attention to reliability and validity of data sets currently available to undertake comparison.
Resumo:
A 'mapping task' was used to explore the networks available to head teachers, school coordinators and local authority staff. Beginning from an ego-centred perspective on networks, we illustrate a number of key analytic categories, including brokerage, formality, and strength and weakness of links with reference to a single UK primary school. We describe how teachers differentiate between the strength of network links and their value, which is characteristically related to their potential impact on classroom practice.
Resumo:
The coding of body part location may depend upon both visual and proprioceptive information, and allows targets to be localized with respect to the body. The present study investigates the interaction between visual and proprioceptive localization systems under conditions of multisensory conflict induced by optokinetic stimulation (OKS). Healthy subjects were asked to estimate the apparent motion speed of a visual target (LED) that could be located either in the extrapersonal space (visual encoding only, V), or at the same distance, but stuck on the subject's right index finger-tip (visual and proprioceptive encoding, V-P). Additionally, the multisensory condition was performed with the index finger kept in position both passively (V-P passive) and actively (V-P active). Results showed that the visual stimulus was always perceived to move, irrespective of its out- or on-the-body location. Moreover, this apparent motion speed varied consistently with the speed of the moving OKS background in all conditions. Surprisingly, no differences were found between V-P active and V-P passive conditions in the speed of apparent motion. The persistence of the visual illusion during the active posture maintenance reveals a novel condition in which vision totally dominates over proprioceptive information, suggesting that the hand-held visual stimulus was perceived as a purely visual, external object despite its contact with the hand.
Resumo:
This paper presents an efficient construction algorithm for obtaining sparse kernel density estimates based on a regression approach that directly optimizes model generalization capability. Computational efficiency of the density construction is ensured using an orthogonal forward regression, and the algorithm incrementally minimizes the leave-one-out test score. A local regularization method is incorporated naturally into the density construction process to further enforce sparsity. An additional advantage of the proposed algorithm is that it is fully automatic and the user is not required to specify any criterion to terminate the density construction procedure. This is in contrast to an existing state-of-art kernel density estimation method using the support vector machine (SVM), where the user is required to specify some critical algorithm parameter. Several examples are included to demonstrate the ability of the proposed algorithm to effectively construct a very sparse kernel density estimate with comparable accuracy to that of the full sample optimized Parzen window density estimate. Our experimental results also demonstrate that the proposed algorithm compares favorably with the SVM method, in terms of both test accuracy and sparsity, for constructing kernel density estimates.
Resumo:
We propose a simple yet computationally efficient construction algorithm for two-class kernel classifiers. In order to optimise classifier's generalisation capability, an orthogonal forward selection procedure is used to select kernels one by one by minimising the leave-one-out (LOO) misclassification rate directly. It is shown that the computation of the LOO misclassification rate is very efficient owing to orthogonalisation. Examples are used to demonstrate that the proposed algorithm is a viable alternative to construct sparse two-class kernel classifiers in terms of performance and computational efficiency.
Resumo:
We propose a simple and computationally efficient construction algorithm for two class linear-in-the-parameters classifiers. In order to optimize model generalization, a forward orthogonal selection (OFS) procedure is used for minimizing the leave-one-out (LOO) misclassification rate directly. An analytic formula and a set of forward recursive updating formula of the LOO misclassification rate are developed and applied in the proposed algorithm. Numerical examples are used to demonstrate that the proposed algorithm is an excellent alternative approach to construct sparse two class classifiers in terms of performance and computational efficiency.
Resumo:
A fundamental principle in practical nonlinear data modeling is the parsimonious principle of constructing the minimal model that explains the training data well. Leave-one-out (LOO) cross validation is often used to estimate generalization errors by choosing amongst different network architectures (M. Stone, "Cross validatory choice and assessment of statistical predictions", J. R. Stast. Soc., Ser. B, 36, pp. 117-147, 1974). Based upon the minimization of LOO criteria of either the mean squares of LOO errors or the LOO misclassification rate respectively, we present two backward elimination algorithms as model post-processing procedures for regression and classification problems. The proposed backward elimination procedures exploit an orthogonalization procedure to enable the orthogonality between the subspace as spanned by the pruned model and the deleted regressor. Subsequently, it is shown that the LOO criteria used in both algorithms can be calculated via some analytic recursive formula, as derived in this contribution, without actually splitting the estimation data set so as to reduce computational expense. Compared to most other model construction methods, the proposed algorithms are advantageous in several aspects; (i) There are no tuning parameters to be optimized through an extra validation data set; (ii) The procedure is fully automatic without an additional stopping criteria; and (iii) The model structure selection is directly based on model generalization performance. The illustrative examples on regression and classification are used to demonstrate that the proposed algorithms are viable post-processing methods to prune a model to gain extra sparsity and improved generalization.
Resumo:
This essay explores how The Truman Show, Peter Weir’s film about a television show, deserves more sustained analysis than it has received since its release in 1998. I will argue that The Truman Show problematizes the binary oppositions of cinema/television, disruption/stability, reality/simulation and outside/inside that structure it. The Truman Show proposes that binary oppositions such as outside/inside exist in a mutually implicating relationship. This deconstructionist strategy not only questions the film’s critical position, but also enables a reflection on the very status of film analysis itself.
Resumo:
Demand for local food in the United States has significantly increased over the last decade. In an attempt to understand the drivers of this demand and how they have changed over time, we investigate the literature on organic and local foods over the last few decades. We focus our review on studies that allow comparison of characteristics now associated with both local and organic food. We summarize the major findings of these studies and their implications for understanding drivers of local food demand. Prior to the late 1990s, most studies failed to consider factors now associated with local food, and the few that included these factors found very little support for them. In many cases, the lines between local and organic were blurred. Coincident with the development of federal organic food standards, studies began to find comparatively more support for local food as distinct and separate from organic food. Our review uncovers a distinct turn in the demand for local and organic food. Before the federal organic standards, organic food was linked to small farms, animal welfare, deep sustainability, community support, and many other factors that are not associated with most organic foods today. Based on our review, we argue that demand for local food arose largely in response to corporate cooptation of the organic food market and the arrival of “organic lite.” This important shift in consumer preferences away from organic and toward local food has broad implications for the environment and society. If these patterns of consumer preferences prove to be sustainable, producers, activists, and others should be aware of the implications that these trends have for the food system at large.