16 resultados para Non-commercial film distribution
em CentAUR: Central Archive University of Reading - UK
Resumo:
A field trial was undertaken to determine the influence of four commercially available film-forming polymers (Bond [alkyl phenyl hydroxyl polyoxyethylene], Newman Crop Spray 11E™ [paraffinic oil], Nu-Film P [poly-1-p menthene], and Spray Gard [di-1-p menthene]) on reducing salt spray injury on two woody species, evergreen oak (Quercus ilex L.) and laurel (Prunus laurocerasus L.). Irrespective of species, the film-forming polymers Nu-Film-P and Spay Gard did not provide any significant degree of protection against salt spray damage irrespective of concentration (1% or 2%) applied as measured by leaf chlorophyll concentrations, photosynthetic efficiency, visual leaf necrosis, foliar sodium and chloride content, and growth (height, leaf area). The film-forming polymer Newman Crop Spray 11E™ provided only 1-week protection against salt spray injury. The film-forming polymer Bond provided a significant (P < 0.05) degree of protection against salt spray injury 3 months after application as manifest by higher leaf chlorophyll content, photosynthetic efficiency, height and leaf area, and lower visual leaf necrosis and foliar Na and Cl content compared with nontreated controls. In conclusion, results indicate that application of a suitable film-forming polymer can provide a significant degree of protection of up to 3 months against salt spray injury in evergreen oak and laurel. Results also indicate that when applied at 1% or 2% solutions, no problems associated with phytotoxicity and rapid degradation on the leaf surface exist.
Resumo:
Global communicationrequirements andloadimbalanceof someparalleldataminingalgorithms arethe major obstacles to exploitthe computational power of large-scale systems. This work investigates how non-uniform data distributions can be exploited to remove the global communication requirement and to reduce the communication costin parallel data mining algorithms and, in particular, in the k-means algorithm for cluster analysis. In the straightforward parallel formulation of the k-means algorithm, data and computation loads are uniformly distributed over the processing nodes. This approach has excellent load balancing characteristics that may suggest it could scale up to large and extreme-scale parallel computing systems. However, at each iteration step the algorithm requires a global reduction operationwhichhinders thescalabilityoftheapproach.Thisworkstudiesadifferentparallelformulation of the algorithm where the requirement of global communication is removed, while maintaining the same deterministic nature ofthe centralised algorithm. The proposed approach exploits a non-uniform data distribution which can be either found in real-world distributed applications or can be induced by means ofmulti-dimensional binary searchtrees. The approachcanalso be extended to accommodate an approximation error which allows a further reduction ofthe communication costs. The effectiveness of the exact and approximate methods has been tested in a parallel computing system with 64 processors and in simulations with 1024 processing element
Resumo:
We consider the comparison of two formulations in terms of average bioequivalence using the 2 × 2 cross-over design. In a bioequivalence study, the primary outcome is a pharmacokinetic measure, such as the area under the plasma concentration by time curve, which is usually assumed to have a lognormal distribution. The criterion typically used for claiming bioequivalence is that the 90% confidence interval for the ratio of the means should lie within the interval (0.80, 1.25), or equivalently the 90% confidence interval for the differences in the means on the natural log scale should be within the interval (-0.2231, 0.2231). We compare the gold standard method for calculation of the sample size based on the non-central t distribution with those based on the central t and normal distributions. In practice, the differences between the various approaches are likely to be small. Further approximations to the power function are sometimes used to simplify the calculations. These approximations should be used with caution, because the sample size required for a desirable level of power might be under- or overestimated compared to the gold standard method. However, in some situations the approximate methods produce very similar sample sizes to the gold standard method. Copyright © 2005 John Wiley & Sons, Ltd.
Resumo:
Rainfall can be modeled as a spatially correlated random field superimposed on a background mean value; therefore, geostatistical methods are appropriate for the analysis of rain gauge data. Nevertheless, there are certain typical features of these data that must be taken into account to produce useful results, including the generally non-Gaussian mixed distribution, the inhomogeneity and low density of observations, and the temporal and spatial variability of spatial correlation patterns. Many studies show that rigorous geostatistical analysis performs better than other available interpolation techniques for rain gauge data. Important elements are the use of climatological variograms and the appropriate treatment of rainy and nonrainy areas. Benefits of geostatistical analysis for rainfall include ease of estimating areal averages, estimation of uncertainties, and the possibility of using secondary information (e.g., topography). Geostatistical analysis also facilitates the generation of ensembles of rainfall fields that are consistent with a given set of observations, allowing for a more realistic exploration of errors and their propagation in downstream models, such as those used for agricultural or hydrological forecasting. This article provides a review of geostatistical methods used for kriging, exemplified where appropriate by daily rain gauge data from Ethiopia.
Resumo:
There are competing theoretical expectations and conflicting empirical results concerning the impact of partisanship on spending on active labour market policies (ALMPs). This paper argues that one should distinguish between different ALMPs. Employment incentives and rehabilitation programmes incentivize the unemployed to accept jobs. Direct job creation reduces the supply of labour by creating non-commercial jobs. Training schemes raise the human capital of the unemployed. Using regression analysis this paper shows that the positions of political parties towards these three types of ALMPs are different. Party preferences also depend on the welfare regime in which parties are located. In Scandinavia, left-wing parties support neither employment incentives nor direct job creation schemes. In continental and Liberal welfare regimes, left-wing parties oppose employment incentives and rehabilitation programmes to a lesser extent and they support direct job creation. There is no impact of partisanship on training. These results reconcile the previously contradictory findings concerning the impact of the Left on ALMPs.
Resumo:
Global communication requirements and load imbalance of some parallel data mining algorithms are the major obstacles to exploit the computational power of large-scale systems. This work investigates how non-uniform data distributions can be exploited to remove the global communication requirement and to reduce the communication cost in iterative parallel data mining algorithms. In particular, the analysis focuses on one of the most influential and popular data mining methods, the k-means algorithm for cluster analysis. The straightforward parallel formulation of the k-means algorithm requires a global reduction operation at each iteration step, which hinders its scalability. This work studies a different parallel formulation of the algorithm where the requirement of global communication can be relaxed while still providing the exact solution of the centralised k-means algorithm. The proposed approach exploits a non-uniform data distribution which can be either found in real world distributed applications or can be induced by means of multi-dimensional binary search trees. The approach can also be extended to accommodate an approximation error which allows a further reduction of the communication costs.
Resumo:
By the mid-1930s the major Hollywood studios had developed extensive networks of distribution subsidiaries across five continents. This article focuses on the operation of American film distributors in Australia – one of Hollywood's largest foreign markets. Drawing on two unique primary datasets, the article compares and investigates film distribution in Sydney's first-run and suburban-run markets. It finds that the subsidiaries of US film companies faced a greater liability of foreignness in the city centre market than in the suburban one. Our data support the argument that film audiences in local or suburban cinema markets were more receptive to Hollywood entertainment than those in metropolitan centres.
Resumo:
Massive Open Online Courses (MOOCs) are a new addition to the open educational provision. They are offered mainly by prestigious universities on various commercial and non-commercial MOOC platforms allowing anyone who is interested to experience the world class teaching practiced in these universities. MOOCs have attracted wide interest from around the world. However, learner demographics in MOOCs suggest that some demographic groups are underrepresented. At present MOOCs seem to be better serving the continuous professional development sector.
Resumo:
The abundance and distribution of coccinellids in non-crop habitats was studied using removal sampling and visual observation. Coccinellids were most frequently found on grassland habitats. Coccinellid abundance appeared to be most strongly correlated with the percentage ground cover of thistle, grasses and nettles. The most commonly collected coccinellids were Coccinella septempunctata and Adalia bipunctata comprising 60% and 35% of the catches respectively. Most coccinellids were found on Rubus spp. with nettles (Urtica dioica) and grasses being the next most favoured plant species. Adalia bipunctata was the most commonly found coccinellid species on nettles and birch (Betula spp.) whereas C. septempunctata was the most commonly found species on grasses, Rubus spp, and oak (Quercus spp.). These results are discussed in light of current thinking on the importance of "island" habitats as pali of an integrated pest management programme.
Resumo:
The objective of this study was to determine the concentration of total selenium (Se) and proportions of total Se comprised as selenomethionine (SeMet) and selenocysteine (SeCys) in the tissues of female turkeys offered diets containing graded additions of selenized-enriched yeast (SY), or sodium selenite (SS). Oxidative stability and tissue glutathione peroxidase (GSH-Px) activity of breast and thigh muscle were assessed at 0 and 10 days post mortem. A total of 216 female turkey poults were enrolled in the study. A total of 24 birds were euthanized at the start of the study and samples of blood, breast, thigh, heart, liver, kidney and gizzard were collected for determination of total Se. Remaining birds were blocked by live weight and randomly allocated to one of four dietary treatments(n548 birds/treatment) that differed either in Se source (SY v. SS) or dose (Con [0.2 mg/kg total Se], SY-L and SS-L [0.3mg/kg total Se as SY and SS, respectively] and SY-H [0.45mg total Se/kg]). Following 42 and 84 days of treatment 24 birds per treatment were euthanized and samples of blood, breast, thigh, heart, liver, kidney and gizzard were retained for determination of total Se and the proportion of total Se comprised as SeMet or SeCys. Whole blood GSH-Px activity was determined at each time point. Tissue GSH-Px activity and thiobarbituric acid reactive substances were determined in breast and thigh tissue at the end of the study. There were responses (P,0.001) in all tissues to the graded addition of dietary Se, although rates of accumulation were highest in birds offered SY. There were notable differences between tissue types and treatments in the distribution of SeMet and SeCys, and the activity of tissue and erythrocyte GSH-Px (P,0.05). SeCys was the predominant form of Se in visceral tissue and SeMet the predominant form in breast tissue. SeCys contents were greater in thigh when compared with breast tissue. Muscle tissue GSH-Px activities mirrored SeCys contents. Despite treatment differences in tissue GSH-Px activity, there were no effects of treatment on any meat quality parameter.
Resumo:
The rapid growth of non-listed real estate funds over the last several years has contributed towards establishing this sector as a major investment vehicle for gaining exposure to commercial real estate. Academic research has not kept up with this development, however, as there are still only a few published studies on non-listed real estate funds. This paper aims to identify the factors driving the total return over a seven-year period. Influential factors tested in our analysis include the weighted underlying direct property returns in each country and sector as well as fund size, investment style gearing and the distribution yield. Furthermore, we analyze the interaction of non-listed real estate funds with the performance of the overall economy and that of competing asset classes and found that lagged GDP growth and stock market returns as well as contemporaneous government bond rates are significant and positive predictors of annual fund performance.
Resumo:
This paper review the literature on the distribution of commercial real estate returns. There is growing evidence that the assumption of normality in returns is not safe. Distributions are found to be peaked, fat-tailed and, tentatively, skewed. There is some evidence of compound distributions and non-linearity. Public traded real estate assets (such as property company or REIT shares) behave in a fashion more similar to other common stocks. However, as in equity markets, it would be unwise to assume normality uncritically. Empirical evidence for UK real estate markets is obtained by applying distribution fitting routines to IPD Monthly Index data for the aggregate index and selected sub-sectors. It is clear that normality is rejected in most cases. It is often argued that observed differences in real estate returns are a measurement issue resulting from appraiser behaviour. However, unsmoothing the series does not assist in modelling returns. A large proportion of returns are close to zero. This would be characteristic of a thinly-traded market where new information arrives infrequently. Analysis of quarterly data suggests that, over longer trading periods, return distributions may conform more closely to those found in other asset markets. These results have implications for the formulation and implementation of a multi-asset portfolio allocation strategy.
Resumo:
Although financial theory rests heavily upon the assumption that asset returns are normally distributed, value indices of commercial real estate display significant departures from normality. In this paper, we apply and compare the properties of two recently proposed regime switching models for value indices of commercial real estate in the US and the UK, both of which relax the assumption that observations are drawn from a single distribution with constant mean and variance. Statistical tests of the models' specification indicate that the Markov switching model is better able to capture the non-stationary features of the data than the threshold autoregressive model, although both represent superior descriptions of the data than the models that allow for only one state. Our results have several implications for theoretical models and empirical research in finance.