890 resultados para phi value analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The issue of levels of participation in post-compulsory education has been emphasised by the current policy initiatives to increase the age to which some form of participation is compulsory. One of the acknowledged weaknesses of research in the field of children's intentions with regard to participation is the lack of longitudinal data. This paper offers a longitudinal analysis using the Youth Survey from the British Household Panel Survey. The results show that most children can express intentions with regard to future participation very early in their secondary school careers and that these intentions are good predictors of actual behaviour five years later. Intentions to stay on are more consistent than intentions to leave and most children who finally leave at 16 have at some point said they want to remain in education post-16. The strongest association with participation levels is attainment at GCSE. However, there are also influences of gender and parental background and these remain, even after attainment is held constant. The results show the value of focusing on intentions for participation at a very early stage of children's school careers and also the importance of current attempts to reform curriculum and assessment for the 14-19 age group.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Cannabinoids from cannabis (Cannabis sativa) are anti-inflammatory and have inhibitory effects on the proliferation of a number of tumorigenic cell lines, some of which are mediated via cannabinoid receptors. Cannabinoid (CB) receptors are present in human skin and anandamide, an endogenous CB receptor ligand, inhibits epidermal keratinocyte differentiation. Psoriasis is an inflammatory disease also characterised in part by epidermal keratinocyte hyper-proliferation. Objective: We investigated the plant cannabinoids Delta-9 tetrahydrocannabinol, cannabidiol, cannabinol and cannabigerol for their ability to inhibit the proliferation of a hyper-proliferating human keratinocyte cell line and for any involvement of cannabinoid receptors. Methods: A keratinocyte proliferation assay was used to assess the effect of treatment with cannabinoids. Cell integrity and metabolic competence confirmed using lactate-dehydrogenase and adenosine tri-phosphate assays. To determine the involvement of the receptors, specific agonist and antagonist were used in conjunction with some phytocannabinoids. Western blot and RT-PCR analysis confirmed presence of CB1 and CB2 receptors. Results: The cannabinoids tested all inhibited keratinocyte proliferation in a concentration-dependent manner. The selective CB2 receptor agonists JWH015 and BML190 elicited only partial inhibition, the non-selective CB agonist HU210 produced a concentration-dependent response, the activity of theses agonists were not blocked by either C81 /C82 antagonists. Conclusion: The results indicate that while CB receptors may have a circumstantial role in keratinocyte proliferation, they do not contribute significantly to this process. Our results show that cannabinoids inhibit keratinocyte proliferation, and therefore support a potential role for cannabinoids in the treatment of psoriasis. (c) 2006 Japanese Society for Investigative Dermatology. Published by Elsevier Ireland Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Robot-mediated therapies offer a new approach to neurorehabilitation. This paper analyses the Fugl-Meyer data from the Gentle/S project and finds that the two intervention phases (sling suspension and robot mediated therapy) have approximately equal value to the further recovery of chronic stroke subjects (on average 27 months post stroke). Both sling suspension and robot mediated interventions show a recovery over baseline and further work is needed to establish the common factors in treatment, and to establish intervention protocols for each that will give individual subjects a maximum level of recovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce transreal analysis as a generalisation of real analysis. We find that the generalisation of the real exponential and logarithmic functions is well defined for all transreal numbers. Hence, we derive well defined values of all transreal powers of all non-negative transreal numbers. In particular, we find a well defined value for zero to the power of zero. We also note that the computation of products via the transreal logarithm is identical to the transreal product, as expected. We then generalise all of the common, real, trigonometric functions to transreal functions and show that transreal (sin x)/x is well defined everywhere. This raises the possibility that transreal analysis is total, in other words, that every function and every limit is everywhere well defined. If so, transreal analysis should be an adequate mathematical basis for analysing the perspex machine - a theoretical, super-Turing machine that operates on a total geometry. We go on to dispel all of the standard counter "proofs" that purport to show that division by zero is impossible. This is done simply by carrying the proof through in transreal arithmetic or transreal analysis. We find that either the supposed counter proof has no content or else that it supports the contention that division by zero is possible. The supposed counter proofs rely on extending the standard systems in arbitrary and inconsistent ways and then showing, tautologously, that the chosen extensions are not consistent. This shows only that the chosen extensions are inconsistent and does not bear on the question of whether division by zero is logically possible. By contrast, transreal arithmetic is total and consistent so it defeats any possible "straw man" argument. Finally, we show how to arrange that a function has finite or else unmeasurable (nullity) values, but no infinite values. This arithmetical arrangement might prove useful in mathematical physics because it outlaws naked singularities in all equations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: This paper presents a detailed study of fractal-based methods for texture characterization of mammographic mass lesions and architectural distortion. The purpose of this study is to explore the use of fractal and lacunarity analysis for the characterization and classification of both tumor lesions and normal breast parenchyma in mammography. Materials and methods: We conducted comparative evaluations of five popular fractal dimension estimation methods for the characterization of the texture of mass lesions and architectural distortion. We applied the concept of lacunarity to the description of the spatial distribution of the pixel intensities in mammographic images. These methods were tested with a set of 57 breast masses and 60 normal breast parenchyma (dataset1), and with another set of 19 architectural distortions and 41 normal breast parenchyma (dataset2). Support vector machines (SVM) were used as a pattern classification method for tumor classification. Results: Experimental results showed that the fractal dimension of region of interest (ROIs) depicting mass lesions and architectural distortion was statistically significantly lower than that of normal breast parenchyma for all five methods. Receiver operating characteristic (ROC) analysis showed that fractional Brownian motion (FBM) method generated the highest area under ROC curve (A z = 0.839 for dataset1, 0.828 for dataset2, respectively) among five methods for both datasets. Lacunarity analysis showed that the ROIs depicting mass lesions and architectural distortion had higher lacunarities than those of ROIs depicting normal breast parenchyma. The combination of FBM fractal dimension and lacunarity yielded the highest A z value (0.903 and 0.875, respectively) than those based on single feature alone for both given datasets. The application of the SVM improved the performance of the fractal-based features in differentiating tumor lesions from normal breast parenchyma by generating higher A z value. Conclusion: FBM texture model is the most appropriate model for characterizing mammographic images due to self-affinity assumption of the method being a better approximation. Lacunarity is an effective counterpart measure of the fractal dimension in texture feature extraction in mammographic images. The classification results obtained in this work suggest that the SVM is an effective method with great potential for classification in mammographic image analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A case study on the tendering process and cost/time performance of a public building project in Ghana is conducted. Competitive bids submitted by five contractors for the project, in which contractors were required to prepare their own quantities, were analyzed to compare differences in their pricing levels and risk/requirement perceptions. Queries sent to the consultants at the tender stage were also analyzed to identify the significant areas of concern to contractors in relation to the tender documentation. The five bidding prices were significantly different. The queries submitted for clarifications were significantly different, although a few were similar. Using a before-and-after experiment, the expected cost/time estimate at the start of the project was compared to the actual cost/time values, i.e. what happened in the actual construction phase. The analysis showed that the project exceeded its expected cost by 18% and its planned time by 210%. Variations and inadequate design were the major reasons. Following an exploration of these issues, an alternative tendering mechanism is recommended to clients. A shift away from the conventional approach of awarding work based on price, and serious consideration of alternative procurement routes can help clients in Ghana obtain better value for money on their projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigates the price effects of environmental certification on commercial real estate assets. It is argued that there are likely to be three main drivers of price differences between certified and noncertified buildings. These are additional occupier benefits, lower holding costs for investors and a lower risk premium. Drawing upon the CoStar database of U.S. commercial real estate assets, hedonic regression analysis is used to measure the effect of certification on both rent and price. The results suggest that, compared to buildings in the same submarkets, eco-certified buildings have both a rental and sale price premium.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present an extensive thermodynamic analysis of a hysteresis experiment performed on a simplified yet Earth-like climate model. We slowly vary the solar constant by 20% around the present value and detect that for a large range of values of the solar constant the realization of snowball or of regular climate conditions depends on the history of the system. Using recent results on the global climate thermodynamics, we show that the two regimes feature radically different properties. The efficiency of the climate machine monotonically increases with decreasing solar constant in present climate conditions, whereas the opposite takes place in snowball conditions. Instead, entropy production is monotonically increasing with the solar constant in both branches of climate conditions, and its value is about four times larger in the warm branch than in the corresponding cold state. Finally, the degree of irreversibility of the system, measured as the fraction of excess entropy production due to irreversible heat transport processes, is much higher in the warm climate conditions, with an explosive growth in the upper range of the considered values of solar constants. Whereas in the cold climate regime a dominating role is played by changes in the meridional albedo contrast, in the warm climate regime changes in the intensity of latent heat fluxes are crucial for determining the observed properties. This substantiates the importance of addressing correctly the variations of the hydrological cycle in a changing climate. An interpretation of the climate transitions at the tipping points based upon macro-scale thermodynamic properties is also proposed. Our results support the adoption of a new generation of diagnostic tools based on the second law of thermodynamics for auditing climate models and outline a set of parametrizations to be used in conceptual and intermediate-complexity models or for the reconstruction of the past climate conditions. Copyright © 2010 Royal Meteorological Society

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rainfall can be modeled as a spatially correlated random field superimposed on a background mean value; therefore, geostatistical methods are appropriate for the analysis of rain gauge data. Nevertheless, there are certain typical features of these data that must be taken into account to produce useful results, including the generally non-Gaussian mixed distribution, the inhomogeneity and low density of observations, and the temporal and spatial variability of spatial correlation patterns. Many studies show that rigorous geostatistical analysis performs better than other available interpolation techniques for rain gauge data. Important elements are the use of climatological variograms and the appropriate treatment of rainy and nonrainy areas. Benefits of geostatistical analysis for rainfall include ease of estimating areal averages, estimation of uncertainties, and the possibility of using secondary information (e.g., topography). Geostatistical analysis also facilitates the generation of ensembles of rainfall fields that are consistent with a given set of observations, allowing for a more realistic exploration of errors and their propagation in downstream models, such as those used for agricultural or hydrological forecasting. This article provides a review of geostatistical methods used for kriging, exemplified where appropriate by daily rain gauge data from Ethiopia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The overall operation and internal complexity of a particular production machinery can be depicted in terms of clusters of multidimensional points which describe the process states, the value in each point dimension representing a measured variable from the machinery. The paper describes a new cluster analysis technique for use with manufacturing processes, to illustrate how machine behaviour can be categorised and how regions of good and poor machine behaviour can be identified. The cluster algorithm presented is the novel mean-tracking algorithm, capable of locating N-dimensional clusters in a large data space in which a considerable amount of noise is present. Implementation of the algorithm on a real-world high-speed machinery application is described, with clusters being formed from machinery data to indicate machinery error regions and error-free regions. This analysis is seen to provide a promising step ahead in the field of multivariable control of manufacturing systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mean state, variability and extreme variability of the stratospheric polar vortices, with an emphasis on the Northern Hemisphere vortex, are examined using 2-dimensional moment analysis and Extreme Value Theory (EVT). The use of moments as an analysis to ol gives rise to information about the vortex area, centroid latitude, aspect ratio and kurtosis. The application of EVT to these moment derived quantaties allows the extreme variability of the vortex to be assessed. The data used for this study is ECMWF ERA-40 potential vorticity fields on interpolated isentropic surfaces that range from 450K-1450K. Analyses show that the most extreme vortex variability occurs most commonly in late January and early February, consistent with when most planetary wave driving from the troposphere is observed. Composites around sudden stratospheric warming (SSW) events reveal that the moment diagnostics evolve in statistically different ways between vortex splitting events and vortex displacement events, in contrast to the traditional diagnostics. Histograms of the vortex diagnostics on the 850K (∼10hPa) surface over the 1958-2001 period are fitted with parametric distributions, and show that SSW events comprise the majority of data in the tails of the distributions. The distribution of each diagnostic is computed on various surfaces throughout the depth of the stratosphere, and shows that in general the vortex becomes more circular with higher filamentation at the upper levels. The Northern Hemisphere (NH) and Southern Hemisphere (SH) vortices are also compared through the analysis of their respective vortex diagnostics, and confirm that the SH vortex is less variable and lacks extreme events compared to the NH vortex. Finally extreme value theory is used to statistically mo del the vortex diagnostics and make inferences about the underlying dynamics of the polar vortices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The differential phase (ΦDP) measured by polarimetric radars is recognized to be a very good indicator of the path integrated by rain. Moreover, if a linear relationship is assumed between the specific differential phase (KDP) and the specific attenuation (AH) and specific differential attenuation (ADP), then attenuation can easily be corrected. The coefficients of proportionality, γH and γDP, are, however, known to be dependent in rain upon drop temperature, drop shapes, drop size distribution, and the presence of large drops causing Mie scattering. In this paper, the authors extensively apply a physically based method, often referred to as the “Smyth and Illingworth constraint,” which uses the constraint that the value of the differential reflectivity ZDR on the far side of the storm should be low to retrieve the γDP coefficient. More than 30 convective episodes observed by the French operational C-band polarimetric Trappes radar during two summers (2005 and 2006) are used to document the variability of γDP with respect to the intrinsic three-dimensional characteristics of the attenuating cells. The Smyth and Illingworth constraint could be applied to only 20% of all attenuated rays of the 2-yr dataset so it cannot be considered the unique solution for attenuation correction in an operational setting but is useful for characterizing the properties of the strongly attenuating cells. The range of variation of γDP is shown to be extremely large, with minimal, maximal, and mean values being, respectively, equal to 0.01, 0.11, and 0.025 dB °−1. Coefficient γDP appears to be almost linearly correlated with the horizontal reflectivity (ZH), differential reflectivity (ZDR), and specific differential phase (KDP) and correlation coefficient (ρHV) of the attenuating cells. The temperature effect is negligible with respect to that of the microphysical properties of the attenuating cells. Unusually large values of γDP, above 0.06 dB °−1, often referred to as “hot spots,” are reported for 15%—a nonnegligible figure—of the rays presenting a significant total differential phase shift (ΔϕDP > 30°). The corresponding strongly attenuating cells are shown to have extremely high ZDR (above 4 dB) and ZH (above 55 dBZ), very low ρHV (below 0.94), and high KDP (above 4° km−1). Analysis of 4 yr of observed raindrop spectra does not reproduce such low values of ρHV, suggesting that (wet) ice is likely to be present in the precipitation medium and responsible for the attenuation and high phase shifts. Furthermore, if melting ice is responsible for the high phase shifts, this suggests that KDP may not be uniquely related to rainfall rate but can result from the presence of wet ice. This hypothesis is supported by the analysis of the vertical profiles of horizontal reflectivity and the values of conventional probability of hail indexes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The principle aim of this research is to elucidate the factors driving the total rate of return of non-listed funds using a panel data analytical framework. In line with previous results, we find that core funds exhibit lower yet more stable returns than value-added and, in particular, opportunistic funds, both cross-sectionally and over time. After taking into account overall market exposure, as measured by weighted market returns, the excess returns of value-added and opportunity funds are likely to stem from: high leverage, high exposure to development, active asset management and investment in specialized property sectors. A random effects estimation of the panel data model largely confirms the findings obtained from the fixed effects model. Again, the country and sector property effect shows the strongest significance in explaining total returns. The stock market variable is negative which hints at switching effects between competing asset classes. For opportunity funds, on average, the returns attributable to gearing are three times higher than those for value added funds and over five times higher than for core funds. Overall, there is relatively strong evidence indicating that country and sector allocation, style, gearing and fund size combinations impact on the performance of unlisted real estate funds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is widely accepted that equity return volatility increases more following negative shocks rather than positive shocks. However, much of value-at-risk (VaR) analysis relies on the assumption that returns are normally distributed (a symmetric distribution). This article considers the effect of asymmetries on the evaluation and accuracy of VaR by comparing estimates based on various models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Classical risk assessment approaches for animal diseases are influenced by the probability of release, exposure and consequences of a hazard affecting a livestock population. Once a pathogen enters into domestic livestock, potential risks of exposure and infection both to animals and people extend through a chain of economic activities related to producing, buying and selling of animals and products. Therefore, in order to understand economic drivers of animal diseases in different ecosystems and to come up with effective and efficient measures to manage disease risks from a country or region, the entire value chain and related markets for animal and product needs to be analysed to come out with practical and cost effective risk management options agreed by actors and players on those value chains. Value chain analysis enriches disease risk assessment providing a framework for interdisciplinary collaboration, which seems to be in increasing demand for problems concerning infectious livestock diseases. The best way to achieve this is to ensure that veterinary epidemiologists and social scientists work together throughout the process at all levels.