85 resultados para Latent Threshold


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we address issues relating to vulnerability to economic exclusion and levels of economic exclusion in Europe. We do so by applying latent class models to data from the European Community Household Panel for thirteen countries. This approach allows us to distinguish between vulnerability to economic exclusion and exposure to multiple deprivation at a particular point in time. The results of our analysis confirm that in every country it is possible to distinguish between a vulnerable and a non-vulnerable class. Association between income poverty, life-style deprivation and subjective economic strain is accounted for by allocating individuals to the categories of this latent variable. The size of the vulnerable class varies across countries in line with expectations derived from welfare regime theory. Between class differentiation is weakest in social democratic regimes but otherwise the pattern of differentiation is remarkably similar. The key discriminatory factor is life-style deprivation, followed by income and economic strain. Social class and employment status are powerful predictors of latent class membership in all countries but the strength of these relationships varies across welfare regimes. Individual biography and life events are also related to vulnerability to economic exclusion. However, there is no evidence that they account for any significant part of the socio-economic structuring of vulnerability and no support is found for the hypothesis that social exclusion has come to transcend class boundaries and become a matter of individual biography. However, the extent of socio-economic structuring does vary substantially across welfare regimes. Levels of economic exclusion, in the sense of current exposure to multiple deprivation, also vary systematically by welfare regime and social class. Taking both vulnerability to economic exclusion and levels of exclusion into account suggests that care should be exercised in moving from evidence on the dynamic nature of poverty and economic exclusion to arguments relating to the superiority of selective over universal social policies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A criterion is derived for delamination onset in transversely isotropic laminated plates under small mass, high velocity impact. The resulting delamination threshold load is about 21% higher than the corresponding quasi-static threshold load. A closed form approximation for the peak impact load is then used to predict the delamination threshold velocity. The theory is validated for a range of test cases by comparison with 3D finite element simulation using LS-DYNA and a newly developed interface element to model delamination onset and growth. The predicted delamination threshold loads and velocities are in very good agreement with the finite element simulations. Good agreement is also shown in a comparison with published experimental results. In contrast to quasi-static impacts, delamination growth occurs under a rapidly decreasing load. Inclusion of finite thickness effects and a proper description of the contact stiffness are found to be vital for accurate prediction of the delamination threshold velocity

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: To compare the ability of Glaucoma Progression Analysis (GPA) and Threshold Noiseless Trend (TNT) programs to detect visual-field deterioration.

METHODS: Patients with open-angle glaucoma followed for a minimum of 2 years and a minimum of seven reliable visual fields were included. Progression was assessed subjectively by four masked glaucoma experts, and compared with GPA and TNT results. Each case was judged to be stable, deteriorated or suspicious of deterioration

RESULTS: A total of 56 eyes of 42 patients were followed with a mean of 7.8 (SD 1.0) tests over an average of 5.5 (1.04) years. Interobserver agreement to detect progression was good (mean kappa = 0.57). Progression was detected in 10-19 eyes by the experts, in six by GPA and in 24 by TNT. Using the consensus expert opinion as the gold standard (four clinicians detected progression), the GPA sensitivity and specificity were 75% and 83%, respectively, while the TNT sensitivity and specificity was 100% and 77%, respectively.

CONCLUSION: TNT showed greater concordance with the experts than GPA in the detection of visual-field deterioration. GPA showed a high specificity but lower sensitivity, mainly detecting cases of high focality and pronounced mean defect slopes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: The authors sought to quantify neighboring and distant interpoint correlations of threshold values within the visual field in patients with glaucoma. Methods: Visual fields of patients with confirmed or suspected glaucoma were analyzed (n = 255). One eye per patient was included. Patients were examined using the 32 program of the Octopus 1-2-3. Linear regression analysis among each of the locations and the rest of the points of the visual field was performed, and the correlation coefficient was calculated. The degree of correlation was categorized as high (r > 0.66), moderate (0.66 = r > 0.33), or low (r = 0.33). The standard error of threshold estimation was calculated. Results: Most locations of the visual field had high and moderate correlations with neighboring points and with distant locations corresponding to the same nerve fiber bundle. Locations of the visual field had low correlations with those of the opposite hemifield, with the exception of locations temporal to the blind spot. The standard error of threshold estimation increased from 0.6 to 0.9 dB with an r reduction of 0.1. Conclusion: Locations of the visual field have highest interpoint correlation with neighboring points and with distant points in areas corresponding to the distribution of the retinal nerve fiber layer. The quantification of interpoint correlations may be useful in the design and interpretation of visual field tests in patients with glaucoma.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To evaluate the sensitivity and specificity of the screening mode of the Humphrey-Welch Allyn frequency-doubling technology (FDT), Octopus tendency-oriented perimetry (TOP), and the Humphrey Swedish Interactive Threshold Algorithm (SITA)-fast (HSF) in patients with glaucoma. DESIGN: A comparative consecutive case series. METHODS: This was a prospective study which took place in the glaucoma unit of an academic department of ophthalmology. One eye of 70 consecutive glaucoma patients and 28 age-matched normal subjects was studied. Eyes were examined with the program C-20 of FDT, G1-TOP, and 24-2 HSF in one visit and in random order. The gold standard for glaucoma was presence of a typical glaucomatous optic disk appearance on stereoscopic examination, which was judged by a glaucoma expert. The sensitivity and specificity, positive and negative predictive value, and receiver operating characteristic (ROC) curves of two algorithms for the FDT screening test, two algorithms for TOP, and three algorithms for HSF, as defined before the start of this study, were evaluated. The time required for each test was also analyzed. RESULTS: Values for area under the ROC curve ranged from 82.5%-93.9%. The largest area (93.9%) under the ROC curve was obtained with the FDT criteria, defining abnormality as presence of at least one abnormal location. Mean test time was 1.08 ± 0.28 minutes, 2.31 ± 0.28 minutes, and 4.14 ± 0.57 minutes for the FDT, TOP, and HSF, respectively. The difference in testing time was statistically significant (P <.0001). CONCLUSIONS: The C-20 FDT, G1-TOP, and 24-2 HSF appear to be useful tools to diagnose glaucoma. The test C-20 FDT and G1-TOP take approximately 1/4 and 1/2 of the time taken by 24 to 2 HSF. © 2002 by Elsevier Science Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a low-complexity closed-loop spatial multiplexing method with limited feedback over multi-input-multi-output (MIMO) fading channels. The transmit adaptation is simply performed by selecting transmit antennas (or substreams) by comparing their signal-to-noise ratios to a given threshold with a fixed nonadaptive constellation and fixed transmit power per substream. We analyze the performance of the proposed system by deriving closed-form expressions for spectral efficiency, average transmit power, and bit error rate (BER). Depending on practical system design constraints, the threshold is chosen to maximize the spectral efficiency (or minimize the average BER) subject to average transmit power and average BER (or spectral efficiency) constraints, respectively. We present numerical and Monte Carlo simulation results that validate our analysis. Compared to open-loop spatial multiplexing and other approaches that select the best antenna subset in spatial multiplexing, the numerical results illustrate that the proposed technique obtains significant power gains for the same BER and spectral efficiency. We also provide numerical results that show improvement over rate-adaptive orthogonal space-time block coding, which requires highly complex constellation adaptation. We analyze the impact of feedback delay using analytical and Monte Carlo approaches. The proposed approach is arguably the simplest possible adaptive spatial multiplexing system from an implementation point of view. However, our approach and analysis can be extended to other systems using multiple constellations and power levels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bayesian probabilistic analysis offers a new approach to characterize semantic representations by inferring the most likely feature structure directly from the patterns of brain activity. In this study, infinite latent feature models [1] are used to recover the semantic features that give rise to the brain activation vectors when people think about properties associated with 60 concrete concepts. The semantic features recovered by ILFM are consistent with the human ratings of the shelter, manipulation, and eating factors that were recovered by a previous factor analysis. Furthermore, different areas of the brain encode different perceptual and conceptual features. This neurally-inspired semantic representation is consistent with some existing conjectures regarding the role of different brain areas in processing different semantic and perceptual properties. © 2012 Springer-Verlag.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – Under investigation is Prosecco wine, a sparkling white wine from North-East Italy.
Information collection on consumer perceptions is particularly relevant when developing market
strategies for wine, especially so when local production and certification of origin play an important
role in the wine market of a given district, as in the case at hand. Investigating and characterizing the
structure of preference heterogeneity become crucial steps in every successful marketing strategy. The
purpose of this paper is to investigate the sources of systematic differences in consumer preferences.
Design/methodology/approach – The paper explores the effect of inclusion of answers to
attitudinal questions in a latent class regression model of stated willingness to pay (WTP) for this
specialty wine. These additional variables were included in the membership equations to investigate
whether they could be of help in the identification of latent classes. The individual specific WTPs from
the sampled respondents were then derived from the best fitting model and examined for consistency.
Findings – The use of answers to attitudinal question in the latent class regression model is found to
improve model fit, thereby helping in the identification of latent classes. The best performing model
obtained makes use of both attitudinal scores and socio-economic covariates identifying five latent
classes. A reasonable pattern of differences in WTP for Prosecco between CDO and TGI types were
derived from this model.
Originality/value – The approach appears informative and promising: attitudes emerge as
important ancillary indicators of taste differences for specialty wines. This might be of interest per se
and of practical use in market segmentation. If future research shows that these variables can be of use
in other contexts, it is quite possible that more attitudinal questions will be routinely incorporated in
structural latent class hedonic models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We employ the time-dependent R-matrix (TDRM) method to calculate anisotropy parameters for positive and negative sidebands of selected harmonics generated by two-color two-photon above-threshold ionization of argon. We consider odd harmonics of an 800-nm field ranging from the 13th to 19th harmonic, overlapped by a fundamental 800-nm IR field. The anisotropy parameters obtained using the TDRM method are compared with those obtained using a second-order perturbation theory with a model potential approach and a soft photon approximation approach. Where available, a comparison is also made to published experimental results. All three theoretical approaches provide similar values for anisotropy parameters. The TDRM approach obtains values that are closest to published experimental values. At high photon energies, the differences between each of the theoretical methods become less significant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has long been substantial interest in understanding consumer food choices, where a key complexity in this context is the potentially large amount of heterogeneity in tastes across individual consumers, as well as the role of underlying attitudes towards food and cooking. The present paper underlines that both tastes and attitudes are unobserved, and makes the case for a latent variable treatment of these components. Using empirical data collected in Northern Ireland as part of a wider study to elicit intra-household trade-offs between home-cooked meal options, we show how these latent sensitivities and attitudes drive both the choice behaviour as well as the answers to supplementary questions. We find significant heterogeneity across respondents in these underlying factors and show how incorporating them in our models leads to important insights into preferences. 

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The environmental quality of land can be assessed by calculating relevant threshold values, which differentiate between concentrations of elements resulting from geogenic and diffuse anthropogenic sources and concentrations generated by point sources of elements. A simple process allowing the calculation of these typical threshold values (TTVs) was applied across a region of highly complex geology (Northern Ireland) to six elements of interest; arsenic, chromium, copper, lead, nickel and vanadium. Three methods for identifying domains (areas where a readily identifiable factor can be shown to control the concentration of an element) were used: k-means cluster analysis, boxplots and empirical cumulative distribution functions (ECDF). The ECDF method was most efficient at determining areas of both elevated and reduced concentrations and was used to identify domains in this investigation. Two statistical methods for calculating normal background concentrations (NBCs) and upper limits of geochemical baseline variation (ULBLs), currently used in conjunction with legislative regimes in the UK and Finland respectively, were applied within each domain. The NBC methodology was constructed to run within a specific legislative framework, and its use on this soil geochemical data set was influenced by the presence of skewed distributions and outliers. In contrast, the ULBL methodology was found to calculate more appropriate TTVs that were generally more conservative than the NBCs. TTVs indicate what a "typical" concentration of an element would be within a defined geographical area and should be considered alongside the risk that each of the elements pose in these areas to determine potential risk to receptors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Creep of Steel Fiber Reinforced Concrete (SFRC) under flexural loads in the cracked state and to what extent different factors determine creep behaviour are quite understudied topics within the general field of SFRC mechanical properties. A series of prismatic specimens have been produced and subjected to sustained flexural loads. The effect of a number of variables (fiber length and slenderness, fiber content, and concrete compressive strength) has been studied in a comprehensive fashion. Twelve response variables (creep parameters measured at different times) have been retained as descriptive of flexural creep behaviour. Multivariate techniques have been used: the experimental results have been projected to their latent structure by means of Principal Components Analysis (PCA), so that all the information has been reduced to a set of three latent variables. They have been related to the variables considered and statistical significance of their effects on creep behaviour has been assessed. The result is a unified view on the effects of the different variables considered upon creep behaviour: fiber content and fiber slenderness have been detected to clearly modify the effect that load ratio has on flexural creep behaviour.