941 resultados para harmonic mean
Resumo:
BACKGROUND: Second Harmonic Generation (SHG) microscopy recently appeared as an efficient optical imaging technique to probe unstained collagen-rich tissues like cornea. Moreover, corneal remodeling occurs in many diseases and precise characterization requires overcoming the limitations of conventional techniques. In this work, we focus on diabetes, which affects hundreds of million people worldwide and most often leads to diabetic retinopathy, with no early diagnostic tool. This study then aims to establish the potential of SHG microscopy for in situ detection and characterization of hyperglycemia-induced abnormalities in the Descemet's membrane, in the posterior cornea. METHODOLOGY/PRINCIPAL FINDINGS: We studied corneas from age-matched control and Goto-Kakizaki rats, a spontaneous model of type 2 diabetes, and corneas from human donors with type 2 diabetes and without any diabetes. SHG imaging was compared to confocal microscopy, to histology characterization using conventional staining and transmitted light microscopy and to transmission electron microscopy. SHG imaging revealed collagen deposits in the Descemet's membrane of unstained corneas in a unique way compared to these gold standard techniques in ophthalmology. It provided background-free images of the three-dimensional interwoven distribution of the collagen deposits, with improved contrast compared to confocal microscopy. It also provided structural capability in intact corneas because of its high specificity to fibrillar collagen, with substantially larger field of view than transmission electron microscopy. Moreover, in vivo SHG imaging was demonstrated in Goto-Kakizaki rats. CONCLUSIONS/SIGNIFICANCE: Our study shows unambiguously the high potential of SHG microscopy for three-dimensional characterization of structural abnormalities in unstained corneas. Furthermore, our demonstration of in vivo SHG imaging opens the way to long-term dynamical studies. This method should be easily generalized to other structural remodeling of the cornea and SHG microscopy should prove to be invaluable for in vivo corneal pathological studies.
Resumo:
A Guide for Staff
Resumo:
This paper conducts an empirical analysis of the relationship between wage inequality, employment structure, and returns to education in urban areas of Mexico during the past two decades (1987-2008). Applying Melly’s (2005) quantile regression based decomposition, we find that changes in wage inequality have been driven mainly by variations in educational wage premia. Additionally, we find that changes in employment structure, including occupation and firm size, have played a vital role. This evidence seems to suggest that the changes in wage inequality in urban Mexico cannot be interpreted in terms of a skill-biased change, but rather they are the result of an increasing demand for skills during that period.
Resumo:
Kriging is an interpolation technique whose optimality criteria are based on normality assumptions either for observed or for transformed data. This is the case of normal, lognormal and multigaussian kriging.When kriging is applied to transformed scores, optimality of obtained estimators becomes a cumbersome concept: back-transformed optimal interpolations in transformed scores are not optimal in the original sample space, and vice-versa. This lack of compatible criteria of optimality induces a variety of problems in both point and block estimates. For instance, lognormal kriging, widely used to interpolate positivevariables, has no straightforward way to build consistent and optimal confidence intervals for estimates.These problems are ultimately linked to the assumed space structure of the data support: for instance, positive values, when modelled with lognormal distributions, are assumed to be embedded in the whole real space, with the usual real space structure and Lebesgue measure
Resumo:
The present study examines the Five-Factor Model (FFM) of personality and locus of control in French-speaking samples in Burkina Faso (N = 470) and Switzerland (Ns = 1,090, 361), using the Revised NEO Personality Inventory (NEO-PI-R) and Levenson's Internality, Powerful others, and Chance (IPC) scales. Alpha reliabilities were consistently lower in Burkina Faso, but the factor structure of the NEO-PI-R was replicated in both cultures. The intended three-factor structure of the IPC could not be replicated, although a two-factor solution was replicable across the two samples. Although scalar equivalence has not been demonstrated, mean level comparisons showed the hypothesized effects for most of the five factors and locus of control; Burkinabè scored higher in Neuroticism than anticipated. Findings from this African sample generally replicate earlier results from Asian and Western cultures, and are consistent with a biologically-based theory of personality.
Resumo:
Most people hold beliefs about personality characteristics typical members of their own and others' cultures. These perceptions of national character may be generalizations from personal experience, stereotypes with a "kernel of truth", or inaccurate stereotypes. We obtained national character ratings of 3989 people from 49 cultures and compared them with the average personality scores of culture members assessed by observer ratings and self-reports. National character ratings were reliable but did not converge with assessed traits. Perceptions of national character thus appear to be unfounded stereotypes that may serve the function of maintaining a national identity.
Resumo:
We propose a segmentation method based on the geometric representation of images as 2-D manifolds embedded in a higher dimensional space. The segmentation is formulated as a minimization problem, where the contours are described by a level set function and the objective functional corresponds to the surface of the image manifold. In this geometric framework, both data-fidelity and regularity terms of the segmentation are represented by a single functional that intrinsically aligns the gradients of the level set function with the gradients of the image and results in a segmentation criterion that exploits the directional information of image gradients to overcome image inhomogeneities and fragmented contours. The proposed formulation combines this robust alignment of gradients with attractive properties of previous methods developed in the same geometric framework: 1) the natural coupling of image channels proposed for anisotropic diffusion and 2) the ability of subjective surfaces to detect weak edges and close fragmented boundaries. The potential of such a geometric approach lies in the general definition of Riemannian manifolds, which naturally generalizes existing segmentation methods (the geodesic active contours, the active contours without edges, and the robust edge integrator) to higher dimensional spaces, non-flat images, and feature spaces. Our experiments show that the proposed technique improves the segmentation of multi-channel images, images subject to inhomogeneities, and images characterized by geometric structures like ridges or valleys.
Resumo:
There is almost not a case in exploration geology, where the studied data doesn’tincludes below detection limits and/or zero values, and since most of the geological dataresponds to lognormal distributions, these “zero data” represent a mathematicalchallenge for the interpretation.We need to start by recognizing that there are zero values in geology. For example theamount of quartz in a foyaite (nepheline syenite) is zero, since quartz cannot co-existswith nepheline. Another common essential zero is a North azimuth, however we canalways change that zero for the value of 360°. These are known as “Essential zeros”, butwhat can we do with “Rounded zeros” that are the result of below the detection limit ofthe equipment?Amalgamation, e.g. adding Na2O and K2O, as total alkalis is a solution, but sometimeswe need to differentiate between a sodic and a potassic alteration. Pre-classification intogroups requires a good knowledge of the distribution of the data and the geochemicalcharacteristics of the groups which is not always available. Considering the zero valuesequal to the limit of detection of the used equipment will generate spuriousdistributions, especially in ternary diagrams. Same situation will occur if we replace thezero values by a small amount using non-parametric or parametric techniques(imputation).The method that we are proposing takes into consideration the well known relationshipsbetween some elements. For example, in copper porphyry deposits, there is always agood direct correlation between the copper values and the molybdenum ones, but whilecopper will always be above the limit of detection, many of the molybdenum values willbe “rounded zeros”. So, we will take the lower quartile of the real molybdenum valuesand establish a regression equation with copper, and then we will estimate the“rounded” zero values of molybdenum by their corresponding copper values.The method could be applied to any type of data, provided we establish first theircorrelation dependency.One of the main advantages of this method is that we do not obtain a fixed value for the“rounded zeros”, but one that depends on the value of the other variable.Key words: compositional data analysis, treatment of zeros, essential zeros, roundedzeros, correlation dependency
Resumo:
Stone groundwood (SGW) is a fibrous matter commonly prepared in a high yield process, and mainly used for papermaking applications. In this work, the use of SGW fibers is explored as reinforcing element of polypropylene (PP) composites. Due to its chemical and superficial features, the use of coupling agents is needed for a good adhesion and stress transfer across the fiber-matrix interface. The intrinsic strength of the reinforcement is a key parameter to predict the mechanical properties of the composite and to perform an interface analysis. The main objective of the present work was the determination of the intrinsic tensile strength of stone groundwood fibers. Coupled and non-coupled PP composites from stone groundwood fibers were prepared. The influence of the surface morphology and the quality at interface on the final properties of the composite was analyzed and compared to that of fiberglass PP composites. The intrinsic tensile properties of stone groundwood fibers, as well as the fiber orientation factor and the interfacial shear strength of the current composites were determined
Resumo:
Nonlinear optical nanocrystals have been recently introduced as a promising alternative to fluorescent probes for multiphoton microscopy. We present for the first time a complete survey of the properties of five nanomaterials (KNbO(3), LiNbO(3), BaTiO(3), KTP, and ZnO), describing their preparation and stabilization and providing quantitative estimations of their nonlinear optical response. In the light of their prospective use as biological and clinical markers, we assess their biocompatibility on human healthy and cancerous cell lines. Finally, we demonstrate the great potential for cell imaging of these inherently nonlinear probes in terms of optical contrast, wavelength flexibility, and signal photostability.
Resumo:
A study was conducted on the methods of basis set superposition error (BSSE)-free geometry optimization and frequency calculations in clusters larger than a dimer. In particular, three different counterpoise schemes were critically examined. It was shown that the counterpoise-corrected supermolecule energy can be easily obtained in all the cases by using the many-body partitioning of energy
Resumo:
This paper investigates a simple procedure to estimate robustly the mean of an asymmetric distribution. The procedure removes the observations which are larger or smaller than certain limits and takes the arithmetic mean of the remaining observations, the limits being determined with the help of a parametric model, e.g., the Gamma, the Weibull or the Lognormal distribution. The breakdown point, the influence function, the (asymptotic) variance, and the contamination bias of this estimator are explored and compared numerically with those of competing estimates.
Resumo:
We establish the validity of subsampling confidence intervals for themean of a dependent series with heavy-tailed marginal distributions.Using point process theory, we study both linear and nonlinear GARCH-liketime series models. We propose a data-dependent method for the optimalblock size selection and investigate its performance by means of asimulation study.