66 resultados para fractal segmentation
Resumo:
The general objective of the study was to empirically test a reciprocal model of job satisfaction and life satisfaction while controlling for some social demographic variables. 827 employees working in 34 car dealerships in Northern Quebec (56% responses rate) were surveyed. The multiple item questionnaires were analysed using correlation analysis, chi square and ANOVAs. Results show interesting patterns emerging for the relationships between job and life satisfaction of which 49.2% of all individuals have spillover, 43.5% compensation, and 7.3% segmentation type of relationships. Results, nonetheless, are far richer and the model becomes much more refined when social demographic indicators are taken into account. Globally, social demographic variables demonstrate some effects on each satisfaction individually but also on the interrelation (nature of the relations) between life and work satisfaction.
Resumo:
We study the earnings structure and the equilibrium assignment of workers when workers exert intra-firm spillovers on each other.We allow for arbitrary spillovers provided output depends on some aggregate index of workers' skill. Despite the possibility of increasing returns to skills, equilibrium typically exists. We show that equilibrium will typically be segregated; that the skill space can be partitioned into a set of segments and any firm hires from only one segment. Next, we apply the model to analyze the effect of information technology on segmentation and the distribution of income. There are two types of human capital, productivity and creativity, i.e. the ability to produce ideas that may be duplicated over a network. Under plausible assumptions, inequality rises and then falls when network size increases, and the poorest workers cannot lose. We also analyze the impact of an improvement in worker quality and of an increased international mobility of ideas.
Resumo:
Many workers believe that personal contacts are crucial for obtainingjobs in high-wage sectors. On the other hand, firms in high-wage sectorsreport using employee referrals because they help provide screening andmonitoring of new employees. This paper develops a matching model thatcan explain the link between inter-industry wage differentials and useof employee referrals. Referrals lower monitoring costs because high-effortreferees can exert peer pressure on co-workers, allowing firms to pay lowerefficiency wages. On the other hand, informal search provides fewer job andapplicant contacts than formal methods (e.g., newspaper ads). In equilibrium,the matching process generates segmentation in the labor market becauseof heterogeneity in the size of referral networks. Referrals match good high-paying jobs to well-connected workers, while formal methods matchless attractive jobs to less-connected workers. Industry-level data show apositive correlation between industry wage premia and use of employeereferrals. Moreover, evidence using the NLSY shows similar positive andsignificant OLS and fixed-effects estimates of the returns to employeereferrals, but insignificant effects once sector of employment is controlledfor. This evidence suggests referred workers earn higher wages not becauseof higher unobserved ability or better matches but rather because theyare hired in high-wage sectors.
Resumo:
The trade-off between property rights/price regulation and innovation depends on countrycharacteristics and drug industry specificities. Access to drugs and innovation can bereconciled by seven ways that, among others, include: public health strengthening in thecountries with the largest access problems (those among the poor with the weakestinstitutions); public and private aid to make attractive R&D on neglected diseases; pricediscrimination with market segmentation; to require patent owners to choose eitherprotection in the rich countries or protection in the poor countries (but not both).Regarding price regulation, after a review of theoretical arguments and empirical evidence,seven strategies to reconcile health and industrial considerations are outlined, including:mitigation of the medical profession dependence on the pharmaceutical industry; considerationof a drug as an input of a production process; split drug authorization from public fundingdecisions; establish an efficiency minimum for all health production inputs; and stop theEuropean R&D hemorrhagia.
Resumo:
El reconeixement dels gestos de la mà (HGR, Hand Gesture Recognition) és actualment un camp important de recerca degut a la varietat de situacions en les quals és necessari comunicar-se mitjançant signes, com pot ser la comunicació entre persones que utilitzen la llengua de signes i les que no. En aquest projecte es presenta un mètode de reconeixement de gestos de la mà a temps real utilitzant el sensor Kinect per Microsoft Xbox, implementat en un entorn Linux (Ubuntu) amb llenguatge de programació Python i utilitzant la llibreria de visió artifical OpenCV per a processar les dades sobre un ordinador portàtil convencional. Gràcies a la capacitat del sensor Kinect de capturar dades de profunditat d’una escena es poden determinar les posicions i trajectòries dels objectes en 3 dimensions, el que implica poder realitzar una anàlisi complerta a temps real d’una imatge o d’una seqüencia d’imatges. El procediment de reconeixement que es planteja es basa en la segmentació de la imatge per poder treballar únicament amb la mà, en la detecció dels contorns, per després obtenir l’envolupant convexa i els defectes convexos, que finalment han de servir per determinar el nombre de dits i concloure en la interpretació del gest; el resultat final és la transcripció del seu significat en una finestra que serveix d’interfície amb l’interlocutor. L’aplicació permet reconèixer els números del 0 al 5, ja que s’analitza únicament una mà, alguns gestos populars i algunes de les lletres de l’alfabet dactilològic de la llengua de signes catalana. El projecte és doncs, la porta d’entrada al camp del reconeixement de gestos i la base d’un futur sistema de reconeixement de la llengua de signes capaç de transcriure tant els signes dinàmics com l’alfabet dactilològic.
Resumo:
The matching function -a key building block in models of labor market frictions- impliesthat the job finding rate depends only on labor market tightness. We estimate such amatching function and find that the relation, although remarkably stable over 1967-2007,broke down spectacularly after 2007. We argue that labor market heterogeneities are notfully captured by the standard matching function, but that a generalized matching functionthat explicitly takes into account worker heterogeneity and market segmentation is fullyconsistent with the behavior of the job finding rate. The standard matching function canbreak down when, as in the Great Recession, the average characteristics of the unemployedchange too much, or when dispersion in labor market conditions -the extent to which somelabor markets fare worse than others- increases too much.
Resumo:
In this paper we present a Bayesian image reconstruction algorithm with entropy prior (FMAPE) that uses a space-variant hyperparameter. The spatial variation of the hyperparameter allows different degrees of resolution in areas of different statistical characteristics, thus avoiding the large residuals resulting from algorithms that use a constant hyperparameter. In the first implementation of the algorithm, we begin by segmenting a Maximum Likelihood Estimator (MLE) reconstruction. The segmentation method is based on using a wavelet decomposition and a self-organizing neural network. The result is a predetermined number of extended regions plus a small region for each star or bright object. To assign a different value of the hyperparameter to each extended region and star, we use either feasibility tests or cross-validation methods. Once the set of hyperparameters is obtained, we carried out the final Bayesian reconstruction, leading to a reconstruction with decreased bias and excellent visual characteristics. The method has been applied to data from the non-refurbished Hubble Space Telescope. The method can be also applied to ground-based images.
Resumo:
Evidence exists that many natural facts are described better as a fractal. Although fractals are very useful for describing nature, it is also appropiate to review the concept of random fractal in finance. Due to the extraordinary importance of Brownian motion in physics, chemistry or biology, we will consider the generalization that supposes fractional Brownian motion introduced by Mandelbrot.The main goal of this work is to analyse the existence of long range dependence in instantaneous forward rates of different financial markets. Concretelly, we perform an empirical analysis on the Spanish, Mexican and U.S. interbanking interest rate. We work with three time series of daily data corresponding to 1 day operations from 28th March 1996 to 21st May 2002. From among all the existing tests on this matter we apply the methodology proposed in Taqqu, Teverovsky and Willinger (1995).
Resumo:
En este documento se ilustra de un modo práctico, el empleo de tres instrumentos que permiten al actuario definir grupos arancelarios y estimar premios de riesgo en el proceso que tasa la clase para el seguro de no vida. El primero es el análisis de segmentación (CHAID y XAID) usado en primer lugar en 1997 por UNESPA en su cartera común de coches. El segundo es un proceso de selección gradual con el modelo de regresión a base de distancia. Y el tercero es un proceso con el modelo conocido y generalizado de regresión linear, que representa la técnica más moderna en la bibliografía actuarial. De estos últimos, si combinamos funciones de eslabón diferentes y distribuciones de error, podemos obtener el aditivo clásico y modelos multiplicativos
Resumo:
Spanning avalanches in the 3D Gaussian Random Field Ising Model (3D-GRFIM) with metastable dynamics at T=0 have been studied. Statistical analysis of the field values for which avalanches occur has enabled a Finite-Size Scaling (FSS) study of the avalanche density to be performed. Furthermore, a direct measurement of the geometrical properties of the avalanches has confirmed an earlier hypothesis that several types of spanning avalanches with two different fractal dimensions coexist at the critical point. We finally compare the phase diagram of the 3D-GRFIM with metastable dynamics with the same model in equilibrium at T=0.
Resumo:
Naive scale invariance is not a true property of natural images. Natural monochrome images possess a much richer geometrical structure, which is particularly well described in terms of multiscaling relations. This means that the pixels of a given image can be decomposed into sets, the fractal components of the image, with well-defined scaling exponents [Turiel and Parga, Neural Comput. 12, 763 (2000)]. Here it is shown that hyperspectral representations of natural scenes also exhibit multiscaling properties, observing the same kind of behavior. A precise measure of the informational relevance of the fractal components is also given, and it is shown that there are important differences between the intrinsically redundant red-green-blue system and the decorrelated one defined in Ruderman, Cronin, and Chiao [J. Opt. Soc. Am. A 15, 2036 (1998)].
Resumo:
The design of appropriate multifractal analysis algorithms, able to correctly characterize the scaling properties of multifractal systems from experimental, discretized data, is a major challenge in the study of such scale invariant systems. In the recent years, a growing interest for the application of the microcanonical formalism has taken place, as it allows a precise localization of the fractal components as well as a statistical characterization of the system. In this paper, we deal with the specific problems arising when systems that are strictly monofractal are analyzed using some standard microcanonical multifractal methods. We discuss the adaptations of these methods needed to give an appropriate treatment of monofractal systems.
Resumo:
En este documento se ilustra de un modo práctico, el empleo de tres instrumentos que permiten al actuario definir grupos arancelarios y estimar premios de riesgo en el proceso que tasa la clase para el seguro de no vida. El primero es el análisis de segmentación (CHAID y XAID) usado en primer lugar en 1997 por UNESPA en su cartera común de coches. El segundo es un proceso de selección gradual con el modelo de regresión a base de distancia. Y el tercero es un proceso con el modelo conocido y generalizado de regresión linear, que representa la técnica más moderna en la bibliografía actuarial. De estos últimos, si combinamos funciones de eslabón diferentes y distribuciones de error, podemos obtener el aditivo clásico y modelos multiplicativos
Resumo:
Evidence exists that many natural facts are described better as a fractal. Although fractals are very useful for describing nature, it is also appropiate to review the concept of random fractal in finance. Due to the extraordinary importance of Brownian motion in physics, chemistry or biology, we will consider the generalization that supposes fractional Brownian motion introduced by Mandelbrot.The main goal of this work is to analyse the existence of long range dependence in instantaneous forward rates of different financial markets. Concretelly, we perform an empirical analysis on the Spanish, Mexican and U.S. interbanking interest rate. We work with three time series of daily data corresponding to 1 day operations from 28th March 1996 to 21st May 2002. From among all the existing tests on this matter we apply the methodology proposed in Taqqu, Teverovsky and Willinger (1995).
Resumo:
In this work, the calcium-induced aggregation of phosphatidylserine liposomes is probed by means of the analysis of the kinetics of such process as well as the aggregate morphology. This novel characterization of liposome aggregation involves the use of static and dynamic light-scattering techniques to obtain kinetic exponents and fractal dimensions. For salt concentrations larger than 5 mM, a diffusion-limited aggregation regime is observed and the Brownian kernel properly describes the time evolution of the diffusion coefficient. For slow kinetics, a slightly modified multiple contact kernel is required. In any case, a time evolution model based on the numerical resolution of Smoluchowski's equation is proposed in order to establish a theoretical description for the aggregating system. Such a model provides an alternative procedure to determine the dimerization constant, which might supply valuable information about interaction mechanisms between phospholipid vesicles.