833 resultados para dissimilarity measures
Resumo:
We study a given fixed continuous function phi : S(1) -> R and an endomorphism f : S(1)-> S(1), whose f-invariant probability measures maximize integral phi d mu. We prove that the set of endomorphisms having a f maximizing invariant measure supported on a periodic orbit is C(0) dense.
Resumo:
We prove that given a compact n-dimensional connected Riemannian manifold X and a continuous function g : X -> R, there exists a dense subset of the space of homeomorphisms of X such that for all T in this subset, the integral integral(X) g d mu, considered as a function on the space of all T-invariant Borel probability measures mu, attains its maximum on a measure supported on a periodic orbit.
Resumo:
Objective: To investigate whether advanced visualizations of spirography-based objective measures are useful in differentiating drug-related motor dysfunctions between Off and dyskinesia in Parkinson’s disease (PD). Background: During the course of a 3 year longitudinal clinical study, in total 65 patients (43 males and 22 females with mean age of 65) with advanced PD and 10 healthy elderly (HE) subjects (5 males and 5 females with mean age of 61) were assessed. Both patients and HE subjects performed repeated and time-stamped assessments of their objective health indicators using a test battery implemented on a telemetry touch screen handheld computer, in their home environment settings. Among other tasks, the subjects were asked to trace a pre-drawn Archimedes spiral using the dominant hand and repeat the test three times per test occasion. Methods: A web-based framework was developed to enable a visual exploration of relevant spirography-based kinematic features by clinicians so they can in turn evaluate the motor states of the patients i.e. Off and dyskinesia. The system uses different visualization techniques such as time series plots, animation, and interaction and organizes them into different views to aid clinicians in measuring spatial and time-dependent irregularities that could be associated with the motor states. Along with the animation view, the system displays two time series plots for representing drawing speed (blue line) and displacement from ideal trajectory (orange line). The views are coordinated and linked i.e. user interactions in one of the views will be reflected in other views. For instance, when the user points in one of the pixels in the spiral view, the circle size of the underlying pixel increases and a vertical line appears in the time series views to depict the corresponding position. In addition, in order to enable clinicians to observe erratic movements more clearly and thus improve the detection of irregularities, the system displays a color-map which gives an idea of the longevity of the spirography task. Figure 2 shows single randomly selected spirals drawn by a: A) patient who experienced dyskinesias, B) HE subject, and C) patient in Off state. Results: According to a domain expert (DN), the spirals drawn in the Off and dyskinesia motor states are characterized by different spatial and time features. For instance, the spiral shown in Fig. 2A was drawn by a patient who showed symptoms of dyskinesia; the drawing speed was relatively high (cf. blue-colored time series plot and the short timestamp scale in the x axis) and the spatial displacement was high (cf. orange-colored time series plot) associated with smooth deviations as a result of uncontrollable movements. The patient also exhibited low amount of hesitation which could be reflected both in the animation of the spiral as well as time series plots. In contrast, the patient who was in the Off state exhibited different kinematic features, as shown in Fig. 2C. In the case of spirals drawn by a HE subject, there was a great precision during the drawing process as well as unchanging levels of time-dependent features over the test trial, as seen in Fig. 2B. Conclusions: Visualizing spirography-based objective measures enables identification of trends and patterns of drug-related motor dysfunctions at the patient’s individual level. Dynamic access of visualized motor tests may be useful during the evaluation of drug-related complications such as under- and over-medications, providing decision support to clinicians during evaluation of treatment effects as well as improve the quality of life of patients and their caregivers. In future, we plan to evaluate the proposed approach by assessing within- and between-clinician variability in ratings in order to determine its actual usefulness and then use these ratings as target outcomes in supervised machine learning, similarly as it was previously done in the study performed by Memedi et al. (2013).
Resumo:
Woodworking industries still consists of wood dust problems. Young workers are especially vulnerable to safety risks. To reduce risks, it is important to change attitudes and increase knowledge about safety. Safety training have shown to establish positive attitudes towards safety among employees. The aim of current study is to analyze the effect of QR codes that link to Picture Mix EXposure (PIMEX) videos by analyzing attitudes to this safety training method and safety in student responses. Safety training videos were used in upper secondary school handicraft programs to demonstrate wood dust risks and methods to decrease exposure to wood dust. A preliminary study was conducted to investigate improvement of safety training in two schools in preparation for the main study that investigated a safety training method in three schools. In the preliminary study the PIMEX method was first used in which students were filmed while wood dust exposure was measured and subsequently displayed on a computer screen in real time. Before and after the filming, teachers, students, and researchers together analyzed wood dust risks and effective measures to reduce exposure to them. For the main study, QR codes linked to PIMEX videos were attached at wood processing machines. Subsequent interviews showed that this safety training method enables students in an early stage of their life to learn about risks and safety measures to control wood dust exposure. The new combination of methods can create awareness, change attitudes and motivation among students to work more frequently to reduce wood dust.
Resumo:
Market timing performance of mutual funds is usually evaluated with linear models with dummy variables which allow for the beta coefficient of CAPM to vary across two regimes: bullish and bearish market excess returns. Managers, however, use their predictions of the state of nature to deÞne whether to carry low or high beta portfolios instead of the observed ones. Our approach here is to take this into account and model market timing as a switching regime in a way similar to Hamilton s Markov-switching GNP model. We then build a measure of market timing success and apply it to simulated and real world data.
An ordering of measures of the welfare cost of inflation in economies with interest-bearing deposits
Resumo:
This paper builds on Lucas (2000) and on Cysne (2003) to derive and order six alternative measures of the welfare costs of inflation (five of which already existing in the literature) for any vector of opportunity costs. The ordering of the functions is carried out for economies with or without interestbearing deposits. We provide examples and closed-form solutions for the log-log money demand both in the unidimensional and in the multidimensional setting (when interest-bearing monies are present). An estimate of the maximum relative error a researcher can incur when using any particular measure is also provided.
Resumo:
This paper presents semiparametric estimators of changes in inequality measures of a dependent variable distribution taking into account the possible changes on the distributions of covariates. When we do not impose parametric assumptions on the conditional distribution of the dependent variable given covariates, this problem becomes equivalent to estimation of distributional impacts of interventions (treatment) when selection to the program is based on observable characteristics. The distributional impacts of a treatment will be calculated as differences in inequality measures of the potential outcomes of receiving and not receiving the treatment. These differences are called here Inequality Treatment Effects (ITE). The estimation procedure involves a first non-parametric step in which the probability of receiving treatment given covariates, the propensity-score, is estimated. Using the inverse probability weighting method to estimate parameters of the marginal distribution of potential outcomes, in the second step weighted sample versions of inequality measures are computed. Root-N consistency, asymptotic normality and semiparametric efficiency are shown for the semiparametric estimators proposed. A Monte Carlo exercise is performed to investigate the behavior in finite samples of the estimator derived in the paper. We also apply our method to the evaluation of a job training program.
Resumo:
We outline possible actions to be adopted by the European Union to ensure a better share of total coffee revenues to producers in developing countries. The way to this translates, ultimately, in producers receiving a fair price for the commodity they supply, i.e., a market price that results from fair market conditions in the whole coffee producing chain. We plead for proposals to take place in the consuming countries, as market conditions in the consuming-countries side of the coffee producing chain are not fair; market failures and ingenious distortions are responsible for the enormous asymmetry of gains in the two sides. The first of three proposals for consumer government supported actions is to help in the creation of domestic trading companies for achieving higher export volumes. These tradings would be associated to roasters that, depending on the final product envisaged, could perform the roasting in the country and export the roasted – and sometimes ground – coffee, breaking the increasing importers-exporters verticalisation. Another measure would be the systematic provision of basic intelligence on the consuming markets. Statistics of the quantities sold according to mode of consumption, by broad “categories of coffee” and point of sale, could be produced for each country. They should be matched to the exports/imports data and complemented by (aggregate) country statistics on the roasting sector. This would extremely help producing countries design their own market and producing strategies. Finally, a fund, backed by a common EU tax on roasted coffee – created within the single market tax harmonisation programme, is suggested. This European Coffee Fund would have two main projects. Together with the ICO, it would launch an advertising campaign on coffee in general, aimed at counterbalancing the increasing “brandification” of coffee. Basic information on the characteristics of the plant and the drink would be passed, and the effort could be extended to the future Eastern European members of the Union, as a further assurance that EU processors would not have a too privileged access to these new markets. A quality label for every coffee sold in the Union could complement this initiative, helping to create a level playing field for products from outside the EU. A second project would consist in a careful diversification effort, to take place in selected producing countries.
Resumo:
This paper presents three contributions to the literature on the welfare cost of ináation. First, it introduces a new sensible way of measuring this cost - that of a compensating variation in consumption or income, instead of the equivalent variation notion that has been extensively used in empirical and theoretical research during the past fiftt years. We Önd this new measure to be interestingly related to the proxy measure of the shopping-time welfare cost of ináation introduced by Simonsen and Cysne (2001). Secondly, it discusses for which money-demand functions this and the shopping-time measure can be evaluated in an economically meaningful way. And, last but not least, it completely orders a comprehensive set of measures of the welfare cost of ináation for these money-demand specification. All of our results are extended to an economy in which there are many types of monies present, and are illustrated with the log-log money-demand specification.
Resumo:
In this paper we construct sunspot equilibria that arrise from chaotic deterministic dynamics. These equilibria are robust and therefore observables. We prove that they may be learned by a sim pie rule based on the histograms or past state variables. This work gives the theoretical justification or deterministic models that might compete with stochastic models to explain real data.
Resumo:
In trade agreements, governments can design remedies to ensure compliance (property rule) or to compensate victims (liability rule). This paper describes an economic framework to explain the pattern of remedies over non-tariff restrictions—particularly domestic subsidies and nonviolation complaints subject to liability rules. The key determinants of the contract form for any individual measure are the expected joint surplus from an agreement and the expected loss to the constrained government. The loss is higher for domestic subsidies and nonviolations because these are the policies most likely to correct domestic distortions. Governments choose property rules when expected gains from compliance are sufficiently high and expected losses to the constrained country are sufficiently low. Liability rules are preferable when dispute costs are relatively high, because inefficiencies in the compensation process reduce the number of socially inefficient disputes filed.
Resumo:
In this paper I will investigate the conditions under which a convex capacity (or a non-additive probability which exhibts uncertainty aversion) can be represented as a squeeze of a(n) (additive) probability measure associate to an uncertainty aversion function. Then I will present two alternatives forrnulations of the Choquet integral (and I will extend these forrnulations to the Choquet expected utility) in a parametric approach that will enable me to do comparative static exercises over the uncertainty aversion function in an easy way.
Resumo:
This article proposes an alternative methodology for estimating the effects of non-tariff measures on trade flows, based on the recent literature on gravity models. A two-stage Heckman selection model is applied to the case of Brazilian exports, where the second stage gravity equation is theoretically grounded on the seminal Melitz model of heterogeneous firms. This extended gravity equation highlights the role played by zero trade flows as well as firm heterogeneity in explaining bilateral trade among countries, two factors usually omitted in traditional gravity specifications found in previous literature. Last, it also proposes a economic rationale for the effects of NTM on trade flows, helping to shed some light on its main operating channels under a rather simple Cournot’s duopolistic competition framework.