112 resultados para consistency
Resumo:
We propose to analyze shapes as “compositions” of distances in Aitchison geometry asan alternate and complementary tool to classical shape analysis, especially when sizeis non-informative.Shapes are typically described by the location of user-chosen landmarks. Howeverthe shape – considered as invariant under scaling, translation, mirroring and rotation– does not uniquely define the location of landmarks. A simple approach is to usedistances of landmarks instead of the locations of landmarks them self. Distances arepositive numbers defined up to joint scaling, a mathematical structure quite similar tocompositions. The shape fixes only ratios of distances. Perturbations correspond torelative changes of the size of subshapes and of aspect ratios. The power transformincreases the expression of the shape by increasing distance ratios. In analogy to thesubcompositional consistency, results should not depend too much on the choice ofdistances, because different subsets of the pairwise distances of landmarks uniquelydefine the shape.Various compositional analysis tools can be applied to sets of distances directly or afterminor modifications concerning the singularity of the covariance matrix and yield resultswith direct interpretations in terms of shape changes. The remaining problem isthat not all sets of distances correspond to a valid shape. Nevertheless interpolated orpredicted shapes can be backtransformated by multidimensional scaling (when all pairwisedistances are used) or free geodetic adjustment (when sufficiently many distancesare used)
Resumo:
A condition needed for testing nested hypotheses from a Bayesianviewpoint is that the prior for the alternative model concentratesmass around the small, or null, model. For testing independencein contingency tables, the intrinsic priors satisfy this requirement.Further, the degree of concentration of the priors is controlled bya discrete parameter m, the training sample size, which plays animportant role in the resulting answer regardless of the samplesize.In this paper we study robustness of the tests of independencein contingency tables with respect to the intrinsic priors withdifferent degree of concentration around the null, and comparewith other “robust” results by Good and Crook. Consistency ofthe intrinsic Bayesian tests is established.We also discuss conditioning issues and sampling schemes,and argue that conditioning should be on either one margin orthe table total, but not on both margins.Examples using real are simulated data are given
Resumo:
In this paper we explore the sectoral and aggregate implications of some endogeneization rules (i.e. on value-added and final demand) which have been common in the Leontief model and have been recently proposed in the Ghosh model. We detect that these rules may give rise in both models to some allegedly pathological behavior in the sense that sectoral or aggregate output, very often, may not follow the logical and economically expected direct relationship with some underlying endogenous variables—namely, output and value-added in the Ghosh model and output and consumption in the Leontief model. Because of the common mathematical structure, whatever is or seems to be pathological in the Ghosh model also has a symmetric counterpart in the Leontief model. These would not be good news for the inner consistency of these linear models. To avoid such possible inconsistencies, we propose new and simple endogeneization rules that have a sound economic interpretation.
Resumo:
In this paper we axiomatize the strong constrained egalitarian solution (Dutta and Ray, 1991) over the class of weak superadditive games using constrained egalitarianism, order-consistency, and converse order-consistency. JEL classification: C71, C78. Keywords: Cooperative TU-game, strong constrained egalitarian solution, axiomatization.
Resumo:
El punt de partida d'aquesta investigació és una retòrica molt utilitzada que la UE és un actor global. En vista d'això, la no proliferació de la política comunitària al sud de la Mediterrània s'examina. L'estudi es realitza sobre la base de la conceptualització de la UE "actorness" ia través d'alguns criteris (context extern, l'evolució de l'aparell de política exterior de la UE, la Unió Europea l'auto-presentació i la percepció de tercers, la consistència i la disponibilitat d'instruments de política i accions concretes) que involucren tant factors ideacionals i materials, d'acord amb el "pluralisme metodològic". Aquest marc conceptual va ajudar a avaluar la no proliferació de la política comunitària en aquesta regió en particular on la UE té interessos i bones raons per actuar. Cada un dels criteris de manifest els avantatges i desavantatges de la UE "actorness" en aquest camp seleccionat i la caixa. Aquest document sosté que la no proliferació "actorness" de la UE a la regió del sud de la Mediterrània ha estat limitat a causa d'una varietat de raons.
Resumo:
El objeto de esta comunicación es presentar el modelo formativo de los estudios de Grado de la Facultad de Educación Social y Trabajo Social Pere Tarrés de la Universidad Ramón Llull, una vez se han desplegado los tres primeros cursos de las titulaciones de Grado. Pretendemos dar a conocer las competencias transversales comunes en todas las asignaturas así como las metodologías y las actividades utilizadas para facilitar el proceso de aprendizaje de los estudiantes. Queremos constatar, por un lado, que nuestra propuesta avala la integración del nuevo modelo educativo, recogido en el Plan Bolonia, donde la actividad del estudiante es uno de los ejes centrales de su proceso formativo y, por el otro, que nos permita dar un salto cualitativo en la formación que ofrecemos a los futuros profesionales de la acción social. Pretendemos garantizar la coherencia entre los contenidos de las asignaturas que forman una misma materia y los niveles de consecución de las competencias a lo largo del período de formación. Esto implica una tarea de coordinación y de trabajo en equipo entre los profesores que garantice la consecución de los contenidos evitando vacíos o solapamientos entre asignaturas.
Resumo:
This paper deals with fault detection and isolation problems for nonlinear dynamic systems. Both problems are stated as constraint satisfaction problems (CSP) and solved using consistency techniques. The main contribution is the isolation method based on consistency techniques and uncertainty space refining of interval parameters. The major advantage of this method is that the isolation speed is fast even taking into account uncertainty in parameters, measurements, and model errors. Interval calculations bring independence from the assumption of monotony considered by several approaches for fault isolation which are based on observers. An application to a well known alcoholic fermentation process model is presented
Resumo:
Background: The COSMIN checklist (COnsensus-based Standards for the selection of health status Measurement INstruments) was developed in an international Delphi study to evaluate the methodological quality of studies on measurement properties of health-related patient reported outcomes (HR-PROs). In this paper, we explain our choices for the design requirements and preferred statistical methods for which no evidence is available in the literature or on which the Delphi panel members had substantial discussion. Methods: The issues described in this paper are a reflection of the Delphi process in which 43 panel members participated. Results: The topics discussed are internal consistency (relevance for reflective and formative models, and distinction with unidimensionality), content validity (judging relevance and comprehensiveness), hypotheses testing as an aspect of construct validity (specificity of hypotheses), criterion validity (relevance for PROs), and responsiveness (concept and relation to validity, and (in) appropriate measures).Conclusions: We expect that this paper will contribute to a better understanding of the rationale behind the items, thereby enhancing the acceptance and use of the COSMIN checklist.
Resumo:
Background: Despite the fact that labour market flexibility has resulted in an expansion of precarious employment in industrialized countries, to date there is limited empirical evidence about its health consequences. The Employment Precariousness Scale (EPRES) is a newly developed, theory-based, multidimensional questionnaire specifically devised for epidemiological studies among waged and salaried workers. Objective: To assess acceptability, reliability and construct validity of EPRES in a sample of waged and salaried workers in Spain. Methods: Cross-sectional study, using a sub-sample of 6.968 temporary and permanent workers from a population-based survey carried out in 2004-2005. The survey questionnaire was interviewer administered and included the six EPRES subscales, measures of the psychosocial work environment (COPSOQ ISTAS21), and perceived general and mental health (SF-36). Results: A high response rate to all EPRES items indicated good acceptability; Cronbach’s alpha coefficients, over 0.70 for all subscales and the global score, demonstrated good internal consistency reliability; exploratory factor analysis using principal axis analysis and varimax rotation confirmed the six-subscale structure and the theoretical allocation of all items. Patterns across known groups and correlation coefficients with psychosocial work environment measures and perceived health demonstrated the expected relations, providing evidence of construct validity. Conclusions: Our results provide evidence in support of the psychometric properties of EPRES, which appears to be a promising tool for the measurement of employment precariousness in public health research.
Resumo:
Background: Choosing an adequate measurement instrument depends on the proposed use of the instrument, the concept to be measured, the measurement properties (e.g. internal consistency, reproducibility, content and construct validity, responsiveness, and interpretability), the requirements, the burden for subjects, and costs of the available instruments. As far as measurement properties are concerned, there are no sufficiently specific standards for the evaluation of measurement properties of instruments to measure health status, and also no explicit criteria for what constitutes good measurement properties. In this paper we describe the protocol for the COSMIN study, the objective of which is to develop a checklist that contains COnsensus-based Standards for the selection of health Measurement INstruments, including explicit criteria for satisfying these standards. We will focus on evaluative health related patient-reported outcomes (HR-PROs), i.e. patient-reported health measurement instruments used in a longitudinal design as an outcome measure, excluding health care related PROs, such as satisfaction with care or adherence. The COSMIN standards will be made available in the form of an easily applicable checklist.Method: An international Delphi study will be performed to reach consensus on which and how measurement properties should be assessed, and on criteria for good measurement properties. Two sources of input will be used for the Delphi study: (1) a systematic review of properties, standards and criteria of measurement properties found in systematic reviews of measurement instruments, and (2) an additional literature search of methodological articles presenting a comprehensive checklist of standards and criteria. The Delphi study will consist of four (written) Delphi rounds, with approximately 30 expert panel members with different backgrounds in clinical medicine, biostatistics, psychology, and epidemiology. The final checklist will subsequently be field-tested by assessing the inter-rater reproducibility of the checklist.Discussion: Since the study will mainly be anonymous, problems that are commonly encountered in face-to-face group meetings, such as the dominance of certain persons in the communication process, will be avoided. By performing a Delphi study and involving many experts, the likelihood that the checklist will have sufficient credibility to be accepted and implemented will increase.
Resumo:
This paper presents a new registration algorithm, called Temporal Di eomorphic Free Form Deformation (TDFFD), and its application to motion and strain quanti cation from a sequence of 3D ultrasound (US) images. The originality of our approach resides in enforcing time consistency by representing the 4D velocity eld as the sum of continuous spatiotemporal B-Spline kernels. The spatiotemporal displacement eld is then recovered through forward Eulerian integration of the non-stationary velocity eld. The strain tensor iscomputed locally using the spatial derivatives of the reconstructed displacement eld. The energy functional considered in this paper weighs two terms: the image similarity and a regularization term. The image similarity metric is the sum of squared di erences between the intensities of each frame and a reference one. Any frame in the sequence can be chosen as reference. The regularization term is based on theincompressibility of myocardial tissue. TDFFD was compared to pairwise 3D FFD and 3D+t FFD, bothon displacement and velocity elds, on a set of synthetic 3D US images with di erent noise levels. TDFFDshowed increased robustness to noise compared to these two state-of-the-art algorithms. TDFFD also proved to be more resistant to a reduced temporal resolution when decimating this synthetic sequence. Finally, this synthetic dataset was used to determine optimal settings of the TDFFD algorithm. Subsequently, TDFFDwas applied to a database of cardiac 3D US images of the left ventricle acquired from 9 healthy volunteers and 13 patients treated by Cardiac Resynchronization Therapy (CRT). On healthy cases, uniform strain patterns were observed over all myocardial segments, as physiologically expected. On all CRT patients, theimprovement in synchrony of regional longitudinal strain correlated with CRT clinical outcome as quanti ed by the reduction of end-systolic left ventricular volume at follow-up (6 and 12 months), showing the potential of the proposed algorithm for the assessment of CRT.
Resumo:
This paper presents a technique to estimate and model patient-specific pulsatility of cerebral aneurysms over onecardiac cycle, using 3D rotational X-ray angiography (3DRA) acquisitions. Aneurysm pulsation is modeled as a time varying-spline tensor field representing the deformation applied to a reference volume image, thus producing the instantaneousmorphology at each time point in the cardiac cycle. The estimated deformation is obtained by matching multiple simulated projections of the deforming volume to their corresponding original projections. A weighting scheme is introduced to account for the relevance of each original projection for the selected time point. The wide coverage of the projections, together with the weighting scheme, ensures motion consistency in all directions. The technique has been tested on digital and physical phantoms that are realistic and clinically relevant in terms of geometry, pulsation and imaging conditions. Results from digital phantomexperiments demonstrate that the proposed technique is able to recover subvoxel pulsation with an error lower than 10% of the maximum pulsation in most cases. The experiments with the physical phantom allowed demonstrating the feasibility of pulsation estimation as well as identifying different pulsation regions under clinical conditions.
Resumo:
In monetary unions, monetary policy is typically made by delegates of the member countries. This procedure raises the possibility of strategic delegation - that countries may choose the types of delegates to influence outcomes in their favor. We show that without commitment in monetary policy, strategic delegation arises if and only if three conditions are met: shocks affecting individual countries are not perfectly correlated, risk-sharing across countries is imperfect, and the Phillips Curve is nonlinear. Moreover, inflation rates are inefficiently high. We argue that ways of solving the commitment problem, including the emphasis on price stability in the agreements constituting the European Union are especially valuable when strategic delegation is a problem.
Resumo:
We study the effects of nominal debt on the optimal sequential choice of monetary policy. When the stock of debt is nominal, the incentive to generate unanticipated inflation increases the cost of the outstanding debt even if no unanticipated inflation episodes occur in equilibrium. Without full commitment, the optimal sequential policy is to deplete the outstanding stock of debt progressively until these extra costs disappear. Nominal debt is therefore a burden on monetary policy, not only because it must be serviced, but also because it creates a time inconsistency problem that distorts interest rates. The introduction of alternative forms of taxation may lessen this burden, if there is enough commtiment to fiscal policy. If there is full commitment to an optimal fiscal policy, then the resulting monetary policy is the Friedman rule of zero nominal interest rates.
Resumo:
Although research has documented the importance of emotion in risk perception, little is knownabout its prevalence in everyday life. Using the Experience Sampling Method, 94 part-timestudents were prompted at random via cellular telephones to report on mood state and threeemotions and to assess risk on thirty occasions during their working hours. The emotions valence, arousal, and dominance were measured using self-assessment manikins (Bradley &Lang, 1994). Hierarchical linear models (HLM) revealed that mood state and emotions explainedsignificant variance in risk perception. In addition, valence and arousal accounted for varianceover and above reason (measured by severity and possibility of risks). Six risks were reassessedin a post-experimental session and found to be lower than their real-time counterparts.The study demonstrates the feasibility and value of collecting representative samples of data withsimple technology. Evidence for the statistical consistency of the HLM estimates is provided inan Appendix.