96 resultados para Methodological problems
Resumo:
In the last decades; a growing stock of literature has been devoted to the criticism of GDP as an indicator of societal wealth. A relevant question is: what are the perspectives to build, on the existing knowledge and consensus, alternative measures of prosperity? A starting point may be to connect well-being research agenda with the sustainability one. However, there is no doubt that there is a lot of complexity and fuzziness inherent in multidimensional concepts such as sustainability and well-being. This article analyses the theoretical foundations and the empirical validity of some multidimensional technical tools that can be used for well-being evaluation and assessment. Of course one should not forget that policy conclusions derived through any mathematical model depend also on the conceptual framework used, i.e. which representation of reality (and thus which societal values and interests) has been considered.
Resumo:
Background: The COSMIN checklist (COnsensus-based Standards for the selection of health status Measurement INstruments) was developed in an international Delphi study to evaluate the methodological quality of studies on measurement properties of health-related patient reported outcomes (HR-PROs). In this paper, we explain our choices for the design requirements and preferred statistical methods for which no evidence is available in the literature or on which the Delphi panel members had substantial discussion. Methods: The issues described in this paper are a reflection of the Delphi process in which 43 panel members participated. Results: The topics discussed are internal consistency (relevance for reflective and formative models, and distinction with unidimensionality), content validity (judging relevance and comprehensiveness), hypotheses testing as an aspect of construct validity (specificity of hypotheses), criterion validity (relevance for PROs), and responsiveness (concept and relation to validity, and (in) appropriate measures).Conclusions: We expect that this paper will contribute to a better understanding of the rationale behind the items, thereby enhancing the acceptance and use of the COSMIN checklist.
Resumo:
Background: Choosing an adequate measurement instrument depends on the proposed use of the instrument, the concept to be measured, the measurement properties (e.g. internal consistency, reproducibility, content and construct validity, responsiveness, and interpretability), the requirements, the burden for subjects, and costs of the available instruments. As far as measurement properties are concerned, there are no sufficiently specific standards for the evaluation of measurement properties of instruments to measure health status, and also no explicit criteria for what constitutes good measurement properties. In this paper we describe the protocol for the COSMIN study, the objective of which is to develop a checklist that contains COnsensus-based Standards for the selection of health Measurement INstruments, including explicit criteria for satisfying these standards. We will focus on evaluative health related patient-reported outcomes (HR-PROs), i.e. patient-reported health measurement instruments used in a longitudinal design as an outcome measure, excluding health care related PROs, such as satisfaction with care or adherence. The COSMIN standards will be made available in the form of an easily applicable checklist.Method: An international Delphi study will be performed to reach consensus on which and how measurement properties should be assessed, and on criteria for good measurement properties. Two sources of input will be used for the Delphi study: (1) a systematic review of properties, standards and criteria of measurement properties found in systematic reviews of measurement instruments, and (2) an additional literature search of methodological articles presenting a comprehensive checklist of standards and criteria. The Delphi study will consist of four (written) Delphi rounds, with approximately 30 expert panel members with different backgrounds in clinical medicine, biostatistics, psychology, and epidemiology. The final checklist will subsequently be field-tested by assessing the inter-rater reproducibility of the checklist.Discussion: Since the study will mainly be anonymous, problems that are commonly encountered in face-to-face group meetings, such as the dominance of certain persons in the communication process, will be avoided. By performing a Delphi study and involving many experts, the likelihood that the checklist will have sufficient credibility to be accepted and implemented will increase.
Resumo:
En el marco del Espacio Europeo de Educación Superior 1 se inscriben innumerables acciones que están desarrollando todas las universidades europeas con la intención de llegar al 2010 con un bagaje lo suficientemente importante como para afrontar este cambio con entereza y sobretodo conservando (o incrementando) la calidad de los procesos de enseñanza- aprendizaje que hasta el momento han predominado. Dentro de este amplio proceso de transformación se encuentra el diseño de los nuevos Grados que brinda la oportunidad de replantear los planes de estudios, por tanto, la organización de las asignaturas, estructura de los contenidos, metodologías y sistemas de evaluación. Toda esta reflexión debe girar, a nuestro entender, alrededor de tres núcleos que se encuentran en estado deinterdependencia: los escenarios profesionales, los perfiles profesionales y las competencias que en ellos se inscriben.Para poder plantear los nuevos Grados en coherencia con la era del EEES debe hacerse un análisis minucioso de los perfil profesionales demandados por el mercado de trabajo que, al fin y al cabo será el destino de los profesionales que se forman en nuestras universidades, es por ello, que la tarea de definir el perfil profesional a priori del diseño de los nuevos Grados resulta una máxima para garantizar la calidad de estos. En este trabajo se presenta un ejemplo de metodología a seguir para la definición de un perfil profesional en Educación Superior, concretamente, el del Ingeniero TIC mediante el Análisis Funcional.
Resumo:
We present a polyhedral framework for establishing general structural properties on optimal solutions of stochastic scheduling problems, where multiple job classes vie for service resources: the existence of an optimal priority policy in a given family, characterized by a greedoid (whose feasible class subsets may receive higher priority), where optimal priorities are determined by class-ranking indices, under restricted linear performance objectives (partial indexability). This framework extends that of Bertsimas and Niño-Mora (1996), which explained the optimality of priority-index policies under all linear objectives (general indexability). We show that, if performance measures satisfy partial conservation laws (with respect to the greedoid), which extend previous generalized conservation laws, then the problem admits a strong LP relaxation over a so-called extended greedoid polytope, which has strong structural and algorithmic properties. We present an adaptive-greedy algorithm (which extends Klimov's) taking as input the linear objective coefficients, which (1) determines whether the optimal LP solution is achievable by a policy in the given family; and (2) if so, computes a set of class-ranking indices that characterize optimal priority policies in the family. In the special case of project scheduling, we show that, under additional conditions, the optimal indices can be computed separately for each project (index decomposition). We further apply the framework to the important restless bandit model (two-action Markov decision chains), obtaining new index policies, that extend Whittle's (1988), and simple sufficient conditions for their validity. These results highlight the power of polyhedral methods (the so-called achievable region approach) in dynamic and stochastic optimization.
Resumo:
In todays competitive markets, the importance of goodscheduling strategies in manufacturing companies lead to theneed of developing efficient methods to solve complexscheduling problems.In this paper, we studied two production scheduling problemswith sequence-dependent setups times. The setup times areone of the most common complications in scheduling problems,and are usually associated with cleaning operations andchanging tools and shapes in machines.The first problem considered is a single-machine schedulingwith release dates, sequence-dependent setup times anddelivery times. The performance measure is the maximumlateness.The second problem is a job-shop scheduling problem withsequence-dependent setup times where the objective is tominimize the makespan.We present several priority dispatching rules for bothproblems, followed by a study of their performance. Finally,conclusions and directions of future research are presented.
Resumo:
In many areas of economics there is a growing interest in how expertise andpreferences drive individual and group decision making under uncertainty. Increasingly, we wish to estimate such models to quantify which of these drive decisionmaking. In this paper we propose a new channel through which we can empirically identify expertise and preference parameters by using variation in decisionsover heterogeneous priors. Relative to existing estimation approaches, our \Prior-Based Identification" extends the possible environments which can be estimated,and also substantially improves the accuracy and precision of estimates in thoseenvironments which can be estimated using existing methods.
Resumo:
The P-median problem is a classical location model par excellence . In this paper we, firstexamine the early origins of the problem, formulated independently by Louis Hakimi andCharles ReVelle, two of the fathers of the burgeoning multidisciplinary field of researchknown today as Facility Location Theory and Modelling. We then examine some of thetraditional heuristic and exact methods developed to solve the problem. In the third sectionwe analyze the impact of the model in the field. We end the paper by proposing new lines ofresearch related to such a classical problem.
Resumo:
From a scientific point of view, surveys are undoubtedly a valuable tool for the knowledge of the social and political reality. They are widely used in the social sciences research. However, the researcher's task is often disturbed by a series of deficiencies related to some technical aspects that make difficult both the inference and the comparison. The main aim of the present paper is to report and justify the European Social Survey's technical specifications addressed to avoid and/or minimize such deficiencies. The article also gives a characterization of the non-respondents in Spain obtained from the analysis of the 2002 fieldwork data file.
Resumo:
There is a large and growing literature that studies the effects of weak enforcement institutions on economic performance. This literature has focused almost exclusively on primary markets, in which assets are issued and traded to improve the allocation of investment and consumption. The general conclusion is that weak enforcement institutions impair the workings of these markets, giving rise to various inefficiencies.But weak enforcement institutions also create incentives to develop secondary markets, in which the assets issued in primary markets are retraded. This paper shows that trading in secondary markets counteracts the effects of weak enforcement institutions and, in the absence of further frictions, restores efficiency.
Resumo:
We present a polyhedral framework for establishing general structural properties on optimal solutions of stochastic scheduling problems, where multiple job classes vie for service resources: the existence of an optimal priority policy in a given family, characterized by a greedoid(whose feasible class subsets may receive higher priority), where optimal priorities are determined by class-ranking indices, under restricted linear performance objectives (partial indexability). This framework extends that of Bertsimas and Niño-Mora (1996), which explained the optimality of priority-index policies under all linear objectives (general indexability). We show that, if performance measures satisfy partial conservation laws (with respect to the greedoid), which extend previous generalized conservation laws, then theproblem admits a strong LP relaxation over a so-called extended greedoid polytope, which has strong structural and algorithmic properties. We present an adaptive-greedy algorithm (which extends Klimov's) taking as input the linear objective coefficients, which (1) determines whether the optimal LP solution is achievable by a policy in the given family; and (2) if so, computes a set of class-ranking indices that characterize optimal priority policies in the family. In the special case of project scheduling, we show that, under additional conditions, the optimal indices can be computed separately for each project (index decomposition). We further apply the framework to the important restless bandit model (two-action Markov decision chains), obtaining new index policies, that extend Whittle's (1988), and simple sufficient conditions for their validity. These results highlight the power of polyhedral methods (the so-called achievable region approach) in dynamic and stochastic optimization.
Resumo:
The paper develops a method to solve higher-dimensional stochasticcontrol problems in continuous time. A finite difference typeapproximation scheme is used on a coarse grid of low discrepancypoints, while the value function at intermediate points is obtainedby regression. The stability properties of the method are discussed,and applications are given to test problems of up to 10 dimensions.Accurate solutions to these problems can be obtained on a personalcomputer.
Resumo:
The set covering problem is an NP-hard combinatorial optimization problemthat arises in applications ranging from crew scheduling in airlines todriver scheduling in public mass transport. In this paper we analyze searchspace characteristics of a widely used set of benchmark instances throughan analysis of the fitness-distance correlation. This analysis shows thatthere exist several classes of set covering instances that have a largelydifferent behavior. For instances with high fitness distance correlation,we propose new ways of generating core problems and analyze the performanceof algorithms exploiting these core problems.
Resumo:
To an odd irreducible 2-dimensional complex linear representation of the absolute Galois group of the field Q of rational numbers, a modular form of weight 1 is associated (modulo Artin's conjecture on the L-series of the representation in the icosahedral case). In addition, linear liftings of 2-dimensional projective Galois representations are related to solutions of certain Galois embedding problems. In this paper we present some recent results on the existence of liftings of projective representations and on the explicit resolution of embedding problems associated to orthogonal Galois representations, and explain how these results can be used to construct modular forms.