26 resultados para Balanced Score Card (BSC®)
Resumo:
Sudoku problems are some of the most known and enjoyed pastimes, with a never diminishing popularity, but, for the last few years those problems have gone from an entertainment to an interesting research area, a twofold interesting area, in fact. On the one side Sudoku problems, being a variant of Gerechte Designs and Latin Squares, are being actively used for experimental design, as in [8, 44, 39, 9]. On the other hand, Sudoku problems, as simple as they seem, are really hard structured combinatorial search problems, and thanks to their characteristics and behavior, they can be used as benchmark problems for refining and testing solving algorithms and approaches. Also, thanks to their high inner structure, their study can contribute more than studies of random problems to our goal of solving real-world problems and applications and understanding problem characteristics that make them hard to solve. In this work we use two techniques for solving and modeling Sudoku problems, namely, Constraint Satisfaction Problem (CSP) and Satisfiability Problem (SAT) approaches. To this effect we define the Generalized Sudoku Problem (GSP), where regions can be of rectangular shape, problems can be of any order, and solution existence is not guaranteed. With respect to the worst-case complexity, we prove that GSP with block regions of m rows and n columns with m = n is NP-complete. For studying the empirical hardness of GSP, we define a series of instance generators, that differ in the balancing level they guarantee between the constraints of the problem, by finely controlling how the holes are distributed in the cells of the GSP. Experimentally, we show that the more balanced are the constraints, the higher the complexity of solving the GSP instances, and that GSP is harder than the Quasigroup Completion Problem (QCP), a problem generalized by GSP. Finally, we provide a study of the correlation between backbone variables – variables with the same value in all the solutions of an instance– and hardness of GSP.
Resumo:
Tractable cases of the binary CSP are mainly divided in two classes: constraint language restrictions and constraint graph restrictions. To better understand and identify the hardest binary CSPs, in this work we propose methods to increase their hardness by increasing the balance of both the constraint language and the constraint graph. The balance of a constraint is increased by maximizing the number of domain elements with the same number of occurrences. The balance of the graph is defined using the classical definition from graph the- ory. In this sense we present two graph models; a first graph model that increases the balance of a graph maximizing the number of vertices with the same degree, and a second one that additionally increases the girth of the graph, because a high girth implies a high treewidth, an important parameter for binary CSPs hardness. Our results show that our more balanced graph models and constraints result in harder instances when compared to typical random binary CSP instances, by several orders of magnitude. Also we detect, at least for sparse constraint graphs, a higher treewidth for our graph models.
Resumo:
Background. Although peer review is widely considered to be the most credible way of selecting manuscripts and improving the quality of accepted papers in scientific journals, there is little evidence to support its use. Our aim was to estimate the effects on manuscript quality of either adding a statistical peer reviewer or suggesting the use of checklists such as CONSORT or STARD to clinical reviewers or both. Methodology and Principal Findings. Interventions were defined as 1) the addition of a statistical reviewer to the clinical peer review process, and 2) suggesting reporting guidelines to reviewers; with"no statistical expert" and"no checklist" as controls. The two interventions were crossed in a 262 balanced factorial design including original research articles consecutively selected, between May 2004 and March 2005, by the Medicina Clinica (Barc) editorial committee. We randomized manuscripts to minimize differences in terms of baseline quality and type of study (intervention, longitudinal, cross-sectional, others). Sample-size calculations indicated that 100 papers provide an 80% power to test a 55% standardized difference. We specified the main outcome as the increment in quality of papers as measured on the Goodman Scale. Two blinded evaluators rated the quality of manuscripts at initial submission and final post peer review version. Of the 327 manuscripts submitted to the journal, 131 were accepted for further review, and 129 were randomized. Of those, 14 that were lost to follow-up showed no differences in initial quality to the followed-up papers. Hence, 115 were included in the main analysis, with 16 rejected for publication after peer review. 21 (18.3%) of the 115 included papers were interventions, 46 (40.0%) were longitudinal designs, 28 (24.3%) cross-sectional and 20 (17.4%) others. The 16 (13.9%) rejected papers had a significantly lower initial score on the overall Goodman scale than accepted papers (difference 15.0, 95% CI: 4.6- 24.4). The effect of suggesting a guideline to the reviewers had no effect on change in overall quality as measured by the Goodman scale (0.9, 95% CI: 20.3+2.1). The estimated effect of adding a statistical reviewer was 5.5 (95% CI: 4.3-6.7), showing a significant improvement in quality. Conclusions and Significance. This prospective randomized study shows the positive effect of adding a statistical reviewer to the field-expert peers in improving manuscript quality. We did not find a statistically significant positive effect by suggesting reviewers use reporting guidelines.
Resumo:
[eng] We analyze the equilibrium of a multi-sector exogenous growth model where the introduction of minimum consumption requirements drives structural change. We show that equilibrium dynamics simultaneously exhibt structural change and balanced growth of aggregate variables as is observed in US when the initial intensity of minimum consumption requirements is sufficiently small. This intensity is measured by the ratio between the aggregate value of the minimum consumption requirements and GDP and, therefore, it is inversely related with the level of economic development. Initially rich economies benefit from an initially low intensity of the minimum consumption requirements and, as a consequence, these economies end up exhibiting balanced growth of aggregate variables, while there is structural change. In contrast, initially poor economies suffer from an initially large intensity of the minimum consumption requirements, which makes the growth of the aggregate variables unbalanced during a very large period. These economies may never exhibit simultaneously balanced growth of aggregate variables and structural change.
Resumo:
[eng] We analyze the equilibrium of a multi-sector exogenous growth model where the introduction of minimum consumption requirements drives structural change. We show that equilibrium dynamics simultaneously exhibt structural change and balanced growth of aggregate variables as is observed in US when the initial intensity of minimum consumption requirements is sufficiently small. This intensity is measured by the ratio between the aggregate value of the minimum consumption requirements and GDP and, therefore, it is inversely related with the level of economic development. Initially rich economies benefit from an initially low intensity of the minimum consumption requirements and, as a consequence, these economies end up exhibiting balanced growth of aggregate variables, while there is structural change. In contrast, initially poor economies suffer from an initially large intensity of the minimum consumption requirements, which makes the growth of the aggregate variables unbalanced during a very large period. These economies may never exhibit simultaneously balanced growth of aggregate variables and structural change.
Resumo:
A Wiener system is a linear time-invariant filter, followed by an invertible nonlinear distortion. Assuming that the input signal is an independent and identically distributed (iid) sequence, we propose an algorithm for estimating the input signal only by observing the output of the Wiener system. The algorithm is based on minimizing the mutual information of the output samples, by means of a steepest descent gradient approach.
Resumo:
The linear prediction coding of speech is based in the assumption that the generation model is autoregresive. In this paper we propose a structure to cope with the nonlinear effects presents in the generation of the speech signal. This structure will consist of two stages, the first one will be a classical linear prediction filter, and the second one will model the residual signal by means of two nonlinearities between a linear filter. The coefficients of this filter are computed by means of a gradient search on the score function. This is done in order to deal with the fact that the probability distribution of the residual signal still is not gaussian. This fact is taken into account when the coefficients are computed by a ML estimate. The algorithm based on the minimization of a high-order statistics criterion, uses on-line estimation of the residue statistics and is based on blind deconvolution of Wiener systems [1]. Improvements in the experimental results with speech signals emphasize on the interest of this approach.
Resumo:
The Va/Ba strain, constructed by Sperlich et al. (1977), is the only balanced lethal strain in D. subobscura. It allows the production of homozygous O chromosomes and has been a useful tool not only to analyse chromosomal viabilities but also to obtain homokaryotypic lines (Mestres and Serra, 2008). Besides the morphological dominant mutations Va (Varicose) and Ba (Bare), other genetic markers have been characterized in this strain, some of them by our group and not described previously. Here we present a list of these markers.
Resumo:
Many European states apply score systems to evaluate the disability severity of non-fatal motor victims under the law of third-party liability. The score is a non-negative integer with an upper bound at 100 that increases with severity. It may be automatically converted into financial terms and thus also reflects the compensation cost for disability. In this paper, discrete regression models are applied to analyze the factors that influence the disability severity score of victims. Standard and zero-altered regression models are compared from two perspectives: an interpretation of the data generating process and the level of statistical fit. The results have implications for traffic safety policy decisions aimed at reducing accident severity. An application using data from Spain is provided.
Resumo:
Bodily injury claims have the greatest impact on the claim costs of motor insurance companies. The disability severity of motor claims is assessed in numerous European countries by means of score systems. In this paper a zero inflated generalized Poisson regression model is implemented to estimate the disability severity score of victims in-volved in motor accidents on Spanish roads. We show that the injury severity estimates may be automatically converted into financial terms by insurers at any point of the claim handling process. As such, the methodology described may be used by motor insurers operating in the Spanish market to monitor the size of bodily injury claims. By using insurance data, various applications are presented in which the score estimate of disability severity is of value to insurers, either for computing the claim compensation or for claim reserve purposes.