7 resultados para Resource-based and complementarity theory
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
In this paper, we address the problem of defining the product mix in order to maximise a system's throughput. This problem is well known for being NP-Complete and therefore, most contributions to the topic focus on developing heuristics that are able to obtain good solutions for the problem in a short CPU time. In particular, constructive heuristics are available for the problem such as that by Fredendall and Lea, and by Aryanezhad and Komijan. We propose a new constructive heuristic based on the Theory of Constraints and the Knapsack Problem. The computational results indicate that the proposed heuristic yields better results than the existing heuristic.
Resumo:
This work is supported by Brazilian agencies Fapesp, CAPES and CNPq
Resumo:
The Sznajd model is a sociophysics model that is used to model opinion propagation and consensus formation in societies. Its main feature is that its rules favor bigger groups of agreeing people. In a previous work, we generalized the bounded confidence rule in order to model biases and prejudices in discrete opinion models. In that work, we applied this modification to the Sznajd model and presented some preliminary results. The present work extends what we did in that paper. We present results linking many of the properties of the mean-field fixed points, with only a few qualitative aspects of the confidence rule (the biases and prejudices modeled), finding an interesting connection with graph theory problems. More precisely, we link the existence of fixed points with the notion of strongly connected graphs and the stability of fixed points with the problem of finding the maximal independent sets of a graph. We state these results and present comparisons between the mean field and simulations in Barabasi-Albert networks, followed by the main mathematical ideas and appendices with the rigorous proofs of our claims and some graph theory concepts, together with examples. We also show that there is no qualitative difference in the mean-field results if we require that a group of size q > 2, instead of a pair, of agreeing agents be formed before they attempt to convince other sites (for the mean field, this would coincide with the q-voter model).
Resumo:
International Journal of Paediatric Dentistry 2012; 22: 459466 Aim. This in vitro study aimed to test the performance of fluorescence-based methods in detecting occlusal caries lesions in primary molars compared to conventional methods. Design. Two examiners assessed 113 sites on 77 occlusal surfaces of primary molars using three fluorescence devices: DIAGNOdent (LF), DIAGNOdent pen (LFpen), and fluorescence camera (VistaProof-FC). Visual inspection (ICDAS) and radiographic methods were also evaluated. One examiner repeated the evaluations after one month. As reference standard method, the lesion depth was determined after sectioning and evaluation in stereomicroscope. The area under the ROC curve (Az), sensitivity, specificity, and accuracy of the methods were calculated at enamel (D1) and dentine caries (D3) lesions thresholds. The intra and interexaminer reproducibility were calculated using the intraclass correlation coefficient (ICC) and kappa statistics. Results. At D1, visual inspection presented higher sensitivities (0.970.99) but lower specificities (0.180.25). At D3, all the methods demonstrated similar performance (Az values around 0.90). Visual and radiographic methods showed a slightly higher specificity (values higher than 0.96) than the fluorescence based ones (values around 0.88). In general, all methods presented high reproducibility (ICC higher than 0.79). Conclusions. Although fluorescence-based and conventional methods present similar performance in detecting occlusal caries lesions in primary teeth, visual inspection alone seems to be sufficient to be used in clinical practice.
Resumo:
In this paper, the effects of uncertainty and expected costs of failure on optimum structural design are investigated, by comparing three distinct formulations of structural optimization problems. Deterministic Design Optimization (DDO) allows one the find the shape or configuration of a structure that is optimum in terms of mechanics, but the formulation grossly neglects parameter uncertainty and its effects on structural safety. Reliability-based Design Optimization (RBDO) has emerged as an alternative to properly model the safety-under-uncertainty part of the problem. With RBDO, one can ensure that a minimum (and measurable) level of safety is achieved by the optimum structure. However, results are dependent on the failure probabilities used as constraints in the analysis. Risk optimization (RO) increases the scope of the problem by addressing the compromising goals of economy and safety. This is accomplished by quantifying the monetary consequences of failure, as well as the costs associated with construction, operation and maintenance. RO yields the optimum topology and the optimum point of balance between economy and safety. Results are compared for some example problems. The broader RO solution is found first, and optimum results are used as constraints in DDO and RBDO. Results show that even when optimum safety coefficients are used as constraints in DDO, the formulation leads to configurations which respect these design constraints, reduce manufacturing costs but increase total expected costs (including expected costs of failure). When (optimum) system failure probability is used as a constraint in RBDO, this solution also reduces manufacturing costs but by increasing total expected costs. This happens when the costs associated with different failure modes are distinct. Hence, a general equivalence between the formulations cannot be established. Optimum structural design considering expected costs of failure cannot be controlled solely by safety factors nor by failure probability constraints, but will depend on actual structural configuration. (c) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Abstract Background Educational computer games are examples of computer-assisted learning objects, representing an educational strategy of growing interest. Given the changes in the digital world over the last decades, students of the current generation expect technology to be used in advancing their learning requiring a need to change traditional passive learning methodologies to an active multisensory experimental learning methodology. The objective of this study was to compare a computer game-based learning method with a traditional learning method, regarding learning gains and knowledge retention, as means of teaching head and neck Anatomy and Physiology to Speech-Language and Hearing pathology undergraduate students. Methods Students were randomized to participate to one of the learning methods and the data analyst was blinded to which method of learning the students had received. Students’ prior knowledge (i.e. before undergoing the learning method), short-term knowledge retention and long-term knowledge retention (i.e. six months after undergoing the learning method) were assessed with a multiple choice questionnaire. Students’ performance was compared considering the three moments of assessment for both for the mean total score and for separated mean scores for Anatomy questions and for Physiology questions. Results Students that received the game-based method performed better in the pos-test assessment only when considering the Anatomy questions section. Students that received the traditional lecture performed better in both post-test and long-term post-test when considering the Anatomy and Physiology questions. Conclusions The game-based learning method is comparable to the traditional learning method in general and in short-term gains, while the traditional lecture still seems to be more effective to improve students’ short and long-term knowledge retention.
Resumo:
Liberalism and Marxism are two schools of thought which have left deep imprints in sociological, political and economic theory. They are usually perceived as opposite, rival approaches. In the field of democracy there is a seemingly insurmountable rift around the question of political versus economic democracy. Liberals emphasize the former, Marxists the latter. Liberals say that economic democracy is too abstract and fuzzy a concept, therefore one should concentrate on the workings of an objective political democracy. Marxists insist that political democracy without economic democracy is insufficient. The article argues that both propositions are valid and not mutually exclusive. It proposes the creation of an operational, quantifiable index of economic democracy that can be used alongside the already existing indexes of political democracy. By using these two indexes jointly, political and economic democracy can be objectively evaluated. Thus, the requirements of both camps are met and maybe a more dialogical approach to democracy can be reached in the debate between liberals and Marxists. The joint index is used to evaluate the levels of economic and political democracy in the transition countries of Eastern Europe.