994 resultados para problem complexity
Resumo:
Purpose - The purpose of this paper is twofold: to analyze the computational complexity of the cogeneration design problem; to present an expert system to solve the proposed problem, comparing such an approach with the traditional searching methods available.Design/methodology/approach - The complexity of the cogeneration problem is analyzed through the transformation of the well-known knapsack problem. Both problems are formulated as decision problems and it is proven that the cogeneration problem is np-complete. Thus, several searching approaches, such as population heuristics and dynamic programming, could be used to solve the problem. Alternatively, a knowledge-based approach is proposed by presenting an expert system and its knowledge representation scheme.Findings - The expert system is executed considering two case-studies. First, a cogeneration plant should meet power, steam, chilled water and hot water demands. The expert system presented two different solutions based on high complexity thermodynamic cycles. In the second case-study the plant should meet just power and steam demands. The system presents three different solutions, and one of them was never considered before by our consultant expert.Originality/value - The expert system approach is not a "blind" method, i.e. it generates solutions based on actual engineering knowledge instead of the searching strategies from traditional methods. It means that the system is able to explain its choices, making available the design rationale for each solution. This is the main advantage of the expert system approach over the traditional search methods. On the other hand, the expert system quite likely does not provide an actual optimal solution. All it can provide is one or more acceptable solutions.
Resumo:
In recent years, rough set approach computing issues concerning
reducts of decision tables have attracted the attention of many researchers.
In this paper, we present the time complexity of an algorithm
computing reducts of decision tables by relational database approach. Let
DS = (U, C ∪ {d}) be a consistent decision table, we say that A ⊆ C is a
relative reduct of DS if A contains a reduct of DS. Let s =
Resumo:
This paper addresses the single machine scheduling problem with a common due date aiming to minimize earliness and tardiness penalties. Due to its complexity, most of the previous studies in the literature deal with this problem using heuristics and metaheuristics approaches. With the intention of contributing to the study of this problem, a branch-and-bound algorithm is proposed. Lower bounds and pruning rules that exploit properties of the problem are introduced. The proposed approach is examined through a computational comparative study with 280 problems involving different due date scenarios. In addition, the values of optimal solutions for small problems from a known benchmark are provided.
Resumo:
Three experiments investigated the effect of complexity on children's understanding of a beam balance. In nonconflict problems, weights or distances varied, while the other was held constant. In conflict items, both weight and distance varied, and items were of three kinds: weight dominant, distance dominant, or balance (in which neither was dominant). In Experiment 1, 2-year-old children succeeded on nonconflict-weight and nonconflict-distance problems. This result was replicated in Experiment 2, but performance on conflict items did not exceed chance. In Experiment 3, 3- and 4-year-olds succeeded on all except conflict balance problems, while 5- and 6-year-olds succeeded on all problem types. The results were interpreted in terms of relational complexity theory. Children aged 2 to 4 years succeeded on problems that entailed binary relations, but 5- and 6-year-olds also succeeded on problems that entailed ternary relations. Ternary relations tasks from other domains-transitivity and class inclusion-accounted for 93% of the age-related variance in balance scale scores. (C) 2002 Elsevier Science (USA).
Resumo:
The phenomenon of aging is nowadays society as acquired the status of a social problem, with growing attention and concern, leading to an increase number of studies dedicated to the elderly. The lack of domestic, familiar or social support often lead elderly to nursing homes. Institutionalization is in many cases the only opportunity to have access to health care and life quality. Aging is also associated with a higher prevalence of chronic diseases that require long term medication sometimes for life. Frequently the onset of multiple pathologies at the same time require different therapies and the phenomenon of polypharmacy (five ou more drugs daily) can occur. Even more, the slow down of physiological and cognitives mechanisms associated with these chronic diseases can interphere, in one hand, with the pharmacocinetic of many medications and, on the other hand, with the facility to accomplish the therapeutical regimen. All of these realities contribute to an increase of pharmacotherapeutical complexity, decreasing the adherence and effectiveness of treatment. The pharmacotherapeutical complexity of an individual is characterized by the conciliator element of different characteristics of their drug therapy, such as: the number of medications used; dosage forms; dosing frequency and additional indications. It can be measured by the Medication Regimen Complexity Index (MRCI), originally validated in English.
Resumo:
Nowadays, the phenomenon of population ageing represents an worldwide problem, which assumes particular significance in Portugal. As they get older, individuals present more comorbidities and consequently consume an increasing number of drugs, which contributes to a growing drug therapy complexity. The institutionalized elders are particularly affected by this occurrence. Drug therapy complexity is defined as the conciliator of several characteristics of the pharmacotherapy and can affect patient’s safety and medication adherence. It can be measured with Medication Regimen Complexity Index (MRCI). This study aims to determine the drug therapy complexity of institutionalized elders in order to assess the need of pharmacotherapeutic follow-up.
Resumo:
Compositional schedulability analysis of hierarchical realtime systems is a well-studied problem. Various techniques have been developed to abstract resource requirements of components in such systems, and schedulability has been addressed using these abstract representations (also called component interfaces). These approaches for compositional analysis incur resource overheads when they abstract components into interfaces. In this talk, we define notions of resource schedulability and optimality for component interfaces, and compare various approaches.
Resumo:
I develop a model of endogenous bounded rationality due to search costs, arising implicitly from the problems complexity. The decision maker is not required to know the entire structure of the problem when making choices but can think ahead, through costly search, to reveal more of it. However, the costs of search are not assumed exogenously; they are inferred from revealed preferences through her choices. Thus, bounded rationality and its extent emerge endogenously: as problems become simpler or as the benefits of deeper search become larger relative to its costs, the choices more closely resemble those of a rational agent. For a fixed decision problem, the costs of search will vary across agents. For a given decision maker, they will vary across problems. The model explains, therefore, why the disparity, between observed choices and those prescribed under rationality, varies across agents and problems. It also suggests, under reasonable assumptions, an identifying prediction: a relation between the benefits of deeper search and the depth of the search. As long as calibration of the search costs is possible, this can be tested on any agent-problem pair. My approach provides a common framework for depicting the underlying limitations that force departures from rationality in different and unrelated decision-making situations. Specifically, I show that it is consistent with violations of timing independence in temporal framing problems, dynamic inconsistency and diversification bias in sequential versus simultaneous choice problems, and with plausible but contrasting risk attitudes across small- and large-stakes gambles.
Resumo:
We give the first systematic study of strong isomorphism reductions, a notion of reduction more appropriate than polynomial time reduction when, for example, comparing the computational complexity of the isomorphim problem for different classes of structures. We show that the partial ordering of its degrees is quite rich. We analyze its relationship to a further type of reduction between classes of structures based on purely comparing for every n the number of nonisomorphic structures of cardinality at most n in both classes. Furthermore, in a more general setting we address the question of the existence of a maximal element in the partial ordering of the degrees.
Resumo:
The problems arising in commercial distribution are complex and involve several players and decision levels. One important decision is relatedwith the design of the routes to distribute the products, in an efficient and inexpensive way.This article deals with a complex vehicle routing problem that can beseen as a new extension of the basic vehicle routing problem. The proposed model is a multi-objective combinatorial optimization problemthat considers three objectives and multiple periods, which models in a closer way the real distribution problems. The first objective is costminimization, the second is balancing work levels and the third is amarketing objective. An application of the model on a small example, with5 clients and 3 days, is presented. The results of the model show the complexity of solving multi-objective combinatorial optimization problems and the contradiction between the several distribution management objective.
Resumo:
Previous covering models for emergency service consider all the calls to be of the sameimportance and impose the same waiting time constraints independently of the service's priority.This type of constraint is clearly inappropriate in many contexts. For example, in urban medicalemergency services, calls that involve danger to human life deserve higher priority over calls formore routine incidents. A realistic model in such a context should allow prioritizing the calls forservice.In this paper a covering model which considers different priority levels is formulated andsolved. The model heritages its formulation from previous research on Maximum CoverageModels and incorporates results from Queuing Theory, in particular Priority Queuing. Theadditional complexity incorporated in the model justifies the use of a heuristic procedure.
Resumo:
The Drivers Scheduling Problem (DSP) consists of selecting a set of duties for vehicle drivers, for example buses, trains, plane or boat drivers or pilots, for the transportation of passengers or goods. This is a complex problem because it involves several constraints related to labour and company rules and can also present different evaluation criteria and objectives. Being able to develop an adequate model for this problem that can represent the real problem as close as possible is an important research area.The main objective of this research work is to present new mathematical models to the DSP problem that represent all the complexity of the drivers scheduling problem, and also demonstrate that the solutions of these models can be easily implemented in real situations. This issue has been recognized by several authors and as important problem in Public Transportation. The most well-known and general formulation for the DSP is a Set Partition/Set Covering Model (SPP/SCP). However, to a large extend these models simplify some of the specific business aspects and issues of real problems. This makes it difficult to use these models as automatic planning systems because the schedules obtained must be modified manually to be implemented in real situations. Based on extensive passenger transportation experience in bus companies in Portugal, we propose new alternative models to formulate the DSP problem. These models are also based on Set Partitioning/Covering Models; however, they take into account the bus operator issues and the perspective opinions and environment of the user.We follow the steps of the Operations Research Methodology which consist of: Identify the Problem; Understand the System; Formulate a Mathematical Model; Verify the Model; Select the Best Alternative; Present the Results of theAnalysis and Implement and Evaluate. All the processes are done with close participation and involvement of the final users from different transportation companies. The planner s opinion and main criticisms are used to improve the proposed model in a continuous enrichment process. The final objective is to have a model that can be incorporated into an information system to be used as an automatic tool to produce driver schedules. Therefore, the criteria for evaluating the models is the capacity to generate real and useful schedules that can be implemented without many manual adjustments or modifications. We have considered the following as measures of the quality of the model: simplicity, solution quality and applicability. We tested the alternative models with a set of real data obtained from several different transportation companies and analyzed the optimal schedules obtained with respect to the applicability of the solution to the real situation. To do this, the schedules were analyzed by the planners to determine their quality and applicability. The main result of this work is the proposition of new mathematical models for the DSP that better represent the realities of the passenger transportation operators and lead to better schedules that can be implemented directly in real situations.
Resumo:
We study the complexity of rationalizing choice behavior. We do so by analyzing two polar cases, and a number of intermediate ones. In our most structured case, that is where choice behavior is defined in universal choice domains and satisfies the "weak axiom of revealed preference," finding the complete preorder rationalizing choice behavior is a simple matter. In the polar case, where no restriction whatsoever is imposed, either on choice behavior or on choice domain, finding the complete preordersthat rationalize behavior turns out to be intractable. We show that the task of finding the rationalizing complete preorders is equivalent to a graph problem. This allows the search for existing algorithms in the graph theory literature, for the rationalization of choice.