964 resultados para multiple objective programming
Resumo:
OBJECTIVE: To test discriminant analysis as a method of turning the information of a routine customer satisfaction survey (CSS) into a more accurate decision-making tool. METHODS: A 7-question, 10-multiple choice, self-applied questionnaire was used to study a sample of patients seen in two outpatient care units in Valparaíso, Chile, one of primary care (n=100) and the other of secondary care (n=249). Two cutting points were considered in the dependent variable (final satisfaction score): satisfied versus unsatisfied, and very satisfied versus all others. Results were compared with empirical measures (proportion of satisfied individuals, proportion of unsatisfied individuals and size of the median). RESULTS: The response rate was very high, over 97.0% in both units. A new variable, medical attention, was revealed, as explaining satisfaction at the primary care unit. The proportion of the total variability explained by the model was very high (over 99.4%) in both units, when comparing satisfied with unsatisfied customers. In the analysis of very satisfied versus all other customers, significant relationship was identified only in the case of the primary care unit, which explained a small proportion of the variability (41.9%). CONCLUSIONS: Discriminant analysis identified relationships not revealed by the previous analysis. It provided information about the proportion of the variability explained by the model. It identified non-significant relationships suggested by empirical analysis (e.g. the case of the relation very satisfied versus others in the secondary care unit). It measured the contribution of each independent variable to the explanation of the variation of the dependent one.
Resumo:
In this paper we wish to illustrate different perspectives used to create Multiple-Choice questions and we will show how we can improve these in the construction of math tests. As it is known, web technologies have a great influence on student’s behaviour. Based on an on-line project beginning at 2007 which has been contributing to help students on their individual work, we would like to share our experience and thoughts with colleagues who have a common concern when they have the task of constructing Multiple-Choice tests. We feel that Multiple-Choice tests play an important and a very useful supporting role in selfevaluation or self-examination of our students. Nonetheless, good Multiple–Choice Test Items are generally more complex and time-consuming to create than other types of tests. It requires a certain amount of skill. However, this skill maybe increases through study, practice and experience. This paper discusses a number of issues related to the use of Multiple-Choice questions, lists the advantages and disadvantages of this question format contrasting it with open questions. Some examples are given in this context.
Resumo:
Multiple-Choice items are used in many different kinds of tests in several areas of knowledge. They can be considered an interesting tool to the self-assessing or as an alternative or complementary instrument to the traditional methods for assessing knowledge. The objectivity and accuracy of the multiple-choice tests is an important reason to think about. They are especially useful when the number of students to evaluate is too large. Moodle (Modular Object-Oriented Dynamic Learning Environment) is an Open Source course management system centered around learners' needs and designed to support collaborative approaches to teaching and learning. Moodle offers to the users a rich interface, context-specific help buttons, and a wide variety of tools such as discussion forums, wikis, chat, surveys, quizzes, glossaries, journals, grade books and more, that allow them to learn and collaborate in a truly interactive space. Come together the interactivity of the Moodle platform and the objectivity of this kind of tests one can easily build manifold random tests. The proposal of this paper is to relate our journey in the construction of these tests and share our experience in the use of the Moodle platform to create, take advantage and improve the multiple-choices tests in the Mathematic area.
Resumo:
OBJECTIVE: To assess the variation in Anopheles darlingi's biting activity compared to An. marajoara in the same locality and to biting activity data from other regions. METHODS: Using human bait, eight observations of the biting activity of An. darlingi and An. marajoara were carried out during 1999 and 2000 in the municipality of São Raimundo do Pirativa, state of Amapá, Brazil. Each observation consisted of three consecutive 13-hour collections, close to full moon. There were shifts of collectors in the observation points and nocturnal periods. RESULTS: An. darlingi revealed considerable plasticity of biting activity in contrast to An. marajoara, which showed well-defined crepuscular biting peaks. No significant correlation between density and biting activity was found, but a significant correlation existed between time and proportional crepuscular activity, indicating underlying ecological processes not yet understood. Two of the four available data sets having multiple observations at one locality showed considerable plasticity of this species' biting patterns as well. CONCLUSION: Intra-population variation of biting activity can be as significant as inter-population variation. Some implications in malaria vector control and specific studies are also discussed.
Resumo:
OBJECTIVE: To assess the impact of town planning, infrastructure, sanitation and rainfall on the bacteriological quality of domestic water supplies. METHODS: Water samples obtained from deep and shallow wells, boreholes and public taps were cultured to determine the most probable number of Escherichia coli and total coliform using the multiple tube technique. Presence of enteric pathogens was detected using selective and differential media. Samples were collected during both periods of heavy and low rainfall and from municipalities that are unique with respect to infrastructure planning, town planning and sanitation. RESULTS: Contamination of treated and pipe distributed water was related with distance of the collection point from a utility station. Faults in pipelines increased the rate of contamination (p<0.5) and this occurred mostly in densely populated areas with dilapidated infrastructure. Wastewater from drains was the main source of contamination of pipe-borne water. Shallow wells were more contaminated than deep wells and boreholes and contamination was higher during period of heavy rainfall (p<0.05). E. coli and enteric pathogens were isolated from contaminated supplies. CONCLUSIONS: Poor town planning, dilapidated infrastructure and indiscriminate siting of wells and boreholes contributed to the low bacteriological quality of domestic water supplies. Rainfall accentuated the impact.
Resumo:
A água é um recurso escasso e indispensável à vida, podendo ser um importante veículo de microrganismos patogénicos com origem fecal. A matéria fecal é também uma fonte de microrganismos resistentes a antimicrobianos e contribui para a sua disseminação e dos seus genes de resistência no ambiente e entre as comunidades microbianas comensais e microrganismos patogénicos humanos e animais. A qualidade microbiológica da água é monitorizada recorrendo à utilização de bioindicadores como Escherichia coli, enterococos e microrganismos totais. O presente estudo apresentou como principal objetivo determinar a prevalência de ESBLs e AmpCs e ainda avaliar a prevalência de estirpes de enterococos com resistência à vancomicina (VRE) em águas do rio Douro e da orla costeira da cidade do Porto. As amostragens de água foram realizadas em quatro locais localizados no estuário do rio Douro e orla costeira da cidade do Porto entre o mês de Abril e Julho. A deteção e quantificação dos bioindicadores foram realizadas pelo método de filtração por membrana. A suscetibilidade das estirpes de E. coli e enterococos foi testada pelo método de difusão em disco relativamente a várias classes de antimicrobianos. As contagens microbianas mais elevadas foram determinadas entre Abril e Junho e em amostras de água doce. Foram isoladas 62 estirpes de E. coli e 49 estirpes de enterococos que apresentaram prevalências de resistência a antimicrobianos de 90,3% (56/62) e 83,7% (41/49), respetivamente. As estirpes de E. coli apresentaram altas frequências de resistência à ampicilina (74,2%) e tetraciclina (61,3%). Nestas estirpes verificou-se ainda fenótipos associados a multirresistência a um mínimo de três classes de antimicrobianos em 56,5% (35/62) dos isolados. Verificou-se que as estirpes de enterococos apresentaram altos níveis de resistência à rifampicina (34,7%) e azitromicina (40,8%), detetando-se ainda a manifestação de fenótipo de resistência à vancomicina em 26,5% das estirpes. Observou-se uma prevalência de 36,7% (18/49) de estirpes de enterococos associadas a fenómenos de multirresistência antimicrobiana. Ana Martins vi Os resultados obtidos sugerem que o rio Douro e orla costeira, bem como os ambientes aquáticos, constituem reservatórios de bactérias e genes de resistência a antimicrobianos e possuem um papel preponderante na sua disseminação.
Resumo:
The advances made in channel-capacity codes, such as turbo codes and low-density parity-check (LDPC) codes, have played a major role in the emerging distributed source coding paradigm. LDPC codes can be easily adapted to new source coding strategies due to their natural representation as bipartite graphs and the use of quasi-optimal decoding algorithms, such as belief propagation. This paper tackles a relevant scenario in distributedvideo coding: lossy source coding when multiple side information (SI) hypotheses are available at the decoder, each one correlated with the source according to different correlation noise channels. Thus, it is proposed to exploit multiple SI hypotheses through an efficient joint decoding technique withmultiple LDPC syndrome decoders that exchange information to obtain coding efficiency improvements. At the decoder side, the multiple SI hypotheses are created with motion compensated frame interpolation and fused together in a novel iterative LDPC based Slepian-Wolf decoding algorithm. With the creation of multiple SI hypotheses and the proposed decoding algorithm, bitrate savings up to 8.0% are obtained for similar decoded quality.
Resumo:
OBJECTIVE: To evaluate the microbiological quality of treated and untreated water samples came from urban and rural communities and to examine the relationship between coliforms occurrence and average water temperature, and a comparison of the rainfall levels. METHODS: A sample of 3,073 untreated and treated (chlorinated) water from taps (1,594), reservoir used to store treated water (1,033), spring water (96) and private well (350) collected for routine testing between 1996 and 1999 was analyzed by the multiple dilution tube methods used to detect the most probable number of total and fecal coliforms. These samples were obtained in the region of Maringá, state of Paraná, Brazil. RESULTS: The highest numbers water samples contaminated by TC (83%) and FC (48%) were found in the untreated water. TC and FC in samples taken from reservoirs used to store treated water was higher than that from taps midway along distribution lines. Among the treated water samples examined, coliform bacteria were found in 171 of the 1,033 sampling reservoirs. CONCLUSIONS: Insufficient treatment or regrowth is suggested by the observation that more than 17% of these treated potable water contained coliform. TC and FC positive samples appear to be similar and seasonally influenced in treated water. Two different periods must be considered for the occurrence of both TC and FC positive samples: (i) a warm-weather period (September-March) with high percentage of contaminated samples; and (ii) cold-weather period (April-August) were they are lower. Both TC and TF positive samples declined with the decreased of water temperature.
Resumo:
Long-term contractual decisions are the basis of an efficient risk management. However those types of decisions have to be supported with a robust price forecast methodology. This paper reports a different approach for long-term price forecast which tries to give answers to that need. Making use of regression models, the proposed methodology has as main objective to find the maximum and a minimum Market Clearing Price (MCP) for a specific programming period, and with a desired confidence level α. Due to the problem complexity, the meta-heuristic Particle Swarm Optimization (PSO) was used to find the best regression parameters and the results compared with the obtained by using a Genetic Algorithm (GA). To validate these models, results from realistic data are presented and discussed in detail.
Resumo:
One of the most difficult problems that face researchers experimenting with complex systems in real world applications is the Facility Layout Design Problem. It relies with the design and location of production lines, machinery and equipment, inventory storage and shipping facilities. In this work it is intended to address this problem through the use of Constraint Logic Programming (CLP) technology. The use of Genetic Algorithms (GA) as optimisation technique in CLP environment is also an issue addressed. The approach aims the implementation of genetic algorithm operators following the CLP paradigm.
Resumo:
Natural gas industry has been confronted with big challenges: great growth in demand, investments on new GSUs – gas supply units, and efficient technical system management. The right number of GSUs, their best location on networks and the optimal allocation to loads is a decision problem that can be formulated as a combinatorial programming problem, with the objective of minimizing system expenses. Our emphasis is on the formulation, interpretation and development of a solution algorithm that will analyze the trade-off between infrastructure investment expenditure and operating system costs. The location model was applied to a 12 node natural gas network, and its effectiveness was tested in five different operating scenarios.
Resumo:
In recent years the use of several new resources in power systems, such as distributed generation, demand response and more recently electric vehicles, has significantly increased. Power systems aim at lowering operational costs, requiring an adequate energy resources management. In this context, load consumption management plays an important role, being necessary to use optimization strategies to adjust the consumption to the supply profile. These optimization strategies can be integrated in demand response programs. The control of the energy consumption of an intelligent house has the objective of optimizing the load consumption. This paper presents a genetic algorithm approach to manage the consumption of a residential house making use of a SCADA system developed by the authors. Consumption management is done reducing or curtailing loads to keep the power consumption in, or below, a specified energy consumption limit. This limit is determined according to the consumer strategy and taking into account the renewable based micro generation, energy price, supplier solicitations, and consumers’ preferences. The proposed approach is compared with a mixed integer non-linear approach.
Resumo:
This paper presents a methodology for distribution networks reconfiguration in outage presence in order to choose the reconfiguration that presents the lower power losses. The methodology is based on statistical failure and repair data of the distribution power system components and uses fuzzy-probabilistic modelling for system component outage parameters. Fuzzy membership functions of system component outage parameters are obtained by statistical records. A hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models allows catching both randomness and fuzziness of component outage parameters. Once obtained the system states by Monte Carlo simulation, a logical programming algorithm is applied to get all possible reconfigurations for every system state. In order to evaluate the line flows and bus voltages and to identify if there is any overloading, and/or voltage violation a distribution power flow has been applied to select the feasible reconfiguration with lower power losses. To illustrate the application of the proposed methodology to a practical case, the paper includes a case study that considers a real distribution network.
Resumo:
This paper present a methodology to choose the distribution networks reconfiguration that presents the lower power losses. The proposed methodology is based on statistical failure and repair data of the distribution power system components and uses fuzzy-probabilistic modeling for system component outage parameters. The proposed hybrid method using fuzzy sets and Monte Carlo simulation based on the fuzzyprobabilistic models allows catching both randomness and fuzziness of component outage parameters. A logic programming algorithm is applied, once obtained the system states by Monte Carlo Simulation, to get all possible reconfigurations for each system state. To evaluate the line flows and bus voltages and to identify if there is any overloading, and/or voltage violation an AC load flow has been applied to select the feasible reconfiguration with lower power losses. To illustrate the application of the proposed methodology, the paper includes a case study that considers a 115 buses distribution network.
Resumo:
In the energy management of the isolated operation of small power system, the economic scheduling of the generation units is a crucial problem. Applying right timing can maximize the performance of the supply. The optimal operation of a wind turbine, a solar unit, a fuel cell and a storage battery is searched by a mixed-integer linear programming implemented in General Algebraic Modeling Systems (GAMS). A Virtual Power Producer (VPP) can optimal operate the generation units, assured the good functioning of equipment, including the maintenance, operation cost and the generation measurement and control. A central control at system allows a VPP to manage the optimal generation and their load control. The application of methodology to a real case study in Budapest Tech, demonstrates the effectiveness of this method to solve the optimal isolated dispatch of the DC micro-grid renewable energy park. The problem has been converged in 0.09 s and 30 iterations.