919 resultados para Many-to-many-assignment problem
Resumo:
The cutting fluids are lubricants used in machining processes, because they present many benefits for different processes. They have many functions, such as lubrication, cooling, improvement in surface finishing, besides they decreases the tool wear and protect it against corrosion. Therefore due to new environment laws and demand to green products, new cutting fluids must be development. These shall be biodegradable, non-toxic, safety for environment and operator healthy. Thus, vegetable oils are a good option to solve this problem, replacing the mineral oils. In this context, this work aimed to develop an emulsion cutting fluid from epoxidized vegetable oil, promoting better lubrication and cooling in machining processes, besides being environment friendly. The methodology was divided in five steps: first one was the biolubricant synthesis by epoxidation reaction. Following this, the biolubricant was characterized in terms of density, acidity, iodo index, oxirane index, viscosity, thermal stability and chemical composition. The third step was to develop an emulsion O/A with different oil concentration (10, 20 and 25%) and surfactant concentration (1, 2.5 and 5%). Also, emulsion stability was studied. The emulsion tribological performance were carried out in HFRR (High Frequency Reciprocating Rig), it consists in ball-disc contact. Results showed that the vegetable based lubricant may be synthesized by epoxidationreaction, the spectra showed that there was 100% conversion of the epoxy rings unsaturations. In regard the tribological assessment is observed that the percentage of oil present in the emulsion directly influenced the film formation and coefficient of friction for higher concentrations the film formation process is slow and unstable, and the coefficient of friction. The high concentrations of surfactants have not improved the emulsions tribological performance. The best performance in friction reduction was observed to emulsion with 10% of oil and 5% of surfactant, its average wear scar was 202 μm.
Resumo:
Multi-objective problems may have many optimal solutions, which together form the Pareto optimal set. A class of heuristic algorithms for those problems, in this work called optimizers, produces approximations of this optimal set. The approximation set kept by the optmizer may be limited or unlimited. The benefit of using an unlimited archive is to guarantee that all the nondominated solutions generated in the process will be saved. However, due to the large number of solutions that can be generated, to keep an archive and compare frequently new solutions to the stored ones may demand a high computational cost. The alternative is to use a limited archive. The problem that emerges from this situation is the need of discarding nondominated solutions when the archive is full. Some techniques were proposed to handle this problem, but investigations show that none of them can surely prevent the deterioration of the archives. This work investigates a technique to be used together with the previously proposed ideas in the literature to deal with limited archives. The technique consists on keeping discarded solutions in a secondary archive, and periodically recycle these solutions, bringing them back to the optimization. Three methods of recycling are presented. In order to verify if these ideas are capable to improve the archive content during the optimization, they were implemented together with other techniques from the literature. An computational experiment with NSGA-II, SPEA2, PAES, MOEA/D and NSGA-III algorithms, applied to many classes of problems is presented. The potential and the difficulties of the proposed techniques are evaluated based on statistical tests.
Resumo:
Multi-objective problems may have many optimal solutions, which together form the Pareto optimal set. A class of heuristic algorithms for those problems, in this work called optimizers, produces approximations of this optimal set. The approximation set kept by the optmizer may be limited or unlimited. The benefit of using an unlimited archive is to guarantee that all the nondominated solutions generated in the process will be saved. However, due to the large number of solutions that can be generated, to keep an archive and compare frequently new solutions to the stored ones may demand a high computational cost. The alternative is to use a limited archive. The problem that emerges from this situation is the need of discarding nondominated solutions when the archive is full. Some techniques were proposed to handle this problem, but investigations show that none of them can surely prevent the deterioration of the archives. This work investigates a technique to be used together with the previously proposed ideas in the literature to deal with limited archives. The technique consists on keeping discarded solutions in a secondary archive, and periodically recycle these solutions, bringing them back to the optimization. Three methods of recycling are presented. In order to verify if these ideas are capable to improve the archive content during the optimization, they were implemented together with other techniques from the literature. An computational experiment with NSGA-II, SPEA2, PAES, MOEA/D and NSGA-III algorithms, applied to many classes of problems is presented. The potential and the difficulties of the proposed techniques are evaluated based on statistical tests.
Resumo:
In longitudinal data analysis, our primary interest is in the regression parameters for the marginal expectations of the longitudinal responses; the longitudinal correlation parameters are of secondary interest. The joint likelihood function for longitudinal data is challenging, particularly for correlated discrete outcome data. Marginal modeling approaches such as generalized estimating equations (GEEs) have received much attention in the context of longitudinal regression. These methods are based on the estimates of the first two moments of the data and the working correlation structure. The confidence regions and hypothesis tests are based on the asymptotic normality. The methods are sensitive to misspecification of the variance function and the working correlation structure. Because of such misspecifications, the estimates can be inefficient and inconsistent, and inference may give incorrect results. To overcome this problem, we propose an empirical likelihood (EL) procedure based on a set of estimating equations for the parameter of interest and discuss its characteristics and asymptotic properties. We also provide an algorithm based on EL principles for the estimation of the regression parameters and the construction of a confidence region for the parameter of interest. We extend our approach to variable selection for highdimensional longitudinal data with many covariates. In this situation it is necessary to identify a submodel that adequately represents the data. Including redundant variables may impact the model’s accuracy and efficiency for inference. We propose a penalized empirical likelihood (PEL) variable selection based on GEEs; the variable selection and the estimation of the coefficients are carried out simultaneously. We discuss its characteristics and asymptotic properties, and present an algorithm for optimizing PEL. Simulation studies show that when the model assumptions are correct, our method performs as well as existing methods, and when the model is misspecified, it has clear advantages. We have applied the method to two case examples.
Resumo:
Ce mémoire explore la relation qui lie démocratie et légitimité politique, dans une perspective épistémique. La démocratie, dans son acception la plus générale, confère à chacun la possibilité de faire valoir les intérêts qu'il estime être les siens et ceux de sa communauté, en particulier à l’occasion d’un scrutin. Cette procédure décisionnelle qu’est le vote consacre ainsi en quelque sorte la liberté et l’égalité dont profitent chacun des citoyens, et confère une certaine légitimité au processus décisionnel. Cela dit, si le vote n’est pas encadré par des considérations épistémiques, rien ne garantit que le résultat politique qui en découlera sera souhaitable tant pour les individus que pour la collectivité: il est tout à fait permis d’imaginer que des politiques discriminatoires, économiquement néfastes ou simplement inefficaces voient ainsi le jour, et prennent effet au détriment de tous. En réponse à ce problème, différentes théories démocratiques ont vu le jour et se sont succédé, afin de tenter de lier davantage le processus démocratique à l’atteinte d’objectifs politiques bénéfiques pour la collectivité. Au nombre d’entre elles, la démocratie délibérative a proposé de substituer la seule confrontation d’intérêts de la démocratie agrégative par une recherche collective du bien commun, canalisée autour de procédures délibératives appelées à légitimer sur des bases plus solides l’exercice démocratique. À sa suite, la démocratie épistémique s’est inspirée des instances délibératives en mettant davantage l’accent sur la qualité des résultats obtenus que sur les procédures elles-mêmes. Au final, un même dilemme hante chaque fois les différentes théories : est-il préférable de construire les instances décisionnelles en se concentrant prioritairement sur les critères procéduraux eux-mêmes, au risque de voir de mauvaises décisions filtrer malgré tout au travers du processus sans pouvoir rien y faire, ou devons-nous avoir d’entrée de jeu une conception plus substantielle de ce qui constitue une bonne décision, au risque cette fois de sacrifier la liberté de choix qui est supposé caractériser un régime démocratique? La thèse que nous défendrons dans ce mémoire est que le concept d’égalité politique peut servir à dénouer ce dilemme, en prenant aussi bien la forme d’un critère procédural que celle d’un objectif politique préétabli. L’égalité politique devient en ce sens une source normative forte de légitimité politique. En nous appuyant sur le procéduralisme épistémique de David Estlund, nous espérons avoir démontré au terme de ce mémoire que l’atteinte d’une égalité politique substantielle par le moyen de procédures égalitaires n’est pas une tautologie hermétique, mais plutôt un mécanisme réflexif améliorant tantôt la robustesse des procédures décisionnelles, tantôt l’atteinte d’une égalité tangible dans les rapports entre citoyens.
Resumo:
Using the wisdom of crowds---combining many individual forecasts to obtain an aggregate estimate---can be an effective technique for improving forecast accuracy. When individual forecasts are drawn from independent and identical information sources, a simple average provides the optimal crowd forecast. However, correlated forecast errors greatly limit the ability of the wisdom of crowds to recover the truth. In practice, this dependence often emerges because information is shared: forecasters may to a large extent draw on the same data when formulating their responses.
To address this problem, I propose an elicitation procedure in which each respondent is asked to provide both their own best forecast and a guess of the average forecast that will be given by all other respondents. I study optimal responses in a stylized information setting and develop an aggregation method, called pivoting, which separates individual forecasts into shared and private information and then recombines these results in the optimal manner. I develop a tailored pivoting procedure for each of three information models, and introduce a simple and robust variant that outperforms the simple average across a variety of settings.
In three experiments, I investigate the method and the accuracy of the crowd forecasts. In the first study, I vary the shared and private information in a controlled environment, while the latter two studies examine forecasts in real-world contexts. Overall, the data suggest that a simple minimal pivoting procedure provides an effective aggregation technique that can significantly outperform the crowd average.
Resumo:
Development of reliable methods for optimised energy storage and generation is one of the most imminent challenges in modern power systems. In this paper an adaptive approach to load leveling problem using novel dynamic models based on the Volterra integral equations of the first kind with piecewise continuous kernels. These integral equations efficiently solve such inverse problem taking into account both the time dependent efficiencies and the availability of generation/storage of each energy storage technology. In this analysis a direct numerical method is employed to find the least-cost dispatch of available storages. The proposed collocation type numerical method has second order accuracy and enjoys self-regularization properties, which is associated with confidence levels of system demand. This adaptive approach is suitable for energy storage optimisation in real time. The efficiency of the proposed methodology is demonstrated on the Single Electricity Market of Republic of Ireland and Northern Ireland.
Resumo:
Stovepipes, or also called silos, appear in many different organizations and sectors and contribute to problems when employees or managers tend to look more to their own, or the individual departments, objectives rather than to the organizations. The purpose of this study was to identify different communicative factors that promote stovepipes in order to further identify the most critical factor to disarm. A case study has been done at a selected company, with a stovepipe structure, in order to achieve the purpose of the study. The case study has included interviews and observations to identify different problem areas which then have been compared with three communicative factors identified in previous studies. The factor that had the most connections to the problem areas have been considered the most critical factor. The result of the study indicates that “A lack of understanding each other's work” is the most critical factor in stovepipe structures and that it can be prevented by following five recommendations: bring up positive collaboration continually, raise problems with each other instead of with others, identify different communication paths in and between the departments, implement a long-term model for preventing stovepipes and set up workshops between the involved departments. The conclusion of the study is that stovepipes create several undesirable effects in the organization but that the efforts to counter these problems do not have to be complicated. Following five small steps into a better collaboration and communication can be enough to be on your way to a better organizational structure.
Resumo:
SOARES, Lennedy C. ; MEDEIROS, Adelardo A. D. de ; PROTASIO, Alan D. D. ; BOLONHINI, Edson H. Sistema supervisório para o método de elevação plunger lift. In: CONGRESSO BRASILEIRO DE PESQUISA E DESENVOLVIMENTO EM PETRÓLEO E GÁS, 5., Fortaleza, CE, 2009. Anais...Fortaleza: CBPDPetro, 2009.
Resumo:
The Highway Safety Manual (HSM) is the compilation of national safety research that provides quantitative methods for analyzing highway safety. The HSM presents crash modification functions related to freeway work zone characteristics such as work zone duration and length. These crash modification functions were based on freeway work zones with high traffic volumes in California. When the HSM-referenced model was calibrated for Missouri, the value was 3.78, which is not ideal since it is significantly larger than 1. Therefore, new models were developed in this study using Missouri data to capture geographical, driver behavior, and other factors in the Midwest. Also, new models for expressway and rural two-lane work zones that barely were studied in the literature were developed. A large sample of 20,837 freeway, 8,993 expressway, and 64,476 rural two-lane work zones in Missouri was analyzed to derive 15 work zone crash prediction models. The most appropriate samples of 1,546 freeway, 1,189 expressway, and 6,095 rural two-lane work zones longer than 0.1 mile and with a duration of greater than 10 days were used to make eight, four, and three models, respectively. A challenging question for practitioners is always how to use crash prediction models to make the best estimation of work zone crash count. To solve this problem, a user-friendly software tool was developed in a spreadsheet format to predict work zone crashes based on work zone characteristics. This software selects the best model, estimates the work zone crashes by severity, and converts them to monetary values using standard crash estimates. This study also included a survey of departments of transportation (DOTs), Federal Highway Administration (FHWA) representatives, and contractors to assess the current state of the practice regarding work zone safety. The survey results indicate that many agencies look at work zone safety informally using engineering judgment. Respondents indicated that they would like a tool that could help them to balance work zone safety across projects by looking at crashes and user costs.
Resumo:
Réalisé en cotutelle avec l'École normale supérieure de Cachan – Université Paris-Saclay
Resumo:
The quality of a heuristic solution to a NP-hard combinatorial problem is hard to assess. A few studies have advocated and tested statistical bounds as a method for assessment. These studies indicate that statistical bounds are superior to the more widely known and used deterministic bounds. However, the previous studies have been limited to a few metaheuristics and combinatorial problems and, hence, the general performance of statistical bounds in combinatorial optimization remains an open question. This work complements the existing literature on statistical bounds by testing them on the metaheuristic Greedy Randomized Adaptive Search Procedures (GRASP) and four combinatorial problems. Our findings confirm previous results that statistical bounds are reliable for the p-median problem, while we note that they also seem reliable for the set covering problem. For the quadratic assignment problem, the statistical bounds has previously been found reliable when obtained from the Genetic algorithm whereas in this work they found less reliable. Finally, we provide statistical bounds to four 2-path network design problem instances for which the optimum is currently unknown.
Resumo:
Le genre des emprunts lexicaux référant à des entités non-sexuées en français est parfois considéré comme arbitraire, alors qu’il est parfois vu comme motivé par sa forme physique et/ou ses significations. Puisque les avis diffèrent à ce sujet, nous nous sommes intéressé à analyser de nombreux critères pouvant contribuer à l’attribution du genre d’un emprunt. Nous avons constitué quatre corpus, chacun composé de textes issus d’une communauté linguistique emprunteuse (Québec et Europe) et d’un niveau de formalité (formel ou informel). Nous avons observé que le genre des emprunts varie considérablement dans de nombreux cas. Nous constatons que les emprunts des langues à genre (italien, arabe) conservent généralement leur genre originel. Les critères sémantiques et de forme physique peuvent autant justifier le genre d’un emprunt l’un que l’autre. Le critère sémantique le plus opératoire intègre chaque emprunt dans un paradigme conceptuel regroupant plusieurs unités lexicales sous une même conceptualisation et, généralement, un genre commun au paradigme.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-07
Resumo:
Groundwater is a common-pool resource that is subject to depletion in many places around the world as a result of increased use of irrigation and water-demanding cash crops. Where state capacity to control groundwater use is limited, collective action is important to increase recharge and restrict highly water-consumptive crops. We present results of field experiments in hard rock areas of Andhra Pradesh, India, to examine factors affecting groundwater use. Two nongovernmental organizations (NGOs) ran the games in communities where they were working to improve watershed and water management. Results indicate that, when the links between crop choice and groundwater depletion is made explicit, farmers can act cooperatively to address this problem. Longer NGO involvement in the villages was associated with more cooperative outcomes in the games. Individuals with more education and higher perceived community social capital played more cooperatively, but neither gender nor method of payment had a significantly effect on individual behavior. When participants could repeat the game with communication, similar crop choice patterns were observed. The games provided an entry point for discussion on the understanding of communities of the interconnectedness of groundwater use and crop choice.