894 resultados para Many-to-many-assignment problem


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Os sistemas compartimentais são frequentemente usados na modelação de diversos processos em várias áreas, tais como a biomedicina, ecologia, farmacocinética, entre outras. Na maioria das aplicações práticas, nomeadamente, aquelas que dizem respeito à administração de drogas a pacientes sujeitos a cirurgia, por exemplo, a presença de incertezas nos parâmetros do sistema ou no estado do sistema é muito comum. Ao longo dos últimos anos, a análise de sistemas compartimentais tem sido bastante desenvolvida na literatura. No entanto, a análise da sensibilidade da estabilidade destes sistemas na presença de incertezas tem recebido muito menos atenção. Nesta tese, consideramos uma lei de controlo por realimentação do estado com restrições de positividade e analisamos a sua robustez quando aplicada a sistemas compartimentais lineares e invariantes no tempo com incertezas nos parâmetros. Além disso, para sistemas lineares e invariantes no tempo com estado inicial desconhecido, combinamos esta lei de controlo com um observador do estado e a robustez da lei de controlo resultante também é analisada. O controlo do bloqueio neuromuscular por meio da infusão contínua de um relaxante muscular pode ser modelado como um sistema compartimental de três compartimentos e tem sido objecto de estudo por diversos grupos de investigação. Nesta tese, os nossos resultados são aplicados a este problema de controlo e são fornecidas estratégias para melhorar os resultados obtidos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La compression des données est la technique informatique qui vise à réduire la taille de l’information pour minimiser l’espace de stockage nécessaire et accélérer la transmission des données dans les réseaux à bande passante limitée. Plusieurs techniques de compression telles que LZ77 et ses variantes souffrent d’un problème que nous appelons la redondance causée par la multiplicité d’encodages. La multiplicité d’encodages (ME) signifie que les données sources peuvent être encodées de différentes manières. Dans son cas le plus simple, ME se produit lorsqu’une technique de compression a la possibilité, au cours du processus d’encodage, de coder un symbole de différentes manières. La technique de compression par recyclage de bits a été introduite par D. Dubé et V. Beaudoin pour minimiser la redondance causée par ME. Des variantes de recyclage de bits ont été appliquées à LZ77 et les résultats expérimentaux obtenus conduisent à une meilleure compression (une réduction d’environ 9% de la taille des fichiers qui ont été compressés par Gzip en exploitant ME). Dubé et Beaudoin ont souligné que leur technique pourrait ne pas minimiser parfaitement la redondance causée par ME, car elle est construite sur la base du codage de Huffman qui n’a pas la capacité de traiter des mots de code (codewords) de longueurs fractionnaires, c’est-à-dire qu’elle permet de générer des mots de code de longueurs intégrales. En outre, le recyclage de bits s’appuie sur le codage de Huffman (HuBR) qui impose des contraintes supplémentaires pour éviter certaines situations qui diminuent sa performance. Contrairement aux codes de Huffman, le codage arithmétique (AC) peut manipuler des mots de code de longueurs fractionnaires. De plus, durant ces dernières décennies, les codes arithmétiques ont attiré plusieurs chercheurs vu qu’ils sont plus puissants et plus souples que les codes de Huffman. Par conséquent, ce travail vise à adapter le recyclage des bits pour les codes arithmétiques afin d’améliorer l’efficacité du codage et sa flexibilité. Nous avons abordé ce problème à travers nos quatre contributions (publiées). Ces contributions sont présentées dans cette thèse et peuvent être résumées comme suit. Premièrement, nous proposons une nouvelle technique utilisée pour adapter le recyclage de bits qui s’appuie sur les codes de Huffman (HuBR) au codage arithmétique. Cette technique est nommée recyclage de bits basé sur les codes arithmétiques (ACBR). Elle décrit le cadriciel et les principes de l’adaptation du HuBR à l’ACBR. Nous présentons aussi l’analyse théorique nécessaire pour estimer la redondance qui peut être réduite à l’aide de HuBR et ACBR pour les applications qui souffrent de ME. Cette analyse démontre que ACBR réalise un recyclage parfait dans tous les cas, tandis que HuBR ne réalise de telles performances que dans des cas très spécifiques. Deuxièmement, le problème de la technique ACBR précitée, c’est qu’elle requiert des calculs à précision arbitraire. Cela nécessite des ressources illimitées (ou infinies). Afin de bénéficier de cette dernière, nous proposons une nouvelle version à précision finie. Ladite technique devienne ainsi efficace et applicable sur les ordinateurs avec les registres classiques de taille fixe et peut être facilement interfacée avec les applications qui souffrent de ME. Troisièmement, nous proposons l’utilisation de HuBR et ACBR comme un moyen pour réduire la redondance afin d’obtenir un code binaire variable à fixe. Nous avons prouvé théoriquement et expérimentalement que les deux techniques permettent d’obtenir une amélioration significative (moins de redondance). À cet égard, ACBR surpasse HuBR et fournit une classe plus étendue des sources binaires qui pouvant bénéficier d’un dictionnaire pluriellement analysable. En outre, nous montrons qu’ACBR est plus souple que HuBR dans la pratique. Quatrièmement, nous utilisons HuBR pour réduire la redondance des codes équilibrés générés par l’algorithme de Knuth. Afin de comparer les performances de HuBR et ACBR, les résultats théoriques correspondants de HuBR et d’ACBR sont présentés. Les résultats montrent que les deux techniques réalisent presque la même réduction de redondance sur les codes équilibrés générés par l’algorithme de Knuth.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relatório da prática de ensino supervisionada, Mestrado em Ensino da Matemática, Universidade de Lisboa, 2011

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relatório da Prática de Ensino Supervisionada, Ensino das Artes Visuais, Universidade de Lisboa, 2013

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tese de doutoramento, Educação (Didática das Ciências), Universidade de Lisboa, Instituto de Educação, 2015

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper introduces an approach to solve the problem of generating a sequence of jobs that minimizes the total weighted tardiness for a set of jobs to be processed in a single machine. An Ant Colony System based algorithm is validated with benchmark problems available in the OR library. The obtained results were compared with the best available results and were found to be nearer to the optimal. The obtained computational results allowed concluding on their efficiency and effectiveness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Consider the problem of assigning real-time tasks on a heterogeneous multiprocessor platform comprising two different types of processors — such a platform is referred to as two-type platform. We present two linearithmic timecomplexity algorithms, SA and SA-P, each providing the follow- ing guarantee. For a given two-type platform and a given task set, if there exists a feasible task-to-processor-type assignment such that tasks can be scheduled to meet deadlines by allowing them to migrate only between processors of the same type, then (i) using SA, it is guaranteed to find such a feasible task-to- processor-type assignment where the same restriction on task migration applies but given a platform in which processors are 1+α/2 times faster and (ii) SA-P succeeds in finding 2 a feasible task-to-processor assignment where tasks are not allowed to migrate between processors but given a platform in which processors are 1+α/times faster, where 0<α≤1. The parameter α is a property of the task set — it is the maximum utilization of any task which is less than or equal to 1.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia de Redes de Comunicações e Multimédia

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Load forecasting has gradually becoming a major field of research in electricity industry. Therefore, Load forecasting is extremely important for the electric sector under deregulated environment as it provides a useful support to the power system management. Accurate power load forecasting models are required to the operation and planning of a utility company, and they have received increasing attention from researches of this field study. Many mathematical methods have been developed for load forecasting. This work aims to develop and implement a load forecasting method for short-term load forecasting (STLF), based on Holt-Winters exponential smoothing and an artificial neural network (ANN). One of the main contributions of this paper is the application of Holt-Winters exponential smoothing approach to the forecasting problem and, as an evaluation of the past forecasting work, data mining techniques are also applied to short-term Load forecasting. Both ANN and Holt-Winters exponential smoothing approaches are compared and evaluated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Violence has always been a part of the human experience, and therefore, a popular topic for research. It is a controversial issue, mostly because the possible sources of violent behaviour are so varied, encompassing both biological and environmental factors. However, very little disagreement is found regarding the severity of this societal problem. Most researchers agree that the number and intensity of aggressive acts among adults and children is growing. Not surprisingly, many educational policies, programs, and curricula have been developed to address this concern. The research favours programs which address the root causes of violence and seek to prevent rather than provide consequences for the undesirable behaviour. But what makes a violence prevention program effective? How should educators choose among the many curricula on the market? After reviewing the literature surrounding violence prevention programs and their effectiveness, The Second Step Violence Prevention Curriculum surfaced as unique in many ways. It was designed to address the root causes of violence in an active, student-centred way. Empathy training, anger management, interpersonal cognitive problem solving, and behavioural social skills form the basis of this program. Published in 1992, the program has been the topic of limited research, almost entirely carried out using quantitative methodologies.The purpose of this study was to understand what happens when the Second Step Violence Prevention Curriculum is implemented with a group of students and teachers. I was not seeking a statistical correlation between the frequency of violence and program delivery, as in most prior research. Rather, I wished to gain a deeper understanding of the impact ofthe program through the eyes of the participants. The Second Step Program was taught to a small, primary level, general learning disabilities class by a teacher and student teacher. Data were gathered using interviews with the teachers, personal observations, staff reports, and my own journal. Common themes across the four types of data collection emerged during the study, and these themes were isolated and explored for meaning. Findings indicate that the program does not offer a "quick fix" to this serious problem. However, several important discoveries were made. The teachers feU that the program was effective despite a lack of concrete evidence to support this claim. They used the Second Step strategies outside their actual instructional time and felt it made them better educators and disciplinarians. The students did not display a marked change in their behaviour during or after the program implementation, but they were better able to speak about their actions, the source of their aggression, and the alternatives which were available. Although they were not yet transferring their knowledge into positive action,a heightened awareness was evident. Finally, staff reports and my own journal led me to a deeper understanding ofhow perception frames reality. The perception that the program was working led everyone to feel more empowered when a violent incident occurred, and efforts were made to address the cause rather than merely to offer consequences. A general feeling that we were addressing the problem in a productive way was prevalent among the staff and students involved. The findings from this investigation have many implications for research and practice. Further study into the realm of violence prevention is greatly needed, using a balance of quantitative and qualitative methodologies. Such a serious problem can only be effectively addressed with a greater understanding of its complexities. This study also demonstrates the overall positive impact of the Second Step Violence Prevention Curriculum and, therefore, supports its continued use in our schools.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction The question of the meaning, methods and philosophical manifestations of history is currently rife with contention. The problem that I will address in an exposition of the thought of Wilhelm Dilthey and Martin Heidegger, centers around the intersubjectivity of an historical world. Specifically, there are two interconnected issues. First, since all knowledge occurs to a person from within his or her historical age how can any person in any age make truth claims? In order to answer this concern we must understand the essence and role of history. Yet how can we come to an individual understanding ofwhat history is when the meanings that we use are themselves historically enveloped? But can we, we who are well aware of the knowledge that archaeology has dredged up from old texts or even from 'living' monuments of past ages, really neglect to notice these artifacts that exist within and enrich our world? Charges of wilful blindness would arise if any attempt were made to suggest that certain things of our world did not come down to us from the past. Thus it appears more important 2 to determine what this 'past' is and therefore how history operates than to simply derail the possibility for historical understanding. Wilhelm Dilthey, the great German historicist from the 19th century, did not question the existence of historical artifacts as from the past, but in treating knowledge as one such artifact placed the onus on knowledge to show itself as true, or meaningful, in light ofthe fact that other historical periods relied on different facts and generated different truths or meanings. The problem for him was not just determining what the role of history is, but moreover to discover how knowledge could make any claim as true knowledge. As he stated, there is a problem of "historical anarchy"!' Martin Heidegger picked up these two strands of Dilthey's thought and wanted to answer the problem of truth and meaning in order to solve the problem of historicism. This problem underscored, perhaps for the first time, that societal presuppositions about the past and present oftheir era are not immutable. Penetrating to the core of the raison d'etre of the age was an historical reflection about the past which was now conceived as separated both temporally and attitudinally from the present. But further than this, Heidegger's focus on asking the question of the meaning of Being meant that history must be ontologically explicated not merely ontically treated. Heidegger hopes to remove barriers to a genuine ontology by II 1 3 including history into an assessment ofprevious philosophical systems. He does this in order that the question of Being be more fully explicated, which necessarily for him includes the question of the Being of history. One approach to the question ofwhat history is, given the information that we get from historical knowledge, is whether such knowledge can be formalized into a science. Additionally, we can approach the question of what the essence and role of history is by revealing its underlying characteristics, that is, by focussing on historicality. Thus we will begin with an expository look at Dilthey's conception of history and historicality. We will then explore these issues first in Heidegger's Being and Time, then in the third chapter his middle and later works. Finally, we shall examine how Heidegger's conception may reflect a development in the conception of historicality over Dilthey's historicism, and what such a conception means for a contemporary historical understanding. The problem of existing in a common world which is perceived only individually has been philosophically addressed in many forms. Escaping a pure subjectivist interpretation of 'reality' has occupied Western thinkers not only in order to discover metaphysical truths, but also to provide a foundation for politics and ethics. Many thinkers accept a solipsistic view as inevitable and reject attempts at justifying truth in an intersubjective world. The problem ofhistoricality raises similar problems. We 4 -. - - - - exist in a common historical age, presumably, yet are only aware ofthe historicity of the age through our own individual thoughts. Thus the question arises, do we actually exist within a common history or do we merely individually interpret this as communal? What is the reality of history, individual or communal? Dilthey answers this question by asserting a 'reality' to the historical age thus overcoming solipsism by encasing individual human experience within the historical horizon of the age. This however does nothing to address the epistemological concern over the discoverablity of truth. Heidegger, on the other hand, rejects a metaphysical construel of history and seeks to ground history first within the ontology ofDasein, and second, within the so called "sending" of Being. Thus there can be no solipsism for Heidegger because Dasein's Being is necessarily "cohistorical", Being-with-Others, and furthermore, this historical-Being-in-the-worldwith- Others is the horizon of Being over which truth can appear. Heidegger's solution to the problem of solipsism appears to satisfy that the world is not just a subjective idealist creation and also that one need not appeal to any universal measures of truth or presumed eternal verities. Thus in elucidating Heidegger's notion of history I will also confront the issues ofDasein's Being-alongside-things as well as the Being of Dasein as Being-in-the-world so that Dasein's historicality is explicated vis-a-vis the "sending of Being" (die Schicken des S eins).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study explored motivations of mid-life women over 30 years old who had returned to school. It sought to fmd whether these women returned to solve a problem arising from life events, whether viewing a problem was related to internal or external motivation, whether this perception was related to having greater coping skills, and whether having greater coping was related to seeking support from internal or external sources. This study examined which emotions were most related to viewing a life event as a problem. Finally, it explored the results of previous research of mid-life women in their role as a student. Women (N==83) from three types of institutions volunteered for this study: a university (N==34), a college (N==28), and an adult education centre (N==21). Participants took home a questionnaire package - a I3-page questionnaire and consent form - that were completed and mailed back to the researcher in pre-paid envelopes. Results showed that women over 30 seek education as a solution to a life event problem. External motivation was related to a life event being a problem (p<.005). There was a significant difference in coping scores between institutions. Moods that were related to viewing a life event as problematic were: anger and depressive moods (p<. 001), fatigue and vigor (p<.O 1), and tension/anxiety (p<.05). Mid-life women students' satisfaction in this role was related to being externally motivated. These women sought support from both internal and external sources, rarely had social interactions with peers, and viewed this role as important, yet, temporary in that it will help them change their lives. Implications ofthe results suggest further exploration ofthe roles of anger and depression in motivating women over 30 to learn and finding ways of directing women to use their emotional intelligence to seek out learning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The following properties of the core of a one well-known: (i) the core is non-empty; (ii) the core is a lattice; and (iii) the set of unmatched agents is identical for any two matchings belonging to the core. The literature on two-sided matching focuses almost exclusively on the core and studies extensively its properties. Our main result is the following characterization of (von Neumann-Morgenstern) stable sets in one-to-one matching problem only if it is a maximal set satisfying the following properties : (a) the core is a subset of the set; (b) the set is a lattice; (c) the set of unmatched agents is identical for any two matchings belonging to the set. Furthermore, a set is a stable set if it is the unique maximal set satisfying properties (a), (b) and (c). We also show that our main result does not extend from one-to-one matching problems to many-to-one matching problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dans ce mémoire, nous présentons un nouveau type de problème de confection de tour- née pour un seul véhicule avec cueillettes et livraisons et contrainte de chargement. Cette variante est motivée par des problèmes similaires rapportés dans la littérature. Le véhi- cule en question contient plusieurs piles où des colis de hauteurs différentes sont empilés durant leur transport. La hauteur totale des items contenus dans chacune des piles ne peut dépasser une certaine hauteur maximale. Aucun déplacement n’est permis lors de la li- vraison d’un colis, ce qui signifie que le colis doit être sur le dessus d’une pile au moment d’être livré. De plus, tout colis i ramassé avant un colis j et contenu dans la même pile doit être livré après j. Une heuristique à grand voisinage, basé sur des travaux récents dans le domaine, est proposée comme méthode de résolution. Des résultats numériques sont rapportés pour plusieurs instances classiques ainsi que pour de nouvelles instances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Contexte. Les phénotypes ABO et Rh(D) des donneurs de sang ainsi que des patients transfusés sont analysés de façon routinière pour assurer une complète compatibilité. Ces analyses sont accomplies par agglutination suite à une réaction anticorps-antigènes. Cependant, pour des questions de coûts et de temps d’analyses faramineux, les dons de sang ne sont pas testés sur une base routinière pour les antigènes mineurs du sang. Cette lacune peut résulter à une allo-immunisation des patients receveurs contre un ou plusieurs antigènes mineurs et ainsi amener des sévères complications pour de futures transfusions. Plan d’étude et Méthodes. Pour ainsi aborder le problème, nous avons produit un panel génétique basé sur la technologie « GenomeLab _SNPstream» de Beckman Coulter, dans l’optique d’analyser simultanément 22 antigènes mineurs du sang. La source d’ADN provient des globules blancs des patients préalablement isolés sur papiers FTA. Résultats. Les résultats démontrent que le taux de discordance des génotypes, mesuré par la corrélation des résultats de génotypage venant des deux directions de l’ADN, ainsi que le taux d’échec de génotypage sont très bas (0,1%). Également, la corrélation entre les résultats de phénotypes prédit par génotypage et les phénotypes réels obtenus par sérologie des globules rouges et plaquettes sanguines, varient entre 97% et 100%. Les erreurs expérimentales ou encore de traitement des bases de données ainsi que de rares polymorphismes influençant la conformation des antigènes, pourraient expliquer les différences de résultats. Cependant, compte tenu du fait que les résultats de phénotypages obtenus par génotypes seront toujours co-vérifiés avant toute transfusion sanguine par les technologies standards approuvés par les instances gouvernementales, les taux de corrélation obtenus sont de loin supérieurs aux critères de succès attendus pour le projet. Conclusion. Le profilage génétique des antigènes mineurs du sang permettra de créer une banque informatique centralisée des phénotypes des donneurs, permettant ainsi aux banques de sang de rapidement retrouver les profiles compatibles entre les donneurs et les receveurs.