585 resultados para Coûts fixes
Resumo:
Code duplication is common in current programming-practice: programmers search for snippets of code, incorporate them into their projects and then modify them to their needs. In today's practice, no automated scheme is in place to inform both parties of any distant changes of the code. As code snippets continues to evolve both on the side of the user and on the side of the author, both may wish to benefit from remote bug fixes or refinements --- authors may be interested in the actual usage of their code snippets, and researchers could gather information on clone usage. We propose maintaining a link between software clones across repositories and outline how the links can be created and maintained.
Resumo:
La traditionnelle scission binaire entre races ou cultures aujourd’hui se voit porter en faux, à travers ces auteurs de la post-colonie, comme notamment Léonora MIANO, auteur du corpus qui a fait l’objet de notre analyse sous l’angle de l’hybridité. Il en ressort que aussi bien les personnages convoqués dans le roman, la manière d’écrire, l’expérience du temps et de l’espace, et enfin l’intrigue représentent chacun une instance de l’hybridité. Le personnage principal nous balade dans les méandres de la nuit : réalités souvent désolantes et désavouées de l’Afrique avec un peu d’exagération certes, mais surtout dans le but d’affirmer avec Madhu Krishnan « l’hybridation comme moyen judicieux et équilibré d’imaginer le continent africain et ses nations. » Très souvent victime d’un rejet mutuel des deux cotés des frontières entre lesquels ils/elles se retrouvent, le héros hybride fait des dégâts dans l’ordre établi des deux cotés en brisant toutes les barrières ou canons sociaux le plus souvent dans la quête d’une vraie identité ; inexistante à moins de l’imposer. L’auteur célèbre l’impureté et lui attribue un pouvoir qui déplace les montagnes pour se faire une place, au mépris de la tradition et ses principes jugés désuètes dans ce nouveau monde globalisé. /// Léonore Miano (1973 -); Cameroun; Postcolonialisme
Resumo:
This project looked at the nature, contents, methods, means and legal and political effects of the influence that constitutional courts exercise upon the legislative and executive powers in the newly established democracies of Central and Eastern Europe. The basic hypothesis was that these courts work to provide a limitation of political power within the framework of the principal constitutional values and that they force the legislature and executive to exercise their powers and duties in strict accordance with the constitution. Following a study of the documentary sources, including primarily the relevant constitutional and statutory provisions and decisions of constitutional courts, Mr. Cvetkovski prepared a questionnaire on various aspects of the topics researched and sent it to the respective constitutional courts. A series of direct interviews with court officials in six of the ten countries then served to clarify a large number of questions relating to differences in procedures etc. that arose from the questionnaires. As a final stage, the findings were compared with those described in recent publications on constitutional control in general and in Central and Eastern Europe in particular. The study began by considering the constitutional and political environment of the constitutional courts' activities in controlling legislative and executive powers, which in all countries studied are based on the principles of the rule of law and the separation of powers. All courts are separate bodies with special status in terms of constitutional law and are independent of other political and judicial institutions. The range of matters within their jurisdiction is set by the constitution of the country in question but in all cases can be exercised only with the framework of procedural rules. This gives considerable significance to the question of who sets these rules and different countries have dealt with it in different ways. In some there is a special constitutional law with the same legal force as the constitution itself (Croatia), the majority of countries allow for regulation by an ordinary law, Macedonia gives the court the autonomy to create and change its own rules of procedure, while in Hungary the parliament fixes the rules on procedure at the suggestion of the constitutional court. The question of the appointment of constitutional judges was also considered and of the mechanisms for ensuring their impartiality and immunity. In the area of the courts' scope for providing normative control, considerable differences were found between the different countries. In some cases the courts' jurisdiction is limited to the normative acts of the respective parliaments, and there is generally no provision for challenging unconstitutional omissions by legislation and the executive. There are, however, some situations in which they may indirectly evaluate the constitutionality of legislative omissions, as when the constitution contains provision for a time limit on enacting legislation, when the parliament has made an omission in drafting a law which violates the constitutional provisions, or when a law grants favours to certain groups while excluding others, thereby violating the equal protection clause of the constitution. The control of constitutionality of normative acts can be either preventive or repressive, depending on whether it is implemented before or after the promulgation of the law or other enactment being challenged. In most countries in the region the constitutional courts provide only repressive control, although in Hungary and Poland the courts are competent to perform both preventive and repressive norm control, while in Romania the court's jurisdiction is limited to preventive norm control. Most countries are wary of vesting constitutional courts with preventive norm control because of the danger of their becoming too involved in the day-to-day political debate, but Mr. Cvetkovski points out certain advantages of such control. If combined with a short time limit it can provide early clarification of a constitutional issue, secondly it avoids the problems arising if a law that has been in force for some years is declared to be unconstitutional, and thirdly it may help preserve the prestige of the legislation. Its disadvantages include the difficulty of ascertaining the actual and potential consequences of a norm without the empirical experience of the administration and enforcement of the law, the desirability of a certain distance from the day-to-day arguments surrounding the political process of legislation, the possible effects of changing social and economic conditions, and the danger of placing obstacles in the way of rapid reactions to acute situations. In the case of repressive norm control, this can be either abstract or concrete. The former is initiated by the supreme state organs in order to protect abstract constitutional order and the latter is initiated by ordinary courts, administrative authorities or by individuals. Constitutional courts cannot directly oblige the legislature and executive to pass a new law and this remains a matter of legislative and executive political responsibility. In the case of Poland, the parliament even has the power to dismiss a constitutional court decision by a special majority of votes, which means that the last word lies with the legislature. As the current constitutions of Central and Eastern European countries are newly adopted and differ significantly from the previous ones, the courts' interpretative functions should ensure a degree of unification in the application of the constitution. Some countries (Bulgaria, Hungary, Poland, Slovakia and Russia) provide for the constitutional courts' decisions to have a binding role on the constitutions. While their decisions inevitably have an influence on the actions of public bodies, they do not set criteria for political behaviour, which depends rather on the overall political culture and traditions of the society. All constitutions except that of Belarus, provide for the courts to have jurisdiction over conflicts arising from the distribution of responsibilities between different organs and levels in the country, as well for impeachment procedures against the head of state, and for determining the constitutionality of political parties (except in Belarus, Hungary, Russia and Slovakia). All the constitutions studied guarantee individual rights and freedoms and most courts have jurisdiction over complaints of violation of these rights by the constitution. All courts also have some jurisdiction over international agreements and treaties, either directly (Belarus, Bulgaria and Hungary) before the treaty is ratified, or indirectly (Croatia, Czech Republic, Macedonia, Romania, Russia and Yugoslavia). In each country the question of who may initiate proceedings of norm control is of central importance and is usually regulated by the constitution itself. There are three main possibilities: statutory organs, normal courts and private individuals and the limitations on each of these is discussed in the report. Most courts are limited in their rights to institute ex officio a full-scale review of a point of law, and such rights as they do have rarely been used. In most countries courts' decisions do not have any binding force but must be approved by parliament or impose on parliament the obligation to bring the relevant law into conformity within a certain period. As a result, the courts' position is generally weaker than in other countries in Europe, with parliament remaining the supreme body. In the case of preventive norm control a finding of unconstitutionality may act to suspend the law and or to refer it back to the legislature, where in countries such as Romania it may even be overturned by a two-thirds majority. In repressive norm control a finding of unconstitutionality generally serves to take the relevant law out of legal force from the day of publication of the decision or from another date fixed by the court. If the law is annulled retrospectively this may or may not bring decisions of criminal courts under review, depending on the provisions laid down in the relevant constitution. In cases relating to conflicts of competencies the courts' decisions tend to be declaratory and so have a binding effect inter partes. In the case of a review of an individual act, decisions generally become effective primarily inter partes but is the individual act has been based on an unconstitutional generally binding normative act of the legislature or executive, the findings has quasi-legal effect as it automatically initiates special proceedings in which the law or other regulation is to be annulled or abrogated with effect erga omnes. This wards off further application of the law and thus further violations of individual constitutional rights, but also discourages further constitutional complaints against the same law. Thus the success of one individual's complaint extends to everyone else whose rights have equally been or might have been violated by the respective law. As the body whose act is repealed is obliged to adopt another act and in doing so is bound by the legal position of the constitutional court on the violation of constitutionally guaranteed freedoms and rights of the complainant, in this situation the decision of the constitutional court has the force of a precedent.
Resumo:
Theoretical studies of the problems of the securities markets in the Russian Federation incline to one or other of the two traditional approaches. The first consists of comparing the definition of "valuable paper" set forth in the current legislation of the Russian Federation, with the theoretical model of "Wertpapiere" elaborated by German scholars more than 90 years ago. The problem with this approach is, in Mr. Pentsov's opinion, that any new features of the definition of "security" that do not coincide with the theoretical model of "Wertpapiere" (such as valuable papers existing in non-material, electronic form) are claimed to be incorrect and removed from the current legislation of the Russian Federation. The second approach works on the basis of the differentiation between the Common Law concept of "security" and the Civil Law concept of "valuable paper". Mr. Pentsov's research, presented in an article written in English, uses both methodological tools and involves, firstly, a historical study of the origin and development of certain legal phenomena (securities) as they evolved in different countries, and secondly, a comparative, synchronic study of equivalent legal phenomena as they exist in different countries today. Employing the first method, Mr. Pentsov divided the historical development of the conception of "valuable paper" in Russia into five major stages. He found that, despite the existence of a relatively wide circulation of valuable papers, especially in the second half of the 19th century, Russian legislation before 1917 (the first stage) did not have a unified definition of valuable paper. The term was used, in both theoretical studies and legislation, but it covered a broad range of financial instruments such as stocks, bonds, government bonds, promissory notes, bills of exchange, etc. During the second stage, also, the legislation of the USSR did not have a unified definition of "valuable paper". After the end of the "new economic policy" (1922 - 1930) the stock exchanges and the securities markets in the USSR, with a very few exceptions, were abolished. And thus during the third stage (up to 1985), the use of valuable papers in practice was reduced to foreign economic relations (bills of exchange, stocks in enterprises outside the USSR) and to state bonds. Not surprisingly, there was still no unified definition of "valuable paper". After the beginning of Gorbachev's perestroika, a securities market began to re-appear in the USSR. However, the successful development of securities markets in the USSR was retarded by the absence of an appropriate regulatory framework. The first effort to improve the situation was the adoption of the Regulations on Valuable Papers, approved by resolution No. 590 of the Council of Ministers of the USSR, dated June 19, 1990. Section 1 of the Regulation contained the first statutory definition of "valuable paper" in the history of Russia. At the very beginning of the period of transition to a market economy, a number of acts contained different definitions of "valuable paper". This diversity clearly undermined the stability of the Russian securities market and did not achieve the goal of protecting the investor. The lack of unified criteria for the consideration of such non-standard financial instruments as "valuable papers" significantly contributed to the appearance of numerous fraudulent "pyramid" schemes that were outside of the regulatory scheme of Russia legislation. The situation was substantially improved by the adoption of the new Civil Code of the Russian Federation. According to Section 1 of Article 142 of the Civil Code, a valuable paper is a document that confirms, in compliance with an established form and mandatory requisites, certain material rights whose realisation or transfer are possible only in the process of its presentation. Finally, the recent Federal law No. 39 - FZ "On the Valuable Papers Market", dated April 22 1996, has also introduced the term "emission valuable papers". According to Article 2 of this Law, an "emission valuable paper" is any valuable paper, including non-documentary, that simultaneously has the following features: it fixes the composition of material and non-material rights that are subject to confirmation, cession and unconditional realisation in compliance with the form and procedure established by this federal law; it is placed by issues; and it has equal amount and time of realisation of rights within the same issue regardless of when the valuable paper was purchased. Thus the introduction of the conception of "emission valuable paper" became the starting point in the Russian federation's legislation for the differentiation between the legal regimes of "commercial papers" and "investment papers" similar to the Common Law approach. Moving now to the synchronic, comparative method of research, Mr. Pentsov notes that there are currently three major conceptions of "security" and, correspondingly, three approaches to its legal definition: the Common Law concept, the continental law concept, and the concept employed by Japanese Law. Mr. Pentsov proceeds to analyse the differences and similarities of all three, concluding that though the concept of "security" in the Common Law system substantially differs from that of "valuable paper" in the Continental Law system, nevertheless the two concepts are developing in similar directions. He predicts that in the foreseeable future the existing differences between these two concepts will become less and less significant. On the basis of his research, Mr. Pentsov arrived at the conclusion that the concept of "security" (and its equivalents) is not a static one. On the contrary, it is in the process of permanent evolution that reflects the introduction of new financial instruments onto the capital markets. He believes that the scope of the statutory definition of "security" plays an extremely important role in the protection of investors. While passing the Securities Act of 1933, the United States Congress determined that the best way to achieve the goal of protecting investors was to define the term "security" in sufficiently broad and general terms so as to include within the definition the many types of instruments that in the commercial world fall within the ordinary concept of "security' and to cover the countless and various devices used by those who seek to use the money of others on the promise of profits. On the other hand, the very limited scope of the current definition of "emission valuable paper" in the Federal Law of the Russian Federation entitled "On the Valuable Papers Market" does not allow the anti-fraud provisions of this law to be implemented in an efficient way. Consequently, there is no basis for the protection of investors. Mr. Pentsov proposes amendments which he believes would enable the Russian markets to become more efficient and attractive for both foreign and domestic investors.
Resumo:
Inexpensive, commercial available off-the-shelf (COTS) Global Positioning Receivers (GPS) have typical accuracy of ±3 meters when augmented by the Wide Areas Augmentation System (WAAS). There exist applications that require position measurements between two moving targets. The focus of this work is to explore the viability of using clusters of COTS GPS receivers for relative position measurements to improve their accuracy. An experimental study was performed using two clusters, each with five GPS receivers, with a fixed distance of 4.5 m between the clusters. Although the relative position was fixed, the entire system of ten GPS receivers was on a mobile platform. Data was recorded while moving the system over a rectangular track with a perimeter distance of 7564 m. The data was post processed and yielded approximately 1 meter accuracy for the relative position vector between the two clusters.
Resumo:
The Hamilton-Waterloo problem and its spouse-avoiding variant for uniform cycle sizes asks if Kv, where v is odd (or Kv - F, if v is even), can be decomposed into 2-factors in which each factor is made either entirely of m-cycles or entirely of n-cycles. This thesis examines the case in which r of the factors are made up of cycles of length 3 and s of the factors are made up of cycles of length 9, for any r and s. We also discuss a constructive solution to the general (m,n) case which fixes r and s.
Resumo:
Various applications for the purposes of event detection, localization, and monitoring can benefit from the use of wireless sensor networks (WSNs). Wireless sensor networks are generally easy to deploy, with flexible topology and can support diversity of tasks thanks to the large variety of sensors that can be attached to the wireless sensor nodes. To guarantee the efficient operation of such a heterogeneous wireless sensor networks during its lifetime an appropriate management is necessary. Typically, there are three management tasks, namely monitoring, (re) configuration, and code updating. On the one hand, status information, such as battery state and node connectivity, of both the wireless sensor network and the sensor nodes has to be monitored. And on the other hand, sensor nodes have to be (re)configured, e.g., setting the sensing interval. Most importantly, new applications have to be deployed as well as bug fixes have to be applied during the network lifetime. All management tasks have to be performed in a reliable, time- and energy-efficient manner. The ability to disseminate data from one sender to multiple receivers in a reliable, time- and energy-efficient manner is critical for the execution of the management tasks, especially for code updating. Using multicast communication in wireless sensor networks is an efficient way to handle such traffic pattern. Due to the nature of code updates a multicast protocol has to support bulky traffic and endto-end reliability. Further, the limited resources of wireless sensor nodes demand an energy-efficient operation of the multicast protocol. Current data dissemination schemes do not fulfil all of the above requirements. In order to close the gap, we designed the Sensor Node Overlay Multicast (SNOMC) protocol such that to support a reliable, time-efficient and energy-efficient dissemination of data from one sender node to multiple receivers. In contrast to other multicast transport protocols, which do not support reliability mechanisms, SNOMC supports end-to-end reliability using a NACK-based reliability mechanism. The mechanism is simple and easy to implement and can significantly reduce the number of transmissions. It is complemented by a data acknowledgement after successful reception of all data fragments by the receiver nodes. In SNOMC three different caching strategies are integrated for an efficient handling of necessary retransmissions, namely, caching on each intermediate node, caching on branching nodes, or caching only on the sender node. Moreover, an option was included to pro-actively request missing fragments. SNOMC was evaluated both in the OMNeT++ simulator and in our in-house real-world testbed and compared to a number of common data dissemination protocols, such as Flooding, MPR, TinyCubus, PSFQ, and both UDP and TCP. The results showed that SNOMC outperforms the selected protocols in terms of transmission time, number of transmitted packets, and energy-consumption. Moreover, we showed that SNOMC performs well with different underlying MAC protocols, which support different levels of reliability and energy-efficiency. Thus, SNOMC can offer a robust, high-performing solution for the efficient distribution of code updates and management information in a wireless sensor network. To address the three management tasks, in this thesis we developed the Management Architecture for Wireless Sensor Networks (MARWIS). MARWIS is specifically designed for the management of heterogeneous wireless sensor networks. A distinguished feature of its design is the use of wireless mesh nodes as backbone, which enables diverse communication platforms and offloading functionality from the sensor nodes to the mesh nodes. This hierarchical architecture allows for efficient operation of the management tasks, due to the organisation of the sensor nodes into small sub-networks each managed by a mesh node. Furthermore, we developed a intuitive -based graphical user interface, which allows non-expert users to easily perform management tasks in the network. In contrast to other management frameworks, such as Mate, MANNA, TinyCubus, or code dissemination protocols, such as Impala, Trickle, and Deluge, MARWIS offers an integrated solution monitoring, configuration and code updating of sensor nodes. Integration of SNOMC into MARWIS further increases performance efficiency of the management tasks. To our knowledge, our approach is the first one, which offers a combination of a management architecture with an efficient overlay multicast transport protocol. This combination of SNOMC and MARWIS supports reliably, time- and energy-efficient operation of a heterogeneous wireless sensor network.
Resumo:
Cet article aborde la transformation de régions de montagne en lieux de résidence remplaçant des secteurs économiques plus anciens (agriculture, industrie manufacturière, tourisme) dans les montagnes européennes. Il se place dans la perspective du développement régional et de son impact sur les ressources régionales fixes, le « capital territorial ». Cette nouvelle tendance affecte les montagnes européennes de deux manières, et participe à la formation de régions métropolitaines qui combinent centres métropolitains et environnements de loisirs fondés sur les attraits du paysage pour constituer de nouvelles entités intégrées. Au cours du processus, le paysage devient un bien de consommation nouveau et rare, qui joue un rôle dans l’accumulation du capital investi. L’article établit que les concepts d’esthétique du paysage et d’agréments ne suffisent pas à expliquer cette nouvelle dynamique, car ils méconnaissent les processus spatio-économiques ainsi que le rôle de la marchandisation du paysage pour les nouveaux résidents. Ces nouveaux résidents ont un profil plus « multilocal » que migrant. La multilocalité et l’usage sélectif des produits du paysage freinent le processus d’intégration, crucial pour entretenir et développer le capital territorial. On peut poser que la présence non permanente des nouveaux résidents risque d’affaiblir et non de renforcer les structures locales existantes. Il semble donc nécessaire de déployer des efforts particuliers auprès de chaque groupe de nouveaux résidents pour que de simples résidents à temps partiel deviennent des acteurs régionaux (au moins à temps partiel). De plus, le concept du développement régional centré sur les acteurs innovants doit être remis en question dans la mesure où l’aspect « consommation » domine le rapport du paysage.
Resumo:
Detecting bugs as early as possible plays an important role in ensuring software quality before shipping. We argue that mining previous bug fixes can produce good knowledge about why bugs happen and how they are fixed. In this paper, we mine the change history of 717 open source projects to extract bug-fix patterns. We also manually inspect many of the bugs we found to get insights into the contexts and reasons behind those bugs. For instance, we found out that missing null checks and missing initializations are very recurrent and we believe that they can be automatically detected and fixed.
Resumo:
Previous studies on issue tracking systems for open source software (OSS) focused mainly on requests for bug fixes. However, requests to add a new feature or an improvement to an OSS project are often also made in an issue tracking system. These inquiries are particularly important because they determine the further development of the software. This study examines if there is any difference between requests of the IBM developer community and other sources in terms of the likelihood of successful implementation. Our study consists of a case study of the issue tracking system BugZilla in the Eclipse integrated development environment (IDE). Our hypothesis, which was that feature requests from outsiders have less chances of being implemented, than feature requests from IBM developers, was confirmed.
Resumo:
Outbreaks of crown-of-thorns starfish (COTS), Acanthaster planci, contribute to major declines of coral reef ecosystems throughout the Indo-Pacific. As the oceans warm and decrease in pH due to increased anthropogenic CO2 production, coral reefs are also susceptible to bleaching, disease and reduced calcification. The impacts of ocean acidification and warming may be exacerbated by COTS predation, but it is not known how this major predator will fare in a changing ocean. Because larval success is a key driver of population outbreaks, we investigated the sensitivities of larval A. planci to increased temperature (2-4 °C above ambient) and acidification (0.3-0.5 pH units below ambient) in flow-through cross-factorial experiments (3 temperature × 3 pH/pCO2 levels). There was no effect of increased temperature or acidification on fertilization or very early development. Larvae reared in the optimal temperature (28 °C) were the largest across all pH treatments. Development to advanced larva was negatively affected by the high temperature treatment (30 °C) and by both experimental pH levels (pH 7.6, 7.8). Thus, planktonic life stages of A. planci may be negatively impacted by near-future global change. Increased temperature and reduced pH had an additive negative effect on reducing larval size. The 30 °C treatment exceeded larval tolerance regardless of pH. As 30 °C sea surface temperatures may become the norm in low latitude tropical regions, poleward migration of A. planci may be expected as they follow optimal isotherms. In the absence of acclimation or adaptation, declines in low latitude populations may occur. Poleward migration will be facilitated by strong western boundary currents, with possible negative flow-on effects on high latitude coral reefs. The contrasting responses of the larvae of A. planci and those of its coral prey to ocean acidification and warming are considered in context with potential future change in tropical reef ecosystems.
Resumo:
The ITER CODAC design identifies slow and fast plant system controllers (PSC). The gast OSCs are based on embedded technologies, permit sampling rates greater than 1 KHz, meet stringent real-time requirements, and will be devoted to data acquisition tasks and control purposes. CIEMAT and UPM have implemented a prototype of a fast PSC based on commercial off-the-shelf (COTS) technologies with PXI hardware and software based on EPICS
Resumo:
The ITER CODAC design identifies slow and fast plant system controllers (PSC). The gast OSCs are based on embedded technologies, permit sampling rates greater than 1 KHz, meet stringent real-time requirements, and will be devoted to data acquisition tasks and control purposes. CIEMAT and UPM have implemented a prototype of a fast PSC based on commercial off-the-shelf (COTS) technologies with PXI hardware and software based on EPICS
Resumo:
La segmentación de imágenes es un campo importante de la visión computacional y una de las áreas de investigación más activas, con aplicaciones en comprensión de imágenes, detección de objetos, reconocimiento facial, vigilancia de vídeo o procesamiento de imagen médica. La segmentación de imágenes es un problema difícil en general, pero especialmente en entornos científicos y biomédicos, donde las técnicas de adquisición imagen proporcionan imágenes ruidosas. Además, en muchos de estos casos se necesita una precisión casi perfecta. En esta tesis, revisamos y comparamos primero algunas de las técnicas ampliamente usadas para la segmentación de imágenes médicas. Estas técnicas usan clasificadores a nivel de pixel e introducen regularización sobre pares de píxeles que es normalmente insuficiente. Estudiamos las dificultades que presentan para capturar la información de alto nivel sobre los objetos a segmentar. Esta deficiencia da lugar a detecciones erróneas, bordes irregulares, configuraciones con topología errónea y formas inválidas. Para solucionar estos problemas, proponemos un nuevo método de regularización de alto nivel que aprende información topológica y de forma a partir de los datos de entrenamiento de una forma no paramétrica usando potenciales de orden superior. Los potenciales de orden superior se están popularizando en visión por computador, pero la representación exacta de un potencial de orden superior definido sobre muchas variables es computacionalmente inviable. Usamos una representación compacta de los potenciales basada en un conjunto finito de patrones aprendidos de los datos de entrenamiento que, a su vez, depende de las observaciones. Gracias a esta representación, los potenciales de orden superior pueden ser convertidos a potenciales de orden 2 con algunas variables auxiliares añadidas. Experimentos con imágenes reales y sintéticas confirman que nuestro modelo soluciona los errores de aproximaciones más débiles. Incluso con una regularización de alto nivel, una precisión exacta es inalcanzable, y se requeire de edición manual de los resultados de la segmentación automática. La edición manual es tediosa y pesada, y cualquier herramienta de ayuda es muy apreciada. Estas herramientas necesitan ser precisas, pero también lo suficientemente rápidas para ser usadas de forma interactiva. Los contornos activos son una buena solución: son buenos para detecciones precisas de fronteras y, en lugar de buscar una solución global, proporcionan un ajuste fino a resultados que ya existían previamente. Sin embargo, requieren una representación implícita que les permita trabajar con cambios topológicos del contorno, y esto da lugar a ecuaciones en derivadas parciales (EDP) que son costosas de resolver computacionalmente y pueden presentar problemas de estabilidad numérica. Presentamos una aproximación morfológica a la evolución de contornos basada en un nuevo operador morfológico de curvatura que es válido para superficies de cualquier dimensión. Aproximamos la solución numérica de la EDP de la evolución de contorno mediante la aplicación sucesiva de un conjunto de operadores morfológicos aplicados sobre una función de conjuntos de nivel. Estos operadores son muy rápidos, no sufren de problemas de estabilidad numérica y no degradan la función de los conjuntos de nivel, de modo que no hay necesidad de reinicializarlo. Además, su implementación es mucho más sencilla que la de las EDP, ya que no requieren usar sofisticados algoritmos numéricos. Desde un punto de vista teórico, profundizamos en las conexiones entre operadores morfológicos y diferenciales, e introducimos nuevos resultados en este área. Validamos nuestra aproximación proporcionando una implementación morfológica de los contornos geodésicos activos, los contornos activos sin bordes, y los turbopíxeles. En los experimentos realizados, las implementaciones morfológicas convergen a soluciones equivalentes a aquéllas logradas mediante soluciones numéricas tradicionales, pero con ganancias significativas en simplicidad, velocidad y estabilidad. ABSTRACT Image segmentation is an important field in computer vision and one of its most active research areas, with applications in image understanding, object detection, face recognition, video surveillance or medical image processing. Image segmentation is a challenging problem in general, but especially in the biological and medical image fields, where the imaging techniques usually produce cluttered and noisy images and near-perfect accuracy is required in many cases. In this thesis we first review and compare some standard techniques widely used for medical image segmentation. These techniques use pixel-wise classifiers and introduce weak pairwise regularization which is insufficient in many cases. We study their difficulties to capture high-level structural information about the objects to segment. This deficiency leads to many erroneous detections, ragged boundaries, incorrect topological configurations and wrong shapes. To deal with these problems, we propose a new regularization method that learns shape and topological information from training data in a nonparametric way using high-order potentials. High-order potentials are becoming increasingly popular in computer vision. However, the exact representation of a general higher order potential defined over many variables is computationally infeasible. We use a compact representation of the potentials based on a finite set of patterns learned fromtraining data that, in turn, depends on the observations. Thanks to this representation, high-order potentials can be converted into pairwise potentials with some added auxiliary variables and minimized with tree-reweighted message passing (TRW) and belief propagation (BP) techniques. Both synthetic and real experiments confirm that our model fixes the errors of weaker approaches. Even with high-level regularization, perfect accuracy is still unattainable, and human editing of the segmentation results is necessary. The manual edition is tedious and cumbersome, and tools that assist the user are greatly appreciated. These tools need to be precise, but also fast enough to be used in real-time. Active contours are a good solution: they are good for precise boundary detection and, instead of finding a global solution, they provide a fine tuning to previously existing results. However, they require an implicit representation to deal with topological changes of the contour, and this leads to PDEs that are computationally costly to solve and may present numerical stability issues. We present a morphological approach to contour evolution based on a new curvature morphological operator valid for surfaces of any dimension. We approximate the numerical solution of the contour evolution PDE by the successive application of a set of morphological operators defined on a binary level-set. These operators are very fast, do not suffer numerical stability issues, and do not degrade the level set function, so there is no need to reinitialize it. Moreover, their implementation is much easier than their PDE counterpart, since they do not require the use of sophisticated numerical algorithms. From a theoretical point of view, we delve into the connections between differential andmorphological operators, and introduce novel results in this area. We validate the approach providing amorphological implementation of the geodesic active contours, the active contours without borders, and turbopixels. In the experiments conducted, the morphological implementations converge to solutions equivalent to those achieved by traditional numerical solutions, but with significant gains in simplicity, speed, and stability.
Resumo:
Este Trabajo de Fin de Grado (TFG) tiene el objetivo de aportar un sistema de enseñanza innovador, un sistema de enseñanza mediante el cual se consiga involucrar a los alumnos en tareas y prácticas en las que se adquieran conocimientos a la vez que se siente un ambiente de juego, es decir, que se consiga aprender de forma divertida. Está destinado al sistema educativo de la Escuela Técnica Superior de Ingenieros Informáticos de la Universidad Politécnica de Madrid, en concreto a las asignaturas relacionadas con los Procesadores de Lenguajes. La aplicación desarrollada en este trabajo está destinada tanto a los profesores de las asignaturas de Procesadores de Lenguajes como a los alumnos que tengan alguna relación con esas asignaturas, consiguiendo mayor interacción y diversión a la hora de realizar la tareas y prácticas de las asignaturas. Para los dos tipos de usuarios descritos anteriormente, la aplicación está configurada para que puedan identificarse mediante sus credenciales, comprobándose si los datos introducidos son correctos, y así poder acceder al sistema. Dependiendo de qué tipo de usuario se identifique, tendrá unas opciones u otras dentro del sistema. Los profesores podrán dar de alta, ver, modificar o dar de baja las configuraciones para los analizadores de los lenguajes correspondientes a las diferentes asignaturas que están configurados previamente en el sistema. Además, los profesores pueden dar de alta, ver, modificar o dar de baja los fragmentos de código que formarán los ficheros correspondientes a las plantillas de pruebas del analizador léxico que se les ofrece a los alumnos para realizar comprobaciones de las prácticas. Mediante la aplicación podrán establecer diferentes características y propiedades de los fragmentos que incorporen al sistema. Por otra parte, los alumnos podrán realizar la configuración del lenguaje, definido por los profesores, para la parte del analizador léxico de las prácticas. Esta configuración será guardada para el grupo al que corresponde el alumno, pudiendo realizar modificaciones cualquier miembro del grupo. De esta manera, se podrán posteriormente establecer las relaciones necesarias entre los elementos del lenguaje según la configuración de los profesores y los elementos referentes a las prácticas de los alumnos.Además, los alumnos podrán realizar comprobaciones de la parte léxica de sus prácticas mediante los ficheros que se generan por el sistema en función de sus opciones de práctica y los fragmentos añadidos por los profesores. De esta manera, se informará a los alumnos del éxito de las pruebas o bien de los fallos ocasionados con sus resultados, bien por el formato del archivo subido como resultado de la prueba o bien por el contenido incorrecto de este mismo. Todas las funciones que ofrece esta aplicación son completamente on-line y tendrán una interfaz llamativa y divertida, además de caracterizarse por su facilidad de uso y su comodidad. En el trabajo realizado para este proyecto se cumplen tanto las Pautas de Accesibilidad para Contenidos Web (WCAG 2.0), así como las propiedades de un código HTML 5 y CSS 3 de manera correcta, para así conseguir que los usuarios utilicen una aplicación fácil, cómoda y atractiva.---ABSTRACT---This Final Year Project (TFG) aims to contribute the educational system of the School of Computer Engineering at the Polytechnic University of Madrid, especially in subjects related with Language Processors. This project is an interactive learning system whose goal is to learn in an amusing environment. To realize this target, the system involves students, using environments of games in tasks and practices. The application developed in this project is designed for both professors of the subjects of Language Processors and students who have some relation to these subjects. This perspective achieve more interaction and a funny environment during the subject‘s tasks. The application is configured in order to the users can be identified by their credentials, checking whether the identification data are correct to have access to the system. According on what type of user is identified, they will have different options within the system. Professors will be able to register, modify or delete settings for the scanner of languages for all the subjects preconfigured in the system. Additionally, professors can register, show, modify or remove the code of the templates from scanner tests that are offered to students for testing the practical exercises. The professors may provide also different characteristics and properties of fragments incorporated in the system. Moreover, students can make the configuration of languages, getting in the systems by the administrators, for the scanner module of their practical exercises. This configuration will be saved for the group of the student. This model can also be changed by any group member. The system permits also establish later the relationships between the elements of language fixes by professors and elements developed by the students. Students could check the lexical part of their practical exercises through files that are created according to their practical options and the fragments added by professors. Thus students will be informed of success or failure in the uploaded files format and in the content of them. All functions provide by this application are completely on-line and will have a striking and funny interface, also characterized by its ease of use and comfort.The work reaches both the Web Content Accessibility Guidelines (WCAG 2.0), and the properties of an HTML 5 and CSS 3 code correctly, in order to get the users to get an easy, convenient application and attractive.