153 resultados para 13TH INTERNATIONAL-CONGRESS
Resumo:
Community development must be accompanied by a social involvement process which creates functional groups of citizens capable of taking responsibility for their own development. It is important that this process promotes the structuring of all population groups and provides the appropriate institutional and technical support. The present paper addresses these issues based on over 25 years of experience by the Association Instituto de Desarrollo Comunitario de Cuenca in revitalizing rural areas of the Spanish province of Cuenca. This paper analyses the social involvement process encouraged by this association, the relationships between public institutions and local associations, the role of these associations and the difficulties encountered in the rural areas. The long-term perspective of this experience provides some keys which can be used to successfully support the process of social involvement ―such as information on its characteristics and methodological tools―, establish local associations and create sustainable partnerships that foster the growth of leadership within the community development process.
Resumo:
A proper allocation of resources targeted to solve hunger is essential to optimize the efficacy of actions and maximize results. This requires an adequate measurement and formulation of the problem as, paraphrasing Einstein, the formulation of a problem is essential to reach a solution. Different measurement methods have been designed to count, score, classify and compare hunger at local level and to allow comparisons between different places. However, the alternative methods reach significantly different results. These discrepancies make decisions on the targeting of resource allocations difficult. To assist decision makers, a new method taking into account the dimension of hunger and the coping capacities of countries is proposed enabling to establish both geographical and sectoral priorities for the allocation of resources
Resumo:
The objective of this paper is to address the methodological process of a teaching strategy for training project management complexity in postgraduate programs. The proposal is made up of different methods —intuitive, comparative, deductive, case study, problem-solving Project-Based Learning— and different activities inside and outside the classroom. This integration of methods motivated the current use of the concept of ―learning strategy‖. The strategy has two phases: firstly, the integration of the competences —technical, behavioral and contextual—in real projects; and secondly, the learning activity was oriented in upper level of knowledge, the evaluating the complexity for projects management in real situations. Both the competences in the learning strategy and the Project Complexity Evaluation are based on the ICB of IPMA. The learning strategy is applied in an international Postgraduate Program —Erasmus Mundus Master of Science— with the participation of five Universities of the European Union. This master program is fruit of a cooperative experience from one Educative Innovation Group of the UPM -GIE-Project-, two Research Groups of the UPM and the collaboration with other external agents to the university. Some reflections on the experience and the main success factors in the learning strategy were presented in the paper.
Resumo:
A review of the main techniques that have been proposed for temporal processing of optical pulses that are the counterpart of the well-known spatial arrangements will be presented. They are translated to the temporal domain via the space-time duality and implemented with electrooptical phase and amplitude modulators and dispersive devices. We will introduce new variations of the conventional approaches and we will focus on their application to optical communications systems
Resumo:
According to the World Health Organization, 15 million people suffer stroke worldwide each year, of these, 5 million die and 5 million are permanently disabled. Stroke is therefore a major cause of mortality world-wide. The majority of strokes are caused by a blood clot that occludes an artery in the brain, and although thrombolytic agents such as Alteplase are used to dissolve clots that arise in the arteries of the brain, there are limitations on the use of these thrombolytic agents. However over the past decade, other methods of treatment have been developed which include Thrombectomy Devices e.g. the 'GP' Thrombus Aspiration Device ('GP' TAD). Such devices may be used as an alternative to thrombolytics or in conjunction with them to extract blood clots in arteries such as the middle cerebral artery of the midbrain brain, and the posterior inferior cerebellar artery (PICA) of the posterior aspect of the brain. In this paper, we mathematically model the removal of blood clots using the 'GP' TAD from selected arteries of the brain where blood clots may arise taking into account factors such as the resistances, compliances and inertances effects. Such mathematical modelling may have potential uses in predicting the pressures necessary to extract blood clots of given lengths, and masses from arteries in the Circle of Willis - posterior circulation of the brain
Resumo:
A walking machine is a wheeled rover alternative, well suited for work in an unstructured environment and specially in abrupt terrain. They have some drawback like speed and power consumption, but they can achieve complex movements and protrude very little the environment they are working on. The locomotion system is determined by the terrain conditions and, in our case, this legged design has been chosen based in a working area like Rio Tinto in the South of Spain, which is a river area with abrupt terrain. A walking robot with so many degrees of freedom can be a challenge when dealing with the analysis and simulations of the legs. This paper shows how to deal with the kinematical analysis of the equations of a hexapod robot based on a design developed by the Center of Astrobiology INTA-CSIC following the classical formulation of equations
Resumo:
The coagulation of milk is the fundamental process in cheese-making, based on a gel formation as consequence of physicochemical changes taking place in the casein micelles, the monitoring the whole process of milk curd formation is a constant preoccupation for dairy researchers and cheese companies (Lagaude et al., 2004). In addition to advances in composition-based applications of near infrared spectroscopy (NIRS), innovative uses of this technology are pursuing dynamic applications that show promise, especially in regard to tracking a sample in situ during food processing (Bock and Connelly, 2008). In this way the literature describes cheese making process applications of NIRS for curd cutting time determination, which conclude that NIRS would be a suitable method of monitoring milk coagulation, as shown i.e. the works published by Fagan et al. (Fagan et al., 2008; Fagan et al., 2007), based in the use of the commercial CoAguLite probe (with a LED at 880nm and a photodetector for light reflectance detection).
Resumo:
Around ten years ago investigation of technical and material construction in Ancient Roma has advanced in favour to obtain positive results. This process has been directed to obtaining some dates based in chemical composition, also action and reaction of materials against meteorological assaults or post depositional displacements. Plenty of these dates should be interpreted as a result of deterioration and damage in concrete material made in one landscape with some kind of meteorological characteristics. Concrete mixture like calcium and gypsum mortars should be analysed in laboratory test programs, and not only with descriptions based in reference books of Strabo, Pliny the Elder or Vitruvius. Roman manufacture was determined by weather condition, landscape, natural resources and of course, economic situation of the owner. In any case we must research the work in every facts of construction. On the one hand, thanks to chemical techniques like X-ray diffraction and Optical microscopy, we could know the granular disposition of mixture. On the other hand if we develop physical and mechanical techniques like compressive strength, capillary absorption on contact or water behaviour, we could know the reactions in binder and aggregates against weather effects. However we must be capable of interpret these results. Last year many analyses developed in archaeological sites in Spain has contributed to obtain different point of view, so has provide new dates to manage one method to continue the investigation of roman mortars. If we developed chemical and physical analysis in roman mortars at the same time, and we are capable to interpret the construction and the resources used, we achieve to understand the process of construction, the date and also the way of restoration in future.
Resumo:
Lupinus mariae-josephi is a recently described species (Pascual, 2004) able to grow in soils with high pH and active lime content in the Valencia province (Spain). L. mariae-josephi endosymbionts are extremely slowgrowing bacteria with genetic and symbiotic characteristics that differentiate them from Bradyrhizobium strains nodulating Lupinus spp. native of the Iberian Peninsula and adapted to grow in acid soils. Cross-inoculation experiments revealed that all the endosymbiotic isolates from L. mariae-josephi tested are legume-host selective and are unable to nodulate species such as L. angustifolius, and L. luteus. In contrast, Bradyrhizobium strains from Lupinus spp. tested were able to nodulate L. mariae-josephi, although the nodules fixed nitrogen inefficiently. Phylogenetic analysis was performed with housekeeping genes (rrn, glnII, recA, atpD) and nodulation gene nodC. Housekeeping gene phylogeny revealed that L. mariae-josephi rhizobia form a strongly supported monophyletic group within Bradyrhizobium genus. This cluster also includes B. jicamae and certain strains of B. elkanii. Contrarily, isolates from other Lupinus spp. native of the Iberian Peninsula were grouped mainly within B. canariense and two B. japonicum lineages. Phylogenetic analysis of L. mariae-josephi isolates based on the nodC symbiotic gene defined a solid clade close to isolates from Algerian Retama spp. and to fast-growing rhizobia.
Resumo:
Nowadays, one of the main objectives that affects the development of any new product is the respect for the environment. Until the late 80's, the development and manufacture of the most of the product were aimed to achieve maximum quality in time and costs with environmental issues relegated to secondary importance. On the other hand, in the 90's, the pressure from factors such as markets, financial and legislative factors, led to environmental considerations being taken into account. In this context, the current aeronautical industry strategies are based on the search for economic, environmental and energy efficiency considerations for all the processes involved in the aircraft manufacturing.
Resumo:
Although several profiling techniques for identifying performance bottlenecks in logic programs have been developed, they are generally not automatic and in most cases they do not provide enough information for identifying the root causes of such bottlenecks. This complicates using their results for guiding performance improvement. We present a profiling method and tool that provides such explanations. Our profiler associates cost centers to certain program elements and can measure different types of resource-related properties that affect performance, preserving the precedence of cost centers in the cali graph. It includes an automatic method for detecting procedures that are performance bottlenecks. The profiling tool has been integrated in a previously developed run-time checking framework to allow verification of certain properties when they cannot be verified statically. The approach allows checking global computational properties which require complex instrumentation tracking information about previous execution states, such as, e.g., that the execution time accumulated by a given procedure is not greater than a given bound. We have built a prototype implementation, integrated it in the Ciao/CiaoPP system and successfully applied it to performance improvement, automatic optimization (e.g., resource-aware specialization of programs), run-time checking, and debugging of global computational properties (e.g., resource usage) in Prolog programs.
Resumo:
This paper introduces and studies the notion of CLP projection for Constraint Handling Rules (CHR). The CLP projection consists of a naive translation of CHR programs into Constraint Logic Programs (CLP). We show that the CLP projection provides a safe operational and declarative approximation for CHR programs. We demónstrate moreover that a confluent CHR program has a least model, which is precisely equal to the least model of its CLP projection (closing henee a ten year-old conjecture by Abdenader et al.). Finally, we illustrate how the notion of CLP projection can be used in practice to apply CLP analyzers to CHR. In particular, we show results from applying AProVE to prove termination, and CiaoPP to infer both complexity upper bounds and types for CHR programs.
Resumo:
We propose a modular, assertion-based system for verification and debugging of large logic programs, together with several interesting models for checking assertions statically in modular programs, each with different characteristics and representing different trade-offs. Our proposal is a modular and multivariant extensión of our previously proposed abstract assertion checking model and we also report on its implementation in the CiaoPP system. In our approach, the specification of the program, given by a set of assertions, may be partial, instead of the complete specification required by raditional verification systems. Also, the system can deal with properties which cannot always be determined at compile-time. As a result, the proposed system needs to work with safe approximations: all assertions proved correct are guaranteed to be valid and all errors actual errors. The use of modular, context-sensitive static analyzers also allows us to introduce a new distinction between assertions checked in a particular context or checked in general.
Resumo:
The relationship between abstract interpretation and partial evaluation has received considerable attention and (partial) integrations have been proposed starting from both the partial evaluation and abstract interpretation perspectives. In this work we present what we argüe is the first generic algorithm for efñcient and precise integration of abstract interpretation and partial evaluation from an abstract interpretation perspective. Taking as starting point state-of-the-art algorithms for context-sensitive, polyvariant abstract interpretation and (abstract) partial evaluation of logic programs, we present an algorithm which combines the best of both worlds. Key ingredients include the accurate success propagation inherent to abstract interpretation and the powerful program transformations achievable by partial deduction. In our algorithm, the calis which appear in the analysis graph are not analyzed w.r.t. the original definition of the procedure but w.r.t. specialized definitions of these procedures. Such specialized definitions are obtained by applying both unfolding and abstract executability. Also, our framework is parametric w.r.t. different control strategies and abstract domains. Different combinations of these parameters correspond to existing algorithms for program analysis and specialization. Our approach efficiently computes strictly more precise results than those achievable by each of the individual techniques. The algorithm is one of the key components of CiaoPP, the analysis and specialization system of the Ciao compiler.
Resumo:
The need of the Bourbon monarchy to build a Naval Base in the Bay of Cartagena (Spain) during the eighteenth century, implied performing various actions on the environment which allowed the construction of the new dock. One of the priority actions was the transformation of the watershed of the streams that flowed into Mandaraches´s sea. For this reason, a dike was designed and constructed in the northern part of the city. The design of this great work, which was designed as a fortification of the city, was subject to considerable uncertainties. Its proximity to the city involved the demolition of several buildings in the San Roque´s neighborhood. The greater or lesser number of affected buildings and the value of the just indemnification for the expropriation of them, become decisive factors to determine if the work was viable for the Royal Estate or not.