945 resultados para Boi-inspired robotics
Estudi de comportaments socials d'aixams robòtics amb aplicació a la neteja d'espais no estructurats
Resumo:
La intel·ligència d’eixams és una branca de la intel·ligència artificial que està agafant molta força en els últims temps, especialment en el camp de la robòtica. En aquest projecte estudiarem el comportament social sorgit de les interaccions entre un nombre determinat de robots autònoms en el camp de la neteja de grans superfícies. Un cop triat un escenari i un robot que s’ajustin als requeriments del projecte, realitzarem una sèrie de simulacions a partir de diferents polítiques de cerca que ens permetran avaluar el comportament dels robots per unes condicions inicials de distribució dels robots i zones a netejar. A partir dels resultats obtinguts serem capaços de determinar quina configuració genera millors resultats.
Resumo:
This paper addresses the issue of policy evaluation in a context in which policymakers are uncertain about the effects of oil prices on economic performance. I consider models of the economy inspired by Solow (1980), Blanchard and Gali (2007), Kim and Loungani (1992) and Hamilton (1983, 2005), which incorporate different assumptions on the channels through which oil prices have an impact on economic activity. I first study the characteristics of the model space and I analyze the likelihood of the different specifications. I show that the existence of plausible alternative representations of the economy forces the policymaker to face the problem of model uncertainty. Then, I use the Bayesian approach proposed by Brock, Durlauf and West (2003, 2007) and the minimax approach developed by Hansen and Sargent (2008) to integrate this form of uncertainty into policy evaluation. I find that, in the environment under analysis, the standard Taylor rule is outperformed under a number of criteria by alternative simple rules in which policymakers introduce persistence in the policy instrument and respond to changes in the real price of oil.
Resumo:
Aim: The relative effectiveness of different methods of prevention of HIV transmission is a subject of debate that is renewed with the integration of each new method. The relative weight of values and evidence in decision-making is not always clearly defined. Debate is often confused, as the proponents of different approaches address the issue at different levels of implementation. This paper defines and delineates the successive levels of analysis of effectiveness, and proposes a conceptual framework to clarify debate. Method / Issue: Initially inspired from work on contraceptive effectiveness, a first version of the conceptual framework was published in 1993 with definition of the Condom Effectiveness Matrix (Spencer, 1993). The framework has since integrated and further developed thinking around distinctions made between efficacy and effectiveness and has been applied to HIV prevention in general. Three levels are defined: theoretical effectiveness (ThE), use-effectiveness (UseE) and population use-effectiveness (PopUseE). For example, abstinence and faithfulness, as proposed in the ABC strategy, have relatively high theoretical effectiveness but relatively low effectiveness at subsequent levels of implementation. The reverse is true of circumcision. Each level is associated with specific forms of scientific enquiry and associated research questions: basic and clinical sciences with ThE; clinical and social sciences with UseE; epidemiology and social, economic and political sciences with PopUseE. Similarly, the focus of investigation moves from biological organisms, to the individual at the physiological and then psychological, social and ecological level, and finally takes as perspective populations and societies as a whole. The framework may be applied to analyse issues on any approach. Hence, regarding consideration of HIV treatment as a means of prevention, examples of issues at each level would be: ThE: achieving adequate viral suppression and non-transmission to partners; UseE: facility and degree of adherence to treatment and medical follow-up; PopUseE: perceived validity of strategy, feasibility of achieving adequate population coverage. Discussion: Use of the framework clarifies the questions that need to be addressed at all levels in order to improve effectiveness. Furthermore, the interconnectedness and complementary nature of research from the different scientific disciplines and the relative contribution of each become apparent. The proposed framework could bring greater rationality to the prevention effectiveness debate and facilitate communication between stakeholders.
Resumo:
Summary (in English) Computer simulations provide a practical way to address scientific questions that would be otherwise intractable. In evolutionary biology, and in population genetics in particular, the investigation of evolutionary processes frequently involves the implementation of complex models, making simulations a particularly valuable tool in the area. In this thesis work, I explored three questions involving the geographical range expansion of populations, taking advantage of spatially explicit simulations coupled with approximate Bayesian computation. First, the neutral evolutionary history of the human spread around the world was investigated, leading to a surprisingly simple model: A straightforward diffusion process of migrations from east Africa throughout a world map with homogeneous landmasses replicated to very large extent the complex patterns observed in real human populations, suggesting a more continuous (as opposed to structured) view of the distribution of modern human genetic diversity, which may play a better role as a base model for further studies. Second, the postglacial evolution of the European barn owl, with the formation of a remarkable coat-color cline, was inspected with two rounds of simulations: (i) determine the demographic background history and (ii) test the probability of a phenotypic cline, like the one observed in the natural populations, to appear without natural selection. We verified that the modern barn owl population originated from a single Iberian refugium and that they formed their color cline, not due to neutral evolution, but with the necessary participation of selection. The third and last part of this thesis refers to a simulation-only study inspired by the barn owl case above. In this chapter, we showed that selection is, indeed, effective during range expansions and that it leaves a distinguished signature, which can then be used to detect and measure natural selection in range-expanding populations. Résumé (en français) Les simulations fournissent un moyen pratique pour répondre à des questions scientifiques qui seraient inabordable autrement. En génétique des populations, l'étude des processus évolutifs implique souvent la mise en oeuvre de modèles complexes, et les simulations sont un outil particulièrement précieux dans ce domaine. Dans cette thèse, j'ai exploré trois questions en utilisant des simulations spatialement explicites dans un cadre de calculs Bayésiens approximés (approximate Bayesian computation : ABC). Tout d'abord, l'histoire de la colonisation humaine mondiale et de l'évolution de parties neutres du génome a été étudiée grâce à un modèle étonnement simple. Un processus de diffusion des migrants de l'Afrique orientale à travers un monde avec des masses terrestres homogènes a reproduit, dans une très large mesure, les signatures génétiques complexes observées dans les populations humaines réelles. Un tel modèle continu (opposé à un modèle structuré en populations) pourrait être très utile comme modèle de base dans l'étude de génétique humaine à l'avenir. Deuxièmement, l'évolution postglaciaire d'un gradient de couleur chez l'Effraie des clocher (Tyto alba) Européenne, a été examiné avec deux séries de simulations pour : (i) déterminer l'histoire démographique de base et (ii) tester la probabilité qu'un gradient phénotypique, tel qu'observé dans les populations naturelles puisse apparaître sans sélection naturelle. Nous avons montré que la population actuelle des chouettes est sortie d'un unique refuge ibérique et que le gradient de couleur ne peux pas s'être formé de manière neutre (sans l'action de la sélection naturelle). La troisième partie de cette thèse se réfère à une étude par simulations inspirée par l'étude de l'Effraie. Dans ce dernier chapitre, nous avons montré que la sélection est, en effet, aussi efficace dans les cas d'expansion d'aire de distribution et qu'elle laisse une signature unique, qui peut être utilisée pour la détecter et estimer sa force.
Resumo:
BACKGROUND: Both non-traumatic and traumatic spinal cord injuries have in common that a relatively minor structural lesion can cause profound sensorimotor and autonomous dysfunction. Besides treating the cause of the spinal cord injury the main goal is to restore lost function as far as possible. AIM: This article provides an overview of current innovative diagnostic (imaging) and therapeutic approaches (neurorehabilitation and neuroregeneration) aiming for recovery of function after non-traumatic and traumatic spinal cord injuries. MATERIAL AND METHODS: An analysis of the current scientific literature regarding imaging, rehabilitation and rehabilitation strategies in spinal cord disease was carried out. RESULTS: Novel magnetic resonance imaging (MRI) based techniques (e.g. diffusion-weighted MRI and functional MRI) allow visualization of structural reorganization and specific neural activity in the spinal cord. Robotics-driven rehabilitative measures provide training of sensorimotor function in a targeted fashion, which can even be continued in the homecare setting. From a preclinical point of view, defined stem cell transplantation approaches allow for the first time robust structural repair of the injured spinal cord. CONCLUSION: Besides well-established neurological and functional scores, MRI techniques offer the unique opportunity to provide robust and reliable "biomarkers" for restorative therapeutic interventions. Function-oriented robotics-based rehabilitative interventions alone or in combination with stem cell based therapies represent promising approaches to achieve substantial functional recovery, which go beyond current rehabilitative treatment efforts.
Resumo:
Pulse oximetry has been proposed as a noninvasive continuous method for transcutaneous monitoring of arterial oxygen saturation of hemoglobin (tcSO2) in the newborn infant. The reliability of this technique in detecting hyperoxemia is controversial, because small changes in saturation greater than 90% are associated with relatively large changes in arterial oxygen tension (PaO2). The purpose of this study was to assess the reliability of pulse oximetry using an alarm limit of 95% tcSO2 in detecting hyperoxemia (defined as PaO2 greater than 90 mm Hg) and to examine the effect of varying the alarm limit on reliability. Two types of pulse oximeter were studied alternately in 50 newborn infants who were mechanically ventilated with indwelling arterial lines. Three arterial blood samples were drawn from every infant during routine increase of inspired oxygen before intratracheal suction, and PaO2 was compared with tcSO2. The Nellcor N-100 pulse oximeter identified all 26 hyperoxemic instances correctly (sensitivity 100%) and alarmed falsely in 25 of 49 nonhyperoxemic instances (specificity 49%). The Ohmeda Biox 3700 pulse oximeter detected 13 of 35 hyperoxemic instances (sensitivity 37%) and alarmed falsely in 7 of 40 nonhyperoxemic instances (specificity 83%). The optimal alarm limit, defined as a sensitivity of 95% or more associated with maximal specificity, was determined for Nellcor N-100 at 96% tcSO2 (specificity 38%) and for Ohmeda Biox 3700 at 89% tcSO2 (specificity 52%). It was concluded that pulse oximeters can be highly sensitive in detecting hyperoxemia provided that type-specific alarm limits are set and a low specificity is accepted.
Resumo:
This project deals with the generation of profitability and the distribution of its benefits. Inspired by Davis (1947, 1955), we define profitability as the ratio of revenue to cost. Profitability is not as popular a measure of business financial performance as profit, the difference between revenue and cost. Regardless of its popularity, however, profitability is surely a useful financial performance measure. Our primary objective in this project is to identify the factors that generate change in profitability. One set of factors, which we refer to as sources, consists of changes in quantities and prices of outputs and inputs. Individual quantity changes aggregate to the overall impact of quantity change on profitability change, which we call productivity change. Individual price changes aggregate to the overall impact of price change on profitability change, which we call price recovery change. In this framework profitability change consists exclusively of productivity change and price recovery change. A second set of factors, which we refer to as drivers, consists of phenomena such as technical change, change in the efficiency of resource allocation, and the impact of economies of scale. The ability of management to harness these factors drives productivity change, which is one component of profitability change. Thus the term sources refers to quantities and prices of individual outputs and inputs, whose changes influence productivity change or price recovery change, either of which influences profitability change. The term drivers refers to phenomena related to technology and management that influence productivity change (but not price recovery change), and hence profitability change.
Resumo:
Dos recursos virtuals elaborats pel grup ÒLIBA de la Universitat Oberta de Catalunya, Portal de la Vall de Boí (http://oliba.uoc.edu/boi/portal) i Memòries de la nostra infantesa: els nens de la guerra (http://oliba.uoc.edu/nens), ens han permès dur a terme un projecte innovador de difusió i interpretació del patrimoni per mitjà de les TIC. Aquests dos recursos virtuals tracten sobre el nostre patrimoni, en el primer cas sobre patrimoni natural i cultural i en el segon cas sobre patrimoni històric, i les seves pàgines web tenen un nivell excel·lent quant a forma i contingut, de manera que han representat un material òptim per a difondre el nostre patrimoni en els centres escolars.
Resumo:
L'objectiu d'aquest pràcticum és treballar amb una eina d'edició i catalogació remota de vídeos accessible via internet. L'eina ha estat desenvolupada per l'empresa Vision Robotics. La memòria reflecteix les experiències viscudes a través del treball amb aquesta eina i analitza les possibilitats de millora, les potencialitats de l'eina a més d'afegir unes reflexions finals de l'eina en general.
Resumo:
Methods like Event History Analysis can show the existence of diffusion and part of its nature, but do not study the process itself. Nowadays, thanks to the increasing performance of computers, processes can be studied using computational modeling. This thesis presents an agent-based model of policy diffusion mainly inspired from the model developed by Braun and Gilardi (2006). I first start by developing a theoretical framework of policy diffusion that presents the main internal drivers of policy diffusion - such as the preference for the policy, the effectiveness of the policy, the institutional constraints, and the ideology - and its main mechanisms, namely learning, competition, emulation, and coercion. Therefore diffusion, expressed by these interdependencies, is a complex process that needs to be studied with computational agent-based modeling. In a second step, computational agent-based modeling is defined along with its most significant concepts: complexity and emergence. Using computational agent-based modeling implies the development of an algorithm and its programming. When this latter has been developed, we let the different agents interact. Consequently, a phenomenon of diffusion, derived from learning, emerges, meaning that the choice made by an agent is conditional to that made by its neighbors. As a result, learning follows an inverted S-curve, which leads to partial convergence - global divergence and local convergence - that triggers the emergence of political clusters; i.e. the creation of regions with the same policy. Furthermore, the average effectiveness in this computational world tends to follow a J-shaped curve, meaning that not only time is needed for a policy to deploy its effects, but that it also takes time for a country to find the best-suited policy. To conclude, diffusion is an emergent phenomenon from complex interactions and its outcomes as ensued from my model are in line with the theoretical expectations and the empirical evidence.Les méthodes d'analyse de biographie (event history analysis) permettent de mettre en évidence l'existence de phénomènes de diffusion et de les décrire, mais ne permettent pas d'en étudier le processus. Les simulations informatiques, grâce aux performances croissantes des ordinateurs, rendent possible l'étude des processus en tant que tels. Cette thèse, basée sur le modèle théorique développé par Braun et Gilardi (2006), présente une simulation centrée sur les agents des phénomènes de diffusion des politiques. Le point de départ de ce travail met en lumière, au niveau théorique, les principaux facteurs de changement internes à un pays : la préférence pour une politique donnée, l'efficacité de cette dernière, les contraintes institutionnelles, l'idéologie, et les principaux mécanismes de diffusion que sont l'apprentissage, la compétition, l'émulation et la coercition. La diffusion, définie par l'interdépendance des différents acteurs, est un système complexe dont l'étude est rendue possible par les simulations centrées sur les agents. Au niveau méthodologique, nous présenterons également les principaux concepts sous-jacents aux simulations, notamment la complexité et l'émergence. De plus, l'utilisation de simulations informatiques implique le développement d'un algorithme et sa programmation. Cette dernière réalisée, les agents peuvent interagir, avec comme résultat l'émergence d'un phénomène de diffusion, dérivé de l'apprentissage, où le choix d'un agent dépend en grande partie de ceux faits par ses voisins. De plus, ce phénomène suit une courbe en S caractéristique, poussant à la création de régions politiquement identiques, mais divergentes au niveau globale. Enfin, l'efficacité moyenne, dans ce monde simulé, suit une courbe en J, ce qui signifie qu'il faut du temps, non seulement pour que la politique montre ses effets, mais également pour qu'un pays introduise la politique la plus efficace. En conclusion, la diffusion est un phénomène émergent résultant d'interactions complexes dont les résultats du processus tel que développé dans ce modèle correspondent tant aux attentes théoriques qu'aux résultats pratiques.
Resumo:
In this paper, a phenomenologically motivated magneto-mechanically coupled finite strain elastic framework for simulating the curing process of polymers in the presence of a magnetic load is proposed. This approach is in line with previous works by Hossain and co-workers on finite strain curing modelling framework for the purely mechanical polymer curing (Hossain et al., 2009b). The proposed thermodynamically consistent approach is independent of any particular free energy function that may be used for the fully-cured magneto-sensitive polymer modelling, i.e. any phenomenological or micromechanical-inspired free energy can be inserted into the main modelling framework. For the fabrication of magneto-sensitive polymers, micron-size ferromagnetic particles are mixed with the liquid matrix material in the uncured stage. The particles align in a preferred direction with the application of a magnetic field during the curing process. The polymer curing process is a complex (visco) elastic process that transforms a fluid to a solid with time. Such transformation process is modelled by an appropriate constitutive relation which takes into account the temporal evolution of the material parameters appearing in a particular energy function. For demonstration in this work, a frequently used energy function is chosen, i.e. the classical Mooney-Rivlin free energy enhanced by coupling terms. Several representative numerical examples are demonstrated that prove the capability of our approach to correctly capture common features in polymers undergoing curing processes in the presence of a magneto-mechanical coupled load.
Resumo:
Excessive proliferation of vascular wall cells underlies the development of elevated vascular resistance in hypoxic pulmonary hypertension (PH), but the responsible mechanisms remain unclear. Growth-promoting effects of catecholamines may contribute. Hypoxemia causes sympathoexcitation, and prolonged stimulation of alpha(1)-adrenoceptors (alpha(1)-ARs) induces hypertrophy and hyperplasia of arterial smooth muscle cells and adventitial fibroblasts. Catecholamine trophic actions in arteries are enhanced when other conditions favoring growth or remodeling are present, e.g., injury or altered shear stress, in isolated pulmonary arteries from rats with hypoxic PH. The present study examined the hypothesis that catecholamines contribute to pulmonary vascular remodeling in vivo in hypoxic PH. Mice genetically deficient in norepinephrine and epinephrine production [dopamine beta-hydroxylase(-/-) (DBH(-/-))] or alpha(1)-ARs were examined for alterations in PH, cardiac hypertrophy, and vascular remodeling after 21 days exposure to normobaric 0.1 inspired oxygen fraction (Fi(O(2))). A decrease in the lumen area and an increase in the wall thickness of arteries were strongly inhibited in knockout mice (order of extent of inhibition: DBH(-/-) = alpha(1D)-AR(-/-) > alpha(1B)-AR(-/-)). Distal muscularization of small arterioles was also reduced (DBH(-/-) > alpha(1D)-AR(-/-) > alpha(1B)-AR(-/-) mice). Despite these reductions, increases in right ventricular pressure and hypertrophy were not attenuated in DBH(-/-) and alpha(1B)-AR(-/-) mice. However, hematocrit increased more in these mice, possibly as a consequence of impaired cardiovascular activation that occurs during reduction of Fi(O(2)). In contrast, in alpha(1D)-AR(-/-) mice, where hematocrit increased the same as in wild-type mice, right ventricular pressure was reduced. These data suggest that catecholamine stimulation of alpha(1B)- and alpha(1D)-ARs contributes significantly to vascular remodeling in hypoxic PH.
Resumo:
Aquest projecte ha representat la implementació d'un entorn de treball en col·laboració inspirat en el BSCW però amb una nova arquitectura, basada en miniaplicacions de servidor (
Resumo:
A survey of medical ambulatory practice was carried out in February-March 1981 in the two Swiss cantons of Vaud and Fribourg (total population: 700,000), in which 205 physicians participated. The methodology used was inspired from the U.S. National Ambulatory Medical Care Survey, the data collection instrument of which was adapted to our conditions; in addition, data were gathered on all referrals prescribed by 154 physicians during two weeks. (The instruments used are presented.) The potential and limits of this type of survey are discussed, as well as the representativity of the participating physicians and of the recorded visits, which are a systematic sample of over 43,000 visits.
Resumo:
Arthropod-borne diseases caused by a variety of microorganisms such as dengue virus and malaria parasites afflict billions of people worldwide imposing major economic and social burdens. Despite many efforts, vaccines against diseases transmitted by mosquitoes, with the exception of yellow fever, are not available. Control of such infectious pathogens is mainly performed by vector management and treatment of affected individuals with drugs. However, the numbers of insecticide-resistant insects and drug-resistant parasites are increasing. Therefore, inspired in recent years by a lot of new data produced by genomics and post-genomics research, several scientific groups have been working on different strategies to control infectious arthropod-borne diseases. This review focuses on recent advances and perspectives towards construction of transgenic mosquitoes refractory to malaria parasites and dengue virus transmission.