981 resultados para Dangerous consumptions


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Nuss procedure requires the creation of a substernal tunnel for bar positioning. This is a manoeuvre that can be dangerous, and cardiac perforation has occurred in a few cases. Our purpose was to describe two technical modifications that enable the prevention of these fatal complications. A series of 25 patients with pectus excavatum were treated with a modification of the Nuss procedure that included the entrance in the left haemithorax first, and the use of the retractor to lift the sternum, with the consequent lowering displacement of the heart. These modified techniques have certain advantages: (i) the narrow anterior mediastinum between the sternum and the pericardial sac is expanded by pulling up the sternum; (ii) the thoracoscopic visualization of the tip of the introducer during tunnel creation is improved; (iii) the rubbing of the introducer against the pericardium is minimized; (iv) the exit path of the introducer can be guided by the surgeon's finger and (v) haemostasis and integrity of the pericardial sac can be more easily confirmed. We observed that with these manoeuvres, the risk of pericardial sac and cardiac injury can be markedly reduced.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: The aim of this study was to assess re-hospitalization rates of individuals with psychosis and bipolar disorder and to study determinants of readmission. Methods: Prospective observational study, conducted in Sao Paulo, Brazil. One hundred-sixty-nine individuals with bipolar and psychotic disorder in need of hospitalization in the public mental health system were followed for 12 months after discharge. Their families were contacted by telephone and interviews were conducted at 1, 2, 6 and 12 months post-discharge to evaluate readmission rates and factors related. Results: One-year re-hospitalization rate was of 42.6%. Physical restraint during hospital stay was a risk factor (OR = 5.4-10.5) for readmission in most models. Not attending consultations after discharge was related to the 12-month point readmission (OR = 8.5, 95% CI 2.3-31.2) and to the survival model (OR = 3.2, 95% CI 1.5-7.2). Number of previous admissions was a risk factor for the survival model (OR = 6.6-11.9). Family's agreement with permanent hospitalization of individuals with mental illness was the predictor associated to readmission in all models (OR = 3.5-10.9) and resulted in shorter survival time to readmission; those readmitted were stereotyped as dangerous and unhealthy. Conclusions: Family's stigma towards mental illness might contribute to the increase in readmission rates of their relatives with psychiatric disorders. More studies should be conducted to depict mechanisms by which stigma increases re-hospitalization rates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Endovascular techniques have shown to be useful in the management of vascular injuries because they transform a complex and potentially dangerous procedure into a safe one. We present the case of a 39-year-old man with congestive heart failure and abdominal bruit 11 years after an abdominal gunshot wound. Imaging studies revealed an arteriovenous fistula involving the left iliac artery bifurcation, and an iliac branch device was used to treat it. Symptoms resolved, and follow-up imaging showed patency of the graft and closure of the arteriovenous communication. To our knowledge, this is the first report of a nonaneurysmal disease treated with this device. (J Vasc Surg 2012;55:1474-6.)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bee venom (BV) allergy is potentially dangerous for allergic individuals because a single bee sting may induce an anaphylactic reaction, eventually leading to death. Currently, venom immunotherapy (VIT) is the only treatment with long-lasting effect for this kind of allergy and its efficiency has been recognized worldwide. This therapy consists of subcutaneous injections of gradually increasing doses of the allergen. This causes patient lack of compliance due to a long time of treatment with a total of 30-80 injections administered over years. In this article we deal with the characterization of different MS-PLGA formulations containing BV proteins for VIT. The PLGA microspheres containing BV represent a strategy to replace the multiple injections, because they can control the solute release. Physical and biochemical methods were used to analyze and characterize their preparation. Microspheres with encapsulation efficiencies of 49-75% were obtained with a BV triphasic release profile. Among them, the MS-PLGA 34 kDa-COOH showed to be best for VIT because they presented a low initial burst (20%) and a slow BV release during lag phase. Furthermore, few conformational changes were observed in the released BV. Above all, the BV remained immunologically recognizable, which means that they could continuously stimulate the immune system. Those microspheres containing BV could replace sequential injections of traditional VIT with the remarkable advantage of reduced number of injections. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: The aim of this study was to assess re-hospitalization rates of individuals with psychosis and bipolar disorder and to study determinants of readmission. METHODS: Prospective observational study, conducted in São Paulo, Brazil. One hundred-sixty-nine individuals with bipolar and psychotic disorder in need of hospitalization in the public mental health system were followed for 12 months after discharge. Their families were contacted by telephone and interviews were conducted at 1, 2, 6 and 12 months post-discharge to evaluate readmission rates and factors related. RESULTSOne-year re-hospitalization rate was of 42.6%. Physical restraint during hospital stay was a risk factor (OR = 5.4-10.5) for readmission in most models. Not attending consultations after discharge was related to the 12-month point readmission (OR = 8.5, 95%CI 2.3-31.2) and to the survival model (OR = 3.2, 95%CI 1.5-7.2). Number of previous admissions was a risk factor for the survival model (OR = 6.6-11.9). Family's agreement with permanent hospitalization of individuals with mental illness was the predictor associated to readmission in all models (OR = 3.5-10.9) and resulted in shorter survival time to readmission; those readmitted were stereotyped as dangerous and unhealthy. CONCLUSIONS: Family's stigma towards mental illness might contribute to the increase in readmission rates of their relatives with psychiatric disorders. More studies should be conducted to depict mechanisms by which stigma increases re-hospitalization rates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The orbits of the stars in the disk of the Galaxy, and their passages through the Galactic spiral arms, are a rarely mentioned factor of biosphere stability which might be important for long-term planetary climate evolution, with a possible bearing on mass extinctions. The Sun lies very near the co-rotation radius, where stars revolve around the Galaxy in the same period as the density wave perturbations of the spiral arms. conventional wisdom generally considers that this status makes for few passages through the spiral arms. Controversy still surrounds whether time spent inside or around spiral arms is dangerous to biospheres and conductive to mass extinctions. Possible threats include giant molecular clouds disturbing the Oort comet cloud and provoking heavy bombardment: a higher exposure to cosmic rays near star forming regions triggering increased cloudiness in Earth atmosphere and ice ages; and the desctruction of Earth's ozone layer posed by supernova explosiosn. We present detailed calculations of the history of spiral arm passages for all 212 solar-type stars nearer than 20 parsecs, including the total time spent inside armsin the last 500 Myr, when the spiral arm position can be traced with good accuracy. We found that there is a large diversity of stellar orbits in the solar neighborhood, and the time fraction spent inside spiral arms can vary from a few percent to nearly half the time. The Sun, despite its proximity to the galactic co-rotation radius, has exceptionally low eccentricity and a low vertical velocity component, and therefore spends 30% of its lifetime crossing the spiral arms, more than most nearby stars. We discuss the possible implications of this fact to the long-term habitability of the Earth, and possible correlations of the Sun's passage through the spiral arms with the five great mass extinctions of the Earth's biosphere from the Late Ordovician to the Cretaceous-Tertiary.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[EN] The presence of emerging contaminants has been previously described in reclaimed water and groundwater of Gran Canaria (Spain). Despite of the environmental risk associated to irrigation with reclaimed water (R), this practice is necessary considering sustainability of the hydrological cycle in semiarid zones, especially regarding agricultural activity. The aim of this study was: i) to analyse the evolution during two years of contaminants of emerging concern, priority substances (2008/105/EC) and heavy metals in reclaimed water (R) and in a volcanic aquifer in the NE of Gran Canaria where a golf course has been irrigated with R since 1976 and ii) to relate this presence with physicochemical water properties and hydrogeological media. Reclaimed water and groundwater (GW) were monitoring quarterly from July 2009 to September 2011. Sorption and degradation processes in soil account for more compounds being detected in R. Diazinon and chlorfenvinphos were detected always in R and terbuthylazine, terbutryn and diuron at 90% of frequency. Considering all the samples, the most frequent compounds were chlorpyrifos ethyl, fluorene, phenanthrene and pyrene. Although their concentrations were frequently below 50 ngL-1, some contaminants, were occasionally detected at higher concentrations. Chlorpyrifos ethyl and diuron are priority substances detected frequently and at high concentrations so they must be included in monitoring studies. Geology and location seem to be related to the emerging compounds presence due to occasional contamination events (not related to R irrigation) and therefore not to an existence of a dangerous diffuse contamination level. Thus, it is preferable to select wells with less stable chemical water quality, in order to monitor the risk of emerging compounds presence. Considering the relationship between contaminant presence, chemical water quality, seasonal variation, hydrogeological characteristics and wells location we can conclude that chlorpyrifos ethyl and diuron were the most dangerous priority substances in terms of GW quality so they must be included in all of the monitoring studies, at least in Canary Islands.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Confronto tra due software specifici per l'analisi di rischio nel trasporto stradale di merci pericolose (TRAT GIS 4.1 e QRAM 3.6) mediante applicazione a un caso di studio semplice e al caso reale di Casalecchio di Reno, comune della provincia di Bologna.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The relation between the intercepted light and orchard productivity was considered linear, although this dependence seems to be more subordinate to planting system rather than light intensity. At whole plant level not always the increase of irradiance determines productivity improvement. One of the reasons can be the plant intrinsic un-efficiency in using energy. Generally in full light only the 5 – 10% of the total incoming energy is allocated to net photosynthesis. Therefore preserving or improving this efficiency becomes pivotal for scientist and fruit growers. Even tough a conspicuous energy amount is reflected or transmitted, plants can not avoid to absorb photons in excess. The chlorophyll over-excitation promotes the reactive species production increasing the photoinhibition risks. The dangerous consequences of photoinhibition forced plants to evolve a complex and multilevel machine able to dissipate the energy excess quenching heat (Non Photochemical Quenching), moving electrons (water-water cycle , cyclic transport around PSI, glutathione-ascorbate cycle and photorespiration) and scavenging the generated reactive species. The price plants must pay for this equipment is the use of CO2 and reducing power with a consequent decrease of the photosynthetic efficiency, both because some photons are not used for carboxylation and an effective CO2 and reducing power loss occurs. Net photosynthesis increases with light until the saturation point, additional PPFD doesn’t improve carboxylation but it rises the efficiency of the alternative pathways in energy dissipation but also ROS production and photoinhibition risks. The wide photo-protective apparatus, although is not able to cope with the excessive incoming energy, therefore photodamage occurs. Each event increasing the photon pressure and/or decreasing the efficiency of the described photo-protective mechanisms (i.e. thermal stress, water and nutritional deficiency) can emphasize the photoinhibition. Likely in nature a small amount of not damaged photosystems is found because of the effective, efficient and energy consuming recovery system. Since the damaged PSII is quickly repaired with energy expense, it would be interesting to investigate how much PSII recovery costs to plant productivity. This PhD. dissertation purposes to improve the knowledge about the several strategies accomplished for managing the incoming energy and the light excess implication on photo-damage in peach. The thesis is organized in three scientific units. In the first section a new rapid, non-intrusive, whole tissue and universal technique for functional PSII determination was implemented and validated on different kinds of plants as C3 and C4 species, woody and herbaceous plants, wild type and Chlorophyll b-less mutant and monocot and dicot plants. In the second unit, using a “singular” experimental orchard named “Asymmetric orchard”, the relation between light environment and photosynthetic performance, water use and photoinhibition was investigated in peach at whole plant level, furthermore the effect of photon pressure variation on energy management was considered on single leaf. In the third section the quenching analysis method suggested by Kornyeyev and Hendrickson (2007) was validate on peach. Afterwards it was applied in the field where the influence of moderate light and water reduction on peach photosynthetic performances, water requirements, energy management and photoinhibition was studied. Using solar energy as fuel for life plant is intrinsically suicidal since the high constant photodamage risk. This dissertation would try to highlight the complex relation existing between plant, in particular peach, and light analysing the principal strategies plants developed to manage the incoming light for deriving the maximal benefits as possible minimizing the risks. In the first instance the new method proposed for functional PSII determination based on P700 redox kinetics seems to be a valid, non intrusive, universal and field-applicable technique, even because it is able to measure in deep the whole leaf tissue rather than the first leaf layers as fluorescence. Fluorescence Fv/Fm parameter gives a good estimate of functional PSII but only when data obtained by ad-axial and ab-axial leaf surface are averaged. In addition to this method the energy quenching analysis proposed by Kornyeyev and Hendrickson (2007), combined with the photosynthesis model proposed by von Caemmerer (2000) is a forceful tool to analyse and study, even in the field, the relation between plant and environmental factors such as water, temperature but first of all light. “Asymmetric” training system is a good way to study light energy, photosynthetic performance and water use relations in the field. At whole plant level net carboxylation increases with PPFD reaching a saturating point. Light excess rather than improve photosynthesis may emphasize water and thermal stress leading to stomatal limitation. Furthermore too much light does not promote net carboxylation improvement but PSII damage, in fact in the most light exposed plants about 50-60% of the total PSII is inactivated. At single leaf level, net carboxylation increases till saturation point (1000 – 1200 μmolm-2s-1) and light excess is dissipated by non photochemical quenching and non net carboxylative transports. The latter follows a quite similar pattern of Pn/PPFD curve reaching the saturation point at almost the same photon flux density. At middle-low irradiance NPQ seems to be lumen pH limited because the incoming photon pressure is not enough to generate the optimum lumen pH for violaxanthin de-epoxidase (VDE) full activation. Peach leaves try to cope with the light excess increasing the non net carboxylative transports. While PPFD rises the xanthophyll cycle is more and more activated and the rate of non net carboxylative transports is reduced. Some of these alternative transports, such as the water-water cycle, the cyclic transport around the PSI and the glutathione-ascorbate cycle are able to generate additional H+ in lumen in order to support the VDE activation when light can be limiting. Moreover the alternative transports seems to be involved as an important dissipative way when high temperature and sub-optimal conductance emphasize the photoinhibition risks. In peach, a moderate water and light reduction does not determine net carboxylation decrease but, diminishing the incoming light and the environmental evapo-transpiration request, stomatal conductance decreases, improving water use efficiency. Therefore lowering light intensity till not limiting levels, water could be saved not compromising net photosynthesis. The quenching analysis is able to partition absorbed energy in the several utilization, photoprotection and photo-oxidation pathways. When recovery is permitted only few PSII remained un-repaired, although more net PSII damage is recorded in plants placed in full light. Even in this experiment, in over saturating light the main dissipation pathway is the non photochemical quenching; at middle-low irradiance it seems to be pH limited and other transports, such as photorespiration and alternative transports, are used to support photoprotection and to contribute for creating the optimal trans-thylakoidal ΔpH for violaxanthin de-epoxidase. These alternative pathways become the main quenching mechanisms at very low light environment. Another aspect pointed out by this study is the role of NPQ as dissipative pathway when conductance becomes severely limiting. The evidence that in nature a small amount of damaged PSII is seen indicates the presence of an effective and efficient recovery mechanism that masks the real photodamage occurring during the day. At single leaf level, when repair is not allowed leaves in full light are two fold more photoinhibited than the shaded ones. Therefore light in excess of the photosynthetic optima does not promote net carboxylation but increases water loss and PSII damage. The more is photoinhibition the more must be the photosystems to be repaired and consequently the energy and dry matter to allocate in this essential activity. Since above the saturation point net photosynthesis is constant while photoinhibition increases it would be interesting to investigate how photodamage costs in terms of tree productivity. An other aspect of pivotal importance to be further widened is the combined influence of light and other environmental parameters, like water status, temperature and nutrition on peach light, water and phtosyntate management.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increasing aversion to technological risks of the society requires the development of inherently safer and environmentally friendlier processes, besides assuring the economic competitiveness of the industrial activities. The different forms of impact (e.g. environmental, economic and societal) are frequently characterized by conflicting reduction strategies and must be holistically taken into account in order to identify the optimal solutions in process design. Though the literature reports an extensive discussion of strategies and specific principles, quantitative assessment tools are required to identify the marginal improvements in alternative design options, to allow the trade-off among contradictory aspects and to prevent the “risk shift”. In the present work a set of integrated quantitative tools for design assessment (i.e. design support system) was developed. The tools were specifically dedicated to the implementation of sustainability and inherent safety in process and plant design activities, with respect to chemical and industrial processes in which substances dangerous for humans and environment are used or stored. The tools were mainly devoted to the application in the stages of “conceptual” and “basic design”, when the project is still open to changes (due to the large number of degrees of freedom) which may comprise of strategies to improve sustainability and inherent safety. The set of developed tools includes different phases of the design activities, all through the lifecycle of a project (inventories, process flow diagrams, preliminary plant lay-out plans). The development of such tools gives a substantial contribution to fill the present gap in the availability of sound supports for implementing safety and sustainability in early phases of process design. The proposed decision support system was based on the development of a set of leading key performance indicators (KPIs), which ensure the assessment of economic, societal and environmental impacts of a process (i.e. sustainability profile). The KPIs were based on impact models (also complex), but are easy and swift in the practical application. Their full evaluation is possible also starting from the limited data available during early process design. Innovative reference criteria were developed to compare and aggregate the KPIs on the basis of the actual sitespecific impact burden and the sustainability policy. Particular attention was devoted to the development of reliable criteria and tools for the assessment of inherent safety in different stages of the project lifecycle. The assessment follows an innovative approach in the analysis of inherent safety, based on both the calculation of the expected consequences of potential accidents and the evaluation of the hazards related to equipment. The methodology overrides several problems present in the previous methods proposed for quantitative inherent safety assessment (use of arbitrary indexes, subjective judgement, build-in assumptions, etc.). A specific procedure was defined for the assessment of the hazards related to the formations of undesired substances in chemical systems undergoing “out of control” conditions. In the assessment of layout plans, “ad hoc” tools were developed to account for the hazard of domino escalations and the safety economics. The effectiveness and value of the tools were demonstrated by the application to a large number of case studies concerning different kinds of design activities (choice of materials, design of the process, of the plant, of the layout) and different types of processes/plants (chemical industry, storage facilities, waste disposal). An experimental survey (analysis of the thermal stability of isomers of nitrobenzaldehyde) provided the input data necessary to demonstrate the method for inherent safety assessment of materials.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Large scale wireless adhoc networks of computers, sensors, PDAs etc. (i.e. nodes) are revolutionizing connectivity and leading to a paradigm shift from centralized systems to highly distributed and dynamic environments. An example of adhoc networks are sensor networks, which are usually composed by small units able to sense and transmit to a sink elementary data which are successively processed by an external machine. Recent improvements in the memory and computational power of sensors, together with the reduction of energy consumptions, are rapidly changing the potential of such systems, moving the attention towards datacentric sensor networks. A plethora of routing and data management algorithms have been proposed for the network path discovery ranging from broadcasting/floodingbased approaches to those using global positioning systems (GPS). We studied WGrid, a novel decentralized infrastructure that organizes wireless devices in an adhoc manner, where each node has one or more virtual coordinates through which both message routing and data management occur without reliance on either flooding/broadcasting operations or GPS. The resulting adhoc network does not suffer from the deadend problem, which happens in geographicbased routing when a node is unable to locate a neighbor closer to the destination than itself. WGrid allow multidimensional data management capability since nodes' virtual coordinates can act as a distributed database without needing neither special implementation or reorganization. Any kind of data (both single and multidimensional) can be distributed, stored and managed. We will show how a location service can be easily implemented so that any search is reduced to a simple query, like for any other data type. WGrid has then been extended by adopting a replication methodology. We called the resulting algorithm WRGrid. Just like WGrid, WRGrid acts as a distributed database without needing neither special implementation nor reorganization and any kind of data can be distributed, stored and managed. We have evaluated the benefits of replication on data management, finding out, from experimental results, that it can halve the average number of hops in the network. The direct consequence of this fact are a significant improvement on energy consumption and a workload balancing among sensors (number of messages routed by each node). Finally, thanks to the replications, whose number can be arbitrarily chosen, the resulting sensor network can face sensors disconnections/connections, due to failures of sensors, without data loss. Another extension to {WGrid} is {W*Grid} which extends it by strongly improving network recovery performance from link and/or device failures that may happen due to crashes or battery exhaustion of devices or to temporary obstacles. W*Grid guarantees, by construction, at least two disjoint paths between each couple of nodes. This implies that the recovery in W*Grid occurs without broadcasting transmissions and guaranteeing robustness while drastically reducing the energy consumption. An extensive number of simulations shows the efficiency, robustness and traffic road of resulting networks under several scenarios of device density and of number of coordinates. Performance analysis have been compared to existent algorithms in order to validate the results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent progress in microelectronic and wireless communications have enabled the development of low cost, low power, multifunctional sensors, which has allowed the birth of new type of networks named wireless sensor networks (WSNs). The main features of such networks are: the nodes can be positioned randomly over a given field with a high density; each node operates both like sensor (for collection of environmental data) as well as transceiver (for transmission of information to the data retrieval); the nodes have limited energy resources. The use of wireless communications and the small size of nodes, make this type of networks suitable for a large number of applications. For example, sensor nodes can be used to monitor a high risk region, as near a volcano; in a hospital they could be used to monitor physical conditions of patients. For each of these possible application scenarios, it is necessary to guarantee a trade-off between energy consumptions and communication reliability. The thesis investigates the use of WSNs in two possible scenarios and for each of them suggests a solution that permits to solve relating problems considering the trade-off introduced. The first scenario considers a network with a high number of nodes deployed in a given geographical area without detailed planning that have to transmit data toward a coordinator node, named sink, that we assume to be located onboard an unmanned aerial vehicle (UAV). This is a practical example of reachback communication, characterized by the high density of nodes that have to transmit data reliably and efficiently towards a far receiver. It is considered that each node transmits a common shared message directly to the receiver onboard the UAV whenever it receives a broadcast message (triggered for example by the vehicle). We assume that the communication channels between the local nodes and the receiver are subject to fading and noise. The receiver onboard the UAV must be able to fuse the weak and noisy signals in a coherent way to receive the data reliably. It is proposed a cooperative diversity concept as an effective solution to the reachback problem. In particular, it is considered a spread spectrum (SS) transmission scheme in conjunction with a fusion center that can exploit cooperative diversity, without requiring stringent synchronization between nodes. The idea consists of simultaneous transmission of the common message among the nodes and a Rake reception at the fusion center. The proposed solution is mainly motivated by two goals: the necessity to have simple nodes (to this aim we move the computational complexity to the receiver onboard the UAV), and the importance to guarantee high levels of energy efficiency of the network, thus increasing the network lifetime. The proposed scheme is analyzed in order to better understand the effectiveness of the approach presented. The performance metrics considered are both the theoretical limit on the maximum amount of data that can be collected by the receiver, as well as the error probability with a given modulation scheme. Since we deal with a WSN, both of these performance are evaluated taking into consideration the energy efficiency of the network. The second scenario considers the use of a chain network for the detection of fires by using nodes that have a double function of sensors and routers. The first one is relative to the monitoring of a temperature parameter that allows to take a local binary decision of target (fire) absent/present. The second one considers that each node receives a decision made by the previous node of the chain, compares this with that deriving by the observation of the phenomenon, and transmits the final result to the next node. The chain ends at the sink node that transmits the received decision to the user. In this network the goals are to limit throughput in each sensor-to-sensor link and minimize probability of error at the last stage of the chain. This is a typical scenario of distributed detection. To obtain good performance it is necessary to define some fusion rules for each node to summarize local observations and decisions of the previous nodes, to get a final decision that it is transmitted to the next node. WSNs have been studied also under a practical point of view, describing both the main characteristics of IEEE802:15:4 standard and two commercial WSN platforms. By using a commercial WSN platform it is realized an agricultural application that has been tested in a six months on-field experimentation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Singularities of robot manipulators have been intensely studied in the last decades by researchers of many fields. Serial singularities produce some local loss of dexterity of the manipulator, therefore it might be desirable to search for singularityfree trajectories in the jointspace. On the other hand, parallel singularities are very dangerous for parallel manipulators, for they may provoke the local loss of platform control, and jeopardize the structural integrity of links or actuators. It is therefore utterly important to avoid parallel singularities, while operating a parallel machine. Furthermore, there might be some configurations of a parallel manipulators that are allowed by the constraints, but nevertheless are unreachable by any feasible path. The present work proposes a numerical procedure based upon Morse theory, an important branch of differential topology. Such procedure counts and identify the singularity-free regions that are cut by the singularity locus out of the configuration space, and the disjoint regions composing the configuration space of a parallel manipulator. Moreover, given any two configurations of a manipulator, a feasible or a singularity-free path connecting them can always be found, or it can be proved that none exists. Examples of applications to 3R and 6R serial manipulators, to 3UPS and 3UPU parallel wrists, to 3UPU parallel translational manipulators, and to 3RRR planar manipulators are reported in the work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the collective imaginaries a robot is a human like machine as any androids in science fiction. However the type of robots that you will encounter most frequently are machinery that do work that is too dangerous, boring or onerous. Most of the robots in the world are of this type. They can be found in auto, medical, manufacturing and space industries. Therefore a robot is a system that contains sensors, control systems, manipulators, power supplies and software all working together to perform a task. The development and use of such a system is an active area of research and one of the main problems is the development of interaction skills with the surrounding environment, which include the ability to grasp objects. To perform this task the robot needs to sense the environment and acquire the object informations, physical attributes that may influence a grasp. Humans can solve this grasping problem easily due to their past experiences, that is why many researchers are approaching it from a machine learning perspective finding grasp of an object using information of already known objects. But humans can select the best grasp amongst a vast repertoire not only considering the physical attributes of the object to grasp but even to obtain a certain effect. This is why in our case the study in the area of robot manipulation is focused on grasping and integrating symbolic tasks with data gained through sensors. The learning model is based on Bayesian Network to encode the statistical dependencies between the data collected by the sensors and the symbolic task. This data representation has several advantages. It allows to take into account the uncertainty of the real world, allowing to deal with sensor noise, encodes notion of causality and provides an unified network for learning. Since the network is actually implemented and based on the human expert knowledge, it is very interesting to implement an automated method to learn the structure as in the future more tasks and object features can be introduced and a complex network design based only on human expert knowledge can become unreliable. Since structure learning algorithms presents some weaknesses, the goal of this thesis is to analyze real data used in the network modeled by the human expert, implement a feasible structure learning approach and compare the results with the network designed by the expert in order to possibly enhance it.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Magnetic resonance imaging (MRI) is today precluded to patients bearing active implantable medical devices AIMDs). The great advantages related to this diagnostic modality, together with the increasing number of people benefiting from implantable devices, in particular pacemakers(PM)and carioverter/defibrillators (ICD), is prompting the scientific community the study the possibility to extend MRI also to implanted patients. The MRI induced specific absorption rate (SAR) and the consequent heating of biological tissues is one of the major concerns that makes patients bearing metallic structures contraindicated for MRI scans. To date, both in-vivo and in-vitro studies have demonstrated the potentially dangerous temperature increase caused by the radiofrequency (RF) field generated during MRI procedures in the tissues surrounding thin metallic implants. On the other side, the technical evolution of MRI scanners and of AIMDs together with published data on the lack of adverse events have reopened the interest in this field and suggest that, under given conditions, MRI can be safely performed also in implanted patients. With a better understanding of the hazards of performing MRI scans on implanted patients as well as the development of MRI safe devices, we may soon enter an era where the ability of this imaging modality may be more widely used to assist in the appropriate diagnosis of patients with devices. In this study both experimental measures and numerical analysis were performed. Aim of the study is to systematically investigate the effects of the MRI RF filed on implantable devices and to identify the elements that play a major role in the induced heating. Furthermore, we aimed at developing a realistic numerical model able to simulate the interactions between an RF coil for MRI and biological tissues implanted with a PM, and to predict the induced SAR as a function of the particular path of the PM lead. The methods developed and validated during the PhD program led to the design of an experimental framework for the accurate measure of PM lead heating induced by MRI systems. In addition, numerical models based on Finite-Differences Time-Domain (FDTD) simulations were validated to obtain a general tool for investigating the large number of parameters and factors involved in this complex phenomenon. The results obtained demonstrated that the MRI induced heating on metallic implants is a real risk that represents a contraindication in extending MRI scans also to patient bearing a PM, an ICD, or other thin metallic objects. On the other side, both experimental data and numerical results show that, under particular conditions, MRI procedures might be consider reasonably safe also for an implanted patient. The complexity and the large number of variables involved, make difficult to define a unique set of such conditions: when the benefits of a MRI investigation cannot be obtained using other imaging techniques, the possibility to perform the scan should not be immediately excluded, but some considerations are always needed.