993 resultados para Simultaneous Methods
Resumo:
A navegação e a interpretação do meio envolvente por veículos autónomos em ambientes não estruturados continua a ser um grande desafio na actualidade. Sebastian Thrun, descreve em [Thr02], que o problema do mapeamento em sistemas robóticos é o da aquisição de um modelo espacial do meio envolvente do robô. Neste contexto, a integração de sistemas sensoriais em plataformas robóticas, que permitam a construção de mapas do mundo que as rodeia é de extrema importância. A informação recolhida desses dados pode ser interpretada, tendo aplicabilidade em tarefas de localização, navegação e manipulação de objectos. Até à bem pouco tempo, a generalidade dos sistemas robóticos que realizavam tarefas de mapeamento ou Simultaneous Localization And Mapping (SLAM), utilizavam dispositivos do tipo laser rangefinders e câmaras stereo. Estes equipamentos, para além de serem dispendiosos, fornecem apenas informação bidimensional, recolhidas através de cortes transversais 2D, no caso dos rangefinders. O paradigma deste tipo de tecnologia mudou consideravelmente, com o lançamento no mercado de câmaras RGB-D, como a desenvolvida pela PrimeSense TM e o subsequente lançamento da Kinect, pela Microsoft R para a Xbox 360 no final de 2010. A qualidade do sensor de profundidade, dada a natureza de baixo custo e a sua capacidade de aquisição de dados em tempo real, é incontornável, fazendo com que o sensor se tornasse instantaneamente popular entre pesquisadores e entusiastas. Este avanço tecnológico deu origem a várias ferramentas de desenvolvimento e interacção humana com este tipo de sensor, como por exemplo a Point Cloud Library [RC11] (PCL). Esta ferramenta tem como objectivo fornecer suporte para todos os blocos de construção comuns que uma aplicação 3D necessita, dando especial ênfase ao processamento de nuvens de pontos de n dimensões adquiridas a partir de câmaras RGB-D, bem como scanners laser, câmaras Time-of-Flight ou câmaras stereo. Neste contexto, é realizada nesta dissertação, a avaliação e comparação de alguns dos módulos e métodos constituintes da biblioteca PCL, para a resolução de problemas inerentes à construção e interpretação de mapas, em ambientes indoor não estruturados, utilizando os dados provenientes da Kinect. A partir desta avaliação, é proposta uma arquitectura de sistema que sistematiza o registo de nuvens de pontos, correspondentes a vistas parciais do mundo, num modelo global consistente. Os resultados da avaliação realizada à biblioteca PCL atestam a sua viabilidade, para a resolução dos problemas propostos. Prova da sua viabilidade, são os resultados práticos obtidos, da implementação da arquitectura de sistema proposta, que apresenta resultados de desempenho interessantes, como também boas perspectivas de integração deste tipo de conceitos e tecnologia em plataformas robóticas desenvolvidas no âmbito de projectos do Laboratório de Sistemas Autónomos (LSA).
Resumo:
An optimised version of the Quick, Easy, Cheap, Effective, Rugged and Safe (QuEChERS) method for simultaneous determination of 14 organochlorine pesticides in carrots was developed using gas chromatography coupled with electron-capture detector (GC-ECD) and confirmation by gas chromatography tandem mass spectrometry (GC-MS/MS). A citrate-buffered version of QuEChERS was applied for the extraction of the organochlorine pesticides, and for the extract clean-up, primary secondary amine, octadecyl-bonded silica (C18), magnesium sulphate (MgSO4) and graphitized carbon black were used as sorbents. The GC-ECD determination of the target compounds was achieved in less than 20 min. The limits of detection were below the EUmaximum residue limits (MRLs) for carrots, 10–50 μg kg−1, while the limit of quantification did exceed 10 μg kg−1 for hexachlorobenzene (HCB). The introduction of a sonication step was shown to improve the recoveries. The overall average recoveries in carrots, at the four tested levels (60, 80, 100 and 140 μg kg−1), ranged from 66 to 111% with relative standard deviations in the range of 2– 15 % (n03) for all analytes, with the exception of HCB. The method has been applied to the analysis of 21 carrot samples from different Portuguese regions, and β-HCH was the pesticide most frequently found, with concentrations oscillating between less than the limit of quantification to 14.6 μg kg−1. Only one sample had a pesticide residue (β-HCH) above the MRL, 14.6 μg kg−1. This methodology combines the advantages of both QuEChERS and GC-ECD, producing a very rapid, sensitive and reliable procedure which can be applied in routine analytical laboratories.
Resumo:
Celiac disease (CD) is a gluten-induced autoimmune enteropathy characterized by the presence of antibodies against gliadin (AGA) and anti-tissue transglutaminase (anti-tTG) antibodies. A disposable electrochemical dual immunosensor for the simultaneous detection of IgA and IgG type AGA and antitTG antibodies in real patient’s samples is presented. The proposed immunosensor is based on a dual screen-printed carbon electrode, with two working electrodes, nanostructured with a carbon–metal hybrid system that worked as the transducer surface. The immunosensing strategy consisted of the immobilization of gliadin and tTG (i.e. CD specific antigens) on the nanostructured electrode surface. The electrochemical detection of the human antibodies present in the assayed serum samples was carried out through the antigen–antibody interaction and recorded using alkaline phosphatase labelled anti-human antibodies and a mixture of 3-indoxyl phosphate with silver ions was used as the substrate. The analytical signal was based on the anodic redissolution of enzymatically generated silver by cyclic voltammetry. The results obtained were corroborated with commercial ELISA kits indicating that the developed sensor can be a good alternative to the traditional methods allowing a decentralization of the analyses towards a point-of-care strategy.
Resumo:
This paper focuses on evaluating the usability of an Intelligent Wheelchair (IW) in both real and simulated environments. The wheelchair is controlled at a high-level by a flexible multimodal interface, using voice commands, facial expressions, head movements and joystick as its main inputs. A Quasi-experimental design was applied including a deterministic sample with a questionnaire that enabled to apply the System Usability Scale. The subjects were divided in two independent samples: 46 individuals performing the experiment with an Intelligent Wheelchair in a simulated environment (28 using different commands in a sequential way and 18 with the liberty to choose the command); 12 individuals performing the experiment with a real IW. The main conclusion achieved by this study is that the usability of the Intelligent Wheelchair in a real environment is higher than in the simulated environment. However there were not statistical evidences to affirm that there are differences between the real and simulated wheelchairs in terms of safety and control. Also, most of users considered the multimodal way of driving the wheelchair very practical and satisfactory. Thus, it may be concluded that the multimodal interfaces enables very easy and safe control of the IW both in simulated and real environments.
Resumo:
We perform a comparison between the fractional iteration and decomposition methods applied to the wave equation on Cantor set. The operators are taken in the local sense. The results illustrate the significant features of the two methods which are both very effective and straightforward for solving the differential equations with local fractional derivative.
Resumo:
The investigation which employed the action research method (qualitative analysis)was divided into four fases. In phases 1-3 the participants were six double bass students at Nossa Senhora do Cabo Music School. Pilot exercises in creativity were followed by broader and more ambitious projects. In phase 4 the techniques were tested and amplified during a summer course for twelve double bass students at Santa Cecilia College.
Resumo:
OBJECTIVE To propose a cut-off for the World Health Organization Quality of Life-Bref (WHOQOL-bref) as a predictor of quality of life in older adults. METHODS Cross-sectional study with 391 older adults registered in the Northwest Health District in Belo Horizonte, MG, Southeastern Brazil, between October 8, 2010 and May 23, 2011. The older adults’ quality of life was measured using the WHOQOL-bref. The analysis was rationalized by outlining two extreme and simultaneous groups according to perceived quality of life and satisfaction with health (quality of life good/satisfactory – good or very good self-reported quality of life and being satisfied or very satisfied with health – G5; and poor/very poor quality of life – poor or very poor self-reported quality of life and feeling dissatisfied or very dissatisfied with health – G6). A Receiver-Operating Characteristic curve (ROC) was created to assess the diagnostic ability of different cut-off points of the WHOQOL-bref. RESULTS ROC curve analysis indicated a critical value 60 as the optimal cut-off point for assessing perceived quality of life and satisfaction with health. The area under the curve was 0.758, with a sensitivity of 76.8% and specificity of 63.8% for a cut-off of ≥ 60 for overall quality of life (G5) and sensitivity 95.0% and specificity of 54.4% for a cut-off of < 60 for overall quality of life (G6). CONCLUSIONS Diagnostic interpretation of the ROC curve revealed that cut-off < 60 for overall quality of life obtained excellent sensitivity and negative predictive value for tracking older adults with probable worse quality of life and dissatisfied with health.
Resumo:
Optimization problems arise in science, engineering, economy, etc. and we need to find the best solutions for each reality. The methods used to solve these problems depend on several factors, including the amount and type of accessible information, the available algorithms for solving them, and, obviously, the intrinsic characteristics of the problem. There are many kinds of optimization problems and, consequently, many kinds of methods to solve them. When the involved functions are nonlinear and their derivatives are not known or are very difficult to calculate, these methods are more rare. These kinds of functions are frequently called black box functions. To solve such problems without constraints (unconstrained optimization), we can use direct search methods. These methods do not require any derivatives or approximations of them. But when the problem has constraints (nonlinear programming problems) and, additionally, the constraint functions are black box functions, it is much more difficult to find the most appropriate method. Penalty methods can then be used. They transform the original problem into a sequence of other problems, derived from the initial, all without constraints. Then this sequence of problems (without constraints) can be solved using the methods available for unconstrained optimization. In this chapter, we present a classification of some of the existing penalty methods and describe some of their assumptions and limitations. These methods allow the solving of optimization problems with continuous, discrete, and mixing constraints, without requiring continuity, differentiability, or convexity. Thus, penalty methods can be used as the first step in the resolution of constrained problems, by means of methods that typically are used by unconstrained problems. We also discuss a new class of penalty methods for nonlinear optimization, which adjust the penalty parameter dynamically.
Resumo:
The characteristics of carbon fibre reinforced laminates had widened their use, from aerospace to domestic appliances. A common characteristic is the need of drilling for assembly purposes. It is known that a drilling process that reduces the drill thrust force can decrease the risk of delamination. In this work, delamination assessment methods based on radiographic data are compared and correlated with mechanical test results (bearing test).
Resumo:
Constrained and unconstrained Nonlinear Optimization Problems often appear in many engineering areas. In some of these cases it is not possible to use derivative based optimization methods because the objective function is not known or it is too complex or the objective function is non-smooth. In these cases derivative based methods cannot be used and Direct Search Methods might be the most suitable optimization methods. An Application Programming Interface (API) including some of these methods was implemented using Java Technology. This API can be accessed either by applications running in the same computer where it is installed or, it can be remotely accessed through a LAN or the Internet, using webservices. From the engineering point of view, the information needed from the API is the solution for the provided problem. On the other hand, from the optimization methods researchers’ point of view, not only the solution for the problem is needed. Also additional information about the iterative process is useful, such as: the number of iterations; the value of the solution at each iteration; the stopping criteria, etc. In this paper are presented the features added to the API to allow users to access to the iterative process data.
Resumo:
In Nonlinear Optimization Penalty and Barrier Methods are normally used to solve Constrained Problems. There are several Penalty/Barrier Methods and they are used in several areas from Engineering to Economy, through Biology, Chemistry, Physics among others. In these areas it often appears Optimization Problems in which the involved functions (objective and constraints) are non-smooth and/or their derivatives are not know. In this work some Penalty/Barrier functions are tested and compared, using in the internal process, Derivative-free, namely Direct Search, methods. This work is a part of a bigger project involving the development of an Application Programming Interface, that implements several Optimization Methods, to be used in applications that need to solve constrained and/or unconstrained Nonlinear Optimization Problems. Besides the use of it in applied mathematics research it is also to be used in engineering software packages.
Resumo:
Epidemiological studies have shown the effect of diet on the incidence of chronic diseases; however, proper planning, designing, and statistical modeling are necessary to obtain precise and accurate food consumption data. Evaluation methods used for short-term assessment of food consumption of a population, such as tracking of food intake over 24h or food diaries, can be affected by random errors or biases inherent to the method. Statistical modeling is used to handle random errors, whereas proper designing and sampling are essential for controlling biases. The present study aimed to analyze potential biases and random errors and determine how they affect the results. We also aimed to identify ways to prevent them and/or to use statistical approaches in epidemiological studies involving dietary assessments.
Resumo:
OBJECTIVE To analyze the effect of air pollution and temperature on mortality due to cardiovascular and respiratory diseases. METHODS We evaluated the isolated and synergistic effects of temperature and particulate matter with aerodynamic diameter < 10 µm (PM10) on the mortality of individuals > 40 years old due to cardiovascular disease and that of individuals > 60 years old due to respiratory diseases in Sao Paulo, SP, Southeastern Brazil, between 1998 and 2008. Three methodologies were used to evaluate the isolated association: time-series analysis using Poisson regression model, bidirectional case-crossover analysis matched by period, and case-crossover analysis matched by the confounding factor, i.e., average temperature or pollutant concentration. The graphical representation of the response surface, generated by the interaction term between these factors added to the Poisson regression model, was interpreted to evaluate the synergistic effect of the risk factors. RESULTS No differences were observed between the results of the case-crossover and time-series analyses. The percentage change in the relative risk of cardiovascular and respiratory mortality was 0.85% (0.45;1.25) and 1.60% (0.74;2.46), respectively, due to an increase of 10 μg/m3 in the PM10 concentration. The pattern of correlation of the temperature with cardiovascular mortality was U-shaped and that with respiratory mortality was J-shaped, indicating an increased relative risk at high temperatures. The values for the interaction term indicated a higher relative risk for cardiovascular and respiratory mortalities at low temperatures and high temperatures, respectively, when the pollution levels reached approximately 60 μg/m3. CONCLUSIONS The positive association standardized in the Poisson regression model for pollutant concentration is not confounded by temperature, and the effect of temperature is not confounded by the pollutant levels in the time-series analysis. The simultaneous exposure to different levels of environmental factors can create synergistic effects that are as disturbing as those caused by extreme concentrations.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Engenharia do Ambiente
Resumo:
ABSTRACT OBJECTIVE : To analyze if the demographic and socioeconomic variables, as well as percutaneous coronary intervention are associated with the use of medicines for secondary prevention of acute coronary syndrome. METHODS : In this cohort study, we included 138 patients with acute coronary syndrome, aged 30 years or more and of both sexes. The data were collected at the time of hospital discharge, and after six and twelve months. The outcome of the study was the simultaneous use of medicines recommended for secondary prevention of acute coronary syndrome: platelet antiaggregant, beta-blockers, statins and angiotensin-converting-enzyme inhibitor or angiotensin receptor blocker. The independent variables were: sex, age, education in years of attending, monthly income in tertiles and percutaneous coronary intervention. We described the prevalence of use of each group of medicines with their 95% confidence intervals, as well as the simultaneous use of the four medicines, in all analyzed periods. In the crude analysis, we verified the outcome with the independent variables for each period through the Chi-square test. The adjusted analysis was carried out using Poisson Regression. RESULTS : More than a third of patients (36.2%; 95%CI 28.2;44.3) had the four medicines prescribed at the same time, at the moment of discharge. We did not observe any differences in the prevalence of use in comparison with the two follow-up periods. The most prescribed class of medicines during discharge was platelet antiaggregant (91.3%). In the crude analysis, the demographic and socioeconomic variables were not associated to the outcome in any of the three periods. CONCLUSIONS : The prevalence of simultaneous use of medicines at discharge and in the follow-ups pointed to the under-utilization of this therapy in clinical practice. Intervention strategies are needed to improve the quality of care given to patients that extend beyond the hospital discharge, a critical point of transition in care.