963 resultados para Standardization in robotics
Resumo:
Diabetes mellitus occurs in two forms, insulin-dependent (IDDM, formerly called juvenile type) and non-insulin dependent (NIDDM, formerly called adult type). Prevalence figures from around the world for NIDDM, show that all societies and all races are affected; although uncommon in some populations (.4%), it is common (10%) or very common (40%) in others (Tables 1 and 2).^ In Mexican-Americans in particular, the prevalence rates (7-10%) are intermediate to those in Caucasians (1-2%) and Amerindians (35%). Information about the distribution of the disease and identification of high risk groups for developing glucose intolerance or its vascular manifestations by the study of genetic markers will help to clarify and solve some of the problems from the public health and the genetic point of view.^ This research was designed to examine two general areas in relation to NIDDM. The first aims to determine the prevalence of polymorphic genetic markers in two groups distinguished by the presence or absence of diabetes and to observe if there are any genetic marker-disease association (univariate analysis using two by two tables and logistic regression to study the individual and joint effects of the different variables). The second deals with the effect of genetic differences on the variation in fasting plasma glucose and percent glycosylated hemoglobin (HbAl) (analysis of Covariance for each marker, using age and sex as covariates).^ The results from the first analysis were not statistically significant at the corrected p value of 0.003 given the number of tests that were performed. From the analysis of covariance of all the markers studied, only Duffy and Phosphoglucomutase were statistically significant but poor predictors, given that the amount they explain in terms of variation in glycosylated hemoglobin is very small.^ Trying to determine the polygenic component of chronic disease is not an easy task. This study confirms the fact that a larger and random or representative sample is needed to be able to detect differences in the prevalence of a marker for association studies and in the genetic contribution to the variation in glucose and glycosylated hemoglobin. The importance that ethnic homogeneity in the groups studied and standardization in the methodology will have on the results has been stressed. ^
Resumo:
New actuation technology in functional or "smart" materials has opened new horizons in robotics actuation systems. Materials such as piezo-electric fiber composites, electro-active polymers and shape memory alloys (SMA) are being investigated as promising alternatives to standard servomotor technology [52]. This paper focuses on the use of SMAs for building muscle-like actuators. SMAs are extremely cheap, easily available commercially and have the advantage of working at low voltages. The use of SMA provides a very interesting alternative to the mechanisms used by conventional actuators. SMAs allow to drastically reduce the size, weight and complexity of robotic systems. In fact, their large force-weight ratio, large life cycles, negligible volume, sensing capability and noise-free operation make possible the use of this technology for building a new class of actuation devices. Nonetheless, high power consumption and low bandwidth limit this technology for certain kind of applications. This presents a challenge that must be addressed from both materials and control perspectives in order to overcome these drawbacks. Here, the latter is tackled. It has been demonstrated that suitable control strategies and proper mechanical arrangements can dramatically improve on SMA performance, mostly in terms of actuation speed and limit cycles.
Resumo:
In recent decades, there has been an increasing interest in systems comprised of several autonomous mobile robots, and as a result, there has been a substantial amount of development in the eld of Articial Intelligence, especially in Robotics. There are several studies in the literature by some researchers from the scientic community that focus on the creation of intelligent machines and devices capable to imitate the functions and movements of living beings. Multi-Robot Systems (MRS) can often deal with tasks that are dicult, if not impossible, to be accomplished by a single robot. In the context of MRS, one of the main challenges is the need to control, coordinate and synchronize the operation of multiple robots to perform a specic task. This requires the development of new strategies and methods which allow us to obtain the desired system behavior in a formal and concise way. This PhD thesis aims to study the coordination of multi-robot systems, in particular, addresses the problem of the distribution of heterogeneous multi-tasks. The main interest in these systems is to understand how from simple rules inspired by the division of labor in social insects, a group of robots can perform tasks in an organized and coordinated way. We are mainly interested on truly distributed or decentralized solutions in which the robots themselves, autonomously and in an individual manner, select a particular task so that all tasks are optimally distributed. In general, to perform the multi-tasks distribution among a team of robots, they have to synchronize their actions and exchange information. Under this approach we can speak of multi-tasks selection instead of multi-tasks assignment, which means, that the agents or robots select the tasks instead of being assigned a task by a central controller. The key element in these algorithms is the estimation ix of the stimuli and the adaptive update of the thresholds. This means that each robot performs this estimate locally depending on the load or the number of pending tasks to be performed. In addition, it is very interesting the evaluation of the results in function in each approach, comparing the results obtained by the introducing noise in the number of pending loads, with the purpose of simulate the robot's error in estimating the real number of pending tasks. The main contribution of this thesis can be found in the approach based on self-organization and division of labor in social insects. An experimental scenario for the coordination problem among multiple robots, the robustness of the approaches and the generation of dynamic tasks have been presented and discussed. The particular issues studied are: Threshold models: It presents the experiments conducted to test the response threshold model with the objective to analyze the system performance index, for the problem of the distribution of heterogeneous multitasks in multi-robot systems; also has been introduced additive noise in the number of pending loads and has been generated dynamic tasks over time. Learning automata methods: It describes the experiments to test the learning automata-based probabilistic algorithms. The approach was tested to evaluate the system performance index with additive noise and with dynamic tasks generation for the same problem of the distribution of heterogeneous multi-tasks in multi-robot systems. Ant colony optimization: The goal of the experiments presented is to test the ant colony optimization-based deterministic algorithms, to achieve the distribution of heterogeneous multi-tasks in multi-robot systems. In the experiments performed, the system performance index is evaluated by introducing additive noise and dynamic tasks generation over time.
Resumo:
This paper addresses initial efforts to develop a navigation system for ground vehicles supported by visual feedback from a mini aerial vehicle. A visual-based algorithm computes the ground vehicle pose in the world frame, as well as possible obstacles within the ground vehicle pathway. Relying on that information, a navigation and obstacle avoidance system is used to re-plan the ground vehicle trajectory, ensuring an optimal detour. Finally, some experiments are presented employing a unmanned ground vehicle (UGV) and a low cost mini unmanned aerial vehicle (UAV).
Resumo:
We present ARGoS, a novel open source multi-robot simulator. The main design focus of ARGoS is the real-time simulation of large heterogeneous swarms of robots. Existing robot simulators obtain scalability by imposing limitations on their extensibility and on the accuracy of the robot models. By contrast, in ARGoS we pursue a deeply modular approach that allows the user both to easily add custom features and to allocate computational resources where needed by the experiment. A unique feature of ARGoS is the possibility to use multiple physics engines of different types and to assign them to different parts of the environment. Robots can migrate from one engine to another transparently. This feature enables entirely novel classes of optimizations to improve scalability and paves the way for a new approach to parallelism in robotics simulation. Results show that ARGoS can simulate about 10,000 simple wheeled robots 40% faster than real-time.
Resumo:
The semantic localization problem in robotics consists in determining the place where a robot is located by means of semantic categories. The problem is usually addressed as a supervised classification process, where input data correspond to robot perceptions while classes to semantic categories, like kitchen or corridor. In this paper we propose a framework, implemented in the PCL library, which provides a set of valuable tools to easily develop and evaluate semantic localization systems. The implementation includes the generation of 3D global descriptors following a Bag-of-Words approach. This allows the generation of fixed-dimensionality descriptors from any type of keypoint detector and feature extractor combinations. The framework has been designed, structured and implemented to be easily extended with different keypoint detectors, feature extractors as well as classification models. The proposed framework has also been used to evaluate the performance of a set of already implemented descriptors, when used as input for a specific semantic localization system. The obtained results are discussed paying special attention to the internal parameters of the BoW descriptor generation process. Moreover, we also review the combination of some keypoint detectors with different 3D descriptor generation techniques.
Resumo:
In the long term, productivity and especially productivity growth are necessary conditions for the survival of a farm. This paper focuses on the technology choice of a dairy farm, i.e. the choice between a conventional and an automatic milking system. Its aim is to reveal the extent to which economic rationality explains investing in new technology. The adoption of robotics is further linked to farm productivity to show how capital-intensive technology has affected the overall productivity of milk production. The empirical analysis applies a probit model and an extended Cobb-Douglas-type production function to a Finnish farm-level dataset for the years 2000–10. The results show that very few economic factors on a dairy farm or in its economic environment can be identified to affect the switch to automatic milking. Existing machinery capital and investment allowances are among the significant factors. The results also indicate that the probability of investing in robotics responds elastically to a change in investment aids: an increase of 1% in aid would generate an increase of 2% in the probability of investing. Despite the presence of non-economic incentives, the switch to robotic milking is proven to promote productivity development on dairy farms. No productivity growth is observed on farms that keep conventional milking systems, whereas farms with robotic milking have a growth rate of 8.1% per year. The mean rate for farms that switch to robotic milking is 7.0% per year. The results show great progress in productivity growth, with the average of the sector at around 2% per year during the past two decades. In conclusion, investments in new technology as well as investment aids to boost investments are needed in low-productivity areas where investments in new technology still have great potential to increase productivity, and thus profitability and competitiveness, in the long run.
Resumo:
Az elmúlt néhány évtizedben a szabványosítás terén igen komoly változások mentek végbe. Ugrásszerűen megnőtt a szabványok száma, és jelentősen átalakult a szabványosítás folyamata is. Ezzel párhuzamosan a téma gazdasági hatásaival foglalkozó kutatások száma is megsokszorozódott, ami elsősorban a hálózati externáliák irodalmának robbanásszerű gyarapodásának köszönhető. Jelen tanulmány – az elméletek fősodrától eltérően – a tranzakciós költségek elméletében (TKE) helyezi el a szabványosítást. A szabványok és a tranzakciós költségek kapcsolatáról már születtek korábban is tanulmányok, de ezek a szabványoknak a tranzakciós költségekre gyakorolt hatásaira fókuszáltak. A tanulmány ezzel szemben arra helyezi a hangsúlyt, hogy azonosítsa a tranzakciós költségeknek a szabványosításra gyakorolt hatásait. A kutatás célja, hogy olyan elméleti alapot adjon, amelyben a témakör átfogóan elemezhető. A fő kutatási kérdés az, hogy mitől függ az, hogy melyik mechanizmus kereteiben érdemes a szabványosítást lebonyolítani. ________ Significant changes have characterized the last few decades of standardization. The number of standards has dramatically increased and processes of standardization have also changed a lot. At the same time the amount of researches that are concerned with the economic impact of standardization has also multiplied due to the boom in the literature of network externalities. Unlike the mainstream, this paper places standardization in the theory of transaction cost economics. Although there are earlier papers that are concerned with the relationship between standards and transaction costs, these studies focus on the impact of standards on transaction costs. In contrast, this paper lays emphasis on the identification of the impact of transaction costs on standardization. This study aims to provide a theoretical basis for the comprehensive analyses. The main research question: What determines which coordination mechanism is used to evolve a standard?
Resumo:
During the years 1890–1920, the public school education system established itself as the medium to transmit American values, knowledge and culture. This study described and explained why some individuals were destined to fail, and others succeed in America's public schools. The exploratory questions guiding this study were: What elements constitute society's perspective of whom it should educate during the years 1890–1920? What variables influenced society's perspective of whom it should educate during the years 1890–1920? ^ After explaining these issues, educators will then have a better understanding and awareness of why certain educational practices are currently implemented and will be able to critically evaluate which ones should be continued. The methodology chosen was historical. The approach for analyzing the data was coding. The information was coded in order to determine themes, concepts and ideas amongst the documents and as portrayed in the literature. The first step was to seek out patterns and then to write out words and phrases to represent these topics. Then, these phrases were attributed to networks. ^ The data indicated that public schools during this era were designed to conform and assimilate the new immigrants and factory workers in an efficient and standardized manner. Efficiency and standardization in production became the American way for government, commerce, personal, lives and the school. Many different approaches to education emerged during this time period, specifically those, which emphasized individuality; but only those, which paralleled the ideology of efficiency, standardization and conformity were adopted. Those students who were unable to conform to society's criteria for success were penalized in the classroom, on IQ examinations and national standardized exams. ^ This study was illuminative in that it explained the root cause as to why some individuals are meant to succeed while others are penalized in the classroom. Future studies connecting standardized assessments and learning styles are suggested. ^
Resumo:
UNESCO’s approval of the Convention on the Protection and Promotion of the Diversity of Cultural Expressions (UNESCO, 2005) has been an important element in catalyzing any attempt to measure the diversity of cultural industries (UIS, 2011). Within this framework, this article analyzes the relations between the music and radio industries in Spain from a critical perspective through the analysis of available data on recorded music offer and consumption (sales lists, radio-formula lists, the characteristics of the phonographic and radio markets) in different key moments due to the emergence of new formats and devices (CDS, Mp3, Internet).The main goal of this work is to study the evolution of the Spanish record market in terms of diversity from the end of the 1970s to the present, through the study of radio music hits lists and, the business structure of the phonographic and radio sectors, and phonograms top sales
Resumo:
This paper reviews current research works at the authors’ Institutions to illustrate how mobile robotics and related technologies can be used to enhance economical fruition, control, protection and social impact of the cultural heritage. Robots allow experiencing on-line, from remote locations, tours at museums, archaeological areas and monuments. These solutions avoid travelling costs, increase beyond actual limits the number of simultaneous visitors, and prevent possible damages that can arise by over-exploitation of fragile environments. The same tools can be used for exploration and monitoring of cultural artifacts located in difficult to reach or dangerous areas. Examples are provided by the use of underwater robots in the exploration of deeply submerged archaeological areas. Besides, technologies commonly employed in robotics can be used to help exploring, monitoring and preserving cultural artifacts. Examples are provided by the development of procedures for data acquisition and mapping and by object recognition and monitoring algorithms.
Resumo:
Despite the frequent use of stepping motors in robotics, automation, and a variety of precision instruments, they can hardly be found in rotational viscometers. This paper proposes the use of a stepping motor to drive a conventional constant-shear-rate laboratory rotational viscometer to avoid the use of velocity sensor and gearbox and, thus, simplify the instrument design. To investigate this driving technique, a commercial rotating viscometer has been adapted to be driven by a bipolar stepping motor, which is controlled via a personal computer. Special circuitry has been added to microstep the stepping motor at selectable step sizes and to condition the torque signal. Tests have been carried out using the prototype to produce flow curves for two standard Newtonian fluids (920 and 12 560 mPa (.) s, both at 25 degrees C). The flow curves have been obtained by employing several distinct microstep sizes within the shear rate range of 50-500 s(-1). The results indicate the feasibility of the proposed driving technique.
Resumo:
As a consequence of the large distribution and use of medicinal plants, the industries are producing products based on plant species in various pharmaceutical forms, which have been commercialized in pharmacies and natural products homes. However, there is no guarantee for the vast majority of these products, as to their effctiveness, safety, and quality, which may cause risks to the health of consumers. There it is important the establishment of standardized protocols of quality control for phytotherapeutic products. Tinctures of barbatimao are available in the Brazilian market proceeding from diverse manufacturers. With the purpose to evaluate the difference between the quality of tinctures of barbatimao proceeding from four manufactures, a comparative study of ph ysico-chenfical andphylocheinical characteristics was carried out. For physico-chemical analysis, the pH, density, dry residue and tannins content were evaluated. The phytochemical analysis was made using thin layer chromatography. The differences observed in physico-chemical and phytochemical characteristics had evidenced the lack of standardization in the production of these tinctures.
Resumo:
A absorção de água por carcaças de frango na etapa de pré-resfriamento da linha abate representa uma característica de qualidade importante relacionada ao rendimento do produto final. Uma forma de manter o padrão de qualidade de um produto é garantir que as etapas do processo sejam estáveis e replicáveis. Ao empregar o Controle Estatístico de Processo (CEP) é possível obter estabilidade e melhorias nos processos, por meio da redução da variabilidade. Neste contexto, o objetivo deste trabalho foi a aplicação de gráficos de controle, análise de correlação, estatística descritiva, testes de hipóteses e regressão linear múltipla na linha de abate de um abatedouro-frigorífico de aves para monitorar a variabilidade da absorção de água pelas carcaças de frango após a etapa de pré-resfriamento. Como resultado, verificou-se que o teor de absorção de água das carcaças de frango apresentou elevada variabilidade, sendo que 10% (8/80) das carcaças apresentaram absorção de água superior ao limite de 8% definido pela legislação brasileira. Do total de 16 variáveis de entrada analisadas, as mais impactantes no teor de absorção de água foram o “tempo de retenção da carcaça no pré-chiller” e o “tempo de espera da carcaça após a etapa de gotejamento”. Entretanto, o modelo de regressão obtido apresentou baixa correlação (R²=0,16) que foi associada à elevada variabilidade da variável-resposta. Os resultados da estatística descritiva demonstraram que as variáveis de entrada também apresentaram elevada variabilidade, com coeficiente de variação entre 7,95 e 63,5%. Verificou-se, pela análise dos gráficos de controle de medida individual e da amplitude móvel, que 15 das 16 variáveis de entrada se apresentaram fora de controle estatístico assim como a variável-resposta. Baseado no fluxograma e na descrição das etapas da linha de abate, previamente realizados, atribuiu-se à falta de padronização na condução das etapas e de procedimentos para o controle de qualidade das operações na linha de abate como fatores relevantes que poderiam estar associados à presença de causas especiais no processo. Concluiu-se que para reduzir a elevada variabilidade das variáveis e eliminar as causas especiais presentes são necessários ajustes operacionais para, dessa forma, obter um processo mais estável e mais uniforme garantindo o padrão de qualidade das carcaças de frango em relação ao teor de absorção de água.