882 resultados para Robotics design framework
Resumo:
Among the current trends for children’s clothing is clothing that may contribute to the development and children’s learning through interaction with their clothes. For child interaction / clothes from occurring, this dress is appropriate the knowledge of design, which assist in its construction. Thus, this study seeks to investigate design principles that can be adopted for production of clothing that can assist in the development of children, showing how notions of education can be incorporated into costumes, studying the relationships and interactions between design, fashion and education. The study addresses this garment from the perspective of design, so as to provide insights that could contribute to the creation of the pieces. To that end, we developed a theoretical framework that sought to show the infant universe, communication, clothing and design. According to the methodological procedures for the work, there was a case study in ten private schools in Bauru-sp in order to obtain data on experiences with children about their preferences, attitudes and ways of reasoning child’s world.
Resumo:
The aim of this study was to evaluate the deformation suffered by cantilevered implant-supported fixed prostheses frameworks cast in silver-palladium alloy and coated with two occlusal veneering materials: acrylic resin or porcelain. Material and Methods: Two strain gauges were bonded to the inferior surface of the silver-palladium framework and two other were bonded to the occlusal surface of the prosthesis framework covered with ceramic and acrylic resin on each of its two halves. The framework was fixed to a metallic master model and a 35.2 N compression force was applied to the cantilever at 10, 15 and 20 mm from the most distal implant. The measurements of deformation by compression and tension were obtained. The statistical 2-way ANOVA test was used for individual analysis of the experiment variables and the Tukey test was used for the interrelation between all the variables (material and distance of force application). Results: The results showed that both variables had influence on the studied factors (deformation by compression and tension). Conclusion: The ceramic coating provided greater rigidity to the assembly and therefore less distortion compared with the uncoated framework and with the resin-coated framework. The cantilever arm length also influenced the prosthesis rigidity, causing higher deformation the farther the load was applied from the last implant.
Resumo:
Usually we observe that Bio-physical systems or Bio-chemical systems are many a time based on nanoscale phenomenon in different host environments, which involve many particles can often not be solved explicitly. Instead a physicist, biologist or a chemist has to rely either on approximate or numerical methods. For a certain type of systems, called integrable in nature, there exist particular mathematical structures and symmetries which facilitate the exact and explicit description. Most integrable systems, we come across are low-dimensional, for instance, a one-dimensional chain of coupled atoms in DNA molecular system with a particular direction or exist as a vector in the environment. This theoretical research paper aims at bringing one of the pioneering ‘Reaction-Diffusion’ aspects of the DNA-plasma material system based on an integrable lattice model approach utilizing quantized functional algebras, to disseminate the new developments, initiate novel computational and design paradigms.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Key management is a core mechanism to ensure the security of applications and network services in wireless sensor networks. It includes two aspects: key distribution and key revocation. Many key management protocols have been specifically designed for wireless sensor networks. However, most of the key management protocols focus on the establishment of the required keys or the removal of the compromised keys. The design of these key management protocols does not consider the support of higher level security applications. When the applications are integrated later in sensor networks, new mechanisms must be designed. In this paper, we propose a security framework, uKeying, for wireless sensor networks. This framework can be easily extended to support many security applications. It includes three components: a security mechanism to provide secrecy for communications in sensor networks, an efficient session key distribution scheme, and a centralized key revocation scheme. The proposed framework does not depend on a specific key distribution scheme and can be used to support many security applications, such as secure group communications. Our analysis shows that the framework is secure, efficient, and extensible. The simulation and results also reveal for the first time that a centralized key revocation scheme can also attain a high efficiency.
Resumo:
Multicommodity flow (MF) problems have a wide variety of applications in areas such as VLSI circuit design, network design, etc., and are therefore very well studied. The fractional MF problems are polynomial time solvable while integer versions are NP-complete. However, exact algorithms to solve the fractional MF problems have high computational complexity. Therefore approximation algorithms to solve the fractional MF problems have been explored in the literature to reduce their computational complexity. Using these approximation algorithms and the randomized rounding technique, polynomial time approximation algorithms have been explored in the literature. In the design of high-speed networks, such as optical wavelength division multiplexing (WDM) networks, providing survivability carries great significance. Survivability is the ability of the network to recover from failures. It further increases the complexity of network design and presents network designers with more formidable challenges. In this work we formulate the survivable versions of the MF problems. We build approximation algorithms for the survivable multicommodity flow (SMF) problems based on the framework of the approximation algorithms for the MF problems presented in [1] and [2]. We discuss applications of the SMF problems to solve survivable routing in capacitated networks.
Resumo:
Robots are needed to perform important field tasks such as hazardous material clean-up, nuclear site inspection, and space exploration. Unfortunately their use is not widespread due to their long development times and high costs. To make them practical, a modular design approach is proposed. Prefabricated modules are rapidly assembled to give a low-cost system for a specific task. This paper described the modular design problem for field robots and the application of a hierarchical selection process to solve this problem. Theoretical analysis and an example case study are presented. The theoretical analysis of the modular design problem revealed the large size of the search space. It showed the advantages of approaching the design on various levels. The hierarchical selection process applies physical rules to reduce the search space to a computationally feasible size and a genetic algorithm performs the final search in a greatly reduced space. This process is based on the observation that simple physically based rules can eliminate large sections of the design space to greatly simplify the search. The design process is applied to a duct inspection task. Five candidate robots were developed. Two of these robots are evaluated using detailed physical simulation. It is shown that the more obvious solution is not able to complete the task, while the non-obvious asymmetric design develop by the process is successful.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Major depressive disorder (MDD) trials - investigating either non-pharmacological or pharmacological interventions - have shown mixed results. Many reasons explain this heterogeneity, but one that stands out is the trial design due to specific challenges in the field. We aimed therefore to review the methodology of non-invasive brain stimulation (NIBS) trials and provide a framework to improve clinical trial design. We performed a systematic review for randomized, controlled MDD trials whose intervention was transcranial magnetic stimulation (rTMS) or transcranial direct current stimulation (tDCS) in MEDLINE and other databases from April 2002 to April 2008. We created an unstructured checklist based on CONSORT guidelines to extract items such as power analysis, sham method, blinding assessment, allocation concealment, operational criteria used for MDD, definition of refractory depression and primary study hypotheses. Thirty-one studies were included. We found that the main methodological issues can be divided in to three groups: (1) issues related to phase II/small trials, (2) issues related to MDD trials and, (3) specific issues of NIBS studies. Taken together, they can threaten study validity and lead to inconclusive results. Feasible solutions include: estimating the sample size a priori; measuring the degree of refractoriness of the subjects; specifying the primary hypothesis and statistical tests; controlling predictor variables through stratification randomization methods or using strict eligibility criteria; adjusting the study design to the target population; using adaptive designs and exploring NIBS efficacy employing biological markers. In conclusion, our study summarizes the main methodological issues of NIBS trials and proposes a number of alternatives to manage them. Copyright (C) 2011 John Wiley & Sons, Ltd.
Resumo:
Background The use of the knowledge produced by sciences to promote human health is the main goal of translational medicine. To make it feasible we need computational methods to handle the large amount of information that arises from bench to bedside and to deal with its heterogeneity. A computational challenge that must be faced is to promote the integration of clinical, socio-demographic and biological data. In this effort, ontologies play an essential role as a powerful artifact for knowledge representation. Chado is a modular ontology-oriented database model that gained popularity due to its robustness and flexibility as a generic platform to store biological data; however it lacks supporting representation of clinical and socio-demographic information. Results We have implemented an extension of Chado – the Clinical Module - to allow the representation of this kind of information. Our approach consists of a framework for data integration through the use of a common reference ontology. The design of this framework has four levels: data level, to store the data; semantic level, to integrate and standardize the data by the use of ontologies; application level, to manage clinical databases, ontologies and data integration process; and web interface level, to allow interaction between the user and the system. The clinical module was built based on the Entity-Attribute-Value (EAV) model. We also proposed a methodology to migrate data from legacy clinical databases to the integrative framework. A Chado instance was initialized using a relational database management system. The Clinical Module was implemented and the framework was loaded using data from a factual clinical research database. Clinical and demographic data as well as biomaterial data were obtained from patients with tumors of head and neck. We implemented the IPTrans tool that is a complete environment for data migration, which comprises: the construction of a model to describe the legacy clinical data, based on an ontology; the Extraction, Transformation and Load (ETL) process to extract the data from the source clinical database and load it in the Clinical Module of Chado; the development of a web tool and a Bridge Layer to adapt the web tool to Chado, as well as other applications. Conclusions Open-source computational solutions currently available for translational science does not have a model to represent biomolecular information and also are not integrated with the existing bioinformatics tools. On the other hand, existing genomic data models do not represent clinical patient data. A framework was developed to support translational research by integrating biomolecular information coming from different “omics” technologies with patient’s clinical and socio-demographic data. This framework should present some features: flexibility, compression and robustness. The experiments accomplished from a use case demonstrated that the proposed system meets requirements of flexibility and robustness, leading to the desired integration. The Clinical Module can be accessed in http://dcm.ffclrp.usp.br/caib/pg=iptrans webcite.
Resumo:
In electronic commerce, systems development is based on two fundamental types of models, business models and process models. A business model is concerned with value exchanges among business partners, while a process model focuses on operational and procedural aspects of business communication. Thus, a business model defines the what in an e-commerce system, while a process model defines the how. Business process design can be facilitated and improved by a method for systematically moving from a business model to a process model. Such a method would provide support for traceability, evaluation of design alternatives, and seamless transition from analysis to realization. This work proposes a unified framework that can be used as a basis to analyze, to interpret and to understand different concepts associated at different stages in e-Commerce system development. In this thesis, we illustrate how UN/CEFACT’s recommended metamodels for business and process design can be analyzed, extended and then integrated for the final solutions based on the proposed unified framework. Also, as an application of the framework, we demonstrate how process-modeling tasks can be facilitated in e-Commerce system design. The proposed methodology, called BP3 stands for Business Process Patterns Perspective. The BP3 methodology uses a question-answer interface to capture different business requirements from the designers. It is based on pre-defined process patterns, and the final solution is generated by applying the captured business requirements by means of a set of production rules to complete the inter-process communication among these patterns.
Resumo:
The world of communication has changed quickly in the last decade resulting in the the rapid increase in the pace of peoples’ lives. This is due to the explosion of mobile communication and the internet which has now reached all levels of society. With such pressure for access to communication there is increased demand for bandwidth. Photonic technology is the right solution for high speed networks that have to supply wide bandwidth to new communication service providers. In particular this Ph.D. dissertation deals with DWDM optical packet-switched networks. The issue introduces a huge quantity of problems from physical layer up to transport layer. Here this subject is tackled from the network level perspective. The long term solution represented by optical packet switching has been fully explored in this years together with the Network Research Group at the department of Electronics, Computer Science and System of the University of Bologna. Some national as well as international projects supported this research like the Network of Excellence (NoE) e-Photon/ONe, funded by the European Commission in the Sixth Framework Programme and INTREPIDO project (End-to-end Traffic Engineering and Protection for IP over DWDM Optical Networks) funded by the Italian Ministry of Education, University and Scientific Research. Optical packet switching for DWDM networks is studied at single node level as well as at network level. In particular the techniques discussed are thought to be implemented for a long-haul transport network that connects local and metropolitan networks around the world. The main issues faced are contention resolution in a asynchronous variable packet length environment, adaptive routing, wavelength conversion and node architecture. Characteristics that a network must assure as quality of service and resilience are also explored at both node and network level. Results are mainly evaluated via simulation and through analysis.
Resumo:
[EN]Nowadays companies demand graduates able to work in multidisciplinary and collaborative projects. Hence, new educational methods are needed in order to support a more advanced society, and progress towards a higher quality of life and sustainability. The University of the Basque Country belongs to the European Higher Education Area, which was created as a result of the Bologna process to ensure the connection and quality of European national educational systems. In this framework, this paper proposes an innovative teaching methodology developed for the "Robotics" subject course that belongs to the syllabus of the B.Sc. degree in Industrial Electronics and Automation Engineering. We present an innovative methodology for Robotics learning based on collaborative projects, aimed at responding to the demands of a multidisciplinary and multilingual society.
Resumo:
The aim of this Ph.D. project has been the design and characterization of new and more efficient luminescent tools, in particular sensors and labels, for analytical chemistry, medical diagnostics and imaging. Actually both the increasing temporal and spatial resolutions that are demanded by those branches, coupled to a sensitivity that is required to reach the single molecule resolution, can be provided by the wide range of techniques based on luminescence spectroscopy. As far as the development of new chemical sensors is concerned, as chemists we were interested in the preparation of new, efficient, sensing materials. In this context, we kept developing new molecular chemosensors, by exploiting the supramolecular approach, for different classes of analytes. In particular we studied a family of luminescent tetrapodal-hosts based on aminopyridinium units with pyrenyl groups for the detection of anions. These systems exhibited noticeable changes in the photophysical properties, depending on the nature of the anion; in particular, addition of chloride resulted in a conformational change, giving an initial increase in excimeric emission. A good selectivity for dicarboxylic acid was also found. In the search for higher sensitivities, we moved our attention also to systems able to perform amplification effects. In this context we described the metal ion binding properties of three photoactive poly-(arylene ethynylene) co-polymers with different complexing units and we highlighted, for one of them, a ten-fold amplification of the response in case of addition of Zn2+, Cu2+ and Hg2+ ions. In addition, we were able to demonstrate the formation of complexes with Yb3+ an Er3+ and an efficient sensitization of their typical metal centered NIR emission upon excitation of the polymer structure, this feature being of particular interest for their possible applications in optical imaging and in optical amplification for telecommunication purposes. An amplification effect was also observed during this research in silica nanoparticles derivatized with a suitable zinc probe. In this case we were able to prove, for the first time, that nanoparticles can work as “off-on” chemosensors with signal amplification. Fluorescent silica nanoparticles can be thus seen as innovative multicomponent systems in which the organization of photophysically active units gives rise to fruitful collective effects. These precious effects can be exploited for biological imaging, medical diagnostic and therapeutics, as evidenced also by some results reported in this thesis. In particular, the observed amplification effect has been obtained thanks to a suitable organization of molecular probe units onto the surface of the nanoparticles. In the effort of reaching a deeper inside in the mechanisms which lead to the final amplification effects, we also attempted to find a correlation between the synthetic route and the final organization of the active molecules in the silica network, and thus with those mutual interactions between one another which result in the emerging, collective behavior, responsible for the desired signal amplification. In this context, we firstly investigated the process of formation of silica nanoparticles doped with pyrene derivative and we showed that the dyes are not uniformly dispersed inside the silica matrix; thus, core-shell structures can be formed spontaneously in a one step synthesis. Moreover, as far as the design of new labels is concerned, we reported a new synthetic approach to obtain a class of robust, biocompatible silica core-shell nanoparticles able to show a long-term stability. Taking advantage of this new approach we also showed the synthesis and photophysical properties of core-shell NIR absorbing and emitting materials that proved to be very valuable for in-vivo imaging. In general, the dye doped silica nanoparticles prepared in the framework of this project can conjugate unique properties, such as a very high brightness, due to the possibility to include many fluorophores per nanoparticle, high stability, because of the shielding effect of the silica matrix, and, to date, no toxicity, with a simple and low-cost preparation. All these features make these nanostructures suitable to reach the low detection limits that are nowadays required for effective clinical and environmental applications, fulfilling in this way the initial expectations of this research project.
Resumo:
Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.