863 resultados para Design of interactive systems
Resumo:
The use of two different materials as electrodes allows the construction of asymmetric and hybrid capacitors cells with enhanced energy and power density. This approach is especially well-suited for overcoming the limitations of pseudocapacitive materials that provide a huge capacitance boost, but in a limited potential window. In this work, we introduce the concepts and protocols that are required for a successful design of such systems, which is illustrated by the construction of an asymmetric hybrid cell where a zeolite-templated carbon and an ultraporous activated carbon have been combined.
Resumo:
Overconsumption of natural resources and the associated environmental hazards are one of today’s most pressing global issues. In the western world, individual consumption in homes and workplaces is a key contributor to this problem. Reflecting the importance of individual action in this domain, this thesis focuses on studying and influencing choices related to sustainability and energy consumption made by people in their daily lives. There are three main components to this work. Firstly, this thesis asserts that people frequently make ineffective consumption reduction goal choices and attempts to understand the rationale for these poor choices by fitting them to goalsetting theory, an established theoretical model of behavior change. Secondly, it presents two approaches that attempt to influence goal choice towards more effective targets, one of which deals with mechanisms for goal priming and the other of which explores the idea that carefully designed toys can exert influence on children’s long term consumption behavior patterns. The final section of this thesis deals with the design of feedback to support the performance of environmentally sound activities. Key contributions surrounding goals include the finding that people choose easy sustainable goals despite immediate feedback as to their ineffectiveness and the discussion and study of goal priming mechanisms that can influence this choice process. Contributions within the design of value instilling toys include a theoretically grounded framework for the design of such toys and a completed and tested prototype toy. Finally, contributions in designing effective and engaging energy consumption feedback include the finding that negative feedback is best presented verbally compared with visually and this is exemplified and presented within a working feedback system. The discussions, concepts, prototypes and empirical findings presented in this work will be useful for both environmental psychologists and for HCI researchers studying eco-feedback.
Resumo:
Durante el transcurso de esta Tesis Doctoral se ha realizado un estudio de la problemática asociada al desarrollo de sistemas de interacción hombre-máquina sensibles al contexto. Este problema se enmarca dentro de dos áreas de investigación: los sistemas interactivos y las fuentes de información contextual. Tradicionalmente la integración entre ambos campos se desarrollaba a través de soluciones verticales específicas, que abstraen a los sistemas interactivos de conocer los procedimientos de bajo nivel de acceso a la información contextual, pero limitan su interoperabilidad con otras aplicaciones y fuentes de información. Para solventar esta limitación se hace imprescindible potenciar soluciones interoperables que permitan acceder a la información del mundo real a través de procedimientos homogéneos. Esta problemática coincide perfectamente con los escenarios de \Computación Ubicua" e \Internet de las Cosas", donde se apunta a un futuro en el que los objetos que nos rodean serán capaces de obtener información del entorno y comunicarla a otros objetos y personas. Los sistemas interactivos, al ser capaces de obtener información de su entorno a través de la interacción con el usuario, pueden tomar un papel especial en este escenario tanto como consumidores como productores de información. En esta Tesis se ha abordado la integración de ambos campos teniendo en cuenta este escenario tecnológico. Para ello, en primer lugar se ha realizado un an álisis de las iniciativas más importantes para la definición y diseño de sistemas interactivos, y de las principales infraestructuras de suministro de información. Mediante este estudio se ha propuesto utilizar el lenguaje SCXML del W3C para el diseño de los sistemas interactivos y el procesamiento de los datos proporcionados por fuentes de contexto. Así, se ha reflejado cómo las capacidades del lenguaje SCXML para combinar información de diferentes modalidades pueden también utilizarse para procesar e integrar información contextual de diferentes fuentes heterogéneas, y por consiguiente diseñar sistemas de interacción sensibles al contexto. Del mismo modo se presenta a la iniciativa Sensor Web, y a su extensión semántica Semantic Sensor Web, como una iniciativa idónea para permitir un acceso y suministro homogéneo de la información a los sistemas interactivos sensibles al contexto. Posteriormente se han analizado los retos que plantea la integración de ambos tipos de iniciativas. Como resultado se ha conseguido establecer una serie de funcionalidades que son necesarias implementar para llevar a cabo esta integración. Utilizando tecnologías que aportan una gran flexibilidad al proceso de implementación y que se apoyan en recomendaciones y estándares actuales, se implementaron una serie de desarrollos experimentales que integraban las funcionalidades identificadas anteriormente. Finalmente, con el fin de validar nuestra propuesta, se realizaron un conjunto de experimentos sobre un entorno de experimentación que simula el escenario de la conducción. En este escenario un sistema interactivo se comunica con una extensión semántica de una plataforma basada en los estándares de la Sensor Web para poder obtener información y publicar las observaciones que el usuario realizaba al sistema. Los resultados obtenidos han demostrado la viabilidad de utilizar el lenguaje SCXML para el diseño de sistemas interactivos sensibles al contexto que requieren acceder a plataformas avanzadas de información para consumir y publicar información a la vez que interaccionan con el usuario. Del mismo modo, se ha demostrado cómo la utilización de tecnologías semánticas en los procesos de consulta y publicación de información puede facilitar la reutilización de la información publicada en infraestructuras Sensor Web por cualquier tipo de aplicación, y de este modo contribuir al futuro escenario de Internet de las Cosas. ABSTRACT In this Thesis, we have addressed the difficulties related to the development of context-aware human-machine interaction systems. This issue is part of two research fields: interactive systems and contextual information sources. Traditionally both fields have been integrated through domain-specific vertical solutions that allow interactive systems to access contextual information without having to deal with low-level procedures, but restricting their interoperability with other applications and heterogeneous data sources. Thus, it is essential to boost the research on interoperable solutions that provide access to real world information through homogeneous procedures. This issue perfectly matches with the scenarios of \Ubiquitous Computing" and \Internet of Things", which point toward a future in which many objects around us will be able to acquire meaningful information about the environment and communicate it to other objects and to people. Since interactive systems are able to get information from their environment through interaction with the user, they can play an important role in this scenario as they can both consume real-world data and produce enriched information. This Thesis deals with the integration of both fields considering this technological scenario. In order to do this, we first carried out an analysis of the most important initiatives for the definition and design of interactive systems, and the main infrastructures for providing information. Through this study the use of the W3C SCXML language is proposed for both the design of interactive systems and the processing of data provided by different context sources. Thus, this work has shown how the SCXML capabilities for combining information from different modalities can also be used to process and integrate contextual information from different heterogeneous sensor sources, and therefore to develope context-aware interaction systems. Similarly, we present the Sensor Web initiative, and its semantic extension Semantic Sensor Web, as an appropriate initiative to allow uniform access and delivery of information to the context-aware interactive systems. Subsequently we have analyzed the challenges of integrating both types of initiatives: SCXML and (Semantic) Sensor Web. As a result, we state a number of functionalities that are necessary to implement in order to perform this integration. By using technologies that provide exibility to the implementation process and are based on current recommendations and standards, we implemented a series of experimental developments that integrate the identified functionalities. Finally, in order to validate our approach, we conducted different experiments with a testing environment simulating a driving scenario. In this framework an interactive system can access a semantic extension of a Telco plataform, based on the standards of the Sensor Web, to acquire contextual information and publish observations that the user performed to the system. The results showed the feasibility of using the SCXML language for designing context-aware interactive systems that require access to advanced sensor platforms for consuming and publishing information while interacting with the user. In the same way, it was shown how the use of semantic technologies in the processes of querying and publication sensor data can assist in reusing and sharing the information published by any application in Sensor Web infrastructures, and thus contribute to realize the future scenario of \Internet of Things".
Resumo:
Language is an essential aspect of human communication and interaction, not only between humans but also between humans and interactive systems. Indeed, the use of the language is a keystone for the design of interactive systems.Inappropriate use of language might limit the access of users to information and induce users to make mistakes or to lose control of interactive systems. Despite the existence of public policies, the diversity of languages poses serious problems when considering full-fledged/seamless support for all existing languages. And yet standardization processes have successfully defined mechanisms for ensuring cross-platform compatibility between languages, at least at the level of format and the set of characters that can be used.
Resumo:
Very large scale scheduling and planning tasks cannot be effectively addressed by fully automated schedule optimisation systems, since many key factors which govern 'fitness' in such cases are unformalisable. This raises the question of an interactive (or collaborative) approach, where fitness is assigned by the expert user. Though well-researched in the domains of interactively evolved art and music, this method is as yet rarely used in logistics. This paper concerns a difficulty shared by all interactive evolutionary systems (IESs), but especially those used for logistics or design problems. The difficulty is that objective evaluation of IESs is severely hampered by the need for expert humans in the loop. This makes it effectively impossible to, for example, determine with statistical confidence any ranking among a decent number of configurations for the parameters and strategy choices. We make headway into this difficulty with an Automated Tester (AT) for such systems. The AT replaces the human in experiments, and has parameters controlling its decision-making accuracy (modelling human error) and a built-in notion of a target solution which may typically be at odds with the solution which is optimal in terms of formalisable fitness. Using the AT, plausible evaluations of alternative designs for the IES can be done, allowing for (and examining the effects of) different levels of user error. We describe such an AT for evaluating an IES for very large scale planning.
Resumo:
This work explores the design of piezoelectric transducers based on functional material gradation, here named functionally graded piezoelectric transducer (FGPT). Depending on the applications, FGPTs must achieve several goals, which are essentially related to the transducer resonance frequency, vibration modes, and excitation strength at specific resonance frequencies. Several approaches can be used to achieve these goals; however, this work focuses on finding the optimal material gradation of FGPTs by means of topology optimization. Three objective functions are proposed: (i) to obtain the FGPT optimal material gradation for maximizing specified resonance frequencies; (ii) to design piezoelectric resonators, thus, the optimal material gradation is found for achieving desirable eigenvalues and eigenmodes; and (iii) to find the optimal material distribution of FGPTs, which maximizes specified excitation strength. To track the desirable vibration mode, a mode-tracking method utilizing the `modal assurance criterion` is applied. The continuous change of piezoelectric, dielectric, and elastic properties is achieved by using the graded finite element concept. The optimization algorithm is constructed based on sequential linear programming, and the concept of continuum approximation of material distribution. To illustrate the method, 2D FGPTs are designed for each objective function. In addition, the FGPT performance is compared with the non-FGPT one.
Resumo:
Objectives: To evaluate the effect of framework design on the fatigue life and failure modes of metal ceramic (MC, Ni-Cr alloy core, VMK 95 porcelain veneer), glass-infiltrated alumina (ICA, In-Ceram Alumina/VM7), and veneered yttria-stabilized tetragonal zirconia polycrystals (Y-TZP, IPSe.max ZirCAD/IPS e.max,) crowns. Methods: Sixty composite resin tooth replicas of a prepared maxillary first molar were produced to receive crowns systems of a standard (MCs, ICAs, and Y-TZPs, n = 10 each) or a modified framework design (MCm, ICAm, and Y-TZPm, n = 10 each). Fatigue loading was delivered with a spherical steel indenter (3.18 mm radius) on the center of the occlusal surface using r-ratio fatigue (30-300 N) until completion of 10(6) cycles or failure. Fatigue was interrupted every 125,000 cycles for damage evaluation. Weibull distribution fits and contour plots were used for examining differences between groups. Failure mode was evaluated by light polarized and SEM microscopy. Results: Weibull analysis showed the highest fatigue life for MC crowns regardless of framework design. No significant difference (confidence bound overlaps) was observed between ICA and Y-TZP with or without framework design modification. Y-TZPm crowns presented fatigue life in the range of MC crowns. No porcelain veneer fracture was observed in the MC groups, whereas ICAs presented bulk fracture and ICAm failed mainly through the veneer. Y-TZP crowns failed through chipping within the veneer, without core fractures. Conclusions: Framework design modification did not improve the fatigue life of the crown systems investigated. Y-TZPm crowns showed comparable fatigue life to MC groups. Failure mode varied according to crown system. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Our day-to-day life is dependent on several embedded devices, and in the near future, many more objects will have computation and communication capabilities enabling an Internet of Things. Correspondingly, with an increase in the interaction of these devices around us, developing novel applications is set to become challenging with current software infrastructures. In this paper, we argue that a new paradigm for operating systems needs to be conceptualized to provide aconducive base for application development on Cyber-physical systems. We demonstrate its need and importance using a few use-case scenarios and provide the design principles behind, and an architecture of a co-operating system or CoS that can serve as an example of this new paradigm.
Resumo:
This paper addresses the challenging task of computing multiple roots of a system of nonlinear equations. A repulsion algorithm that invokes the Nelder-Mead (N-M) local search method and uses a penalty-type merit function based on the error function, known as 'erf', is presented. In the N-M algorithm context, different strategies are proposed to enhance the quality of the solutions and improve the overall efficiency. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm.
Resumo:
Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.
Resumo:
This paper addresses the challenging task of computing multiple roots of a system of nonlinear equations. A repulsion algorithm that invokes the Nelder-Mead (N-M) local search method and uses a penalty-type merit function based on the error function, known as 'erf', is presented. In the N-M algorithm context, different strategies are proposed to enhance the quality of the solutions and improve the overall efficiency. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm.
Resumo:
Tese de Doutoramento em Biologia Ambiental e Molecular
Resumo:
El crecimiento exponencial del tráfico de datos es uno de los mayores desafíos que enfrentan actualmente los sistemas de comunicaciones, debiendo los mismos ser capaces de soportar velocidades de procesamiento de datos cada vez mas altas. En particular, el consumo de potencia se ha transformado en uno de los parámetros de diseño más críticos, generando la necesidad de investigar el uso de nuevas arquitecturas y algoritmos para el procesamiento digital de la información. Por otro lado, el análisis y evaluación de nuevas técnicas de procesamiento presenta dificultades dadas las altas velocidades a las que deben operar, resultando frecuentemente ineficiente el uso de la simulación basada en software como método. En este contexto, el uso de electrónica programable ofrece una oportunidad a bajo costo donde no solo se evaluan nuevas técnicas de diseño de alta velocidad sino también se valida su implementación en desarrollos tecnológicos. El presente proyecto tiene como objetivo principal el estudio y desarrollo de nuevas arquitecturas y algoritmos en electrónica programable para el procesamiento de datos a alta velocidad. El método a utilizar será la programación en dispositivos FPGA (Field-Programmable Gate Array) que ofrecen una buena relación costo-beneficio y gran flexibilidad para integrarse con otros dispositivos de comunicaciones. Para la etapas de diseño, simulación y programación se utilizaran herramientas CAD (Computer-Aided Design) orientadas a sistemas electrónicos digitales. El proyecto beneficiara a estudiantes de grado y postgrado de carreras afines a la informática y las telecomunicaciones, contribuyendo al desarrollo de proyectos finales y tesis doctorales. Los resultados del proyecto serán publicados en conferencias y/o revistas nacionales e internacionales y divulgados a través de charlas de difusión y/o encuentros. El proyecto se enmarca dentro de un área de gran importancia para la Provincia de Córdoba, como lo es la informática y las telecomunicaciones, y promete generar conocimiento de gran valor agregado que pueda ser transferido a empresas tecnológicas de la Provincia de Córdoba a través de consultorias o desarrollos de productos.
Resumo:
Los materiales lignocelulósicos residuales de las actividades agroindustriales pueden ser aprovechados como fuente de lignina, hemicelulosa y celulosa. El tratamiento químico del material lignocelulósico se debe enfrentar al hecho de que dicho material es bastante recalcitrante a tal ataque, fundamentalmente debido a la presencia del polímero lignina. Esto se puede lograr también utilizando hongos de la podredumbre blanca de la madera. Estos producen enzimas lignolíticas extracelulares fundamentalmente Lacasa, que oxida la lignina a CO2. Tambien oxida un amplio rango de sustratos ( fenoles, polifenoles, anilinas, aril-diaminas, fenoles metoxi-sustituídos, y otros), lo cual es una buena razón de su atracción para aplicaciones biotecnológicas. La enzima tiene potencial aplicación en procesos tales como en la delignificación de materiales lignocelulósicos y en el bioblanqueado de pulpas para papel, en el tratamiento de aguas residuales de plantas industriales, en la modificación de fibras y decoloración en industrias textiles y de colorantes, en el mejoramiento de alimentos para animales, en la detoxificación de polutantes y en bioremediación de suelos contaminados. También se la ha utilizado en Q.Orgánica para la oxidación de grupos funcionales, en la formación de enlaces carbono- nitrógeno y en la síntesis de productos naturales complejos. HIPOTESIS: Los hongos de podredumbre blanca, y en condiciones óptimas de cultivo producen distintos tipos de enzimas oxidasas, siendo las lacasas las más adecuadas para explorarlas como catalizadores en los siguientes procesos: Delignificación de residuos de la industria forestal con el fin de aprovechar tales desechos en la alimentación animal. Decontaminación/remediación de suelos y/o efluentes industriales. Se realizarán los estudios para el diseño de bio-reactores que permitan responder a las dos cuestiones planteadas en la hipótesis. Para el proceso de delignificación de material lignocelulósico se proponen dos estrategias: 1- tratar el material con el micelio del hongo adecuando la provisión de nutrientes para un desarrollo sostenido y favorecer la liberación de la enzima. 2- Utilizar la enzima lacasa parcialmente purificada acoplada a un sistema mediador para oxidar los compuestos polifenólicos. Para el proceso de decontaminación/remediación de suelos y/o efluentes industriales se trabajará también en dos frentes: 3) por un lado, se ha descripto que existe una correlación positiva entre la actividad de algunas enzimas presentes en el suelo y la fertilidad. En este sentido se conoce que un sistema enzimático, tentativamente identificado como una lacasa de origen microbiano es responsable de la transformación de compuestos orgánicos en el suelo. La enzima protege al suelo de la acumulación de compuestos orgánicos peligrosos catalizando reacciones que involucran degradación, polimerización e incorporación a complejos del ácido húmico. Se utilizarán suelos incorporados con distintos polutantes(por ej. policlorofenoles ó cloroanilinas.) 4) Se trabajará con efluentes industriales contaminantes (alpechínes y/o el efluente líquido del proceso de desamargado de las aceitunas). The lignocellulosic raw materials of the agroindustrial activities can be taken advantage as source of lignin, hemicellulose and cellulose. The chemical treatment of this material is not easy because the above mentioned material is recalcitrant enough to such an assault, due to the presence of the lignin. This can be achieved also using the white-rot fungi of the wood. It produces extracellular ligninolitic enzymes, fundamentally Laccase, which oxidizes the lignin to CO2. The enzyme has application in such processes as in the delignification of lignocellulosic materials and in the biobleaching of fibers for paper industry, in the treatment of waste water of industrial plants, in the discoloration in textile industries, in the improvement of food for ruminants, in the detoxification of polutants and in bioremediation of contaminated soils. HYPOTHESIS: The white-rot fungi produce different types of enzymes, being the laccases the most adapted to explore them as catalysts in the following processes: Delignification of residues of the forest industry in order to take advantage of such waste in the animal feed. Decontamination of soils and / or waste waters. The studies will be conducted for the design of bio reactors that allow to answer to both questions raised in the hypothesis. For the delignification process of lignocellulosic material they propose two strategies: 1- to treat the material with the fungi 2-to use the partially purified enzyme to oxidize the polyphenolic compounds. For the soil and/or waste water decontamination process, we have: 3- Is know that the enzyme protects to the soil of the accumulation of organic dangerous compounds catalyzing reactions that involve degradation, polymerization and incorporation to complexes of the humic acid. There will be use soils incorporated into different pollutants. 4- We will work with waste waters (alpechins or the green olive debittering effluents.
Resumo:
For years, specifications have focused on the water to cement ratio (w/cm) and strength of concrete, despite the majority of the volume of a concrete mixture consisting of aggregate. An aggregate distribution of roughly 60% coarse aggregate and 40% fine aggregate, regardless of gradation and availability of aggregates, has been used as the norm for a concrete pavement mixture. Efforts to reduce the costs and improve sustainability of concrete mixtures have pushed owners to pay closer attention to mixtures with a well-graded aggregate particle distribution. In general, workability has many different variables that are independent of gradation, such as paste volume and viscosity, aggregate’s shape, and texture. A better understanding of how the properties of aggregates affect the workability of concrete is needed. The effects of aggregate characteristics on concrete properties, such as ability to be vibrated, strength, and resistivity, were investigated using mixtures in which the paste content and the w/cm were held constant. The results showed the different aggregate proportions, the maximum nominal aggregate sizes, and combinations of different aggregates all had an impact on the performance in the strength, slump, and box test.