889 resultados para Technology-based self-service


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los sistemas técnicos son cada vez más complejos, incorporan funciones más avanzadas, están más integrados con otros sistemas y trabajan en entornos menos controlados. Todo esto supone unas condiciones más exigentes y con mayor incertidumbre para los sistemas de control, a los que además se demanda un comportamiento más autónomo y fiable. La adaptabilidad de manera autónoma es un reto para tecnologías de control actualmente. El proyecto de investigación ASys propone abordarlo trasladando la responsabilidad de la capacidad de adaptación del sistema de los ingenieros en tiempo de diseño al propio sistema en operación. Esta tesis pretende avanzar en la formulación y materialización técnica de los principios de ASys de cognición y auto-consciencia basadas en modelos y autogestión de los sistemas en tiempo de operación para una autonomía robusta. Para ello el trabajo se ha centrado en la capacidad de auto-conciencia, inspirada en los sistemas biológicos, y se ha explorado la posibilidad de integrarla en la arquitectura de los sistemas de control. Además de la auto-consciencia, se han explorado otros temas relevantes: modelado funcional, modelado de software, tecnología de los patrones, tecnología de componentes, tolerancia a fallos. Se ha analizado el estado de la técnica en los ámbitos pertinentes para las cuestiones de la auto-consciencia y la adaptabilidad en sistemas técnicos: arquitecturas cognitivas, control tolerante a fallos, y arquitecturas software dinámicas y computación autonómica. El marco teórico de ASys existente de sistemas autónomos cognitivos ha sido adaptado para servir de base para este análisis de autoconsciencia y adaptación y para dar sustento conceptual al posterior desarrollo de la solución. La tesis propone una solución general de diseño para la construcción de sistemas autónomos auto-conscientes. La idea central es la integración de un meta-controlador en la arquitectura de control del sistema autónomo, capaz de percibir la estado funcional del sistema de control y, si es necesario, reconfigurarlo en tiempo de operación. Esta solución de metacontrol se ha formalizado en cuatro patrones de diseño: i) el Patrón Metacontrol, que define la integración de un subsistema de metacontrol, responsable de controlar al propio sistema de control a través de la interfaz proporcionada por su plataforma de componentes, ii) el patrón Bucle de Control Epistémico, que define un bucle de control cognitivo basado en el modelos y que se puede aplicar al diseño del metacontrol, iii) el patrón de Reflexión basada en Modelo Profundo propone una solución para construir el modelo ejecutable utilizado por el meta-controlador mediante una transformación de modelo a modelo a partir del modelo de ingeniería del sistema, y, finalmente, iv) el Patrón Metacontrol Funcional, que estructura el meta-controlador en dos bucles, uno para el control de la configuración de los componentes del sistema de control, y otro sobre éste, controlando las funciones que realiza dicha configuración de componentes; de esta manera las consideraciones funcionales y estructurales se desacoplan. La Arquitectura OM y el metamodelo TOMASys son las piezas centrales del marco arquitectónico desarrollado para materializar la solución compuesta de los patrones anteriores. El metamodelo TOMASys ha sido desarrollado para la representación de la estructura y su relación con los requisitos funcionales de cualquier sistema autónomo. La Arquitectura OM es un patrón de referencia para la construcción de una metacontrolador integrando los patrones de diseño propuestos. Este meta-controlador se puede integrar en la arquitectura de cualquier sistema control basado en componentes. El elemento clave de su funcionamiento es un modelo TOMASys del sistema decontrol, que el meta-controlador usa para monitorizarlo y calcular las acciones de reconfiguración necesarias para adaptarlo a las circunstancias en cada momento. Un proceso de ingeniería, complementado con otros recursos, ha sido elaborado para guiar la aplicación del marco arquitectónico OM. Dicho Proceso de Ingeniería OM define la metodología a seguir para construir el subsistema de metacontrol para un sistema autónomo a partir del modelo funcional del mismo. La librería OMJava proporciona una implementación del meta-controlador OM que se puede integrar en el control de cualquier sistema autónomo, independientemente del dominio de la aplicación o de su tecnología de implementación. Para concluir, la solución completa ha sido validada con el desarrollo de un robot móvil autónomo que incorpora un meta-controlador con la Arquitectura OM. Las propiedades de auto-consciencia y adaptación proporcionadas por el meta-controlador han sido validadas en diferentes escenarios de operación del robot, en los que el sistema era capaz de sobreponerse a fallos en el sistema de control mediante reconfiguraciones orquestadas por el metacontrolador. ABSTRACT Technical systems are becoming more complex, they incorporate more advanced functionalities, they are more integrated with other systems and they are deployed in less controlled environments. All this supposes a more demanding and uncertain scenario for control systems, which are also required to be more autonomous and dependable. Autonomous adaptivity is a current challenge for extant control technologies. The ASys research project proposes to address it by moving the responsibility for adaptivity from the engineers at design time to the system at run-time. This thesis has intended to advance in the formulation and technical reification of ASys principles of model-based self-cognition and having systems self-handle at runtime for robust autonomy. For that it has focused on the biologically inspired capability of self-awareness, and explored the possibilities to embed it into the very architecture of control systems. Besides self-awareness, other themes related to the envisioned solution have been explored: functional modeling, software modeling, patterns technology, components technology, fault tolerance. The state of the art in fields relevant for the issues of self-awareness and adaptivity has been analysed: cognitive architectures, fault-tolerant control, and software architectural reflection and autonomic computing. The extant and evolving ASys Theoretical Framework for cognitive autonomous systems has been adapted to provide a basement for this selfhood-centred analysis and to conceptually support the subsequent development of our solution. The thesis proposes a general design solution for building self-aware autonomous systems. Its central idea is the integration of a metacontroller in the control architecture of the autonomous system, capable of perceiving the functional state of the control system and reconfiguring it if necessary at run-time. This metacontrol solution has been formalised into four design patterns: i) the Metacontrol Pattern, which defines the integration of a metacontrol subsystem, controlling the domain control system through an interface provided by its implementation component platform, ii) the Epistemic Control Loop pattern, which defines a modelbased cognitive control loop that can be applied to the design of such a metacontroller, iii) the Deep Model Reflection pattern proposes a solution to produce the online executable model used by the metacontroller by model-to-model transformation from the engineering model, and, finally, iv) the Functional Metacontrol pattern, which proposes to structure the metacontroller in two loops, one for controlling the configuration of components of the controller, and another one on top of the former, controlling the functions being realised by that configuration; this way the functional and structural concerns become decoupled. The OM Architecture and the TOMASys metamodel are the core pieces of the architectural framework developed to reify this patterned solution. The TOMASys metamodel has been developed for representing the structure and its relation to the functional requirements of any autonomous system. The OM architecture is a blueprint for building a metacontroller according to the patterns. This metacontroller can be integrated on top of any component-based control architecture. At the core of its operation lies a TOMASys model of the control system. An engineering process and accompanying assets have been constructed to complete and exploit the architectural framework. The OM Engineering Process defines the process to follow to develop the metacontrol subsystem from the functional model of the controller of the autonomous system. The OMJava library provides a domain and application-independent implementation of an OM Metacontroller than can be used in the implementation phase of OMEP. Finally, the complete solution has been validated in the development of an autonomous mobile robot that incorporates an OM metacontroller. The functional selfawareness and adaptivity properties achieved thanks to the metacontrol system have been validated in different scenarios. In these scenarios the robot was able to overcome failures in the control system thanks to reconfigurations performed by the metacontroller.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Automated Teller Machines (ATMs) are sensitive self-service systems that require important investments in security and testing. ATM certifications are testing processes for machines that integrate software components from different vendors and are performed before their deployment for public use. This project was originated from the need of optimization of the certification process in an ATM manufacturing company. The process identifies compatibility problems between software components through testing. It is composed by a huge number of manual user tasks that makes the process very expensive and error-prone. Moreover, it is not possible to fully automate the process as it requires human intervention for manipulating ATM peripherals. This project presented important challenges for the development team. First, this is a critical process, as all the ATM operations rely on the software under test. Second, the context of use of ATMs applications is vastly different from ordinary software. Third, ATMs’ useful lifetime is beyond 15 years and both new and old models need to be supported. Fourth, the know-how for efficient testing depends on each specialist and it is not explicitly documented. Fifth, the huge number of tests and their importance implies the need for user efficiency and accuracy. All these factors led us conclude that besides the technical challenges, the usability of the intended software solution was critical for the project success. This business context is the motivation of this Master Thesis project. Our proposal focused in the development process applied. By combining user-centered design (UCD) with agile development we ensured both the high priority of usability and the early mitigation of software development risks caused by all the technology constraints. We performed 23 development iterations and finally we were able to provide a working solution on time according to users’ expectations. The evaluation of the project was carried out through usability tests, where 4 real users participated in different tests in the real context of use. The results were positive, according to different metrics: error rate, efficiency, effectiveness, and user satisfaction. We discuss the problems found, the benefits and the lessons learned in the process. Finally, we measured the expected project benefits by comparing the effort required by the current and the new process (once the new software tool is adopted). The savings corresponded to 40% less effort (man-hours) per certification. Future work includes additional evaluation of product usability in a real scenario (with customers) and the measuring of benefits in terms of quality improvement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines the influence of customer-facing technology in full-service restaurants. As a new addition to the service experience, tabletop devices offer the customer more control over the dining experience, and also increase customer participation in the service process, which has the potential to upset the traditional exchange between service providers and customers in restaurants. To examine how customers react to the use of tabletop devices, this study examines 1,343 point-of-sales transactions from 20 units of a full-service casual dining restaurant chain and matches customer in-restaurant transactions to their reactions to tabletop devices used during their meals. Results show that over 70% of the customers who used tabletop devices reported positive affect toward the device, with approximately 79% of customers reporting that the device improved their experience, citing convenience, ease of use, and credit card security as some benefits of using the technology. Approximately 80% of the customers who used the device reported that they would return to the restaurant because of the positive affect. The results also indicate that likeability of the device and tip percentage were positively and significantly connected to customer reports of the devices having a positive effect on experience and on desire to return. In addition, when customers reported increased return intentions, likeability of the device was higher regardless of reports of the device improving restaurant experience, showing that the introduction of tabletop devices had a positive effect for most—but not all—customers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of tabletop technology continues to grow in the restaurant industry, and this study identifies the strengths and weakness of the technology, how it influences customers, and how it can improve the bottom line for managers and business owners. Results from two studies involving a full-service casual dining chain show that dining time was significantly reduced among patrons who used the tabletop hardware to order or pay for their meals, as was the time required for servers to meet the needs of customers. Also, those who used the devices to order a meal tended to spend more than those who did not. Patrons across the industry have embraced guest-facing technology, such as online reservation systems, mobile apps, payment apps, and tablet-based systems, and may in fact look for such technology when deciding where to dine. Guests’ reactions have been overwhelmingly positive, with 70 to 80 percent of consumers citing the benefits of guest-facing technology and applications. The introduction of tabletop technology in the full-service segment has been slower than in quick-service restaurants (QSRs), and guests cite online reservation systems, online ordering, and tableside payment as preferred technologies. Restaurant operators have also cited benefits of guest-facing technology, for example, the use of electronic ordering, which led to increased sales as such systems can induce the purchase of more expensive menu items and side dishes while allowing managers to store order and payment information for future transactions. Researchers have also noted the cost of the technology and potential problems with integration into other systems as two main factors blocking adoption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim. The purpose of this study was to develop and evaluate a computer-based, dietary, and physical activity self-management program for people recently diagnosed with type 2 diabetes. Methods. The computer-based program was developed in conjunction with the target group and evaluated in a 12-week randomised controlled trial (RCT). Participants were randomised to the intervention (computer-program) or control group (usual care). Primary outcomes were diabetes knowledge and goal setting (ADKnowl questionnaire, Diabetes Obstacles Questionnaire (DOQ)) measured at baseline and week 12. User feedback on the program was obtained via a questionnaire and focus groups. Results. Seventy participants completed the 12-week RCT (32 intervention, 38 control, mean age 59 (SD) years). After completion there was a significant between-group difference in the "knowledge and beliefs scale" of the DOQ. Two-thirds of the intervention group rated the program as either good or very good, 92% would recommend the program to others, and 96% agreed that the information within the program was clear and easy to understand. Conclusions. The computer-program resulted in a small but statistically significant improvement in diet-related knowledge and user satisfaction was high. With some further development, this computer-based educational tool may be a useful adjunct to diabetes self-management. This trial is registered with clinicaltrials.gov NCT number NCT00877851.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On November 19, 2012, Iowa Gov. Terry Branstad, Iowa Secretary of Agriculture Bill Northey, Director Chuck Gipp from the Iowa Department of Natural Resources and Dr. John Lawrence of Iowa State University announced the release of the Iowa Nutrient Reduction Strategy for public comment. A two-month public comment period and several informational meetings allowed the public to provide feedback on the draft strategy. Updates and improvements were made to the draft based on the public comments. The final version of the strategy was released May 29, 2013. The Iowa Nutrient Reduction Strategy is a science and technology-based approach to assess and reduce nutrients delivered to Iowa waterways and the Gulf of Mexico. The strategy outlines voluntary efforts to reduce nutrients in surface water from both point sources, such as wastewater treatment plants and industrial facilities, and nonpoint sources, including farm fields and urban areas, in a scientific, reasonable and cost effective manner. The development of the strategy reflects more than two years of work led by the Iowa Department of Agriculture and Land Stewardship, Iowa Department of Natural Resources and Iowa State University. The scientific assessment to evaluate and model the effects of practices was developed through the efforts of 23 individuals representing five agencies or organizations, including scientists from ISU, IDALS, DNR, USDA Agricultural Research Service and USDA Natural Resources Conservation Service. The strategy was developed in response to the 2008 Gulf Hypoxia Action Plan that calls for the 12 states along the Mississippi River to develop strategies to reduce nutrient loading to the Gulf of Mexico. The Iowa strategy follows the recommended framework provided by EPA in 2011 and is only the second state to complete a statewide nutrient reduction strategy. This strategy is the beginning. Operational plans are being developed and work is underway. This is a dynamic document that will evolve over time, and is a key step towards improving Iowa’s water quality. The impetus for this report comes from the Water Resources Coordination Council (WRCC) which states in its 2014‐15 Annual Report “Efforts are underway to improve understanding of the multiple nutrient monitoring efforts that may be available and can be compared to the nutrient WQ monitoring framework to identify opportunities and potential data gaps to better coordinate and prioritize future nutrient monitoring efforts.” This report is the culmination of those efforts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On November 19, 2012, Iowa Gov. Terry Branstad, Iowa Secretary of Agriculture Bill Northey, Director Chuck Gipp from the Iowa Department of Natural Resources and Dr. John Lawrence of Iowa State University announced the release of the Iowa Nutrient Reduction Strategy for public comment. A two-month public comment period and several informational meetings allowed the public to provide feedback on the draft strategy. Updates and improvements were made to the draft based on the public comments. The final version of the strategy was released May 29, 2013. The Iowa Nutrient Reduction Strategy is a science and technology-based approach to assess and reduce nutrients delivered to Iowa waterways and the Gulf of Mexico. The strategy outlines voluntary efforts to reduce nutrients in surface water from both point sources, such as wastewater treatment plants and industrial facilities, and nonpoint sources, including farm fields and urban areas, in a scientific, reasonable and cost effective manner. The development of the strategy reflects more than two years of work led by the Iowa Department of Agriculture and Land Stewardship, Iowa Department of Natural Resources and Iowa State University. The scientific assessment to evaluate and model the effects of practices was developed through the efforts of 23 individuals representing five agencies or organizations, including scientists from ISU, IDALS, DNR, USDA Agricultural Research Service and USDA Natural Resources Conservation Service. The strategy was developed in response to the 2008 Gulf Hypoxia Action Plan that calls for the 12 states along the Mississippi River to develop strategies to reduce nutrient loading to the Gulf of Mexico. The Iowa strategy follows the recommended framework provided by EPA in 2011 and is only the second state to complete a statewide nutrient reduction strategy. This strategy is the beginning. Operational plans are being developed and work is underway. This is a dynamic document that will evolve over time, and is a key step towards improving Iowa’s water quality. The impetus for this report comes from the Water Resources Coordination Council (WRCC) which states in its 2014‐15 Annual Report “Efforts are underway to improve understanding of the multiple nutrient monitoring efforts that may be available and can be compared to the nutrient WQ monitoring framework to identify opportunities and potential data gaps to better coordinate and prioritize future nutrient monitoring efforts.” This report is the culmination of those efforts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On November 19, 2012, Iowa Gov. Terry Branstad, Iowa Secretary of Agriculture Bill Northey, Director Chuck Gipp from the Iowa Department of Natural Resources and Dr. John Lawrence of Iowa State University announced the release of the Iowa Nutrient Reduction Strategy for public comment. A two-month public comment period and several informational meetings allowed the public to provide feedback on the draft strategy. Updates and improvements were made to the draft based on the public comments. The final version of the strategy was released May 29, 2013. The Iowa Nutrient Reduction Strategy is a science and technology-based approach to assess and reduce nutrients delivered to Iowa waterways and the Gulf of Mexico. The strategy outlines voluntary efforts to reduce nutrients in surface water from both point sources, such as wastewater treatment plants and industrial facilities, and nonpoint sources, including farm fields and urban areas, in a scientific, reasonable and cost effective manner. The development of the strategy reflects more than two years of work led by the Iowa Department of Agriculture and Land Stewardship, Iowa Department of Natural Resources and Iowa State University. The scientific assessment to evaluate and model the effects of practices was developed through the efforts of 23 individuals representing five agencies or organizations, including scientists from ISU, IDALS, DNR, USDA Agricultural Research Service and USDA Natural Resources Conservation Service. The strategy was developed in response to the 2008 Gulf Hypoxia Action Plan that calls for the 12 states along the Mississippi River to develop strategies to reduce nutrient loading to the Gulf of Mexico. The Iowa strategy follows the recommended framework provided by EPA in 2011 and is only the second state to complete a statewide nutrient reduction strategy. This strategy is the beginning. Operational plans are being developed and work is underway. This is a dynamic document that will evolve over time, and is a key step towards improving Iowa’s water quality. The impetus for this report comes from the Water Resources Coordination Council (WRCC) which states in its 2014‐15 Annual Report “Efforts are underway to improve understanding of the multiple nutrient monitoring efforts that may be available and can be compared to the nutrient WQ monitoring framework to identify opportunities and potential data gaps to better coordinate and prioritize future nutrient monitoring efforts.” This report is the culmination of those efforts.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we discuss our current efforts to develop and implement an exploratory, discovery mode assessment item into the total learning and assessment profile for a target group of about 100 second level engineering mathematics students. The assessment item under development is composed of 2 parts, namely, a set of "pre-lab" homework problems (which focus on relevant prior mathematical knowledge, concepts and skills), and complementary computing laboratory exercises which are undertaken within a fixed (1 hour) time frame. In particular, the computing exercises exploit the algebraic manipulation and visualisation capabilities of the symbolic algebra package MAPLE, with the aim of promoting understanding of certain mathematical concepts and skills via visual and intuitive reasoning, rather than a formal or rigorous approach. The assessment task we are developing is aimed at providing students with a significant learning experience, in addition to providing feedback on their individual knowledge and skills. To this end, a noteworthy feature of the scheme is that marks awarded for the laboratory work are primarily based on the extent to which reflective, critical thinking is demonstrated, rather than the amount of CBE-style tasks completed by the student within the allowed time. With regard to student learning outcomes, a novel and potentially critical feature of our scheme is that the assessment task is designed to be intimately linked to the overall course content, in that it aims to introduce important concepts and skills (via individual student exploration) which will be revisited somewhat later in the pedagogically more restrictive formal lecture component of the course (typically a large group plenary format). Furthermore, the time delay involved, or "incubation period", is also a deliberate design feature: it is intended to allow students the opportunity to undergo potentially important internal re-adjustments in their understanding, before being exposed to lectures on related course content which are invariably delivered in a more condensed, formal and mathematically rigorous manner. In our presentation, we will discuss in more detail our motivation and rationale for trailing such a scheme for the targeted student group. Some of the advantages and disadvantages of our approach (as we perceived them at the initial stages) will also be enumerated. In a companion paper, the theoretical framework for our approach will be more fully elaborated, and measures of student learning outcomes (as obtained from eg. student provided feedback) will be discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the field of semantic grid, QoS-based Web service composition is an important problem. In semantic and service rich environment like semantic grid, the emergence of context constraints on Web services is very common making the composition consider not only QoS properties of Web services, but also inter service dependencies and conflicts which are formed due to the context constraints imposed on Web services. In this paper, we present a repair genetic algorithm, namely minimal-conflict hill-climbing repair genetic algorithm, to address the Web service composition optimization problem in the presence of domain constraints and inter service dependencies and conflicts. Experimental results demonstrate the scalability and effectiveness of the genetic algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is significant interest in Human-computer interaction methods that assist in the design of applications for use by children. Many of these approaches draw upon standard HCI methods,such as personas, scenarios, and probes. However, often these techniques require communication and kinds of thinking skills that are designer centred,which prevents children with Autism Spectrum Disorders or other learning and communication disabilities from being able to participate. This study investigates methods that might be used with children with ASD or other learning and communication disabilities to inspire the design of technology based intervention approaches to support their speech and language development. Similar to Iversen and Brodersen, we argue that children with ASD should not be treated as being in some way “cognitively incomplete”. Rather they are experts in their everyday lives and we cannot design future IT without involving them. However, how do we involve them Instead of beginning with HCI methods, we draw upon easy to use technologies and methods used in the therapy professions for child engagement, particularly utilizing the approaches of Hanen (2011) and Greenspan (1998). These approaches emphasize following the child’s lead and ensuring that the child always has a legitimate turn at a detailed level of interaction. In a pilot project, we have studied a child’s interactions with their parents about activities over which they have control – photos that they have taken at school on an iPad. The iPad was simple enough for this child with ASD to use and they enjoyed taking and reviewing photos. We use this small case study as an example of a child-led approach for a child with ASD. We examine interactions from this study in order to assess the possibilities and limitations of the child-led approach for supporting the design of technology based interventions to support speech and language development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper demonstrates an experimental study that examines the accuracy of various information retrieval techniques for Web service discovery. The main goal of this research is to evaluate algorithms for semantic web service discovery. The evaluation is comprehensively benchmarked using more than 1,700 real-world WSDL documents from INEX 2010 Web Service Discovery Track dataset. For automatic search, we successfully use Latent Semantic Analysis and BM25 to perform Web service discovery. Moreover, we provide linking analysis which automatically links possible atomic Web services to meet the complex requirements of users. Our fusion engine recommends a final result to users. Our experiments show that linking analysis can improve the overall performance of Web service discovery. We also find that keyword-based search can quickly return results but it has limitation of understanding users’ goals.