850 resultados para Perception of service
Resumo:
Models are an effective tool for systems and software design. They allow software architects to abstract from the non-relevant details. Those qualities are also useful for the technical management of networks, systems and software, such as those that compose service oriented architectures. Models can provide a set of well-defined abstractions over the distributed heterogeneous service infrastructure that enable its automated management. We propose to use the managed system as a source of dynamically generated runtime models, and decompose management processes into a composition of model transformations. We have created an autonomic service deployment and configuration architecture that obtains, analyzes, and transforms system models to apply the required actions, while being oblivious to the low-level details. An instrumentation layer automatically builds these models and interprets the planned management actions to the system. We illustrate these concepts with a distributed service update operation.
Unimanual and Bimanual Weight Perception of Virtual Objects with a new Multi-finger Haptic Interface
Resumo:
Accurate weight perception is important particularly in tasks where the user has to apply vertical forces to ensure safe landing of a fragile object or precise penetration of a surface with a probe. Moreover, depending on physical properties of objects such as weight and size we may switch between unimanual and bimanual manipulation during a task. Research has shown that bimanual manipulation of real objects results in a misperception of their weight: they tend to feel lighter than similarly heavy objects which are handled with one hand only [8]. Effective simulation of bimanual manipulation with desktop haptic interfaces should be able to replicate this effect of bimanual manipulation on weight perception. Here, we present the MasterFinger-2, a new multi-finger haptic interface allowing bimanual manipulation of virtual objects with precision grip and we conduct weight discrimination experiments to evaluate its capacity to simulate unimanual and bimanual weight. We found that the bimanual ‘lighter’ bias is also observed with the MasterFinger-2 but the sensitivity to changes of virtual weights deteriorated.
Resumo:
This paper empirically evaluates container terminal service attributes. The methodology proposed focuses on statistical control. Based on the concept of service segmentation, the authors employed control charts to classify container terminal services. The purpose of control charts is to allow simple detection of events that are indicative of actual process change. This simple decision can be difficult where the process characteristic is continuously varying, the control chart provides statistically objective criteria of change. When change is detected and considered good its cause should be identified and possibly become the new way of working, where the change is bad then its cause should be identified and eliminated. Both theoretical and practical implications of the research findings are discussed in this paper.
Resumo:
This paper empirically evaluates container terminal service attributes. The methodology proposed focuses on statistical control. Based on the concept of service segmentation, we employed control charts to classify container terminal services. The purpose of control charts is to allow simple detection of events that are indicative of actual process change. This simple decision can be difficult where the process characteristic is continuously varying; the control chart provides statistically objective criteria of change. When change is detected and considered good its cause should be identified and possibly become the new way of working, where the change is bad then its cause should be identified and eliminated. This paper is organized as follows: Section 1 is the introduction, Section 2 provides a brief note on other studies that inspired this research, section 3 focuses on the methodology used, and develops the results obtained and finally conclusions are shown in Section 4. Theoretical and practical implications of the research findings are discussed.
Resumo:
Commitment and involvement from the different members of an organization are two key elements for an organization to achieve its environmental excellence. Firstly, businesses are aware of the close relationship between their activities and the environment, for they are not only polluting agents but also agents with the capacity to reduce adverse environmental impacts. Secondly, the fact that employees can play a relevant role in terms of the socially responsible measures to be taken by organizations has started to become an irrefutably important issue. This piece of research is intended to help gain knowledge concerning the attitude of the two main actors in productive activity toward the environmental, that is, employers and employees; as well, this research intends to identify factors determining behaviour towards the environmental. For this, we have gathered the ideas and assessments contained in the discourse of a group of small and medium-sized businesses, large company owners and officers, employees, and work related risk prevention representatives. Qualitative work consisted of in-depth interviews and the creation of discussion groups.
Resumo:
Telecommunications networks have been always expanding and thanks to it, new services have appeared. The old mechanisms for carrying packets have become obsolete due to the new service requirements, which have begun working in real time. Real time traffic requires strict service guarantees. When this traffic is sent through the network, enough resources must be given in order to avoid delays and information losses. When browsing through the Internet and requesting web pages, data must be sent from a server to the user. If during the transmission there is any packet drop, the packet is sent again. For the end user, it does not matter if the webpage loads in one or two seconds more. But if the user is maintaining a conversation with a VoIP program, such as Skype, one or two seconds of delay in the conversation may be catastrophic, and none of them can understand the other. In order to provide support for this new services, the networks have to evolve. For this purpose MPLS and QoS were developed. MPLS is a packet carrying mechanism used in high performance telecommunication networks which directs and carries data using pre-established paths. Now, packets are forwarded on the basis of labels, making this process faster than routing the packets with the IP addresses. MPLS also supports Traffic Engineering (TE). This refers to the process of selecting the best paths for data traffic in order to balance the traffic load between the different links. In a network with multiple paths, routing algorithms calculate the shortest one, and most of the times all traffic is directed through it, causing overload and packet drops, without distributing the packets in the other paths that the network offers and do not have any traffic. But this is not enough in order to provide the real time traffic the guarantees it needs. In fact, those mechanisms improve the network, but they do not make changes in how the traffic is treated. That is why Quality of Service (QoS) was developed. Quality of service is the ability to provide different priority to different applications, users, or data flows, or to guarantee a certain level of performance to a data flow. Traffic is distributed into different classes and each of them is treated differently, according to its Service Level Agreement (SLA). Traffic with the highest priority will have the preference over lower classes, but this does not mean it will monopolize all the resources. In order to achieve this goal, a set policies are defined to control and alter how the traffic flows. Possibilities are endless, and it depends in how the network must be structured. By using those mechanisms it is possible to provide the necessary guarantees to the real-time traffic, distributing it between categories inside the network and offering the best service for both real time data and non real time data. Las Redes de Telecomunicaciones siempre han estado en expansión y han propiciado la aparición de nuevos servicios. Los viejos mecanismos para transportar paquetes se han quedado obsoletos debido a las exigencias de los nuevos servicios, que han comenzado a operar en tiempo real. El tráfico en tiempo real requiere de unas estrictas garantías de servicio. Cuando este tráfico se envía a través de la red, necesita disponer de suficientes recursos para evitar retrasos y pérdidas de información. Cuando se navega por la red y se solicitan páginas web, los datos viajan desde un servidor hasta el usuario. Si durante la transmisión se pierde algún paquete, éste se vuelve a mandar de nuevo. Para el usuario final, no importa si la página tarda uno o dos segundos más en cargar. Ahora bien, si el usuario está manteniendo una conversación usando algún programa de VoIP (como por ejemplo Skype) uno o dos segundos de retardo en la conversación podrían ser catastróficos, y ninguno de los interlocutores sería capaz de entender al otro. Para poder dar soporte a estos nuevos servicios, las redes deben evolucionar. Para este propósito se han concebido MPLS y QoS MPLS es un mecanismo de transporte de paquetes que se usa en redes de telecomunicaciones de alto rendimiento que dirige y transporta los datos de acuerdo a caminos preestablecidos. Ahora los paquetes se encaminan en función de unas etiquetas, lo cual hace que sea mucho más rápido que encaminar los paquetes usando las direcciones IP. MPLS también soporta Ingeniería de Tráfico (TE). Consiste en seleccionar los mejores caminos para el tráfico de datos con el objetivo de balancear la carga entre los diferentes enlaces. En una red con múltiples caminos, los algoritmos de enrutamiento actuales calculan el camino más corto, y muchas veces el tráfico se dirige sólo por éste, saturando el canal, mientras que otras rutas se quedan completamente desocupadas. Ahora bien, esto no es suficiente para ofrecer al tráfico en tiempo real las garantías que necesita. De hecho, estos mecanismos mejoran la red, pero no realizan cambios a la hora de tratar el tráfico. Por esto es por lo que se ha desarrollado el concepto de Calidad de Servicio (QoS). La calidad de servicio es la capacidad para ofrecer diferentes prioridades a las diferentes aplicaciones, usuarios o flujos de datos, y para garantizar un cierto nivel de rendimiento en un flujo de datos. El tráfico se distribuye en diferentes clases y cada una de ellas se trata de forma diferente, de acuerdo a las especificaciones que se indiquen en su Contrato de Tráfico (SLA). EL tráfico con mayor prioridad tendrá preferencia sobre el resto, pero esto no significa que acapare la totalidad de los recursos. Para poder alcanzar estos objetivos se definen una serie de políticas para controlar y alterar el comportamiento del tráfico. Las posibilidades son inmensas dependiendo de cómo se quiera estructurar la red. Usando estos mecanismos se pueden proporcionar las garantías necesarias al tráfico en tiempo real, distribuyéndolo en categorías dentro de la red y ofreciendo el mejor servicio posible tanto a los datos en tiempo real como a los que no lo son.
Resumo:
La computación basada en servicios (Service-Oriented Computing, SOC) se estableció como un paradigma ampliamente aceptado para el desarollo de sistemas de software flexibles, distribuidos y adaptables, donde las composiciones de los servicios realizan las tareas más complejas o de nivel más alto, frecuentemente tareas inter-organizativas usando los servicios atómicos u otras composiciones de servicios. En tales sistemas, las propriedades de la calidad de servicio (Quality of Service, QoS), como la rapídez de procesamiento, coste, disponibilidad o seguridad, son críticas para la usabilidad de los servicios o sus composiciones en cualquier aplicación concreta. El análisis de estas propriedades se puede realizarse de una forma más precisa y rica en información si se utilizan las técnicas de análisis de programas, como el análisis de complejidad o de compartición de datos, que son capables de analizar simultáneamente tanto las estructuras de control como las de datos, dependencias y operaciones en una composición. El análisis de coste computacional para la composicion de servicios puede ayudar a una monitorización predictiva así como a una adaptación proactiva a través de una inferencia automática de coste computacional, usando los limites altos y bajos como funciones del valor o del tamaño de los mensajes de entrada. Tales funciones de coste se pueden usar para adaptación en la forma de selección de los candidatos entre los servicios que minimizan el coste total de la composición, basado en los datos reales que se pasan al servicio. Las funciones de coste también pueden ser combinadas con los parámetros extraídos empíricamente desde la infraestructura, para producir las funciones de los límites de QoS sobre los datos de entrada, cuales se pueden usar para previsar, en el momento de invocación, las violaciones de los compromisos al nivel de servicios (Service Level Agreements, SLA) potenciales or inminentes. En las composiciones críticas, una previsión continua de QoS bastante eficaz y precisa se puede basar en el modelado con restricciones de QoS desde la estructura de la composition, datos empiricos en tiempo de ejecución y (cuando estén disponibles) los resultados del análisis de complejidad. Este enfoque se puede aplicar a las orquestaciones de servicios con un control centralizado del flujo, así como a las coreografías con participantes multiples, siguiendo unas interacciones complejas que modifican su estado. El análisis del compartición de datos puede servir de apoyo para acciones de adaptación, como la paralelización, fragmentación y selección de los componentes, las cuales son basadas en dependencias funcionales y en el contenido de información en los mensajes, datos internos y las actividades de la composición, cuando se usan construcciones de control complejas, como bucles, bifurcaciones y flujos anidados. Tanto las dependencias funcionales como el contenido de información (descrito a través de algunos atributos definidos por el usuario) se pueden expresar usando una representación basada en la lógica de primer orden (claúsulas de Horn), y los resultados del análisis se pueden interpretar como modelos conceptuales basados en retículos. ABSTRACT Service-Oriented Computing (SOC) is a widely accepted paradigm for development of flexible, distributed and adaptable software systems, in which service compositions perform more complex, higher-level, often cross-organizational tasks using atomic services or other service compositions. In such systems, Quality of Service (QoS) properties, such as the performance, cost, availability or security, are critical for the usability of services and their compositions in concrete applications. Analysis of these properties can become more precise and richer in information, if it employs program analysis techniques, such as the complexity and sharing analyses, which are able to simultaneously take into account both the control and the data structures, dependencies, and operations in a composition. Computation cost analysis for service composition can support predictive monitoring and proactive adaptation by automatically inferring computation cost using the upper and lower bound functions of value or size of input messages. These cost functions can be used for adaptation by selecting service candidates that minimize total cost of the composition, based on the actual data that is passed to them. The cost functions can also be combined with the empirically collected infrastructural parameters to produce QoS bounds functions of input data that can be used to predict potential or imminent Service Level Agreement (SLA) violations at the moment of invocation. In mission-critical applications, an effective and accurate continuous QoS prediction, based on continuations, can be achieved by constraint modeling of composition QoS based on its structure, known data at runtime, and (when available) the results of complexity analysis. This approach can be applied to service orchestrations with centralized flow control, and choreographies with multiple participants with complex stateful interactions. Sharing analysis can support adaptation actions, such as parallelization, fragmentation, and component selection, which are based on functional dependencies and information content of the composition messages, internal data, and activities, in presence of complex control constructs, such as loops, branches, and sub-workflows. Both the functional dependencies and the information content (described using user-defined attributes) can be expressed using a first-order logic (Horn clause) representation, and the analysis results can be interpreted as a lattice-based conceptual models.
Resumo:
In living bodies, the correct perceptual representation of size constancy requires that an object's size appear the same when it changes its location with respect to the observer. At the same time, it is necessary that objects at different locations appear to be the same size if they are. In order to do that, the perceptual system must recover from the stimuli impinging on the individual, from the light falling on the retina, a representation of the relative sizes of objects in the environment. Moreover, at the same time, image perception is related to another type of phenomena. It corresponds to the well known perceptual illusions. To analyze this facts, we propose a system based on a particular arrays of receptive points composed by optical fibers and dummy fibers. The structure is based on the first layers of the mammalians primary visual cortex. At that part of the brain, the neurons located at certain columns, respond to particular directions. This orientation changes in a systematic way as one moves across the cortical surface. In our case, the signals from the above-mentioned array are analyzed and information concerning orientation and size of a particular line is obtained. With this system, the Muelle-Lyer illusion has been studied and some rules to interpret why equal length objects give rise to different interpretations are presented.
Resumo:
Since the beginning of Internet, Internet Service Providers (ISP) have seen the need of giving to users? traffic different treatments defined by agree- ments between ISP and customers. This procedure, known as Quality of Service Management, has not much changed in the last years (DiffServ and Deep Pack-et Inspection have been the most chosen mechanisms). However, the incremen-tal growth of Internet users and services jointly with the application of recent Ma- chine Learning techniques, open up the possibility of going one step for-ward in the smart management of network traffic. In this paper, we first make a survey of current tools and techniques for QoS Management. Then we intro-duce clustering and classifying Machine Learning techniques for traffic charac-terization and the concept of Quality of Experience. Finally, with all these com-ponents, we present a brand new framework that will manage in a smart way Quality of Service in a telecom Big Data based scenario, both for mobile and fixed communications.
Resumo:
Background: It is known that competence to make decisions is a fundamental aspect of sport competition. Objective: This study has analyzed the decision profile of a sample of Spanish football players of different levels of expertise. Methods: 690 Spanish football players of national and international level completed the decision making questionnaire, which cover three dimensions ? perceived decision competence, decision anxiety and commitment with decision learning. MANCOVA and ANOVA analysis were carried out to analyse the differences in each dimension based on the level of expertise. Results: Results showed that perception of decision making competence increased and the anxiety decreased with the level of expertise. Conclusions: This study confirmed the usefulness of this questionnaire in the process of training for coaches and sport psychologists.
Resumo:
The aims of this study were to analyse perceptions regarding the sporting events held in the Madrid Sports Palace and to analyse whether those perceptions vary by age or gender. One hundred and ninetyfive residents answered the Ntloko and Swart (2008) questionnaire. The dimensions most highly rated were economic benefits (3.9±0.8), the event as a regional showcase (3.6±0.7), and the event as entertainment (3.4±0.6). However, the respondents did not agree with the negative environmental impact (2.0±0.8). Men rated the use of public money (z=2.4; pmenor que.05) and the regional showcase (z=2.0 pmenor que.05) more positively than women. Finally, women rated the increase in prices (z=2.0; pmenor que.05) more highly than men. The age groups differed significantly only regarding the promotion of community pride. Seniors and middle-aged adults rated it more positively than young adults (?2(2)=9.9; pmenor que.01). The fact that in an urban sports facility regular sporting events take place on a regular basis means that there are diverse perceptions, though mainly positive, and those perceptions differ from the perceptions about mega events that take place once in a life time at temporary sports facilities.
Resumo:
The financial crisis of 1997-1998 in Southeast Asia and the European Union’s financial crisis of 2008 followed by the sovereign debt crisis represented major policy events in the regions and beyond. The crises triggered policy adjustments with implications on economic and other policies. This paper aims at evaluating the perception of university students in the European Union (EU) and Southeast Asia on the management of these crises. It strives to confirm several ex ante assumptions about the relationship between students’ background, their policy orientation and their knowledge of the European Union and ASEAN policies. It also provides an analysis of the students’ evaluation of the geopolitical importance of the global regions and the EU and ASEAN policies. The paper is based on opinion surveys conducted during the first part of 2012 at four universities, two in the EU and two in ASEAN countries. In the eyes of EU and ASEAN students, the EU crisis is not being managed appropriately. The citizens of the EU surveyed were even significantly more critical of the EU’s anti-crisis measures than any other surveyed group. Their ASEAN counterparts were generally more positive in their evaluations.
Resumo:
In order to test the hypothesis that caesarean birth has negative consequences upon newly mothers’ satisfaction and perceptions, women delivering by caesarean birth (WCB) were compared with women delivering by vaginal birth (WVB). Subjects: 180 newly mothers; 93 WCB and 87 WVB. Instruments: A Socio-Demographic Questionnaire developed for this research, the Childbirth Perceptions Questionnaire and the Mother and Baby Scales. Results: WCB had significantly lower scores in perceptions of baby as alert/responsive and nearly significant lower scores for baby as alert during feeds. WVB showed a significantly higher level of satisfaction with delivery and conduct during labour, as also had significantly lower scores for perceptions of baby as irritable during feeds and for lack of confidence in feeds. After controlling for the kind of anesthesia received in labour, three conclusions must be taken into account: 1) between WCB with regional anaesthesia and WCB with general anaesthesia there is only one significant difference, with the former having higher scores for perception of baby as alert during feeds; 2) between WVB with regional anaesthesia and WVB with no anaesthesia there are only two significant differences, with the former having higher scores for lack of confidence in feeding and having lower scores for global confidence; 3) between WCB with regional anaesthesia and WVB with regional anaesthesia four significant differences emerge, with the former having a lower level of satisfaction with delivery and conduct in labour and having lower scores for perception of baby as alert responsive, and also having higher scores of perception of baby as irritable in feeds and higher scores for lack of confidence in feeding. Data seem compatible with the hypothesis that caesarean birth has some negative consequences upon mothers’ satisfaction and perceptions and, for this reason, psychological surveys should constitute a routine procedure in maternity hospitals, especially when newly mothers pertain to families affected by risks of psychological or social nature.