487 resultados para Lippia salviaefolia Cham.
Resumo:
Uncertainty can be defined as the difference between information that is represented in an executing system and the information that is both measurable and available about the system at a certain point in its life-time. A software system can be exposed to multiple sources of uncertainty produced by, for example, ambiguous requirements and unpredictable execution environments. A runtime model is a dynamic knowledge base that abstracts useful information about the system, its operational context and the extent to which the system meets its stakeholders' needs. A software system can successfully operate in multiple dynamic contexts by using runtime models that augment information available at design-time with information monitored at runtime. This chapter explores the role of runtime models as a means to cope with uncertainty. To this end, we introduce a well-suited terminology about models, runtime models and uncertainty and present a state-of-the-art summary on model-based techniques for addressing uncertainty both at development- and runtime. Using a case study about robot systems we discuss how current techniques and the MAPE-K loop can be used together to tackle uncertainty. Furthermore, we propose possible extensions of the MAPE-K loop architecture with runtime models to further handle uncertainty at runtime. The chapter concludes by identifying key challenges, and enabling technologies for using runtime models to address uncertainty, and also identifies closely related research communities that can foster ideas for resolving the challenges raised. © 2014 Springer International Publishing.
Resumo:
Ant colony optimisation algorithms model the way ants use pheromones for marking paths to important locations in their environment. Pheromone traces are picked up, followed, and reinforced by other ants but also evaporate over time. Optimal paths attract more pheromone and less useful paths fade away. The main innovation of the proposed Multiple Pheromone Ant Clustering Algorithm (MPACA) is to mark objects using many pheromones, one for each value of each attribute describing the objects in multidimensional space. Every object has one or more ants assigned to each attribute value and the ants then try to find other objects with matching values, depositing pheromone traces that link them. Encounters between ants are used to determine when ants should combine their features to look for conjunctions and whether they should belong to the same colony. This paper explains the algorithm and explores its potential effectiveness for cluster analysis. © 2014 Springer International Publishing Switzerland.
Resumo:
This chapter summarizes types of lipid oxidation – both enzymatic and non-enzymatic – and discusses reactivity, biological effects and metabolism of lipid oxidation products. Mechanistic explanations are provided for the diverse biological effects of lipid oxidation products that range from deleterious to regulatory and even to protective. Finally, analytical techniques used for detection of lipid oxidation and lipid oxidation products are discussed.
Resumo:
Previous research suggests that the attitudes and behaviours of front-line employees (FLEs) significantly influence customers’ evaluations of service quality and customer satisfaction. Therefore, it becomes important to identify the variables that influence FLEs job attitudes and Prosocial ServiceBehaviours (PSBs). The conceptual framework developed from extant literature is presented, which proposes that management interventions (internal communication, training and development and empowerment) have a direct effect on PSBs. In addition, these relationships are mediated by role stress and job attitudes. Implications for service management and future research directions are discussed.
Resumo:
The primary purpose of boundary spanning has been the information exchange between the organization and its task-environment. With complex, global organizational structures and increased emphasis on outsourcing, organizations today are susceptible to degenerate into ‘silos’ and in turn hampering the synergy and efficiency. Boundary spanning research becomes critical to answer some emerging questions in this area. Organization theorists have considered the boundary spanning construct an important one that explains the boundaries of an organization, inter organizational exchanges, dependence and in general, the concept of an organization. The research in this area seems to fall in two broad streams viz., Organization focused, dealing with issues pertaining to organization system, network, learning and collaboration and Individual focused, exploring issues of actors—and their attitudes, behavior—that traverse the boundaries of organization such as sales person, service workers and public servants. This chapter introduces nine interesting research studies presented in the following chapters of the book and attempt to put them in perspective in light of extant literature in the area of boundary spanning theory.
Resumo:
In recent years, learning word vector representations has attracted much interest in Natural Language Processing. Word representations or embeddings learned using unsupervised methods help addressing the problem of traditional bag-of-word approaches which fail to capture contextual semantics. In this paper we go beyond the vector representations at the word level and propose a novel framework that learns higher-level feature representations of n-grams, phrases and sentences using a deep neural network built from stacked Convolutional Restricted Boltzmann Machines (CRBMs). These representations have been shown to map syntactically and semantically related n-grams to closeby locations in the hidden feature space. We have experimented to additionally incorporate these higher-level features into supervised classifier training for two sentiment analysis tasks: subjectivity classification and sentiment classification. Our results have demonstrated the success of our proposed framework with 4% improvement in accuracy observed for subjectivity classification and improved the results achieved for sentiment classification over models trained without our higher level features.
Resumo:
The never-stopping increase in demand for information transmission capacity has been met with technological advances in telecommunication systems, such as the implementation of coherent optical systems, advanced multilevel multidimensional modulation formats, fast signal processing, and research into new physical media for signal transmission (e.g. a variety of new types of optical fibers). Since the increase in the signal-to-noise ratio makes fiber communication channels essentially nonlinear (due to the Kerr effect for example), the problem of estimating the Shannon capacity for nonlinear communication channels is not only conceptually interesting, but also practically important. Here we discuss various nonlinear communication channels and review the potential of different optical signal coding, transmission and processing techniques to improve fiber-optic Shannon capacity and to increase the system reach.
Resumo:
The combination of the third-order optical nonlinearity with chromatic dispersion in optical fibers offers an extremely rich variety of possibilities for tailoring the temporal and spectral content of a light signal, depending on the regime of dispersion that is used. Here, we review recent progress on the use of third-order nonlinear processes in optical fibers for pulse shaping in the temporal and spectral domains. Various examples of practical significance will be discussed, spanning fields from the generation of specialized temporal waveforms to the generation of ultrashort pulses, and to stable continuum generation.
Resumo:
All-optical signal processing is a powerful tool for the processing of communication signals and optical network applications have been routinely considered since the inception of optical communication. There are many successful optical devices deployed in today’s communication networks, including optical amplification, dispersion compensation, optical cross connects and reconfigurable add drop multiplexers. However, despite record breaking performance, all-optical signal processing devices have struggled to find a viable market niche. This has been mainly due to competition from electro-optic alternatives, either from detailed performance analysis or more usually due to the limited market opportunity for a mid-link device. For example a wavelength converter would compete with a reconfigured transponder which has an additional market as an actual transponder enabling significantly more economical development. Never-the-less, the potential performance of all-optical devices is enticing. Motivated by their prospects of eventual deployment, in this chapter we analyse the performance and energy consumption of digital coherent transponders, linear coherent repeaters and modulator based pulse shaping/frequency conversion, setting a benchmark for the proposed all-optical implementations.
Resumo:
Complex Event processing (CEP) has emerged over the last ten years. CEP systems are outstanding in processing large amount of data and responding in a timely fashion. While CEP applications are fast growing, performance management in this area has not gain much attention. It is critical to meet the promised level of service for both system designers and users. In this paper, we present a benchmark for complex event processing systems: CEPBen. The CEPBen benchmark is designed to evaluate CEP functional behaviours, i.e., filtering, transformation and event pattern detection and provides a novel methodology of evaluating the performance of CEP systems. A performance study by running the CEPBen on Esper CEP engine is described and discussed. The results obtained from performance tests demonstrate the influences of CEP functional behaviours on the system performance. © 2014 Springer International Publishing Switzerland.
Resumo:
METPEX is a 3 year, FP7 project which aims to develop a PanEuropean tool to measure the quality of the passenger's experience of multimodal transport. Initial work has led to the development of a comprehensive set of variables relating to different passenger groups, forms of transport and journey stages. This paper addresses the main challenges in transforming the variables into usable, accessible computer based tools allowing for the real time collection of information, across multiple journey stages in different EU countries. Non-computer based measurement instruments will be used to gather information from those who may not have or be familiar with mobile technology. Smartphone-based measurement instruments will also be used, hosted in two applications. The mobile applications need to be easy to use, configurable and adaptable according to the context of use. They should also be inherently interesting and rewarding for the participant, whilst allowing for the collection of high quality, valid and reliable data from all journey types and stages (from planning, through to entry into and egress from different transport modes, travel on public and personal vehicles and support of active forms of transport (e.g. cycling and walking). During all phases of the data collection and processing, the privacy of the participant is highly regarded and is ensured. © 2014 Springer International Publishing.
Resumo:
Technological advances have driven some attempt of vital parameters monitoring in adverse environments; these improvements will make possible to monitor cardiac activity also in automotive environments. In this scenario, heart rate changes associated with alcohol consumption, become of great importance to assess the drivers state during time. This paper presents the results of a first set of experiments aimed to discover heart rate variability modification induced by moderate assumption of alcoholic drink (i.e. single draft beer) as that typically occurs in weekend among some people. In the study, twenty subjects were enrolled and for each of them two electrocardiographic recordings were carried out: the first before alcohol ingestion and the second after 25-30 minutes. Each participant remained fasting until the second ECG acquisition was completed. ECG signal were analyzed by typical timedomain, frequency and non linear analysis. Results showed a small increase in LF/HF ratio which reflects a dominance of the sympathetic system over the parasympathetic system, and an increase in signal complexity as proven by non linear analysis. However, the study highlighted the need to monitor HRV starting from alcohol ingestion until its complete metabolization to allow a more precise description of its variation. © Springer International Publishing Switzerland 2014.
Resumo:
Risk management in healthcare represents a group of various complex actions, implemented to improve the quality of healthcare services and guarantee the patients safety. Risks cannot be eliminated, but it can be controlled with different risk assessment methods derived from industrial applications and among these the Failure Mode Effect and Criticality Analysis (FMECA) is a largely used methodology. The main purpose of this work is the analysis of failure modes of the Home Care (HC) service provided by local healthcare unit of Naples (ASL NA1) to focus attention on human and non human factors according to the organization framework selected by WHO. © Springer International Publishing Switzerland 2014.
Resumo:
In global policy documents, the language of Technology-Enhanced Learning (TEL) now firmly structures a perception of educational technology which ‘subsumes’ terms like Networked Learning and e-Learning. Embedded in these three words though is a deterministic, economic assumption that technology has now enhanced learning, and will continue to do so. In a market-driven, capitalist society this is a ‘trouble free’, economically focused discourse which suggests there is no need for further debate about what the use of technology achieves in learning. Yet this raises a problem too: if technology achieves goals for human beings, then in education we are now simply counting on ‘use of technology’ to enhance learning. This closes the door on a necessary and ongoing critical pedagogical conversation that reminds us it is people that design learning, not technology. Furthermore, such discourse provides a vehicle for those with either strong hierarchical, or neoliberal agendas to make simplified claims politically, in the name of technology. This chapter is a reflection on our use of language in the educational technology community through a corpus-based Critical Discourse Analysis (CDA). In analytical examples that are ‘loaded’ with economic expectation, we can notice how the policy discourse of TEL narrows conversational space for learning so that people may struggle to recognise their own subjective being in this language. Through the lens of Lieras’s externality, desubjectivisation and closure (Lieras, 1996) we might examine possible effects of this discourse and seek a more emancipatory approach. A return to discussing Networked Learning is suggested, as a first step towards a more multi-directional conversation than TEL, that acknowledges the interrelatedness of technology, language and learning in people’s practice. Secondly, a reconsideration of how we write policy for educational technology is recommended, with a critical focus on how people learn, rather than on what technology is assumed to enhance.