37 resultados para Context-Aware and Adaptable Architectures
Resumo:
Novel computing systems are increasingly being composed of large numbers of heterogeneous components, each with potentially different goals or local perspectives, and connected in networks which change over time. Management of such systems quickly becomes infeasible for humans. As such, future computing systems should be able to achieve advanced levels of autonomous behaviour. In this context, the system's ability to be self-aware and be able to self-express becomes important. This paper surveys definitions and current understanding of self-awareness and self-expression in biology and cognitive science. Subsequently, previous efforts to apply these concepts to computing systems are described. This has enabled the development of novel working definitions for self-awareness and self-expression within the context of computing systems.
Resumo:
METPEX is a 3 year, FP7 project which aims to develop a PanEuropean tool to measure the quality of the passenger's experience of multimodal transport. Initial work has led to the development of a comprehensive set of variables relating to different passenger groups, forms of transport and journey stages. This paper addresses the main challenges in transforming the variables into usable, accessible computer based tools allowing for the real time collection of information, across multiple journey stages in different EU countries. Non-computer based measurement instruments will be used to gather information from those who may not have or be familiar with mobile technology. Smartphone-based measurement instruments will also be used, hosted in two applications. The mobile applications need to be easy to use, configurable and adaptable according to the context of use. They should also be inherently interesting and rewarding for the participant, whilst allowing for the collection of high quality, valid and reliable data from all journey types and stages (from planning, through to entry into and egress from different transport modes, travel on public and personal vehicles and support of active forms of transport (e.g. cycling and walking). During all phases of the data collection and processing, the privacy of the participant is highly regarded and is ensured. © 2014 Springer International Publishing.
Resumo:
Despite their generally increasing use, the adoption of mobile shopping applications often differs across purchase contexts. In order to advance our understanding of smartphone-based mobile shopping acceptance, this study integrates and extends existing approaches from technology acceptance literature by examining two previously underexplored aspects. Firstly, the study examines the impact of different mobile and personal benefits (instant connectivity, contextual value and hedonic motivation), customer characteristics (habit) and risk facets (financial, performance, and security risk) as antecedents of mobile shopping acceptance. Secondly, it is assumed that several acceptance drivers differ in relevance subject to the perception of three mobile shopping characteristics (location sensitivity, time criticality, and extent of control), while other drivers are assumed to matter independent of the context. Based on a dataset of 410 smartphone shoppers, empirical results demonstrate that several acceptance predictors are associated with ease of use and usefulness, which in turn affect intentional and behavioral outcomes. Furthermore, the extent to which risks and benefits impact ease of use and usefulness is influenced by the three contextual characteristics. From a managerial perspective, results show which factors to consider in the development of mobile shopping applications and in which different application contexts they matter.
Resumo:
This study reports a qualitative phenomenological investigation of anger and anger-related aggression in the context of the lives of individual women. Semistructured interviews with five women are analyzed using interpretative phenomenological analysis. This inductive approach aims to capture the richness and complexity of the lived experience of emotional life. In particular, it draws attention to the context-dependent and relational dimension of angry feelings and aggressive behavior. Three analytic themes are presented here: the subjective experience of anger, which includes the perceptual confusion and bodily change felt by the women when angry, crying, and the presence of multiple emotions; the forms and contexts of aggression, paying particular attention to the range of aggressive strategies used; and anger as moral judgment, in particular perceptions of injustice and unfairness. The authors conclude by examining the analytic observations in light of phenomenological thinking.
Resumo:
Since the transfer of a message between two cultures very frequently takes place through the medium of a written text qua communicative event, it would seem useful to attempt to ascertain whether there is any kind of pattern in the use of strategies for the effective interlingual transfer of this message. Awareness of potentially successful strategies, within the constraints of context, text type, intended TL function and TL reader profile will enhance quality and cost-effectiveness (time, effort, financial costs) in the production of the target text. Through contrastive analysis of pairs of advertising texts, SL and TL, French and English, this study will attempt to identify the nature of some recurring choices made by different translators in the attempt to recreate ST information in the TL in such a manner as to reproduce as closely as possible the informative, persuasive and affective functions of the text as advertising material. Whilst recurrence may be seen to be significant in terms of illustrating tendencies with regard to the solution of problems of translation, this would not necessarily be taken as confirmation of the existence of pre-determined or prescriptive rules. These tendencies could, however, be taken as a guide to potential solutions to certain kinds of context-bound and text-type specific problem. Analysis of translated text-pairs taken from the field of advertising should produce examples of constraints posed by the need to select the content, tone and form of the Target Text, in order to ensure maximum efficacy of persuasive effect and to ensure the desired outcome, as determined by the Source Text function. When evaluating the success of a translated advertising text, constraints could be defined in terms of the culture-specific references or assumptions on which a Source Text may build in order to achieve its intended communicative function within the target community.
Resumo:
The objective of this thesis is to investigate, through an empirical study, the different functions of the highways maintenance departments and to suggest methods by means of which road maintenance work could be carried out in a more efficient way by utilising its resources of men, material and plant to the utmost advantage. This is particularly important under the present circumstances of national financial difficulties which have resulted in continuous cuts in public expenditure. In order to achieve this objective, the researcher carried out a survey among several Highways Authorities by means of questionnaire and interview. The information so collected was analysed in order to understand the actual, practical situation within highways manintenance departments, and highlight any existing problems, and try to answer the question of how they could become more efficient. According to the results obtained by the questionnaire and the interview, and the analysis of these results, the researcher concludes that it is the management system where least has been done, and where problems exist and are most complex. The management of highways maintenance departments argue that the reasons for their problems include both financial and organisational difficulties, apart from the political aspect and nature of the activities undertaken. The researcher believes that this ought to necessitate improving the management's analytical tools and techniques in order to achieve the most effective way of performing each activity. To this end the researcher recommends several related procedures to be adopted by the management of the highways maintenance departments. These recommendations, arising from the study, involve the technical, practical and human aspects. These are essential factors of which management should be aware - and certainly should not neglect - in order to achieve its objectives of improved productivity in the highways maintenance departments.
Resumo:
The current optical communications network consists of point-to-point optical transmission paths interconnected with relatively low-speed electronic switching and routing devices. As the demand for capacity increases, then higher speed electronic devices will become necessary. It is however hard to realise electronic chip-sets above 10 Gbit/s, and therefore to increase the achievable performance of the network, electro-optic and all-optic switching and routing architectures are being investigated. This thesis aims to provide a detailed experimental analysis of high-speed optical processing within an optical time division multiplexed (OTDM) network node. This includes the functions of demultiplexing, 'drop and insert' multiplexing, data regeneration, and clock recovery. It examines the possibilities of combining these tasks using a single device. Two optical switching technologies are explored. The first is an all-optical device known as 'semiconductor optical amplifier-based nonlinear optical loop mirror' (SOA-NOLM). Switching is achieved by using an intense 'control' pulse to induce a phase shift in a low-intensity signal propagating through an interferometer. Simultaneous demultiplexing, data regeneration and clock recovery are demonstrated for the first time using a single SOA-NOLM. The second device is an electroabsorption (EA) modulator, which until this thesis had been used in a uni-directional configuration to achieve picosecond pulse generation, data encoding, demultiplexing, and 'drop and insert' multiplexing. This thesis presents results on the use of an EA modulator in a novel bi-directional configuration. Two independent channels are demultiplexed from a high-speed OTDM data stream using a single device. Simultaneous demultiplexing with stable, ultra-low jitter clock recovery is demonstrated, and then used in a self-contained 40 Gbit/s 'drop and insert' node. Finally, a 10 GHz source is analysed that exploits the EA modulator bi-directionality to increase the pulse extinction ratio to a level where it could be used in an 80 Gbit/s OTDM network.
Resumo:
Liposomes due to their biphasic characteristic and diversity in design, composition and construction, offer a dynamic and adaptable technology for enhancing drug solubility. Starting with equimolar egg-phosphatidylcholine (PC)/cholesterol liposomes, the influence of the liposomal composition and surface charge on the incorporation and retention of a model poorly water soluble drug, ibuprofen was investigated. Both the incorporation and the release of ibuprofen were influenced by the lipid composition of the multi-lamellar vesicles (MLV) with inclusion of the long alkyl chain lipid (dilignoceroyl phosphatidylcholine (C 24PC)) resulting in enhanced ibuprofen incorporation efficiency and retention. The cholesterol content of the liposome bilayer was also shown to influence ibuprofen incorporation with maximum ibuprofen incorporation efficiency achieved when 4 μmol of cholesterol was present in the MLV formulation. Addition of anionic lipid dicetylphosphate (DCP) reduced ibuprofen drug loading presumably due to electrostatic repulsive forces between the carboxyl group of ibuprofen and the anionic head-group of DCP. In contrast, the addition of 2 μmol of the cationic lipid stearylamine (SA) to the liposome formulation (PC:Chol - 16 μmol:4 μmol) increased ibuprofen incorporation efficiency by approximately 8%. However further increases of the SA content to 4 μmol and above reduced incorporation by almost 50% compared to liposome formulations excluding the cationic lipid. Environmental scanning electron microscopy (ESEM) was used to dynamically follow the changes in liposome morphology during dehydration to provide an alternative assay of liposome stability. ESEM analysis clearly demonstrated that ibuprofen incorporation improved the stability of PC:Chol liposomes as evidenced by an increased resistance to coalescence during dehydration. These finding suggest a positive interaction between amphiphilic ibuprofen molecules and the bilayer structure of the liposome. © 2004 Elsevier B.V. All rights reserved.
Resumo:
Continuing advances in digital image capture and storage are resulting in a proliferation of imagery and associated problems of information overload in image domains. In this work we present a framework that supports image management using an interactive approach that captures and reuses task-based contextual information. Our framework models the relationship between images and domain tasks they support by monitoring the interactive manipulation and annotation of task-relevant imagery. During image analysis, interactions are captured and a task context is dynamically constructed so that human expertise, proficiency and knowledge can be leveraged to support other users in carrying out similar domain tasks using case-based reasoning techniques. In this article we present our framework for capturing task context and describe how we have implemented the framework as two image retrieval applications in the geo-spatial and medical domains. We present an evaluation that tests the efficiency of our algorithms for retrieving image context information and the effectiveness of the framework for carrying out goal-directed image tasks. © 2010 Springer Science+Business Media, LLC.
Resumo:
This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources.
Resumo:
The main purpose of this dissertation is to assess the relation between municipal benchmarking and organisational learning with a specific emphasis on benchlearning and performance within municipalities and between groups of municipalities in the building and housing sector in the Netherlands. The first and main conclusion is that this relation exists, but that the relative success of different approaches to dimensions of change and organisational learning are a key explanatory factor for differences in the success of benchlearning. Seven other important conclusions could be derived from the empirical research. First, a combination of interpretative approaches at the group level with a mixture of hierarchical and network strategies, positively influences benchlearning. Second, interaction among professionals at the inter-organisational level strengthens benchlearning. Third, stimulating supporting factors can be seen as a more important strategy to strengthen benchlearning than pulling down barriers. Fourth, in order to facilitate benchlearning, intrinsic motivation and communication skills matter, and are supported by a high level of cooperation (i.e., team work), a flat organisational structure and interactions between individuals. Fifth, benchlearning is facilitated by a strategy that is based on a balanced use of episodic (emergent) and systemic (deliberate) forms of power. Sixth, high levels of benchlearning will be facilitated by an analyser or prospector strategic stance. Prospectors and analysers reach a different learning outcome than defenders and reactors. Whereas analysers and prospectors are willing to change policies when it is perceived as necessary, the strategic stances of defenders and reactors result in narrow process improvements (i.e., single-loop learning). Seventh, performance improvement is influenced by functional perceptions towards performance, and these perceptions ultimately influence the elements adopted. This research shows that efforts aimed at benchlearning and ultimately improved service delivery, should be directed to a multi-level and multi-dimensional approach addressing the context, content and process of dimensions of change and organisational learning.
Resumo:
Modern compute systems continue to evolve towards increasingly complex, heterogeneous and distributed architectures. At the same time, functionality and performance are no longer the only aspects when developing applications for such systems, and additional concerns such as flexibility, power efficiency, resource usage, reliability and cost are becoming increasingly important. This does not only raise the question of how to efficiently develop applications for such systems, but also how to cope with dynamic changes in the application behaviour or the system environment. The EPiCS Project aims to address these aspects through exploring self-awareness and self-expression. Self-awareness allows systems and applications to gather and maintain information about their current state and environment, and reason about their behaviour. Self-expression enables systems to adapt their behaviour autonomously to changing conditions. Innovations in EPiCS are based on systematic integration of research in concepts and foundations, customisable hardware/software platforms and operating systems, and self-aware networking and middleware infrastructure. The developed technologies are validated in three application domains: computational finance, distributed smart cameras and interactive mobile media systems. © 2012 IEEE.
Resumo:
In this paper we present increased adaptivity and robustness in distributed object tracking by multi-camera networks using a socio-economic mechanism for learning the vision graph. To build-up the vision graph autonomously within a distributed smart-camera network, we use an ant-colony inspired mechanism, which exchanges responsibility for tracking objects using Vickrey auctions. Employing the learnt vision graph allows the system to optimise its communication continuously. Since distributed smart camera networks are prone to uncertainties in individual cameras, such as failures or changes in extrinsic parameters, the vision graph should be sufficiently robust and adaptable during runtime to enable seamless tracking and optimised communication. To better reflect real smart-camera platforms and networks, we consider that communication and handover are not instantaneous, and that cameras may be added, removed or their properties changed during runtime. Using our dynamic socio-economic approach, the network is able to continue tracking objects well, despite all these uncertainties, and in some cases even with improved performance. This demonstrates the adaptivity and robustness of our approach.
Resumo:
Full text: The idea of producing proteins from recombinant DNA hatched almost half a century ago. In his PhD thesis, Peter Lobban foresaw the prospect of inserting foreign DNA (from any source, including mammalian cells) into the genome of a λ phage in order to detect and recover protein products from Escherichia coli [ 1 and 2]. Only a few years later, in 1977, Herbert Boyer and his colleagues succeeded in the first ever expression of a peptide-coding gene in E. coli — they produced recombinant somatostatin [ 3] followed shortly after by human insulin. The field has advanced enormously since those early days and today recombinant proteins have become indispensable in advancing research and development in all fields of the life sciences. Structural biology, in particular, has benefitted tremendously from recombinant protein biotechnology, and an overwhelming proportion of the entries in the Protein Data Bank (PDB) are based on heterologously expressed proteins. Nonetheless, synthesizing, purifying and stabilizing recombinant proteins can still be thoroughly challenging. For example, the soluble proteome is organized to a large part into multicomponent complexes (in humans often comprising ten or more subunits), posing critical challenges for recombinant production. A third of all proteins in cells are located in the membrane, and pose special challenges that require a more bespoke approach. Recent advances may now mean that even these most recalcitrant of proteins could become tenable structural biology targets on a more routine basis. In this special issue, we examine progress in key areas that suggests this is indeed the case. Our first contribution examines the importance of understanding quality control in the host cell during recombinant protein production, and pays particular attention to the synthesis of recombinant membrane proteins. A major challenge faced by any host cell factory is the balance it must strike between its own requirements for growth and the fact that its cellular machinery has essentially been hijacked by an expression construct. In this context, Bill and von der Haar examine emerging insights into the role of the dependent pathways of translation and protein folding in defining high-yielding recombinant membrane protein production experiments for the common prokaryotic and eukaryotic expression hosts. Rather than acting as isolated entities, many membrane proteins form complexes to carry out their functions. To understand their biological mechanisms, it is essential to study the molecular structure of the intact membrane protein assemblies. Recombinant production of membrane protein complexes is still a formidable, at times insurmountable, challenge. In these cases, extraction from natural sources is the only option to prepare samples for structural and functional studies. Zorman and co-workers, in our second contribution, provide an overview of recent advances in the production of multi-subunit membrane protein complexes and highlight recent achievements in membrane protein structural research brought about by state-of-the-art near-atomic resolution cryo-electron microscopy techniques. E. coli has been the dominant host cell for recombinant protein production. Nonetheless, eukaryotic expression systems, including yeasts, insect cells and mammalian cells, are increasingly gaining prominence in the field. The yeast species Pichia pastoris, is a well-established recombinant expression system for a number of applications, including the production of a range of different membrane proteins. Byrne reviews high-resolution structures that have been determined using this methylotroph as an expression host. Although it is not yet clear why P. pastoris is suited to producing such a wide range of membrane proteins, its ease of use and the availability of diverse tools that can be readily implemented in standard bioscience laboratories mean that it is likely to become an increasingly popular option in structural biology pipelines. The contribution by Columbus concludes the membrane protein section of this volume. In her overview of post-expression strategies, Columbus surveys the four most common biochemical approaches for the structural investigation of membrane proteins. Limited proteolysis has successfully aided structure determination of membrane proteins in many cases. Deglycosylation of membrane proteins following production and purification analysis has also facilitated membrane protein structure analysis. Moreover, chemical modifications, such as lysine methylation and cysteine alkylation, have proven their worth to facilitate crystallization of membrane proteins, as well as NMR investigations of membrane protein conformational sampling. Together these approaches have greatly facilitated the structure determination of more than 40 membrane proteins to date. It may be an advantage to produce a target protein in mammalian cells, especially if authentic post-translational modifications such as glycosylation are required for proper activity. Chinese Hamster Ovary (CHO) cells and Human Embryonic Kidney (HEK) 293 cell lines have emerged as excellent hosts for heterologous production. The generation of stable cell-lines is often an aspiration for synthesizing proteins expressed in mammalian cells, in particular if high volumetric yields are to be achieved. In his report, Buessow surveys recent structures of proteins produced using stable mammalian cells and summarizes both well-established and novel approaches to facilitate stable cell-line generation for structural biology applications. The ambition of many biologists is to observe a protein's structure in the native environment of the cell itself. Until recently, this seemed to be more of a dream than a reality. Advances in nuclear magnetic resonance (NMR) spectroscopy techniques, however, have now made possible the observation of mechanistic events at the molecular level of protein structure. Smith and colleagues, in an exciting contribution, review emerging ‘in-cell NMR’ techniques that demonstrate the potential to monitor biological activities by NMR in real time in native physiological environments. A current drawback of NMR as a structure determination tool derives from size limitations of the molecule under investigation and the structures of large proteins and their complexes are therefore typically intractable by NMR. A solution to this challenge is the use of selective isotope labeling of the target protein, which results in a marked reduction of the complexity of NMR spectra and allows dynamic processes even in very large proteins and even ribosomes to be investigated. Kerfah and co-workers introduce methyl-specific isotopic labeling as a molecular tool-box, and review its applications to the solution NMR analysis of large proteins. Tyagi and Lemke next examine single-molecule FRET and crosslinking following the co-translational incorporation of non-canonical amino acids (ncAAs); the goal here is to move beyond static snap-shots of proteins and their complexes and to observe them as dynamic entities. The encoding of ncAAs through codon-suppression technology allows biomolecules to be investigated with diverse structural biology methods. In their article, Tyagi and Lemke discuss these approaches and speculate on the design of improved host organisms for ‘integrative structural biology research’. Our volume concludes with two contributions that resolve particular bottlenecks in the protein structure determination pipeline. The contribution by Crepin and co-workers introduces the concept of polyproteins in contemporary structural biology. Polyproteins are widespread in nature. They represent long polypeptide chains in which individual smaller proteins with different biological function are covalently linked together. Highly specific proteases then tailor the polyprotein into its constituent proteins. Many viruses use polyproteins as a means of organizing their proteome. The concept of polyproteins has now been exploited successfully to produce hitherto inaccessible recombinant protein complexes. For instance, by means of a self-processing synthetic polyprotein, the influenza polymerase, a high-value drug target that had remained elusive for decades, has been produced, and its high-resolution structure determined. In the contribution by Desmyter and co-workers, a further, often imposing, bottleneck in high-resolution protein structure determination is addressed: The requirement to form stable three-dimensional crystal lattices that diffract incident X-ray radiation to high resolution. Nanobodies have proven to be uniquely useful as crystallization chaperones, to coax challenging targets into suitable crystal lattices. Desmyter and co-workers review the generation of nanobodies by immunization, and highlight the application of this powerful technology to the crystallography of important protein specimens including G protein-coupled receptors (GPCRs). Recombinant protein production has come a long way since Peter Lobban's hypothesis in the late 1960s, with recombinant proteins now a dominant force in structural biology. The contributions in this volume showcase an impressive array of inventive approaches that are being developed and implemented, ever increasing the scope of recombinant technology to facilitate the determination of elusive protein structures. Powerful new methods from synthetic biology are further accelerating progress. Structure determination is now reaching into the living cell with the ultimate goal of observing functional molecular architectures in action in their native physiological environment. We anticipate that even the most challenging protein assemblies will be tackled by recombinant technology in the near future.
Resumo:
One of the major problems for Critical Discourse Analysts is how to move on from their insightful critical analyses to successfully 'acting on the world in order to transform it'. This paper discusses, with detailed exemplification, some of the areas where linguists have moved beyond description to acting on and changing the world. Examples from three murder trials show how essential it is, in order to protect the rights of witnesses and defendants, to have audio records of significant interviews with police officers. The article moves on to discuss the potentially serious consequences of the many communicative problems inherent in legal/lay interaction and illustrates a few of the linguist-led improvements to important texts. Finally, the article turns to the problems of using linguistic data to try to determine the geographical origin of asylum seekers. The intention of the article is to act as a call to arms to linguists; it concludes with the observation that 'innumerable mountains remain for those with a critical linguistic perspective who would like to try to move one'. © 2011 John Benjamins Publishing Company.