940 resultados para User experience based approaches


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Marine sponges have been an abundant source of new metabolites in recent years. The symbiotic association between the bacteria and the sponge has enabled scientists to access the bacterial diversity present within the bacterial/sponge ecosystem. This study has focussed on accessing the bacterial diversity in two Irish coastal marine sponges, namely Amphilectus fucorum and Eurypon major. A novel species from the genus Aquimarina has been isolated from the sponge Amphilectus fucorum. The study has also resulted in the identification of an α–Proteobacteria, Pseudovibrio sp. as a potential producer of antibiotics. Thus a targeted based approach to specifically cultivate Pseudovibrio sp. may prove useful for the development of new metabolites from this particular genus. Bacterial isolates from the marine sponge Haliclona simulans were screened for anti–fungal activity and one isolate namely Streptomyces sp. SM8 displayed activity against all five fungal strains tested. The strain was also tested for anti–bacterial activity and it showed activity against both against B. subtilis and P. aeruginosa. Hence a combinatorial approach involving both biochemical and genomic approaches were employed in an attempt to identify the bioactive compounds with these activities which were being produced by this strain. Culture broths from Streptomyces sp. SM8 were extracted and purified by various techniques such as reverse–phase HPLC, MPLC and ash chromatography. Anti–bacterial activity was observed in a fraction which contained a hydroxylated saturated fatty acid and also another compound with a m/z 227 but further structural elucidation of these compounds proved unsuccessful. The anti–fungal fractions from SM8 were shown to contain antimycin–like compounds, with some of these compounds having different retention times from that of an antimycin standard. A high–throughput assay was developed to screen for novel calcineurin inhibitors using yeast as a model system and three putative bacterial extracts were found to be positive using this screen. One of these extracts from SM8 was subsequently analysed using NMR and the calcineurin inhibition activity was con rmed to belong to a butenolide type compound. A H. simulans metagenomic library was also screened using the novel calcineurin inhibitor high–throughput assay system and eight clones displaying putative calcineurin inhibitory activity were detected. The clone which displayed the best inhibitory activity was subsequently sequenced and following the use of other genetic based approaches it became clear that the inhibition was being caused by a hypothetical protein with similarity to a hypothetical Na+/Ca2+ exchanger protein. The Streptomyces sp. SM8 genome was sequenced from a fragment library using Roche 454 pyrosequencing technology to identify potential secondary metabolism clusters. The draft genome was annotated by IMG/ER using the Prodigal pipeline. The Whole Genome Shotgun project has been deposited at DDBJ/EMBL/GenBank under the accession AMPN00000000. The genome contains genes which appear to encode for several polyketide synthases (PKS), non–ribosomal peptide synthetases (NRPS), terpene and siderophore biosynthesis and ribosomal peptides. Transcriptional analyses led to the identification of three hybrid clusters of which one is predicted to be involved in the synthesis of antimycin, while the functions of the others are as yet unknown. Two NRPS clusters were also identified, of which one may be involved in gramicidin biosynthesis and the function of the other is unknown. A Streptomyces sp. SM8 NRPS antC gene knockout was constructed and extracts from the strain were shown to possess a mild anti–fungal activity when compared to the SM8 wild–type. Subsequent LCMS analysis of antC mutant extracts confirmed the absence of the antimycin in the extract proving that the observed anti–fungal activity may involve metabolite(s) other than antimycin. Anti–bacterial activity in the antC gene knockout strain against P. aeruginosa was reduced when compared to the SM8 wild–type indicating that antimycin may be contributing to the observed anti–bacterial activity in addition to the metabolite(s) already identified during the chemical analyses. This is the first report of antimycins exhibiting anti–bacterial activity against P. aeruginosa. One of the hybrid clusters potentially involved in secondary metabolism in SM8 that displayed high and consistent levels of gene–expression in RNA studies was analysed in an attempt to identify the metabolite being produced by the pathway. A number of unusual features were observed following bioinformatics analysis of the gene sequence of the cluster, including a formylation domain within the NRPS cluster which may add a formyl group to the growing chain. Another unusual feature is the lack of AT domains on two of the PKS modules. Other unusual features observed in this cluster is the lack of a KR domain in module 3 of the cluster and an aminotransferase domain in module 4 for which no clear role has been hypothesised.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A wireless sensor network can become partitioned due to node failure, requiring the deployment of additional relay nodes in order to restore network connectivity. This introduces an optimisation problem involving a tradeoff between the number of additional nodes that are required and the costs of moving through the sensor field for the purpose of node placement. This tradeoff is application-dependent, influenced for example by the relative urgency of network restoration. In addition, minimising the number of relay nodes might lead to long routing paths to the sink, which may cause problems of data latency. This data latency is extremely important in wireless sensor network applications such as battlefield surveillance, intrusion detection, disaster rescue, highway traffic coordination, etc. where they must not violate the real-time constraints. Therefore, we also consider the problem of deploying multiple sinks in order to improve the network performance. Previous research has only parts of this problem in isolation, and has not properly considered the problems of moving through a constrained environment or discovering changes to that environment during the repair or network quality after the restoration. In this thesis, we firstly consider a base problem in which we assume the exploration tasks have already been completed, and so our aim is to optimise our use of resources in the static fully observed problem. In the real world, we would not know the radio and physical environments after damage, and this creates a dynamic problem where damage must be discovered. Therefore, we extend to the dynamic problem in which the network repair problem considers both exploration and restoration. We then add a hop-count constraint for network quality in which the desired locations can talk to a sink within a hop count limit after the network is restored. For each new problem of the network repair, we have proposed different solutions (heuristics and/or complete algorithms) which prioritise different objectives. We evaluate our solutions based on simulation, assessing the quality of solutions (node cost, movement cost, computation time, and total restoration time) by varying the problem types and the capability of the agent that makes the repair. We show that the relative importance of the objectives influences the choice of algorithm, and different speeds of movement for the repairing agent have a significant impact on performance, and must be taken into account when selecting the algorithm. In particular, the node-based approaches are the best in the node cost, and the path-based approaches are the best in the mobility cost. For the total restoration time, the node-based approaches are the best with a fast moving agent while the path-based approaches are the best with a slow moving agent. For a medium speed moving agent, the total restoration time of the node-based approaches and that of the path-based approaches are almost balanced.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For at least two millennia and probably much longer, the traditional vehicle for communicating geographical information to end-users has been the map. With the advent of computers, the means of both producing and consuming maps have radically been transformed, while the inherent nature of the information product has also expanded and diversified rapidly. This has given rise in recent years to the new concept of geovisualisation (GVIS), which draws on the skills of the traditional cartographer, but extends them into three spatial dimensions and may also add temporality, photorealistic representations and/or interactivity. Demand for GVIS technologies and their applications has increased significantly in recent years, driven by the need to study complex geographical events and in particular their associated consequences and to communicate the results of these studies to a diversity of audiences and stakeholder groups. GVIS has data integration, multi-dimensional spatial display advanced modelling techniques, dynamic design and development environments and field-specific application needs. To meet with these needs, GVIS tools should be both powerful and inherently usable, in order to facilitate their role in helping interpret and communicate geographic problems. However no framework currently exists for ensuring this usability. The research presented here seeks to fill this gap, by addressing the challenges of incorporating user requirements in GVIS tool design. It starts from the premise that usability in GVIS should be incorporated and implemented throughout the whole design and development process. To facilitate this, Subject Technology Matching (STM) is proposed as a new approach to assessing and interpreting user requirements. Based on STM, a new design framework called Usability Enhanced Coordination Design (UECD) is ten presented with the purpose of leveraging overall usability of the design outputs. UECD places GVIS experts in a new key role in the design process, to form a more coordinated and integrated workflow and a more focused and interactive usability testing. To prove the concept, these theoretical elements of the framework have been implemented in two test projects: one is the creation of a coastal inundation simulation for Whitegate, Cork, Ireland; the other is a flooding mapping tool for Zhushan Town, Jiangsu, China. The two case studies successfully demonstrated the potential merits of the UECD approach when GVIS techniques are applied to geographic problem solving and decision making. The thesis delivers a comprehensive understanding of the development and challenges of GVIS technology, its usability concerns, usability and associated UCD; it explores the possibility of putting UCD framework in GVIS design; it constructs a new theoretical design framework called UECD which aims to make the whole design process usability driven; it develops the key concept of STM into a template set to improve the performance of a GVIS design. These key conceptual and procedural foundations can be built on future research, aimed at further refining and developing UECD as a useful design methodology for GVIS scholars and practitioners.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes implementations of two mobile cloud applications, file synchronisation and intensive data processing, using the Context Aware Mobile Cloud Services middleware, and the Cloud Personal Assistant. Both are part of the same mobile cloud project, actively developed and currently at the second version. We describe recent changes to the middleware, along with our experimental results of the two application models. We discuss challenges faced during the development of the middleware and their implications. The paper includes performance analysis of the CPA support for the two applications in respect to existing solutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents our efforts to bridge the gap between mobile context awareness, and mobile cloud services, using the Cloud Personal Assistant (CPA). The CPA is a part of the Context Aware Mobile Cloud Services (CAMCS) middleware, which we continue to develop. Specifically, we discuss the development and evaluation of the Context Processor component of this middleware. This component collects context data from the mobile devices of users, which is then provided to the CPA of each user, for use with mobile cloud services. We discuss the architecture and implementation of the Context Processor, followed by the evaluation. We introduce context profiles for the CPA, which influence its operation by using different context types. As part of the evaluation, we present two experimental context-aware mobile cloud services to illustrate how the CPA works with user context, and related context profiles, to complete tasks for the user.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern information systems (ISs) are becoming increasingly complex. Simultaneously, organizational changes are occurring more often and more rapidly. Therefore, emergent behavior and organic adaptivity are key advantages of ISs. In this paper, a design science research (DSR) question for design-oriented information systems research (DISR) is proposed: Can the application of biomimetic principles to IS design result in the creation of value by innovation? Accordingly, the properties of biological IS are analyzed, and these insights are crystallized into a theoretical framework to address the three major aspects of biomimetic ISs: user experience, information processing, and management cybernetics. On this basis, the research question is elaborated together with a starting point for a research methodology in biomimetic information systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article describes advances in statistical computation for large-scale data analysis in structured Bayesian mixture models via graphics processing unit (GPU) programming. The developments are partly motivated by computational challenges arising in fitting models of increasing heterogeneity to increasingly large datasets. An example context concerns common biological studies using high-throughput technologies generating many, very large datasets and requiring increasingly high-dimensional mixture models with large numbers of mixture components.We outline important strategies and processes for GPU computation in Bayesian simulation and optimization approaches, give examples of the benefits of GPU implementations in terms of processing speed and scale-up in ability to analyze large datasets, and provide a detailed, tutorial-style exposition that will benefit readers interested in developing GPU-based approaches in other statistical models. Novel, GPU-oriented approaches to modifying existing algorithms software design can lead to vast speed-up and, critically, enable statistical analyses that presently will not be performed due to compute time limitations in traditional computational environments. Supplementalmaterials are provided with all source code, example data, and details that will enable readers to implement and explore the GPU approach in this mixture modeling context. © 2010 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although many feature selection methods for classification have been developed, there is a need to identify genes in high-dimensional data with censored survival outcomes. Traditional methods for gene selection in classification problems have several drawbacks. First, the majority of the gene selection approaches for classification are single-gene based. Second, many of the gene selection procedures are not embedded within the algorithm itself. The technique of random forests has been found to perform well in high-dimensional data settings with survival outcomes. It also has an embedded feature to identify variables of importance. Therefore, it is an ideal candidate for gene selection in high-dimensional data with survival outcomes. In this paper, we develop a novel method based on the random forests to identify a set of prognostic genes. We compare our method with several machine learning methods and various node split criteria using several real data sets. Our method performed well in both simulations and real data analysis.Additionally, we have shown the advantages of our approach over single-gene-based approaches. Our method incorporates multivariate correlations in microarray data for survival outcomes. The described method allows us to better utilize the information available from microarray data with survival outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

© 2015 IEEE.In virtual reality applications, there is an aim to provide real time graphics which run at high refresh rates. However, there are many situations in which this is not possible due to simulation or rendering issues. When running at low frame rates, several aspects of the user experience are affected. For example, each frame is displayed for an extended period of time, causing a high persistence image artifact. The effect of this artifact is that movement may lose continuity, and the image jumps from one frame to another. In this paper, we discuss our initial exploration of the effects of high persistence frames caused by low refresh rates and compare it to high frame rates and to a technique we developed to mitigate the effects of low frame rates. In this technique, the low frame rate simulation images are displayed with low persistence by blanking out the display during the extra time such image would be displayed. In order to isolate the visual effects, we constructed a simulator for low and high persistence displays that does not affect input latency. A controlled user study comparing the three conditions for the tasks of 3D selection and navigation was conducted. Results indicate that the low persistence display technique may not negatively impact user experience or performance as compared to the high persistence case. Directions for future work on the use of low persistence displays for low frame rate situations are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays multi-touch devices (MTD) can be found in all kind of contexts. In the learning context, MTD availability leads many teachers to use them in their class room, to support the use of the devices by students, or to assume that it will enhance the learning processes. Despite the raising interest for MTD, few researches studying the impact in term of performance or the suitability of the technology for the learning context exist. However, even if the use of touch-sensitive screens rather than a mouse and keyboard seems to be the easiest and fastest way to realize common learning tasks (as for instance web surfing behaviour), we notice that the use of MTD may lead to a less favourable outcome. The complexity to generate an accurate fingers gesture and the split attention it requires (multi-tasking effect) make the use of gestures to interact with a touch-sensitive screen more difficult compared to the traditional laptop use. More precisely, it is hypothesized that efficacy and efficiency decreases, as well as the available cognitive resources making the users’ task engagement more difficult. Furthermore, the presented study takes into account the moderator effect of previous experiences with MTD. Two key factors of technology adoption theories were included in the study: familiarity and self-efficacy with the technology.Sixty university students, invited to a usability lab, are asked to perform information search tasks on an online encyclopaedia. The different tasks were created in order to execute the most commonly used mouse actions (e.g. right click, left click, scrolling, zooming, key words encoding…). Two different conditions were created: (1) MTD use and (2) laptop use (with keyboard and mouse). The cognitive load, self-efficacy, familiarity and task engagement scales were adapted to the MTD context. Furthermore, the eye-tracking measurement would offer additional information about user behaviours and their cognitive load.Our study aims to clarify some important aspects towards the usage of MTD and the added value compared to a laptop in a student learning context. More precisely, the outcomes will enhance the suitability of MTD with the processes at stakes, the role of previous knowledge in the adoption process, as well as some interesting insights into the user experience with such devices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many Web applications walk the thin line between the need for dynamic data and the need to meet user performance expectations. In environments where funds are not available to constantly upgrade hardware inline with user demand, alternative approaches need to be considered. This paper introduces a ‘Data farming’ model whereby dynamic data, which is ‘grown’ in operational applications, is ‘harvested’ and ‘packaged’ for various consumer markets. Like any well managed agricultural operation, crops are harvested according to historical and perceived demand as inferred by a self-optimising process. This approach aims to make enhanced use of available resources through better utlilisation of system downtime - thereby improving application performance and increasing the availability of key business data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sustainable development depends on maintaining ecosystem services which are concentrated in coastal marine and estuarine ecosystems. Analyses of the science needed to manage human uses of ecosystem services have concentrated on terrestrial ecosystems. Our focus is on the provision of multidisciplinary data needed to inform adaptive, ecosystem-based approaches (EBAs) for maintaining coastal ecosystem services based on comparative ecosystem analyses. Key indicators of pressures on coastal ecosystems, ecosystem states and the impacts of changes in states on services are identified for monitoring and analysis at a global coastal network of sentinel sites nested in the ocean-climate observing system. Biodiversity is targeted as the “master” indicator because of its importance to a broad spectrum of services. Ultimately, successful implementation of EBAs will depend on establishing integrated, holistic approaches to ocean governance that oversee the development of integrated, operational ocean observing systems based on the data and information requirements specified by a broad spectrum of stakeholders for sustainable development. Sustained engagement of such a spectrum of stakeholders on a global scale is not feasible. The global coastal network will need to be customized locally and regionally based on priorities established by stakeholders in their respective regions. The E.U. Marine Strategy Framework Directive and the U.S. Recommendations of the Interagency Ocean Policy Task Force are important examples of emerging regional scale approaches. The effectiveness of these policies will depend on the co-evolution of ocean policy and the observing system under the auspices of integrated ocean governance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Satellite remote sensing of ocean colour is the only method currently available for synoptically measuring wide-area properties of ocean ecosystems, such as phytoplankton chlorophyll biomass. Recently, a variety of bio-optical and ecological methods have been established that use satellite data to identify and differentiate between either phytoplankton functional types (PFTs) or phytoplankton size classes (PSCs). In this study, several of these techniques were evaluated against in situ observations to determine their ability to detect dominant phytoplankton size classes (micro-, nano- and picoplankton). The techniques are applied to a 10-year ocean-colour data series from the SeaWiFS satellite sensor and compared with in situ data (6504 samples) from a variety of locations in the global ocean. Results show that spectral-response, ecological and abundance-based approaches can all perform with similar accuracy. Detection of microplankton and picoplankton were generally better than detection of nanoplankton. Abundance-based approaches were shown to provide better spatial retrieval of PSCs. Individual model performance varied according to PSC, input satellite data sources and in situ validation data types. Uncertainty in the comparison procedure and data sources was considered. Improved availability of in situ observations would aid ongoing research in this field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This review examines interregional linkages and gives an overview perspective on marine ecosystem functioning in the north-eastern Atlantic. It is based on three of the 'systems' considered by the European Network of Excellence for Ocean Ecosystems Analysis (EUR-OC EANS was established in 2004 under the European Framework VI funding programme to promote integration of marine ecological research within Europe), the Arctic and Nordic Seas, North Atlantic shelf seas and North Atlantic. The three systems share common open boundaries and the transport of water, heat, nutrients and particulates across these boundaries modifies local processes. Consistent with the EUR-OC EANS concept of 'end-to-end' analyses of marine food webs, the review takes an integrated approach linking ocean physics, lower trophic levels and working up the food web to top predators such as marine mammals. We begin with an overview of the regions focusing on the major physical patterns and their implications for the microbial community, phytoplankton, zooplankton, fish and top predators. Human-induced links between the regional systems are then considered and finally possible changes in the regional linkages over the next century are discussed. Because of the scale of potential impacts of climate change, this issue is considered in a separate section. The review demonstrates that the functioning of the ecosystems in each of the regions cannot be considered in isolation and the role of the atmosphere and ocean currents in linking the North Atlantic Ocean, North Atlantic shelf seas and the Arctic and Nordic Seas must be taken into account. Studying the North Atlantic and associated shelf seas as an integrated 'basin-scale' system will be a key challenge for the early twenty-first century. This requires a multinational approach that should lead to improved ecosystem-based approaches to conservation of natural resources, the maintenance of biodiversity, and a better understanding of the key role of the north-eastern Atlantic in the global carbon cycle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Science-based approaches to support the conservation of marine biodiversity have been developed in recent years. They include measures of ‘rarity’, ‘diversity’, ‘importance’, biological indicators of water ‘quality’ and measures of ‘sensitivity’. Identifying the sensitivity of species and biotopes, the main topic of this contribution, relies on accessing and interpreting available scientific data in a structured way and then making use of information technology to disseminate suitably presented information to decision makers. The Marine Life Information Network (MarLIN) has achieved that research for a range of environmentally critical species and biotopes over the past four years and has published the reviews on the MarLIN Web site (www.marlin.ac.uk). Now, by linking the sensitivity database and databases of survey information, sensitivity mapping approaches using GIS are being developed. The methods used to assess sensitivity are described and the approach is advocated for wider application in Europe.