932 resultados para computation- and data-intensive applications


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Localization is essential feature for many mobile wireless applications. Data collected from applications such as environmental monitoring, package tracking or position tracking has no meaning without knowing the location of this data. Other applications have location information as a building block for example, geographic routing protocols, data dissemination protocols and location-based services such as sensing coverage. Many of the techniques have the trade-off among many features such as deployment of special hardware, level of accuracy and computation power. In this paper, we present an algorithm that extracts location constraints from the connectivity information. Our solution, which does not require any special hardware and a small number of landmark nodes, uses two types of location constraints. The spatial constraints derive the estimated locations observing which nodes are within communication range of each other. The temporal constraints refine the areas, computed by the spatial constraints, using properties of time and space extracted from a contact trace. The intuition of the temporal constraints is to limit the possible locations that a node can be using its previous and future locations. To quantify this intuitive improvement in refine the nodes estimated areas adding temporal information, we performed simulations using synthetic and real contact traces. The results show this improvement and also the difficulties of using real traces.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spectral methods of graph partitioning have been shown to provide a powerful approach to the image segmentation problem. In this paper, we adopt a different approach, based on estimating the isoperimetric constant of an image graph. Our algorithm produces the high quality segmentations and data clustering of spectral methods, but with improved speed and stability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Integrated nanowire electrodes that permit direct, sensitive and rapid electrochemical based detection of chemical and biological species are a powerful emerging class of sensor devices. As critical dimensions of the electrodes enter the nanoscale, radial analyte diffusion profiles to the electrode dominate with a corresponding enhancement in mass transport, steady-state sigmoidal voltammograms, low depletion of target molecules and faster analysis. To optimise these sensors it is necessary to fully understand the factors that influence performance limits including: electrode geometry, electrode dimensions, electrode separation distances (within nanowire arrays) and diffusional mass transport. Therefore, in this thesis, theoretical simulations of analyte diffusion occurring at a variety of electrode designs were undertaken using Comsol Multiphysics®. Sensor devices were fabricated and corresponding experiments were performed to challenge simulation results. Two approaches for the fabrication and integration of metal nanowire electrodes are presented: Template Electrodeposition and Electron-Beam Lithography. These approaches allow for the fabrication of nanowires which may be subsequently integrated at silicon chip substrates to form fully functional electrochemical devices. Simulated and experimental results were found to be in excellent agreement validating the simulation model. The electrochemical characteristics exhibited by nanowire electrodes fabricated by electronbeam lithography were directly compared against electrochemical performance of a commercial ultra-microdisc electrode. Steady-state cyclic voltammograms in ferrocenemonocarboxylic acid at single ultra-microdisc electrodes were observed at low to medium scan rates (≤ 500 mV.s-1). At nanowires, steady-state responses were observed at ultra-high scan rates (up to 50,000 mV.s-1), thus allowing for much faster analysis (20 ms). Approaches for elucidating faradaic signal without the requirement for background subtraction were also developed. Furthermore, diffusional process occurring at arrays with increasing inter-electrode distance and increasing number of nanowires were explored. Diffusion profiles existing at nanowire arrays were simulated with Comsol Multiphysics®. A range of scan rates were modelled, and experiments were undertaken at 5,000 mV.s-1 since this allows rapid data capture required for, e.g., biomedical, environmental and pharmaceutical diagnostic applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a methodology for detecting anomalies from sequentially observed and potentially noisy data. The proposed approach consists of two main elements: 1) filtering, or assigning a belief or likelihood to each successive measurement based upon our ability to predict it from previous noisy observations and 2) hedging, or flagging potential anomalies by comparing the current belief against a time-varying and data-adaptive threshold. The threshold is adjusted based on the available feedback from an end user. Our algorithms, which combine universal prediction with recent work on online convex programming, do not require computing posterior distributions given all current observations and involve simple primal-dual parameter updates. At the heart of the proposed approach lie exponential-family models which can be used in a wide variety of contexts and applications, and which yield methods that achieve sublinear per-round regret against both static and slowly varying product distributions with marginals drawn from the same exponential family. Moreover, the regret against static distributions coincides with the minimax value of the corresponding online strongly convex game. We also prove bounds on the number of mistakes made during the hedging step relative to the best offline choice of the threshold with access to all estimated beliefs and feedback signals. We validate the theory on synthetic data drawn from a time-varying distribution over binary vectors of high dimensionality, as well as on the Enron email dataset. © 1963-2012 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe a general technique for determining upper bounds on maximal values (or lower bounds on minimal costs) in stochastic dynamic programs. In this approach, we relax the nonanticipativity constraints that require decisions to depend only on the information available at the time a decision is made and impose a "penalty" that punishes violations of nonanticipativity. In applications, the hope is that this relaxed version of the problem will be simpler to solve than the original dynamic program. The upper bounds provided by this dual approach complement lower bounds on values that may be found by simulating with heuristic policies. We describe the theory underlying this dual approach and establish weak duality, strong duality, and complementary slackness results that are analogous to the duality results of linear programming. We also study properties of good penalties. Finally, we demonstrate the use of this dual approach in an adaptive inventory control problem with an unknown and changing demand distribution and in valuing options with stochastic volatilities and interest rates. These are complex problems of significant practical interest that are quite difficult to solve to optimality. In these examples, our dual approach requires relatively little additional computation and leads to tight bounds on the optimal values. © 2010 INFORMS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Sharing of epidemiological and clinical data sets among researchers is poor at best, in detriment of science and community at large. The purpose of this paper is therefore to (1) describe a novel Web application designed to share information on study data sets focusing on epidemiological clinical research in a collaborative environment and (2) create a policy model placing this collaborative environment into the current scientific social context. METHODOLOGY: The Database of Databases application was developed based on feedback from epidemiologists and clinical researchers requiring a Web-based platform that would allow for sharing of information about epidemiological and clinical study data sets in a collaborative environment. This platform should ensure that researchers can modify the information. A Model-based predictions of number of publications and funding resulting from combinations of different policy implementation strategies (for metadata and data sharing) were generated using System Dynamics modeling. PRINCIPAL FINDINGS: The application allows researchers to easily upload information about clinical study data sets, which is searchable and modifiable by other users in a wiki environment. All modifications are filtered by the database principal investigator in order to maintain quality control. The application has been extensively tested and currently contains 130 clinical study data sets from the United States, Australia, China and Singapore. Model results indicated that any policy implementation would be better than the current strategy, that metadata sharing is better than data-sharing, and that combined policies achieve the best results in terms of publications. CONCLUSIONS: Based on our empirical observations and resulting model, the social network environment surrounding the application can assist epidemiologists and clinical researchers contribute and search for metadata in a collaborative environment, thus potentially facilitating collaboration efforts among research communities distributed around the globe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The inherent complexity of statistical methods and clinical phenomena compel researchers with diverse domains of expertise to work in interdisciplinary teams, where none of them have a complete knowledge in their counterpart's field. As a result, knowledge exchange may often be characterized by miscommunication leading to misinterpretation, ultimately resulting in errors in research and even clinical practice. Though communication has a central role in interdisciplinary collaboration and since miscommunication can have a negative impact on research processes, to the best of our knowledge, no study has yet explored how data analysis specialists and clinical researchers communicate over time. METHODS/PRINCIPAL FINDINGS: We conducted qualitative analysis of encounters between clinical researchers and data analysis specialists (epidemiologist, clinical epidemiologist, and data mining specialist). These encounters were recorded and systematically analyzed using a grounded theory methodology for extraction of emerging themes, followed by data triangulation and analysis of negative cases for validation. A policy analysis was then performed using a system dynamics methodology looking for potential interventions to improve this process. Four major emerging themes were found. Definitions using lay language were frequently employed as a way to bridge the language gap between the specialties. Thought experiments presented a series of "what if" situations that helped clarify how the method or information from the other field would behave, if exposed to alternative situations, ultimately aiding in explaining their main objective. Metaphors and analogies were used to translate concepts across fields, from the unfamiliar to the familiar. Prolepsis was used to anticipate study outcomes, thus helping specialists understand the current context based on an understanding of their final goal. CONCLUSION/SIGNIFICANCE: The communication between clinical researchers and data analysis specialists presents multiple challenges that can lead to errors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The advent of digital microfluidic lab-on-a-chip (LoC) technology offers a platform for developing diagnostic applications with the advantages of portability, reduction of the volumes of the sample and reagents, faster analysis times, increased automation, low power consumption, compatibility with mass manufacturing, and high throughput. Moreover, digital microfluidics is being applied in other areas such as airborne chemical detection, DNA sequencing by synthesis, and tissue engineering. In most diagnostic and chemical-detection applications, a key challenge is the preparation of the analyte for presentation to the on-chip detection system. Thus, in diagnostics, raw physiological samples must be introduced onto the chip and then further processed by lysing blood cells and extracting DNA. For massively parallel DNA sequencing, sample preparation can be performed off chip, but the synthesis steps must be performed in a sequential on-chip format by automated control of buffers and nucleotides to extend the read lengths of DNA fragments. In airborne particulate-sampling applications, the sample collection from an air stream must be integrated into the LoC analytical component, which requires a collection droplet to scan an exposed impacted surface after its introduction into a closed analytical section. Finally, in tissue-engineering applications, the challenge for LoC technology is to build high-resolution (less than 10 microns) 3D tissue constructs with embedded cells and growth factors by manipulating and maintaining live cells in the chip platform. This article discusses these applications and their implementation in digital-microfluidic LoC platforms. © 2007 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The importance of patterns in constructing complex systems has long been recognised in other disciplines. In software engineering, for example, well-crafted object-oriented architectures contain several design patterns. Focusing on mechanisms of constructing software during system development can yield an architecture that is simpler, clearer and more understandable than if design patterns were ignored or not properly applied. In this paper, we propose a model that uses object-oriented design patterns to develop a core bitemporal conceptual model. We define three core design patterns that form a core bitemporal conceptual model of a typical bitemporal object. Our framework is known as the Bitemporal Object, State and Event Modelling Approach (BOSEMA) and the resulting core model is known as a Bitemporal Object, State and Event (BOSE) model. Using this approach, we demonstrate that we can enrich data modelling by using well known design patterns which can help designers to build complex models of bitemporal databases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Drugs based on 5-phenyl-2,4 diamino pyrimidine and 6-phenyl-1,2,4 triazine derivatives are well known for their effects on the central nervous system. The study presented here provides detailed crystal structures of two pyrimidine derivatives which have neuroprotective properties in models of both grey and white matter ischemia. Recently published studies suggest that the compounds lamotrigine (a triazine derivative), and the two pyrimidines BW1003C87 (I) and sipatrigine (II) mediate their primary in vivo mode of action by inhibiting voltage-gated Na+ channels. The X-ray crystal structures will contribute valuable data for applications involving binding and modelling studies of the biological actions of these drugs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is the first report from ALT’s new Annual Survey launched in December 2014. This survey was primarily for ALT members (individual or at an organisation which is an organisational member) it could however also be filled in by others, perhaps those interested in taking out membership. The report and data highlight emerging work areas that are important to the survey respondents. Analysis of the survey responses indicates a number of areas ALT should continue to support and develop. Priorities for the membership are ‘Intelligent use of learning technology’ and ‘Research and practice’, aligned to this is the value placed by respondent’s on by communication via the ALT Newsletter/News, social media and Research in Learning Technology. The survey also reveals ‘Data and Analytics’ and ‘Open Education’ are areas where the majority of respondents are finding are becoming increasingly important. As such our community may benefit from development opportunities ALT can provide. The survey is also a reminder that ALT has an essential role in enabling members to develop research and practice in areas which might be considered as minority interest. For example whilst the majority of respondents didn't indicate areas such as ‘Digital and Open Badges’, and ‘Game Based Learning’ as important there are still members who consider these areas are very significant and becoming increasingly valuable and as such ALT will continue to better support these groups within our community. Whilst ALT has conducted previous surveys of ALT membership this is the first iteration in this form. ALT has committed to surveying the sector on an annual basis, refining the core question set but trying to preserve an opportunity for longitudinal analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reviews the utility and availability of biological and ecological traits for marine species so as to prioritise the development of a world database on marine species traits. In addition, the ‘status’ of species for conservation, that is, whether they are introduced or invasive, of fishery or aquaculture interest, harmful, or used as an ecological indicator, were reviewed because these attributes are of particular interest to society. Whereas traits are an enduring characteristic of a species and/or population, a species status may vary geographically and over time. Criteria for selecting traits were that they could be applied to most taxa, were easily available, and their inclusion would result in new research and/or management applications. Numerical traits were favoured over categorical. Habitat was excluded as it can be derived from a selection of these traits. Ten traits were prioritized for inclusion in the most comprehensive open access database on marine species (World Register of Marine Species), namely taxonomic classification, environment, geography, depth, substratum, mobility, skeleton, diet, body size and reproduction. These traits and statuses are being added to the database and new use cases may further subdivide and expand upon them.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Grey Level Co-occurrence Matrix (GLCM), one of the best known tool for texture analysis, estimates image properties related to second-order statistics. These image properties commonly known as Haralick texture features can be used for image classification, image segmentation, and remote sensing applications. However, their computations are highly intensive especially for very large images such as medical ones. Therefore, methods to accelerate their computations are highly desired. This paper proposes the use of programmable hardware to accelerate the calculation of GLCM and Haralick texture features. Further, as an example of the speedup offered by programmable logic, a multispectral computer vision system for automatic diagnosis of prostatic cancer has been implemented. The performance is then compared against a microprocessor based solution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

MOOCs are changing the educational landscape and gaining a lot of attention in scientific literature. However, the pedagogical design of these proposals has been called into question. It is precisely MOOCs’ social aspect, i.e. the interaction between course participants and the support for learning processes that has become one of the main topics of interest. This article presents the results of a research project carried out at the University of the Basque Country, which focused in cooperative learning and the intensive use of social networks in a MOOC. Significant data was compiled through Likert-type surveys, revealing that the use of both external and internal social networks in a massive open online course is a factor that is evaluated positively by students. We argue that the use of social networks as a learning strategy in a MOOC has an influence on academic performance and on the students' success rate. Furthermore, the participants’ age also has a bearing on the social networks they use, and we have found that the younger members tend to work with external networks such as Twitter or personal blogs, whereas the older students are more inclined to use forums from the Chamilo or Ning platforms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The future convergence of voice, video and data applications on the Internet requires that next generation technology provides bandwidth and delay guarantees. Current technology trends are moving towards scalable aggregate-based systems where applications are grouped together and guarantees are provided at the aggregate level only. This solution alone is not enough for interactive video applications with sub-second delay bounds. This paper introduces a novel packet marking scheme that controls the end-to-end delay of an individual flow as it traverses a network enabled to supply aggregate- granularity Quality of Service (QoS). IPv6 Hop-by-Hop extension header fields are used to track the packet delay encountered at each network node and autonomous decisions are made on the best queuing strategy to employ. The results of network simulations are presented and it is shown that when the proposed mechanism is employed the requested delay bound is met with a 20% reduction in resource reservation and no packet loss in the network.