913 resultados para 100602 Input Output and Data Devices


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A nonparametric probability estimation procedure using the fuzzy ARTMAP neural network is here described. Because the procedure does not make a priori assumptions about underlying probability distributions, it yields accurate estimates on a wide variety of prediction tasks. Fuzzy ARTMAP is used to perform probability estimation in two different modes. In a 'slow-learning' mode, input-output associations change slowly, with the strength of each association computing a conditional probability estimate. In 'max-nodes' mode, a fixed number of categories are coded during an initial fast learning interval, and weights are then tuned by slow learning. Simulations illustrate system performance on tasks in which various numbers of clusters in the set of input vectors mapped to a given class.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Body Sensor Network (BSN) technology is seeing a rapid emergence in application areas such as health, fitness and sports monitoring. Current BSN wireless sensors typically operate on a single frequency band (e.g. utilizing the IEEE 802.15.4 standard that operates at 2.45GHz) employing a single radio transceiver for wireless communications. This allows a simple wireless architecture to be realized with low cost and power consumption. However, network congestion/failure can create potential issues in terms of reliability of data transfer, quality-of-service (QOS) and data throughput for the sensor. These issues can be especially critical in healthcare monitoring applications where data availability and integrity is crucial. The addition of more than one radio has the potential to address some of the above issues. For example, multi-radio implementations can allow access to more than one network, providing increased coverage and data processing as well as improved interoperability between networks. A small number of multi-radio wireless sensor solutions exist at present but require the use of more than one radio transceiver devices to achieve multi-band operation. This paper presents the design of a novel prototype multi-radio hardware platform that uses a single radio transceiver. The proposed design allows multi-band operation in the 433/868MHz ISM bands and this, together with its low complexity and small form factor, make it suitable for a wide range of BSN applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Motivated by accurate average-case analysis, MOdular Quantitative Analysis (MOQA) is developed at the Centre for Efficiency Oriented Languages (CEOL). In essence, MOQA allows the programmer to determine the average running time of a broad class of programmes directly from the code in a (semi-)automated way. The MOQA approach has the property of randomness preservation which means that applying any operation to a random structure, results in an output isomorphic to one or more random structures, which is key to systematic timing. Based on original MOQA research, we discuss the design and implementation of a new domain specific scripting language based on randomness preserving operations and random structures. It is designed to facilitate compositional timing by systematically tracking the distributions of inputs and outputs. The notion of a labelled partial order (LPO) is the basic data type in the language. The programmer uses built-in MOQA operations together with restricted control flow statements to design MOQA programs. This MOQA language is formally specified both syntactically and semantically in this thesis. A practical language interpreter implementation is provided and discussed. By analysing new algorithms and data restructuring operations, we demonstrate the wide applicability of the MOQA approach. Also we extend MOQA theory to a number of other domains besides average-case analysis. We show the strong connection between MOQA and parallel computing, reversible computing and data entropy analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this project is to integrate neuronal cell culture with commercial or in-house built micro-electrode arrays and MEMS devices. The resulting device is intended to support neuronal cell culture on its surface, expose specific portions of a neuronal population to different environments using microfluidic gradients and stimulate/record neuronal electrical activity using micro-electrode arrays. Additionally, through integration of chemical surface patterning, such device can be used to build neuronal cell networks of specific size, conformation and composition. The design of this device takes inspiration from the nervous system because its development and regeneration are heavily influenced by surface chemistry and fluidic gradients. Hence, this device is intended to be a step forward in neuroscience research because it utilizes similar concepts to those found in nature. The large part of this research revolved around solving technical issues associated with integration of biology, surface chemistry, electrophysiology and microfluidics. Commercially available microelectrode arrays (MEAs) are mechanically and chemically brittle making them unsuitable for certain surface modification and micro-fluidic integration techniques described in the literature. In order to successfully integrate all the aspects into one device, some techniques were heavily modified to ensure that their effects on MEA were minimal. In terms of experimental work, this thesis consists of 3 parts. The first part dealt with characterization and optimization of surface patterning and micro-fluidic perfusion. Through extensive image analysis, the optimal conditions required for micro-contact printing and micro-fluidic perfusion were determined. The second part used a number of optimized techniques and successfully applied these to culturing patterned neural cells on a range of substrates including: Pyrex, cyclo-olefin and SiN coated Pyrex. The second part also described culturing neurons on MEAs and recording electrophysiological activity. The third part of the thesis described integration of MEAs with patterned neuronal culture and microfluidic devices. Although integration of all methodologies proved difficult, a large amount of data relating to biocompatibility, neuronal patterning, electrophysiology and integration was collected. Original solutions were successfully applied to solve a number of issues relating to consistency of micro printing and microfluidic integration leading to successful integration of techniques and device components.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The mobile cloud computing model promises to address the resource limitations of mobile devices, but effectively implementing this model is difficult. Previous work on mobile cloud computing has required the user to have a continuous, high-quality connection to the cloud infrastructure. This is undesirable and possibly infeasible, as the energy required on the mobile device to maintain a connection, and transfer sizeable amounts of data is large; the bandwidth tends to be quite variable, and low on cellular networks. The cloud deployment itself needs to efficiently allocate scalable resources to the user as well. In this paper, we formulate the best practices for efficiently managing the resources required for the mobile cloud model, namely energy, bandwidth and cloud computing resources. These practices can be realised with our mobile cloud middleware project, featuring the Cloud Personal Assistant (CPA). We compare this with the other approaches in the area, to highlight the importance of minimising the usage of these resources, and therefore ensure successful adoption of the model by end users. Based on results from experiments performed with mobile devices, we develop a no-overhead decision model for task and data offloading to the CPA of a user, which provides efficient management of mobile cloud resources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Sharing of epidemiological and clinical data sets among researchers is poor at best, in detriment of science and community at large. The purpose of this paper is therefore to (1) describe a novel Web application designed to share information on study data sets focusing on epidemiological clinical research in a collaborative environment and (2) create a policy model placing this collaborative environment into the current scientific social context. METHODOLOGY: The Database of Databases application was developed based on feedback from epidemiologists and clinical researchers requiring a Web-based platform that would allow for sharing of information about epidemiological and clinical study data sets in a collaborative environment. This platform should ensure that researchers can modify the information. A Model-based predictions of number of publications and funding resulting from combinations of different policy implementation strategies (for metadata and data sharing) were generated using System Dynamics modeling. PRINCIPAL FINDINGS: The application allows researchers to easily upload information about clinical study data sets, which is searchable and modifiable by other users in a wiki environment. All modifications are filtered by the database principal investigator in order to maintain quality control. The application has been extensively tested and currently contains 130 clinical study data sets from the United States, Australia, China and Singapore. Model results indicated that any policy implementation would be better than the current strategy, that metadata sharing is better than data-sharing, and that combined policies achieve the best results in terms of publications. CONCLUSIONS: Based on our empirical observations and resulting model, the social network environment surrounding the application can assist epidemiologists and clinical researchers contribute and search for metadata in a collaborative environment, thus potentially facilitating collaboration efforts among research communities distributed around the globe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The inherent complexity of statistical methods and clinical phenomena compel researchers with diverse domains of expertise to work in interdisciplinary teams, where none of them have a complete knowledge in their counterpart's field. As a result, knowledge exchange may often be characterized by miscommunication leading to misinterpretation, ultimately resulting in errors in research and even clinical practice. Though communication has a central role in interdisciplinary collaboration and since miscommunication can have a negative impact on research processes, to the best of our knowledge, no study has yet explored how data analysis specialists and clinical researchers communicate over time. METHODS/PRINCIPAL FINDINGS: We conducted qualitative analysis of encounters between clinical researchers and data analysis specialists (epidemiologist, clinical epidemiologist, and data mining specialist). These encounters were recorded and systematically analyzed using a grounded theory methodology for extraction of emerging themes, followed by data triangulation and analysis of negative cases for validation. A policy analysis was then performed using a system dynamics methodology looking for potential interventions to improve this process. Four major emerging themes were found. Definitions using lay language were frequently employed as a way to bridge the language gap between the specialties. Thought experiments presented a series of "what if" situations that helped clarify how the method or information from the other field would behave, if exposed to alternative situations, ultimately aiding in explaining their main objective. Metaphors and analogies were used to translate concepts across fields, from the unfamiliar to the familiar. Prolepsis was used to anticipate study outcomes, thus helping specialists understand the current context based on an understanding of their final goal. CONCLUSION/SIGNIFICANCE: The communication between clinical researchers and data analysis specialists presents multiple challenges that can lead to errors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Numerical simulations have been used to study broad-band supercontinuum generation in optical fibers with dispersion and nonlinearity characteristics typical and photonic crystal or tapered fibers structures. The simulations include optical shock and Raman nonlinearity terms, with quantum noise taken into account phenomenologically by including in the input field a noise seed of one photon per mode with random phase. For input pulses of 150-fs duration injected in the anomalous dispersion regime, the effect of modulational instability is shown to lead to severe temporal jitter in the output, and associated fluctuations in the spectral amplitude and phase across the generated supercontinuum. The spectral phase fluctuations are quantified by performing multiple simulations and calculating both the standard deviation of the phase and, more rigorously, the degree of first-order coherence as a function of wavelength across the spectrum. By performing simulations over a range of input pulse durations and wavelengths, we can identify the conditions under which coherent supercontinua with a well-defined spectral phase are generated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes modeling technology and its use in providing data governing the assembly and subsequent reliability of electronic chip components on printed circuit boards (PCBs). Products, such as mobile phones, camcorders, intelligent displays, etc., are changing at a tremendous rate where newer technologies are being applied to satisfy the demands for smaller products with increased functionality. At ever decreasing dimensions, and increasing number of input/output connections, the design of these components, in terms of dimensions and materials used, is playing a key role in determining the reliability of the final assembly. Multiphysics modeling techniques are being adopted to predict a range of interacting physics-based phenomena associated with the manufacturing process. For example, heat transfer, solidification, marangoni fluid flow, void movement, and thermal-stress. The modeling techniques used are based on finite volume methods that are conservative and take advantage of being able to represent the physical domain using an unstructured mesh. These techniques are also used to provide data on thermal induced fatigue which is then mapped into product lifetime predictions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel circuit design technique is presented which improves gain-accuracy and linearity in differential amplifiers. The technique employs negative impedance compensation and results demonstrate a significant performance improvement in precision, lowering sensitivity, and wide dynamic range. A theoretical underpinning is given together with the results of a demonstrator differential input/output amplifier with gain of 12 dB. The simulation results show that, with the novel method, both the gain-accuracy and linearity can be improved greatly. Especially, the linearity improvement in IMD can get to more than 23 dB with a required gain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is the first report from ALT’s new Annual Survey launched in December 2014. This survey was primarily for ALT members (individual or at an organisation which is an organisational member) it could however also be filled in by others, perhaps those interested in taking out membership. The report and data highlight emerging work areas that are important to the survey respondents. Analysis of the survey responses indicates a number of areas ALT should continue to support and develop. Priorities for the membership are ‘Intelligent use of learning technology’ and ‘Research and practice’, aligned to this is the value placed by respondent’s on by communication via the ALT Newsletter/News, social media and Research in Learning Technology. The survey also reveals ‘Data and Analytics’ and ‘Open Education’ are areas where the majority of respondents are finding are becoming increasingly important. As such our community may benefit from development opportunities ALT can provide. The survey is also a reminder that ALT has an essential role in enabling members to develop research and practice in areas which might be considered as minority interest. For example whilst the majority of respondents didn't indicate areas such as ‘Digital and Open Badges’, and ‘Game Based Learning’ as important there are still members who consider these areas are very significant and becoming increasingly valuable and as such ALT will continue to better support these groups within our community. Whilst ALT has conducted previous surveys of ALT membership this is the first iteration in this form. ALT has committed to surveying the sector on an annual basis, refining the core question set but trying to preserve an opportunity for longitudinal analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

eScience is an umbrella concept which covers internet technologies, such as web service orchestration that involves manipulation and processing of high volumes of data, using simple and efficient methodologies. This concept is normally associated with bioinformatics, but nothing prevents the use of an identical approach for geoinfomatics and OGC (Open Geospatial Consortium) web services like WPS (Web Processing Service). In this paper we present an extended WPS implementation based on the PyWPS framework using an automatically generated WSDL (Web Service Description Language) XML document that replicates the WPS input/output document structure used during an Execute request to a server. Services are accessed using a modified SOAP (Simple Object Access Protocol) interface provided by PyWPS, that uses service and input/outputs identifiers as element names. The WSDL XML document is dynamically generated by applying XSLT (Extensible Stylesheet Language Transformation) to the getCapabilities XML document that is generated by PyWPS. The availability of the SOAP interface and WSDL description allows WPS instances to be accessible to workflow development software like Taverna, enabling users to build complex workflows using web services represented by interconnecting graphics. Taverna will transform the visual representation of the workflow into a SCUFL (Simple Conceptual Unified Flow Language) based XML document that can be run internally or sent to a Taverna orchestration server. SCUFL uses a dataflow-centric orchestration model as opposed to the more commonly used orchestration language BPEL (Business Process Execution Language) which is process-centric.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines the relative efficiency of UK credit unions. Radial and non-radial measures of input cost efficiency plus associated scale efficiency measures are computed for a selection of input output specifications. Both measures highlighted that UK credit unions have considerable scope for efficiency gains. It was mooted that the documented high levels of inefficiency may be indicative of the fact that credit unions, based on clearly defined and non-overlapping common bonds, are not in competition with each other for market share. Credit unions were also highlighted as suffering from a considerable degree of scale inefficiency with the majority of scale inefficient credit unions subject to decreasing returns to scale. The latter aspect highlights that the UK Government's goal of larger credit unions must be accompanied by greater regulatory freedom if inefficiency is to be avoided. One of the advantages of computing non-radial measures is that an insight into potential over- or under-expenditure on specific inputs can be obtained through a comparison of the non-radial measure of efficiency with the associated radial measure. Two interesting findings emerged, the first that UK credit unions over-spend on dividend payments and the second that they under-spend on labour costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reproducible modulations in low-pressure, inductively coupled discharges operating in chlorine and argon-chlorine mixtures have been observed and studied. Changes in the light output, floating potential, negative ion fraction, and charged particle densities were observed. Here we report two types of unstable operational modes in an inductively coupled discharge. On the one hand, when the discharge was matched, to minimize reflected power, instabilities were observed in argon-chlorine plasmas over limited operating conditions of input power and gas pressure. The instability window decreased with increasing chlorine content and was observed for chlorine concentrations between 30% and 60% only. However, when operating at pressures below 5 mTorr and the discharge circuit detuned to increase the reflected power, modulations were observed in a pure chlorine discharge. These modulations varied in nature from a series of sharp bursts to a very periodic behavior and can be controlled, by variation of the matching conditions, to produce an apparent pulsed plasma environment. (C) 2005 American Institute of Physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper deals with Takagi-Sugeno (TS) fuzzy model identification of nonlinear systems using fuzzy clustering. In particular, an extended fuzzy Gustafson-Kessel (EGK) clustering algorithm, using robust competitive agglomeration (RCA), is developed for automatically constructing a TS fuzzy model from system input-output data. The EGK algorithm can automatically determine the 'optimal' number of clusters from the training data set. It is shown that the EGK approach is relatively insensitive to initialization and is less susceptible to local minima, a benefit derived from its agglomerate property. This issue is often overlooked in the current literature on nonlinear identification using conventional fuzzy clustering. Furthermore, the robust statistical concepts underlying the EGK algorithm help to alleviate the difficulty of cluster identification in the construction of a TS fuzzy model from noisy training data. A new hybrid identification strategy is then formulated, which combines the EGK algorithm with a locally weighted, least-squares method for the estimation of local sub-model parameters. The efficacy of this new approach is demonstrated through function approximation examples and also by application to the identification of an automatic voltage regulation (AVR) loop for a simulated 3 kVA laboratory micro-machine system.