51 resultados para Development Applications

em Aston University Research Archive


Relevância:

60.00% 60.00%

Publicador:

Resumo:

As a new medium for questionnaire delivery, the internet has the potential to revolutionise the survey process. Online (web-based) questionnaires provide several advantages over traditional survey methods in terms of cost, speed, appearance, flexibility, functionality, and usability [1, 2]. For instance, delivery is faster, responses are received more quickly, and data collection can be automated or accelerated [1- 3]. Online-questionnaires can also provide many capabilities not found in traditional paper-based questionnaires: they can include pop-up instructions and error messages; they can incorporate links; and it is possible to encode difficult skip patterns making such patterns virtually invisible to respondents. Like many new technologies, however, online-questionnaires face criticism despite their advantages. Typically, such criticisms focus on the vulnerability of online-questionnaires to the four standard survey error types: namely, coverage, non-response, sampling, and measurement errors. Although, like all survey errors, coverage error (“the result of not allowing all members of the survey population to have an equal or nonzero chance of being sampled for participation in a survey” [2, pg. 9]) also affects traditional survey methods, it is currently exacerbated in online-questionnaires as a result of the digital divide. That said, many developed countries have reported substantial increases in computer and internet access and/or are targeting this as part of their immediate infrastructural development [4, 5]. Indicating that familiarity with information technologies is increasing, these trends suggest that coverage error will rapidly diminish to an acceptable level (for the developed world at least) in the near future, and in so doing, positively reinforce the advantages of online-questionnaire delivery. The second error type – the non-response error – occurs when individuals fail to respond to the invitation to participate in a survey or abandon a questionnaire before it is completed. Given today’s societal trend towards self-administration [2] the former is inevitable, irrespective of delivery mechanism. Conversely, non-response as a consequence of questionnaire abandonment can be relatively easily addressed. Unlike traditional questionnaires, the delivery mechanism for online-questionnaires makes estimation of questionnaire length and time required for completion difficult1, thus increasing the likelihood of abandonment. By incorporating a range of features into the design of an online questionnaire, it is possible to facilitate such estimation – and indeed, to provide respondents with context sensitive assistance during the response process – and thereby reduce abandonment while eliciting feelings of accomplishment [6]. For online-questionnaires, sampling error (“the result of attempting to survey only some, and not all, of the units in the survey population” [2, pg. 9]) can arise when all but a small portion of the anticipated respondent set is alienated (and so fails to respond) as a result of, for example, disregard for varying connection speeds, bandwidth limitations, browser configurations, monitors, hardware, and user requirements during the questionnaire design process. Similarly, measurement errors (“the result of poor question wording or questions being presented in such a way that inaccurate or uninterpretable answers are obtained” [2, pg. 11]) will lead to respondents becoming confused and frustrated. Sampling, measurement, and non-response errors are likely to occur when an online-questionnaire is poorly designed. Individuals will answer questions incorrectly, abandon questionnaires, and may ultimately refuse to participate in future surveys; thus, the benefit of online questionnaire delivery will not be fully realized. To prevent errors of this kind2, and their consequences, it is extremely important that practical, comprehensive guidelines exist for the design of online questionnaires. Many design guidelines exist for paper-based questionnaire design (e.g. [7-14]); the same is not true for the design of online questionnaires [2, 15, 16]. The research presented in this paper is a first attempt to address this discrepancy. Section 2 describes the derivation of a comprehensive set of guidelines for the design of online-questionnaires and briefly (given space restrictions) outlines the essence of the guidelines themselves. Although online-questionnaires reduce traditional delivery costs (e.g. paper, mail out, and data entry), set up costs can be high given the need to either adopt and acquire training in questionnaire development software or secure the services of a web developer. Neither approach, however, guarantees a good questionnaire (often because the person designing the questionnaire lacks relevant knowledge in questionnaire design). Drawing on existing software evaluation techniques [17, 18], we assessed the extent to which current questionnaire development applications support our guidelines; Section 3 describes the framework used for the evaluation, and Section 4 discusses our findings. Finally, Section 5 concludes with a discussion of further work.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The objective of the research carried out in this report was to observe the first ever in-situ sonochemical reaction in the NMR Spectrometer in the megahertz region of ultrasound. Several reactions were investigated as potential systems for a sonochemical reaction followed by NMR spectroscopy. The primary problem to resolve when applying ultrasound to a chemical reaction is that of heating. Ultrasound causes the liquid to move and produces 'hot spots' resulting in an increase in sample temperature. The problem was confronted by producing a device that would counteract this effect and so remove the need to account for heating. However, the design of the device limited the length of time during which it would function. Longer reaction times were required to enable observations to be carried out in the NMR spectrometer. The fIrst and most obvious reactions attempted were those of the well-known ultrasonic dosimeter. Such a reaction would, theoretically, enable the author to simultaneously observe a reaction and determine the exact power entering the system for direct comparison of results. Unfortunately, in order to monitor the reactions in the NMR spectrometer the reactant concentrations had to be signifIcantly increased, which resulted in a notable increase in reaction time, making the experiment too lengthy to follow in the time allocated. The Diels-Alder Reaction is probably one of the most highly investigated reaction systems in the field of chemistry and it was this to which the author turned her attention. Previous authors have carried out ultrasonic investigations, with considerable success, for the reaction of anthracene with maleic anhydride. It was this reaction in particular that was next attempted. The first ever sonochemically enhanced reaction using a frequency of ultrasound in the megahertz (MHz) region was successfully carried out as bench experiments. Due to the complexity of the component reactants the product would precipitate from the solution and because the reaction could only be monitored by its formation, it was not possible to observe the reaction in the NMR spectrometer. The solvolysis of 2-chloro-2-methylpropane was examined in various solvent systems; the most suitable of which was determined to be aqueous 2-methylpropan-2-ol. The experiment was successfully enhanced by the application of ultrasound and monitored in-situ in the NMR spectrometer. The increase in product formation of an ultrasonic reaction over that of a traditional thermal reaction occurred. A range of 1.4 to 2.9 fold improvement was noted, dependent upon the reaction conditions investigated. An investigation into the effect of sonication upon a large biological molecule, in this case aqueous lysozyme, was carried out. An easily observed effect upon the sample was noted but no explanation for the observed effects could be established.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Teallach project has adapted model-based user-interface development techniques to the systematic creation of user-interfaces for object-oriented database applications. Model-based approaches aim to provide designers with a more principled approach to user-interface development using a variety of underlying models, and tools which manipulate these models. Here we present the results of the Teallach project, describing the tools developed and the flexible design method supported. Distinctive features of the Teallach system include provision of database-specific constructs, comprehensive facilities for relating the different models, and support for a flexible design method in which models can be constructed and related by designers in different orders and in different ways, to suit their particular design rationales. The system then creates the desired user-interface as an independent, fully functional Java application, with automatically generated help facilities.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Bio-impedance analysis (BIA) provides a rapid, non-invasive technique for body composition estimation. BIA offers a convenient alternative to standard techniques such as MRI, CT scan or DEXA scan for selected types of body composition analysis. The accuracy of BIA is limited because it is an indirect method of composition analysis. It relies on linear relationships between measured impedance and morphological parameters such as height and weight to derive estimates. To overcome these underlying limitations of BIA, a multi-frequency segmental bio-impedance device was constructed through a series of iterative enhancements and improvements of existing BIA instrumentation. Key features of the design included an easy to construct current-source and compact PCB design. The final device was trialled with 22 human volunteers and measured impedance was compared against body composition estimates obtained by DEXA scan. This enabled the development of newer techniques to make BIA predictions. To add a ‘visual aspect’ to BIA, volunteers were scanned in 3D using an inexpensive scattered light gadget (Xbox Kinect controller) and 3D volumes of their limbs were compared with BIA measurements to further improve BIA predictions. A three-stage digital filtering scheme was also implemented to enable extraction of heart-rate data from recorded bio-electrical signals. Additionally modifications have been introduced to measure change in bio-impedance with motion, this could be adapted to further improve accuracy and veracity for limb composition analysis. The findings in this thesis aim to give new direction to the prediction of body composition using BIA. The design development and refinement applied to BIA in this research programme suggest new opportunities to enhance the accuracy and clinical utility of BIA for the prediction of body composition analysis. In particular, the use of bio-impedance to predict limb volumes which would provide an additional metric for body composition measurement and help distinguish between fat and muscle content.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The successful design of polymers for contact lens applications depends on the ability to provide a balance of properties appropriate to the ocular environment. Principal relevant aspects of the anterior eye are the tear film, eyelid and cornea, which govern the requirements for surface properties, modulus and oxygen permeability, respectively. Permeability requirements and the developing view of the needs of the cornea, in terms of oxygen consumption and the particular roles of fluorine and silicon in the design of silicone hydrogels, which have proved to be the most successful family of materials for this demanding application, are discussed. The contact lens field is complicated by the fact that contact lenses are used in a range of wear modalities, the extremes of which can conveniently be classified as lenses that are disposed of at the end of a single period of daily wear and those used for 30. days of successive day-and-night periods, frequently referred to as extended or continuous wear. As silicone hydrogels developed, in the decade following their launch there has been a progressive trend in properties taking both modulus and water content closer to those of conventional hydrogels. This is particularly evident in the family of daily disposable contact lenses that have appeared since 2008.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports the initial results of a joint research project carried out by Aston University and Lloyd's Register to develop a practical method of assessing neural network applications. A set of assessment guidelines for neural network applications were developed and tested on two applications. These case studies showed that it is practical to assess neural networks in a statistical pattern recognition framework. However there is need for more standardisation in neural network technology and a wider takeup of good development practice amongst the neural network community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this research is to propose a procurement system across other disciplines and retrieved information with relevant parties so as to have a better co-ordination between supply and demand sides. This paper demonstrates how to analyze the data with an agent-based procurement system (APS) to re-engineer and improve the existing procurement process. The intelligence agents take the responsibility of searching the potential suppliers, negotiation with the short-listed suppliers and evaluating the performance of suppliers based on the selection criteria with mathematical model. Manufacturing firms and trading companies spend more than half of their sales dollar in the purchase of raw material and components. Efficient data collection with high accuracy is one of the key success factors to generate quality procurement which is to purchasing right material at right quality from right suppliers. In general, the enterprises spend a significant amount of resources on data collection and storage, but too little on facilitating data analysis and sharing. To validate the feasibility of the approach, a case study on a manufacturing small and medium-sized enterprise (SME) has been conducted. APS supports the data and information analyzing technique to facilitate the decision making such that the agent can enhance the negotiation and suppler evaluation efficiency by saving time and cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to survive in the increasingly customer-oriented marketplace, continuous quality improvement marks the fastest growing quality organization’s success. In recent years, attention has been focused on intelligent systems which have shown great promise in supporting quality control. However, only a small number of the currently used systems are reported to be operating effectively because they are designed to maintain a quality level within the specified process, rather than to focus on cooperation within the production workflow. This paper proposes an intelligent system with a newly designed algorithm and the universal process data exchange standard to overcome the challenges of demanding customers who seek high-quality and low-cost products. The intelligent quality management system is equipped with the ‘‘distributed process mining” feature to provide all levels of employees with the ability to understand the relationships between processes, especially when any aspect of the process is going to degrade or fail. An example of generalized fuzzy association rules are applied in manufacturing sector to demonstrate how the proposed iterative process mining algorithm finds the relationships between distributed process parameters and the presence of quality problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last decade we have seen an exponential growth of functional imaging studies investigating multiple aspects of language processing. These studies have sparked an interest in applying some of the paradigms to various clinically relevant questions, such as the identification of the cortical regions mediating language function in surgical candidates for refractory epilepsy. Here we present data from a group of adult control participants in order to investigate the potential of using frequency specific spectral power changes in MEG activation patterns to establish lateralisation of language function using expressive language tasks. In addition, we report on a paediatric patient whose language function was assessed before and after a left hemisphere amygdalo-hippocampectomy. Our verb generation task produced left hemisphere decreases in beta-band power accompanied by right hemisphere increases in low beta-band power in the majority of the control group, a previously unreported phenomenon. This pattern of spectral power was also found in the patient's post-surgery data, though not her pre-surgery data. Comparison of pre and post-operative results also provided some evidence of reorganisation in language related cortex both inter- and intra-hemispherically following surgery. The differences were not limited to changes in localisation of language specific cortex but also changes in the spectral and temporal profile of frontal brain regions during verb generation. While further investigation is required to establish concordance with invasive measures, our data suggest that the methods described may serve as a reliable lateralisation marker for clinical assessment. Furthermore, our findings highlight the potential utility of MEG for the investigation of cortical language functioning in both healthy development and pathology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The inclusion of high-level scripting functionality in state-of-the-art rendering APIs indicates a movement toward data-driven methodologies for structuring next generation rendering pipelines. A similar theme can be seen in the use of composition languages to deploy component software using selection and configuration of collaborating component implementations. In this paper we introduce the Fluid framework, which places particular emphasis on the use of high-level data manipulations in order to develop component based software that is flexible, extensible, and expressive. We introduce a data-driven, object oriented programming methodology to component based software development, and demonstrate how a rendering system with a similar focus on abstract manipulations can be incorporated, in order to develop a visualization application for geospatial data. In particular we describe a novel SAS script integration layer that provides access to vertex and fragment programs, producing a very controllable, responsive rendering system. The proposed system is very similar to developments speculatively planned for DirectX 10, but uses open standards and has cross platform applicability. © The Eurographics Association 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes the study of various grating based optical fibre sensors for applications in refractive index sensing. The sensitivity of these sensors has been studied and in some cases enhanced using novel techniques. The major areas of development are as follows. The sensitivity of long period gratings (LPGs) to surrounding medium refractive index (SRI) for various periods was investigated. The most sensitive period of LPG was found to be around 160 µm and this was due to the core mode coupling to a single cladding mode but phase matching at two wavelength locations, creating two attenuation peaks, close to the waveguide dispersion turning point. Large angle tilted fibre gratings (TFGs) have similar behaviour to LPGs, in that they couple to the co-propagating cladding modes. The tilted structure of the index modulation within the core of the fibre gives rise to a polarisation dependency, differing the large angle TFG from a LPG. Since the large angle TFG couple to the cladding mode they are SRI sensitive, the sensitivity to SRI can be further increased through cladding etching using HF acid. The thinning of the cladding layer caused a reordering of the cladding modes and shifted to more SRI sensitive cladding modes as the investigation discovered. In a SRI range of 1.36 to 1.40 a sensitivity of 506.9 nm/URI was achieved for the etched large angle TFG, which is greater than the dual resonance LPG. UV inscribed LPGs were coated with sol-gel materials with high RIs. The high RI of the coating caused an increase in cladding mode effective index which in turn caused an increase in the LPG sensitivity to SRI. LPGs of various periods of LPG were coated with sol-gel TiO2 and the optimal thickness was found to vary for each period. By coating of the already highly SRI sensitive 160µm period LPG (which is a dual resonance) with a sol-gel TiO2, the SRI sensitivity was further increased with a peak value of 1458 nm/URI, which was an almost 3 fold increase compared to the uncoated LPG. LPGs were also inscribed using a femtosecond laser which produced a highly focused index change which was no uniform throughout the core of the optical fibre. The inscription technique gave rise to a large polarisation sensitivity and the ability to couple to multiple azimuthal cladding mode sets, not seen with uniform UV inscribed gratings. Through coupling of the core mode to multiple sets of cladding modes, attenuation peaks with opposite wavelength shifts for increasing SRI was observed. Through combining this opposite wavelength shifts, a SRI sensitivity was achieved greater than any single observed attenuations peak. The maximum SRI achieved was 1680 nm/URI for a femtosecond inscribed LPG of period 400 µm. Three different types of surface plasmon resonance (SPR) sensors with a multilayer metal top coating were investigated in D shape optical fibre. The sensors could be separated into two types, utilized a pre UV inscribed tilted Bragg grating and the other employed a post UV exposure to generate surface relief grating structure. This surface perturbation aided the out coupling of light from the core but also changed the sensing mechanism from SPR to localised surface plasmon resonance (LSPR). This greatly increased the SRI sensitivity, compared to the SPR sensors; with the gold coated top layer surface relief sensor producing the largest SRI sensitivity of 2111.5nm/URI was achieved. While, the platinum and silver coated top layer surface relief sensors also gave high SRI sensitivities but also the ability to produce resonances in air (not previously seen with the SPR sensors). These properties were employed in two applications. The silver and platinum surface relief devices were used as gas sensors and were shown to be capable of detecting the minute RI change of different gases. The calculated maximum sensitivities produced were 1882.1dB/URI and 1493.5nm/URI for silver and platinum, respectively. Using a DFB laser and power meter a cheap alternative approach was investigated which showed the ability of the sensors to distinguish between different gases and flow rates of those gases. The gold surface relief sensor was coated in a with a bio compound called an aptamer and it was able to detect various concentrations of a biological compound called Thrombin, ranging from 1mM to as low as 10fM. A solution of 2M NaCl was found to give the best stripping results for Thrombin from the aptamer and showed the reusability of the sensor. The association and disassociation constants were calculated to be 1.0638×106Ms-1 and 0.2482s-1, respectively, showing the high affinity of the Aptamer to thrombin. This supports existing working stating that aptamers could be alternative to enzymes for chemical detection and also helps to explain the low detection limit of the gold surface relief sensor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Enhanced data services through mobile phones are expected to be soon fully transactional, interactive and embedded with other mobile consumption practices. While private services will continue to take the lead in the mobile data revolution, others such as government and NGOs are becoming more prominent m-players. This paper adopts a qualitative case study approach interpreting micro-level municipality officers’ mobility concept, ICT histories and choice practices for m-government services in Turkey. The findings highlight that in-situs ICT choice strategies are non-homogenous, sometimes conflicting with each other, and that current strategies have not yet justified the necessity for municipality officers to engage and fully commit to m-government efforts. Furthermore, beyond m-government initiatives’ success or failure, the mechanisms related to public administration mobile technical capacity building and knowledge transfer are identified to be directly related to m-government engagement likelihood.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research is concerned with the development of distributed real-time systems, in which software is used for the control of concurrent physical processes. These distributed control systems are required to periodically coordinate the operation of several autonomous physical processes, with the property of an atomic action. The implementation of this coordination must be fault-tolerant if the integrity of the system is to be maintained in the presence of processor or communication failures. Commit protocols have been widely used to provide this type of atomicity and ensure consistency in distributed computer systems. The objective of this research is the development of a class of robust commit protocols, applicable to the coordination of distributed real-time control systems. Extended forms of the standard two phase commit protocol, that provides fault-tolerant and real-time behaviour, were developed. Petri nets are used for the design of the distributed controllers, and to embed the commit protocol models within these controller designs. This composition of controller and protocol model allows the analysis of the complete system in a unified manner. A common problem for Petri net based techniques is that of state space explosion, a modular approach to both the design and analysis would help cope with this problem. Although extensions to Petri nets that allow module construction exist, generally the modularisation is restricted to the specification, and analysis must be performed on the (flat) detailed net. The Petri net designs for the type of distributed systems considered in this research are both large and complex. The top down, bottom up and hybrid synthesis techniques that are used to model large systems in Petri nets are considered. A hybrid approach to Petri net design for a restricted class of communicating processes is developed. Designs produced using this hybrid approach are modular and allow re-use of verified modules. In order to use this form of modular analysis, it is necessary to project an equivalent but reduced behaviour on the modules used. These projections conceal events local to modules that are not essential for the purpose of analysis. To generate the external behaviour, each firing sequence of the subnet is replaced by an atomic transition internal to the module, and the firing of these transitions transforms the input and output markings of the module. Thus local events are concealed through the projection of the external behaviour of modules. This hybrid design approach preserves properties of interest, such as boundedness and liveness, while the systematic concealment of local events allows the management of state space. The approach presented in this research is particularly suited to distributed systems, as the underlying communication model is used as the basis for the interconnection of modules in the design procedure. This hybrid approach is applied to Petri net based design and analysis of distributed controllers for two industrial applications that incorporate the robust, real-time commit protocols developed. Temporal Petri nets, which combine Petri nets and temporal logic, are used to capture and verify causal and temporal aspects of the designs in a unified manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The underlying work to this thesis focused on the exploitation and investigation of photosensitivity mechanisms in optical fibres and planar waveguides for the fabrication of advanced integrated optical devices for telecoms and sensing applications. One major scope is the improvement of grating fabrication specifications by introducing new writing techniques and the use of advanced characterisation methods for grating testing. For the first time the polarisation control method for advanced grating fabrication has successfully been converted to apodised planar waveguide fabrication and the development of a holographic method for the inscription of chirped gratings at arbitrary wavelength is presented. The latter resulted in the fabrication of gratings for pulse-width suppression and wavelength selection in diode lasers. In co-operation with research partners a number of samples were tested using optical frequency domain and optical low coherence reflectometry for a better insight into the limitations of grating writing techniques. Using a variety of different fabrication methods, custom apodised and chirped fibre Bragg gratings were written for the use as filter elements for multiplexer-demultiplexer devices, as well as for short pulse generation and wavelength selection in telecommunication transmission systems. Long period grating based devices in standard, speciality and tapered fibres are presented, showing great potential for multi-parameter sensing. One particular scope is the development of vectorial curvature and refractive index sensors with potential for medical, chemical and biological sensing. In addition the design of an optically tunable Mach-Zehnder based multiwavelength filter is introduced. The discovery of a Type IA grating type through overexposure of hydrogen loaded standard and Boron-Germanium co-doped fibres strengthened the assumption of UV-photosensitivity being a highly non-linear process. Gratings of this type show a significantly lower thermal sensitivity compared to standard gratings, which makes them useful for sensing applications. An Oxford Lasers copper-vapour laser operating at 255 nm in pulsed mode was used for their inscription, in contrast to previous work using CW-Argon-Ion lasers and contributing to differences in the processes of the photorefractive index change