875 resultados para requirement-based testing
Resumo:
Based on Goffman’s definition that frames are general ‘schemata of interpretation’ that people use to ‘locate, perceive, identify, and label’, other scholars have used the concept in a more specific way to analyze media coverage. Frames are used in the sense of organizing devices that allow journalists to select and emphasise topics, to decide ‘what matters’ (Gitlin 1980). Gamson and Modigliani (1989) consider frames as being embedded within ‘media packages’ that can be seen as ‘giving meaning’ to an issue. According to Entman (1993), framing comprises a combination of different activities such as: problem definition, causal interpretation, moral evaluation, and/or treatment recommendation for the item described. Previous research has analysed climate change with the purpose of testing Downs’s model of the issue attention cycle (Trumbo 1996), to uncover media biases in the US press (Boykoff and Boykoff 2004), to highlight differences between nations (Brossard et al. 2004; Grundmann 2007) or to analyze cultural reconstructions of scientific knowledge (Carvalho and Burgess 2005). In this paper we shall present data from a corpus linguistics-based approach. We will be drawing on results of a pilot study conducted in Spring 2008 based on the Nexis news media archive. Based on comparative data from the US, the UK, France and Germany, we aim to show how the climate change issue has been framed differently in these countries and how this framing indicates differences in national climate change policies.
Resumo:
This thesis describes research into business user involvement in the information systems application building process. The main interest of this research is in establishing and testing techniques to quantify the relationships between identified success factors and the outcome effectiveness of 'business user development' (BUD). The availability of a mechanism to measure the levels of the success factors, and quantifiably relate them to outcome effectiveness, is important in that it provides an organisation with the capability to predict and monitor effects on BUD outcome effectiveness. This is particularly important in an era where BUD levels have risen dramatically, user centred information systems development benefits are recognised as significant, and awareness of the risks of uncontrolled BUD activity is becoming more widespread. This research targets the measurement and prediction of BUD success factors and implementation effectiveness for particular business users. A questionnaire instrument and analysis technique has been tested and developed which constitutes a tool for predicting and monitoring BUD outcome effectiveness, and is based on the BUDES (Business User Development Effectiveness and Scope) research model - which is introduced and described in this thesis. The questionnaire instrument is designed for completion by 'business users' - the target community being more explicitly defined as 'people who primarily have a business role within an organisation'. The instrument, named BUD ESP (Business User Development Effectiveness and Scope Predictor), can readily be used with survey participants, and has been shown to give meaningful and representative results.
Resumo:
The underlying work to this thesis focused on the exploitation and investigation of photosensitivity mechanisms in optical fibres and planar waveguides for the fabrication of advanced integrated optical devices for telecoms and sensing applications. One major scope is the improvement of grating fabrication specifications by introducing new writing techniques and the use of advanced characterisation methods for grating testing. For the first time the polarisation control method for advanced grating fabrication has successfully been converted to apodised planar waveguide fabrication and the development of a holographic method for the inscription of chirped gratings at arbitrary wavelength is presented. The latter resulted in the fabrication of gratings for pulse-width suppression and wavelength selection in diode lasers. In co-operation with research partners a number of samples were tested using optical frequency domain and optical low coherence reflectometry for a better insight into the limitations of grating writing techniques. Using a variety of different fabrication methods, custom apodised and chirped fibre Bragg gratings were written for the use as filter elements for multiplexer-demultiplexer devices, as well as for short pulse generation and wavelength selection in telecommunication transmission systems. Long period grating based devices in standard, speciality and tapered fibres are presented, showing great potential for multi-parameter sensing. One particular scope is the development of vectorial curvature and refractive index sensors with potential for medical, chemical and biological sensing. In addition the design of an optically tunable Mach-Zehnder based multiwavelength filter is introduced. The discovery of a Type IA grating type through overexposure of hydrogen loaded standard and Boron-Germanium co-doped fibres strengthened the assumption of UV-photosensitivity being a highly non-linear process. Gratings of this type show a significantly lower thermal sensitivity compared to standard gratings, which makes them useful for sensing applications. An Oxford Lasers copper-vapour laser operating at 255 nm in pulsed mode was used for their inscription, in contrast to previous work using CW-Argon-Ion lasers and contributing to differences in the processes of the photorefractive index change
Resumo:
The work presented in this thesis describes an investigation into the production and properties of thin amorphous C films, with and without Cr doping, as a low wear / friction coating applicable to MEMS and other micro- and nano-engineering applications. Firstly, an assessment was made of the available testing techniques. Secondly, the optimised test methods were applied to a series of sputtered films of thickness 10 - 2000 nm in order to: (i) investigate the effect of thickness on the properties of coatingslcoating process (ii) investigate fundamental tribology at the nano-scale and (iii) provide a starting point for nanotribological coating optimisation at ultra low thickness. The use of XPS was investigated for the determination of Sp3/Sp2 carbon bonding. Under C 1s peak analysis, significant errors were identified and this was attributed to the absence of sufficient instrument resolution to guide the component peak structure (even with a high resolution instrument). A simple peak width analysis and correlation work with C KLL D value confirmed the errors. The use of XPS for Sp3/Sp2 was therefore limited to initial tentative estimations. Nanoindentation was shown to provide consistent hardness and reduced modulus results with depth (to < 7nm) when replicate data was suitably statistically processed. No significant pile-up or cracking of the films was identified under nanoindentation. Nanowear experimentation by multiple nanoscratching provided some useful information, however the conditions of test were very different to those expect for MEMS and micro- / nano-engineering systems. A novel 'sample oscillated nanoindentation' system was developed for testing nanowear under more relevant conditions. The films were produced in an industrial production coating line. In order to maximise the available information and to take account of uncontrolled process variation a statistical design of experiment procedure was used to investigate the effect of four key process control parameters. Cr doping was the most significant control parameter at all thicknesses tested and produced a softening effect and thus increased nanowear. Substrate bias voltage was also a significant parameter and produced hardening and a wear reducing effect at all thicknesses tested. The use of a Cr adhesion layer produced beneficial results at 150 nm thickness, but was ineffective at 50 nm. Argon flow to the coating chamber produced a complex effect. All effects reduced significantly with reducing film thickness. Classic fretting wear was produced at low amplitude under nanowear testing. Reciprocating sliding was produced at higher amplitude which generated three body abrasive wear and this was generally consistent with the Archard model. Specific wear rates were very low (typically 10-16 - 10-18 m3N-1m-1). Wear rates reduced exponentially with reduced film thickness and below (approx.) 20 nm, thickness was identified as the most important control of wear.
Resumo:
When constructing and using environmental models, it is typical that many of the inputs to the models will not be known perfectly. In some cases, it will be possible to make observations, or occasionally physics-based uncertainty propagation, to ascertain the uncertainty on these inputs. However, such observations are often either not available or even possible, and another approach to characterising the uncertainty on the inputs must be sought. Even when observations are available, if the analysis is being carried out within a Bayesian framework then prior distributions will have to be specified. One option for gathering or at least estimating this information is to employ expert elicitation. Expert elicitation is well studied within statistics and psychology and involves the assessment of the beliefs of a group of experts about an uncertain quantity, (for example an input / parameter within a model), typically in terms of obtaining a probability distribution. One of the challenges in expert elicitation is to minimise the biases that might enter into the judgements made by the individual experts, and then to come to a consensus decision within the group of experts. Effort is made in the elicitation exercise to prevent biases clouding the judgements through well-devised questioning schemes. It is also important that, when reaching a consensus, the experts are exposed to the knowledge of the others in the group. Within the FP7 UncertWeb project (http://www.uncertweb.org/), there is a requirement to build a Webbased tool for expert elicitation. In this paper, we discuss some of the issues of building a Web-based elicitation system - both the technological aspects and the statistical and scientific issues. In particular, we demonstrate two tools: a Web-based system for the elicitation of continuous random variables and a system designed to elicit uncertainty about categorical random variables in the setting of landcover classification uncertainty. The first of these examples is a generic tool developed to elicit uncertainty about univariate continuous random variables. It is designed to be used within an application context and extends the existing SHELF method, adding a web interface and access to metadata. The tool is developed so that it can be readily integrated with environmental models exposed as web services. The second example was developed for the TREES-3 initiative which monitors tropical landcover change through ground-truthing at confluence points. It allows experts to validate the accuracy of automated landcover classifications using site-specific imagery and local knowledge. Experts may provide uncertainty information at various levels: from a general rating of their confidence in a site validation to a numerical ranking of the possible landcover types within a segment. A key challenge in the web based setting is the design of the user interface and the method of interacting between the problem owner and the problem experts. We show the workflow of the elicitation tool, and show how we can represent the final elicited distributions and confusion matrices using UncertML, ready for integration into uncertainty enabled workflows.We also show how the metadata associated with the elicitation exercise is captured and can be referenced from the elicited result, providing crucial lineage information and thus traceability in the decision making process.
Resumo:
Quantum dots (Qdots) are fluorescent nanoparticles that have great potential as detection agents in biological applications. Their optical properties, including photostability and narrow, symmetrical emission bands with large Stokes shifts, and the potential for multiplexing of many different colours, give them significant advantages over traditionally used fluorescent dyes. Here, we report the straightforward generation of stable, covalent quantum dot-protein A/G bioconjugates that will be able to bind to almost any IgG antibody, and therefore can be used in many applications. An additional advantage is that the requirement for a secondary antibody is removed, simplifying experimental design. To demonstrate their use, we show their application in multiplexed western blotting. The sensitivity of Qdot conjugates is found to be superior to fluorescent dyes, and comparable to, or potentially better than, enhanced chemiluminescence. We show a true biological validation using a four-colour multiplexed western blot against a complex cell lysate background, and have significantly improved previously reported non-specific binding of the Qdots to cellular proteins.
Resumo:
A novel biosensing system based on a micromachined rectangular silicon membrane is proposed and investigated in this paper. A distributive sensing scheme is designed to monitor the dynamics of the sensing structure. An artificial neural network is used to process the measured data and to identify cell presence and density. Without specifying any particular bio-application, the investigation is mainly concentrated on the performance testing of this kind of biosensor as a general biosensing platform. The biosensing experiments on the microfabricated membranes involve seeding different cell densities onto the sensing surface of membrane, and measuring the corresponding dynamics information of each tested silicon membrane in the form of a series of frequency response functions (FRFs). All of those experiments are carried out in cell culture medium to simulate a practical working environment. The EA.hy 926 endothelial cell lines are chosen in this paper for the bio-experiments. The EA.hy 926 endothelial cell lines represent a particular class of biological particles that have irregular shapes, non-uniform density and uncertain growth behaviour, which are difficult to monitor using the traditional biosensors. The final predicted results reveal that the methodology of a neural-network based algorithm to perform the feature identification of cells from distributive sensory measurement has great potential in biosensing applications.
Resumo:
This study presents a detailed contrastive description of the textual functioning of connectives in English and Arabic. Particular emphasis is placed on the organisational force of connectives and their role in sustaining cohesion. The description is intended as a contribution for a better understanding of the variations in the dominant tendencies for text organisation in each language. The findings are expected to be utilised for pedagogical purposes, particularly in improving EFL teaching of writing at the undergraduate level. The study is based on an empirical investigation of the phenomenon of connectivity and, for optimal efficiency, employs computer-aided procedures, particularly those adopted in corpus linguistics, for investigatory purposes. One important methodological requirement is the establishment of two comparable and statistically adequate corpora, also the design of software and the use of existing packages and to achieve the basic analysis. Each corpus comprises ca 250,000 words of newspaper material sampled in accordance to a specific set of criteria and assembled in machine readable form prior to the computer-assisted analysis. A suite of programmes have been written in SPITBOL to accomplish a variety of analytical tasks, and in particular to perform a battery of measurements intended to quantify the textual functioning of connectives in each corpus. Concordances and some word lists are produced by using OCP. Results of these researches confirm the existence of fundamental differences in text organisation in Arabic in comparison to English. This manifests itself in the way textual operations of grouping and sequencing are performed and in the intensity of the textual role of connectives in imposing linearity and continuity and in maintaining overall stability. Furthermore, computation of connective functionality and range of operationality has identified fundamental differences in the way favourable choices for text organisation are made and implemented.
Resumo:
New Technology Based Firms (NTBF) are considered to be important for the economic development of a country in regards to both employment growth and innovative activity. The latter is believed to contribute significantly to the increase in productivity and therefore the competitiveness of UK’s economy. This study contributes to the above literature by investigating two of the factors believed to limit the growth of such firms in the UK. The first concerns the existence of a ‘knowledge gap’ while the second the existence of a ‘financial gap’. These themes are developed along three main research lines. Firstly, based upon the human capital theory initially proposed by Backer (1964) new evidence is provided on the human capital characteristics (experience and education) of the current UK NTBF entrepreneurs. Secondly, the causal relationship between general and specific human capital (as well as their interactions) upon the company performance and growth is investigated via its traditional direct effect as well as via its indirect effect upon the access to external finance. Finally, more light is shed on the financial structure and the type of financial constraints that high-tech firms face at start-up. In particular, whether a financial gap exists is explored by distinguishing between the demand and the supply of external finance as well as by type of external source of financing. The empirical testing of the various research hypotheses has been obtained by carrying out an original survey of new technology based firms defined as independent companies, established in the past 25 years in R&D intensive sectors. The resulting dataset contains information for 412 companies on a number of general company characteristics and the characteristics of their entrepreneurs in 2004. Policy and practical implications for future and current entrepreneurs and also providers of external finance are provided.
Resumo:
The fatigue behaviour of the cold chamber pressure-die-cast alloys: Mazak3, ZA8, ZA27, M3K, ZA8K, ZA27K, K1, K2 and K3 was investigated at temperature of 20°C. The alloys M3K, ZA8K and ZA27K were also examined at temperatures of 50 and 100°C. The ratio between fatigue strength and tensile strength was established at 20°C at 107 cycles. The fatigue life prediction of the alloys M3K, ZA8K and ZA27K was formulated at 20, 50 and 100°C. The prediction formulae were found to be reasonably accurate. All of the experimental alloys were heterogeneous and contained large but varying amounts of pores. These pores were a major contribution and dominated the alloys fatigue failure. Their effect, however, on tensile failure was negligible. The ZA27K possessed the highest tensile strength but the lowest fatigue strength. The relationship between the fracture topography and the microstructure was also determined by the use of a mixed signal of a secondary electron and a back-scattered electron on the SEM. The tensile strength of the experimental alloys was directly proportional to the aluminium content within the alloys. The effect of copper content was also investigated within the alloys K1, K2, ZA8K and K3 which contained 0%, 0.5%, 1.0% and 2.0% respectively. It was determined that the fatigue and tensile strengths improved with higher copper contents. Upon ageing the alloys Mazak3, ZA8 and ZA27 at an ambient temperature for 5 years, copper was also found to influence and maintain the metastable Zn-Al (αm) phase. The copper free Mazak3 upon ageing lost this metastable phase. The 1.0% copper ZA8 alloy had lost almost 50% of its metastable phase. Finally the 2.0% copper ZA27 had merely lost 10% of its metastable phase. The cph zinc contained a limited number of slip systems, therefore twinning deformation was unavoidable in both fatigue and tensile testing.
Resumo:
We perform numerical simulations on a model describing a Brillouin-based temperature and strain sensor, testing its response when it is probed with relatively short pulses. Experimental results were recently published [e.g., Opt. Lett. 24, 510 (1999)] that showed a broadening of the Brillouin loss curve when the probe pulse duration is reduced, followed by a sudden and rather surprising reduction of the linewidth when the pulse duration gets shorter than the acoustic relaxation time. Our study reveals the processes responsible for this behavior. We give a clear physical insight into the problem, allowing us to define the best experimental conditions required for one to take the advantage of this effect.
The compressive creep and load relaxation properties of a series of high aluminium zinc-based alloys
Resumo:
A new family of commercial zinc alloys designated as ZA8, ZA12, and ZA27 and high damping capacity alloys including Cosmal and Supercosmal and aluminium alloy LM25 were investigated for compressive creep and load relaxation behaviour under a series of temperatures and stresses. A compressive creep machine was designed to test the sand cast hollow cylindrical test specimens of these alloys. For each compressive creep experiment the variation of creep strain was presented in the form of graphs plotted as percentage of creep strain () versus time in seconds (s). In all cases, the curves showed the same general form of the creep curve, i.e. a primary creep stage, followed by a linear steady-state region (secondary creep). In general, it was observed that alloy ZA8 had the least primary creep among the commercial zinc-based alloys and ZA27 the greatest. The extent of primary creep increased with aluminium content to that of ZA27 then declined to Supercosmal. The overall creep strength of ZA27 was generally less than ZA8 and ZA12 but it showed better creep strength than ZA8 and ZA12 at high temperature and high stress. In high damping capacity alloys, Supercosmal had less primary creep and longer secondary creep regions and also had the lowest minimum creep rate among all the tested alloys. LM25 exhibited almost no creep at maximum temperature and stress used in this research work. Total creep elongation was shown to be well correlated using an empirical equation. Stress exponent and activation energies were calculated and found to be consistent with the creep mechanism of dislocation climb. The primary α and β phases in the as-cast structures decomposed to lamellar phases on cooling, with some particulates at dendrite edges and grain boundaries. Further breakdown into particulate bodies occurred during creep testing, and zinc bands developed at the highest test temperature of 160°C. The results of load relaxation testing showed that initially load loss proceeded rapidly and then deminished gradually with time. Load loss increased with temperature and almost all the curves approximated to a logarithmic decay of preload with time. ZA alloys exhibited almost the same load loss at lower temperature, but at 120°C ZA27 improved its relative performance with the passage of time. High damping capacity alloys and LM25 had much better resistance to load loss than ZA alloys and LM25 was found to be the best against load loss among these alloys. A preliminary equation was derived to correlate the retained load with time and temperature.
Resumo:
This thesis explores how the world-wide-web can be used to support English language teachers doing further studies at a distance. The future of education worldwide is moving towards a requirement that we, as teacher educators, use the latest web technology not as a gambit, but as a viable tool to improve learning. By examining the literature on knowledge, teacher education and web training, a model of teacher knowledge development, along with statements of advice for web developers based upon the model are developed. Next, the applicability and viability of both the model and statements of advice are examined by developing a teacher support site (bttp://www. philseflsupport. com) according to these principles. The data collected from one focus group of users from sixteen different countries, all studying on the same distance Masters programme, is then analysed in depth. The outcomes from the research are threefold: A functioning website that is averaging around 15, 000 hits a month provides a professional contribution. An expanded model of teacher knowledge development that is based upon five theoretical principles that reflect the ever-expanding cyclical nature of teacher learning provides an academic contribution. A series of six statements of advice for developers of teacher support sites. These statements are grounded in the theoretical principles behind the model of teacher knowledge development and incorporate nine keys to effective web facilitation. Taken together, they provide a forward-looking contribution to the praxis of web supported teacher education, and thus to the potential dissemination of the research presented here. The research has succeeded in reducing the proliferation of terminology in teacher knowledge into a succinct model of teacher knowledge development. The model may now be used to further our understanding of how teachers learn and develop as other research builds upon the individual study here. NB: Appendix 4 is only available only available for consultation at Aston University Library with prior arrangement.
Resumo:
The aim of this Interdisciplinary Higher Degrees project was the development of a high-speed method of photometrically testing vehicle headlamps, based on the use of image processing techniques, for Lucas Electrical Limited. Photometric testing involves measuring the illuminance produced by a lamp at certain points in its beam distribution. Headlamp performance is best represented by an iso-lux diagram, showing illuminance contours, produced from a two-dimensional array of data. Conventionally, the tens of thousands of measurements required are made using a single stationary photodetector and a two-dimensional mechanical scanning system which enables a lamp's horizontal and vertical orientation relative to the photodetector to be changed. Even using motorised scanning and computerised data-logging, the data acquisition time for a typical iso-lux test is about twenty minutes. A detailed study was made of the concept of using a video camera and a digital image processing system to scan and measure a lamp's beam without the need for the time-consuming mechanical movement. Although the concept was shown to be theoretically feasible, and a prototype system designed, it could not be implemented because of the technical limitations of commercially-available equipment. An alternative high-speed approach was developed, however, and a second prototype syqtem designed. The proposed arrangement again uses an image processing system, but in conjunction with a one-dimensional array of photodetectors and a one-dimensional mechanical scanning system in place of a video camera. This system can be implemented using commercially-available equipment and, although not entirely eliminating the need for mechanical movement, greatly reduces the amount required, resulting in a predicted data acquisiton time of about twenty seconds for a typical iso-lux test. As a consequence of the work undertaken, the company initiated an 80,000 programme to implement the system proposed by the author.
Resumo:
The primary objective of this research was to understand what kinds of knowledge and skills people use in `extracting' relevant information from text and to assess the extent to which expert systems techniques could be applied to automate the process of abstracting. The approach adopted in this thesis is based on research in cognitive science, information science, psycholinguistics and textlinguistics. The study addressed the significance of domain knowledge and heuristic rules by developing an information extraction system, called INFORMEX. This system, which was implemented partly in SPITBOL, and partly in PROLOG, used a set of heuristic rules to analyse five scientific papers of expository type, to interpret the content in relation to the key abstract elements and to extract a set of sentences recognised as relevant for abstracting purposes. The analysis of these extracts revealed that an adequate abstract could be generated. Furthermore, INFORMEX showed that a rule based system was a suitable computational model to represent experts' knowledge and strategies. This computational technique provided the basis for a new approach to the modelling of cognition. It showed how experts tackle the task of abstracting by integrating formal knowledge as well as experiential learning. This thesis demonstrated that empirical and theoretical knowledge can be effectively combined in expert systems technology to provide a valuable starting approach to automatic abstracting.