940 resultados para local-to-zero analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Principal component analysis (PCA) is a ubiquitous technique for data analysis and processing, but one which is not based upon a probability model. In this paper we demonstrate how the principal axes of a set of observed data vectors may be determined through maximum-likelihood estimation of parameters in a latent variable model closely related to factor analysis. We consider the properties of the associated likelihood function, giving an EM algorithm for estimating the principal subspace iteratively, and discuss the advantages conveyed by the definition of a probability density function for PCA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The methods used by the UK Police to investigate complaints of rape have unsurprisingly come under much scrutiny in recent times, with a 2007 joint report on behalf of HM Crown Prosecution Service Inspectorate and HM Inspectorate of Constabulary concluding that there were many areas where improvements should be made. The research reported here forms part of a larger project which draws on various discourse analytical tools to identify the processes at work during police interviews with women reporting rape. Drawing on a corpus of video recorded police interviews with women reporting rape, this study applies a two pronged analysis to reveal the presence of these ideologies. Firstly, an analysis of the discourse markers ‘well’ and ‘so’ demonstrates the control exerted on the interaction by interviewing officers, as they attach importance to certain facts while omitting much of the information provided by the victim. Secondly, the interpretative repertoires relied upon by officers to ‘make sense’ of victim’s accounts are subject to scrutiny. As well as providing micro-level analyses which demonstrate processes of interactional control at the local level, the findings of these analyses can be shown to relate to a wider context – specifically prevailing ideologies about sexual violence in society as a whole.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article argues against the merger folklore that maintains that a merger negatively affects well-being and work attitudes primarily through the threat of job insecurity. We hold that the workplace is not only a resource for fulfilling a person's financial needs, but that it is an important component of the self-concept in terms of identification with the organization, as explained by social identity theory. We unravel the key concepts of the social identity approach relevant to the analysis of mergers and review evidence from previous studies. Then, we present a study conducted during a merger to substantiate our ideas about the effects of post-merger organizational identification above and beyond the effects of perceived job insecurity. We recommend that managers should account for these psychological effects through the provision of continuity and specific types of communication. © 2006 British Academy of Management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The pattern of illumination on an undulating surface can be used to infer its 3-D form (shape from shading). But the recovery of shape would be invalid if the shading actually arose from reflectance variation. When a corrugated surface is painted with an albedo texture, the variation in local mean luminance (LM) due to shading is accompanied by a similar modulation in texture amplitude (AM). This is not so for reflectance variation, nor for roughly textured surfaces. We used a haptic matching technique to show that modulations of texture amplitude play a role in the interpretation of shape from shading. Observers were shown plaid stimuli comprising LM and AM combined in-phase (LM+AM) on one oblique and in anti-phase (LM-AM) on the other. Stimuli were presented via a modified ReachIN workstation allowing the co-registration of visual and haptic stimuli. In the first experiment, observers were asked to adjust the phase of a haptic surface, which had the same orientation as the LM+AM combination, until its peak in depth aligned with the visually perceived peak. The resulting alignments were consistent with the use of a lighting-from-above prior. In the second experiment, observers were asked to adjust the amplitude of the haptic surface to match that of the visually perceived surface. Observers chose relatively large amplitude settings when the haptic surface was oriented and phase-aligned with the LM+AM cue. When the haptic surface was aligned with the LM-AM cue, amplitude settings were close to zero. Thus the LM/AM phase relation is a significant visual depth cue, and is used to discriminate between shading and reflectance variations. [Supported by the Engineering and Physical Sciences Research Council, EPSRC].

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The authors study experimentally ~10 ps return-to-zero pulse propagation near the net dispersion zero of an optical fibre transmission line. Stable near-jitter-free propagation was observed over 70 Mm. Pulse stabilisation and ASE suppression were achieved through the saturable aborber mechanism of nonlinear polarisation rotation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents theoretical investigation of three topics concerned with nonlinear optical pulse propagation in optical fibres. The techniques used are mathematical analysis and numerical modelling. Firstly, dispersion-managed (DM) solitons in fibre lines employing a weak dispersion map are analysed by means of a perturbation approach. In the case of small dispersion map strengths the average pulse dynamics is described by a perturbation approach (NLS) equation. Applying a perturbation theory, based on the Inverse Scattering Transform method, an analytic expression for the envelope of the DM soliton is derived. This expression correctly predicts the power enhancement arising from the dispersion management.Secondly, autosoliton transmission in DM fibre systems with periodical in-line deployment of nonlinear optical loop mirrors (NOLMs) is investigated. The use of in-line NOLMs is addressed as a general technique for all-optical passive 2R regeneration of return-to-zero data in high speed transmission system with strong dispersion management. By system optimisation, the feasibility of ultra-long single-channel and wavelength-division multiplexed data transmission at bit-rates ³ 40 Gbit s-1 in standard fibre-based systems is demonstrated. The tolerance limits of the results are defined.Thirdly, solutions of the NLS equation with gain and normal dispersion, that describes optical pulse propagation in an amplifying medium, are examined. A self-similar parabolic solution in the energy-containing core of the pulse is matched through Painlevé functions to the linear low-amplitude tails. The analysis provides a full description of the features of high-power pulses generated in an amplifying medium.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Surface compositional changes in GaAs due to RF plasmas of different gases have been investigated by XPS and etch rates were measured using AFM. Angular Resolved XPS (ARXPS) was also employed for depth analysis of the composition of the surface layers. An important role in this study was determination of oxide thickness using XPS data. The study of surface - plasma interaction was undertaken by correlating results of surface analysis with plasma diagnosis. Different experiments were designed to accurately measure the BEs associated with the Ga 3d, Ga 2P3/2 and LMM peaks using XPS analysis and propose identification in terms of the oxides of GaAs. Along with GaAs wafers, some reference compounds such as metallic Ga and Ga2O3 powder were used. A separate study aiming the identification of the GaAs surface oxides formed on the GaAs surface during and after plasma processing was undertaken. Surface compositional changes after plasma treatment, prior to surface analysis are considered, with particular reference to the oxides formed in the air on the activated surface. Samples exposed to ambient air for different periods of time and also to pure oxygen were analysed. Models of surface processes were proposed for explanation of the stoichiometry changes observed with the inert and reactive plasmas used. In order to help with the understanding of the mechanisms responsible for surface effects during plasma treatment, computer simulation using SRIM code was also undertaken. Based on simulation and experimental results, models of surface phenomena are proposed. Discussion of the experimental and simulated results is made in accordance with current theories and published results of different authors. The experimental errors introduced by impurities and also by data acquisition and processing are also evaluated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter explores the different ways in which discourse-analytic approaches reveal the ‘meaningfulness’ of text and talk. It reviews four diverse approaches to discourse analysis of particular value for current research in linguistics: Conversation Analysis (CA), Discourse Analysis (DA), Critical Discourse Analysis (CDA) and Feminist Post-structuralist Discourse Analysis (FPDA). Each approach is examined in terms of its background, motivation, key features, and possible strengths and limitations in relation to the field of linguistics. A key way to schematize discourse-analytic methodology is in terms of its relationship between microanalytical approaches, which examine the finer detail of linguistic interactions in transcripts, and macroanalytical approaches, which consider how broader social processes work through language (Heller, 2001). This chapter assesses whether there is a strength in a discourse-analytic approach that aligns itself exclusively with either a micro- or macrostrategy, or whether, as Heller suggests, the field needs to fi nd a way of ‘undoing’ the micro–macro dichotomy in order to produce richer, more complex insights within linguistic research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advances in the generation and interpretation of proteomics data have spurred a transition from focusing on protein identification to functional analysis. Here we review recent proteomics results that have elucidated new aspects of the roles and regulation of signal transduction pathways in cancer using the epidermal growth factor receptor (EGFR), ERK and breakpoint cluster region (BCR)-ABL1 networks as examples. The emerging theme is to understand cancer signalling as networks of multiprotein machines which process information in a highly dynamic environment that is shaped by changing protein interactions and post-translational modifications (PTMs). Cancerous genetic mutations derange these protein networks in complex ways that are tractable by proteomics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article examines the negotiation of face in post observation feedback conferences on an initial teacher training programme. The conferences were held in groups with one trainer and up to four trainees and followed a set of generic norms. These norms include the right to offer advice and to criticise, speech acts which are often considered to be face threatening in more normal contexts. However, as the data analysis shows, participants also interact in ways that challenge the generic norms, some of which might be considered more conventionally face attacking. The article argues that face should be analysed at the level of interaction (Haugh and Bargiela-Chiappini, 2010) and that situated and contextual detail is relevant to its analysis. It suggests that linguistic ethnography, which 'marries' (Wetherell, 2007) linguistics and ethnography, provides a useful theoretical framework for doing so. To this end the study draws on real-life talk-in-interaction (from transcribed recordings), the participants' perspectives (from focus groups and interviews) and situated detail (from fieldnotes) to produce a contextualised and nuanced analysis. © 2011 Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study is concerned with the analysis of tear proteins, paying particular attention to the state of the tears (e.g. non-stimulated, reflex, closed), created during sampling, and to assess their interactions with hydrogel contact lenses. The work has involved the use of a variety of biochemical and immunological analytical techniques for the measurement of proteins, (a), in tears, (b), on the contact lens, and (c), in the eluate of extracted lenses. Although a diverse range of tear components may contribute to contact lens spoilation, proteins were of particular interest in this study because of their theoretical potential for producing immunological reactions. Although normal host proteins in their natural state are generally not treated as dangerous or non-self, those which undergo denaturation or suffer a conformational change may provoke an excessive and unnecessary immune response. A novel on-lens cell based assay has been developed and exploited in order to study the role of the ubiquitous cell adhesion glycoprotein, vitronectin, in tears and contact lens wear under various parameters. Vitronectin, whose levels are known to increase in the closed eye environment and shown here to increase during contact lens wear, is an important immunoregulatory protein and may be a prominent marker of inflammatory activity. Immunodiffusion assays were developed and optimised for use in tear analysis, and in a series of subsequent studies used for example in the measurement of albumin, lactoferrin, IgA and IgG. The immunodiffusion assays were then applied in the estimation of the closed eye environment; an environment which has been described as sustaining a state of sub-clinical inflammation. The role and presence of a lesser understood and investigated protein, kininogen, was also estimated, in particular, in relation to contact lens wear. Difficulties arise when attempting to extract proteins from the contact lens in order to examine the individual nature of the proteins involved. These problems were partly alleviated with the use of the on-lens cell assay and a UV spectrophotometry assay, which can analyse the lens surface and bulk respectively, the latter yielding only total protein values. Various lens extraction methods were investigated to remove protein from the lens and the most efficient was employed in the analysis of lens extracts. Counter immunoelectrophoresis, an immunodiffusion assay, was then applied to the analysis of albumin, lactoferrin, IgA and IgG in the resultant eluates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this study is to determine if various measures of contraction rate are regionally patterned in written Standard American English. In order to answer this question, this study employs a corpus-based approach to data collection and a statistical approach to data analysis. Based on a spatial autocorrelation analysis of the values of eleven measures of contraction across a 25 million word corpus of letters to the editor representing the language of 200 cities from across the contiguous United States, two primary regional patterns were identified: easterners tend to produce relatively few standard contractions (not contraction, verb contraction) compared to westerners, and northeasterners tend to produce relatively few non-standard contractions (to contraction, non-standard not contraction) compared to southeasterners. These findings demonstrate that regional linguistic variation exists in written Standard American English and that regional linguistic variation is more common than is generally assumed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The concept of a task is fundamental to the discipline of ergonomics. Approaches to the analysis of tasks began in the early 1900's. These approaches have evolved and developed to the present day, when there is a vast array of methods available. Some of these methods are specific to particular contexts or applications, others more general. However, whilst many of these analyses allow tasks to be examined in detail, they do not act as tools to aid the design process or the designer. The present thesis examines the use of task analysis in a process control context, and in particular the use of task analysis to specify operator information and display requirements in such systems. The first part of the thesis examines the theoretical aspect of task analysis and presents a review of the methods, issues and concepts relating to task analysis. A review of over 80 methods of task analysis was carried out to form a basis for the development of a task analysis method to specify operator information requirements in industrial process control contexts. Of the methods reviewed Hierarchical Task Analysis was selected to provide such a basis and developed to meet the criteria outlined for such a method of task analysis. The second section outlines the practical application and evolution of the developed task analysis method. Four case studies were used to examine the method in an empirical context. The case studies represent a range of plant contexts and types, both complex and more simple, batch and continuous and high risk and low risk processes. The theoretical and empirical issues are drawn together and a method developed to provide a task analysis technique to specify operator information requirements and to provide the first stages of a tool to aid the design of VDU displays for process control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis has two aims. First, it sets out to develop an alternative methodology for the investigation of risk homeostasis theory (RHT). It is argued that the current methodologies of the pseudo-experimental design and post hoc analysis of road-traffic accident data both have their limitations, and that the newer 'game' type simulation exercises are also, but for different reasons, incapable of testing RHT predictions. The alternative methodology described here is based on the simulation of physical risk with intrinsic reward rather than a 'points pay-off'. The second aim of the thesis is to examine a number of predictions made by RHT through the use of this alternative methodology. Since the pseudo-experimental design and post hoc analysis of road-traffic data are both ill-suited to the investigation of that part of RHT which deals with the role of utility in determining risk-taking behaviour in response to a change in environmental risk, and since the concept of utility is critical to RHT, the methodology reported here is applied to the specific investigation of utility. Attention too is given to the question of which behavioural pathways carry the homeostasis effect, and whether those pathways are 'local' to the nature of the change in environmental risk. It is suggested that investigating RHT through this new methodology holds a number of advantages and should be developed further in an attempt to answer the RHT question. It is suggested too that the methodology allows RHT to be seen in a psychological context, rather than the statistical context that has so far characterised its investigation. The experimental findings reported here are in support of hypotheses derived from RHT and would therefore seem to argue for the importance of the individual and collective target level of risk, as opposed to the level of environmental risk, as the major determinant of accident loss.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The state of the art in productivity measurement and analysis shows a gap between simple methods having little relevance in practice and sophisticated mathematical theory which is unwieldy for strategic and tactical planning purposes, -particularly at company level. An extension is made in this thesis to the method of productivity measurement and analysis based on the concept of added value, appropriate to those companies in which the materials, bought-in parts and services change substantially and a number of plants and inter-related units are involved in providing components for final assembly. Reviews and comparisons of productivity measurement dealing with alternative indices and their problems have been made and appropriate solutions put forward to productivity analysis in general and the added value method in particular. Based on this concept and method, three kinds of computerised models two of them deterministic, called sensitivity analysis and deterministic appraisal, and the third one, stochastic, called risk simulation, have been developed to cope with the planning of productivity and productivity growth with reference to the changes in their component variables, ranging from a single value 'to• a class interval of values of a productivity distribution. The models are designed to be flexible and can be adjusted according to the available computer capacity expected accuracy and 'presentation of the output. The stochastic model is based on the assumption of statistical independence between individual variables and the existence of normality in their probability distributions. The component variables have been forecasted using polynomials of degree four. This model is tested by comparisons of its behaviour with that of mathematical model using real historical data from British Leyland, and the results were satisfactory within acceptable levels of accuracy. Modifications to the model and its statistical treatment have been made as required. The results of applying these measurements and planning models to the British motor vehicle manufacturing companies are presented and discussed.