967 resultados para semi-classical analysis
Resumo:
This research examines a behavioural based safety (BBS) intervention within a paper mill in the South East of England. Further to this intervention two other mills are examined for the purposes of comparison — one an established BBS programme and the other an improving safety management system through management ownership. BBS programmes have become popular within the UK, but most of the research about their efficacy is carried out by the BBS providers themselves. This thesis aims to evaluate a BBS intervention from a standpoint which is not commercially biased in favour of BBS schemes. The aim of a BBS scheme is to either change personnel behaviours or attitudes, which in turn will positively affect the organisation's safety culture. The research framework involved a qualitative methodology in order to examine the effects of the intervention on the paper mill's safety culture. The techniques used were questionnaires and semi structured interviews, in addition to observation and discussions which were possible because of the author's position as participant observer. The results demonstrated a failure to improve any aspect of the mill's safety culture, which worsened following the BBS intervention. Issues such as trust, morale, communication and support of management showed significant signs of negative workforce response. The paper mill where the safety management system approach was utilised demonstrated a significantly improved safety culture and achieved site ownership from middle managers and supervisors. Research has demonstrated that a solid foundation is required prior to successfully implementing a BBS programme. For a programme to work there must be middle management support in addition to senior management commitment. If a trade union actively distances itself from BBS, it is also unlikely to be effective. This thesis proposes that BBS observation programmes are not suitable for the papermaking industry, particularly when staffing levels are low due to challenging economic conditions. Observers are not available when there are high hazard situations and this suggests that BBS implementation is not the correct intervention for the paper industry.
Resumo:
The purpose of this paper is to deal with the outcomes of a so-called “employability management needs analysis” that is meant to provide more insight into current employability management activities and its possible benefits for Information and Communication Technology (ICT) professionals working in Small- and Medium-sized enterprises (SMEs) throughout Europe. A considerable series of interviews (N=107) were conducted with managers in SMEs in seven European countries, including Germany, Greece, Italy, the Netherlands, Norway, Poland, and the UK. A semi-structured interview protocol was used during the interviews to cover three issues: employability (13 items), ageing (8 items), and future developments and requirements (13 items). Analysis of all final interview transcriptions was at a national level using an elaborate common coding scheme. Although an interest in employability emerged, actual policy and action lagged behind. The recession in the ICT sector at the time of the investigation and the developmental stage of the sector in each participating country appeared connected. Ageing was not seen as a major issue in the ICT sector because managers considered ICT to be a relatively young sector. There appeared to be a serious lack of investment in the development of expertise of ICT professionals. Generalization of the results to large organizations in the ICT sector should be made with caution. The interview protocol developed is of value for further research and complements survey research undertaken within the employability field of study. It can be concluded that proactive HRM (Human Resource Management) policies and strategies are essential, even in times of economic downturn. Employability management activities are especially important in the light of current career issues. The study advances knowledge regarding HRM practices adopted by SMEs in the ICT sector, especially as there is a gap in knowledge about career development issues in that particular sector.
Resumo:
A re-examination of fundamental concepts and a formal structuring of the waveform analysis problem is presented in Part I. eg. the nature of frequency is examined and a novel alternative to the classical methods of detection proposed and implemented which has the advantage of speed and independence from amplitude. Waveform analysis provides the link between Parts I and II. Part II is devoted to Human Factors and the Adaptive Task Technique. The Historical, Technical and Intellectual development of the technique is traced in a review which examines the evidence of its advantages relative to non-adaptive fixed task methods of training, skill assessment and man-machine optimisation. A second review examines research evidence on the effect of vibration on manual control ability. Findings are presented in terms of percentage increment or decrement in performance relative to performance without vibration in the range 0-0.6Rms'g'. Primary task performance was found to vary by as much as 90% between tasks at the same Rms'g'. Differences in task difficulty accounted for this difference. Within tasks vibration-added-difficulty accounted for the effects of vibration intensity. Secondary tasks were found to be largely insensitive to vibration except secondaries which involved fine manual adjustment of minor controls. Three experiments are reported next in which an adaptive technique was used to measure the % task difficulty added by vertical random and sinusoidal vibration to a 'Critical Compensatory Tracking task. At vibration intensities between 0 - 0.09 Rms 'g' it was found that random vibration added (24.5 x Rms'g')/7.4 x 100% to the difficulty of the control task. An equivalence relationship between Random and Sinusoidal vibration effects was established based upon added task difficulty. Waveform Analyses which were applied to the experimental data served to validate Phase Plane analysis and uncovered the development of a control and possibly a vibration isolation strategy. The submission ends with an appraisal of subjects mentioned in the thesis title.
Resumo:
This thesis comprises two main objectives. The first objective involved the stereochemical studies of chiral 4,6-diamino-1-aryl-1,2-dihydro-s-triazines and an investigation on how the different conformations of these stereoisomers may affect their binding affinity to the enzyme dihydrofolate reductase (DHFR). The ortho-substituted 1-aryl-1,2-dihydro-s-triazines were synthesised by the three component method. An ortho-substitution at the C6' position was observed when meta-azidocycloguanil was decomposed in acid. The ortho-substituent restricts free rotation and this gives rise to atropisomerism. Ortho-substituted 4,6-diamino-1-aryl-2-ethyl-1,2-dihydro-2-methyl-s-triazine contains two elements of chirality and therefore exists as four stereoisomers: (S,aR), (R,aS), (R,aR) and (S,aS). The energy barriers to rotation of these compounds were calculated by a semi-empirical molecular orbital program called MOPAC and they were found to be in excess of 23 kcal/mol. The diastereoisomers were resolved and enriched by C18 reversed phase h.p.l.c. Nuclear overhauser effect experiments revealed that (S,aR) and (R,aS) were the more stable pair of stereoisomers and therefore existed as the major component. The minor diastereoisomers showed greater binding affinity for the rat liver DHFR in in vitro assay. The second objective entailed the investigation into the possibility of retaining DHFR inhibitory activity by replacing the classical diamino heterocyclic moiety with an amidinyl group. 4-Benzylamino-3-nitro-N,N-dimethyl-phenylamidine was synthesised in two steps. One of the two phenylamidines indicated weak inhibition against the rat liver DHFR. This weak activity may be due to the failure of the inhibitor molecule to form strong hydrogen bonds with residue Glu-30 at the active site of the enzyme.
Resumo:
The extent to which the surface parameters of Progressive Addition Lenses (PALs) affect successful patient tolerance was investigated. Several optico-physical evaluation techniques were employed, including a newly constructed surface reflection device which was shown to be of value for assessing semi-finished PAL blanks. Detailed physical analysis was undertaken using a computer-controlled focimeter and from these data, iso-cylindrical and mean spherical plots were produced for each PAL studied. Base curve power was shown to have little impact upon the distribution of PAL astigmatism. A power increase in reading addition primarily caused a lengthening and narrowing of the lens progression channel. Empirical measurements also indicated a marginal steepening of the progression power gradient with an increase in reading addition power. A sample of the PAL wearing population were studied using patient records and questionnaire analysis (90% were returned). This subjective analysis revealed the reading portion to be the most troublesome lens zone and showed that patients with high astigmatism (> 2.00D) adapt more readily to PALs than those with spherical or low cylindrical (2.00D) corrections. The psychophysical features of PALs were then investigated. Both grafting visual acuity (VA) and contrast sensitivity (CS) were shown to be reduced with an increase in eccentricity from the central umbilical line. Two sample populations (N= 20) of successful and unsuccessful PAL wearers were assessed for differences in their visual performance and their adaptation to optically induced distortion. The possibility of dispensing errors being the cause of poor patient tolerance amongst the unsuccessful wearer group was investigated and discounted. The contrast sensitivity of the successful group was significantly greater than that of the unsuccessful group. No differences in adaptation to or detection of curvature distortion were evinced between these presbyopic groups.
Resumo:
The finite element process is now used almost routinely as a tool of engineering analysis. From early days, a significant effort has been devoted to developing simple, cost effective elements which adequately fulfill accuracy requirements. In this thesis we describe the development and application of one of the simplest elements available for the statics and dynamics of axisymmetric shells . A semi analytic truncated cone stiffness element has been formulated and implemented in a computer code: it has two nodes with five degrees of freedom at each node, circumferential variations in displacement field are described in terms of trigonometric series, transverse shear is accommodated by means of a penalty function and rotary inertia is allowed for. The element has been tested in a variety of applications in the statics and dynamics of axisymmetric shells subjected to a variety of boundary conditions. Good results have been obtained for thin and thick shell cases .
Resumo:
The reliability of the printed circuit board assembly under dynamic environments, such as those found onboard airplanes, ships and land vehicles is receiving more attention. This research analyses the dynamic characteristics of the printed circuit board (PCB) supported by edge retainers and plug-in connectors. By modelling the wedge retainer and connector as providing simply supported boundary condition with appropriate rotational spring stiffnesses along their respective edges with the aid of finite element codes, accurate natural frequencies for the board against experimental natural frequencies are obtained. For a PCB supported by two opposite wedge retainers and a plug-in connector and with its remaining edge free of any restraint, it is found that these real supports behave somewhere between the simply supported and clamped boundary conditions and provide a percentage fixity of 39.5% more than the classical simply supported case. By using an eigensensitivity method, the rotational stiffnesses representing the boundary supports of the PCB can be updated effectively and is capable of representing the dynamics of the PCB accurately. The result shows that the percentage error in the fundamental frequency of the PCB finite element model is substantially reduced from 22.3% to 1.3%. The procedure demonstrated the effectiveness of using only the vibration test frequencies as reference data when the mode shapes of the original untuned model are almost identical to the referenced modes/experimental data. When using only modal frequencies in model improvement, the analysis is very much simplified. Furthermore, the time taken to obtain the experimental data will be substantially reduced as the experimental mode shapes are not required.In addition, this thesis advocates a relatively simple method in determining the support locations for maximising the fundamental frequency of vibrating structures. The technique is simple and does not require any optimisation or sequential search algorithm in the analysis. The key to the procedure is to position the necessary supports at positions so as to eliminate the lower modes from the original configuration. This is accomplished by introducing point supports along the nodal lines of the highest possible mode from the original configuration, so that all the other lower modes are eliminated by the introduction of the new or extra supports to the structure. It also proposes inspecting the average driving point residues along the nodal lines of vibrating plates to find the optimal locations of the supports. Numerical examples are provided to demonstrate its validity. By applying to the PCB supported on its three sides by two wedge retainers and a connector, it is found that a single point constraint that would yield maximum fundamental frequency is located at the mid-point of the nodal line, namely, node 39. This point support has the effect of increasing the structure's fundamental frequency from 68.4 Hz to 146.9 Hz, or 115% higher.
Resumo:
Shopping behavior is often exclusively studied through consumer purchases, since they are an easily measurable ouput. Still, the observation of in-store physical behavior (paths, moves and actions) is crucial, as is the quantification of its impact on purchases. Using an innovative PDA tool to precisely record and time stamp consumer’s moves and gestures, we extend the classical Market Basket Analysis (MBA) by integrating this new kind of information. We draw associations not only from purchases but also from in-store consumer moves and actions. We compare results of our new method with classical MBA results and show a significant improvement.
Resumo:
Shopping behavior is often exclusively studied through consumer purchases, since they are an easily measurable ouput. Still, the observation of in-store physical behavior (path, moves and actions) is crucial, as is the quantification of its impact on purchases. Using an innovative PDA tool to precisely record and time stamp consumers' moves and actions, we extend the classical Market Basket Analysis (MBA) by integrating this new information: associations between product categories are measured not only from purchases but also from consumer physical behavior. We compare results of our new method with classical MBA results and show a significant improvement.
Resumo:
The use of quantitative methods has become increasingly important in the study of neuropathology and especially in neurodegenerative disease. Disorders such as Alzheimer's disease (AD) and the frontotemporal dementias (FTD) are characterized by the formation of discrete, microscopic, pathological lesions which play an important role in pathological diagnosis. This chapter reviews the advantages and limitations of the different methods of quantifying pathological lesions in histological sections including estimates of density, frequency, coverage, and the use of semi-quantitative scores. The sampling strategies by which these quantitative measures can be obtained from histological sections, including plot or quadrat sampling, transect sampling, and point-quarter sampling, are described. In addition, data analysis methods commonly used to analysis quantitative data in neuropathology, including analysis of variance (ANOVA), polynomial curve fitting, multiple regression, classification trees, and principal components analysis (PCA), are discussed. These methods are illustrated with reference to quantitative studies of a variety of neurodegenerative disorders.
Resumo:
Objective: Recently, much research has been proposed using nature inspired algorithms to perform complex machine learning tasks. Ant colony optimization (ACO) is one such algorithm based on swarm intelligence and is derived from a model inspired by the collective foraging behavior of ants. Taking advantage of the ACO in traits such as self-organization and robustness, this paper investigates ant-based algorithms for gene expression data clustering and associative classification. Methods and material: An ant-based clustering (Ant-C) and an ant-based association rule mining (Ant-ARM) algorithms are proposed for gene expression data analysis. The proposed algorithms make use of the natural behavior of ants such as cooperation and adaptation to allow for a flexible robust search for a good candidate solution. Results: Ant-C has been tested on the three datasets selected from the Stanford Genomic Resource Database and achieved relatively high accuracy compared to other classical clustering methods. Ant-ARM has been tested on the acute lymphoblastic leukemia (ALL)/acute myeloid leukemia (AML) dataset and generated about 30 classification rules with high accuracy. Conclusions: Ant-C can generate optimal number of clusters without incorporating any other algorithms such as K-means or agglomerative hierarchical clustering. For associative classification, while a few of the well-known algorithms such as Apriori, FP-growth and Magnum Opus are unable to mine any association rules from the ALL/AML dataset within a reasonable period of time, Ant-ARM is able to extract associative classification rules.
Resumo:
This paper investigates the environmental sustainability and competitiveness perceptions of small farmers in a region in northern Brazil. The main data collection instruments included a survey questionnaire and an analysis of the region's strategic plan. In total, ninety-nine goat and sheep breeding farmers were surveyed. Data analysis methods included descriptive statistics, cluster analysis, and chi-squared tests. The main results relate to the impact of education, land size, and location on the farmers' perceptions of competitiveness and environmental issues. Farmers with longer periods of education have higher perception scores about business competitiveness and environmental sustainability than those with less formal education. Farmers who are working larger land areas also have higher scores than those with smaller farms. Lastly, location can yield factors that impact on farmers' perceptions. In our study, farmers located in Angicos and Lajes had higher perception scores than Pedro Avelino and Afonso Bezerra, despite the geographical proximity of these municipalities. On the other hand, three other profile variables did not impact on farmers' perceptions, namely: family income, dairy production volume, and associative condition. The authors believe the results and insights can be extended to livestock farming in other developing countries and contribute generally to fostering effective sustainable development policies, mainly in the agribusiness sector. © 2013 Elsevier Ltd. All rights reserved.
Resumo:
Over the last few years Data Envelopment Analysis (DEA) has been gaining increasing popularity as a tool for measuring efficiency and productivity of Decision Making Units (DMUs). Conventional DEA models assume non-negative inputs and outputs. However, in many real applications, some inputs and/or outputs can take negative values. Recently, Emrouznejad et al. [6] introduced a Semi-Oriented Radial Measure (SORM) for modelling DEA with negative data. This paper points out some issues in target setting with SORM models and introduces a modified SORM approach. An empirical study in bank sector demonstrates the applicability of the proposed model. © 2014 Elsevier Ltd. All rights reserved.
Resumo:
A simplified (without phase modulator) scheme of a black box optical regenerator is proposed, where an appropriate nonlinear propagation is used to enhance regeneration. Applying semi-theoretical models the authors optimise and demonstrate feasibility of error-free long distance transmission at 40 Gbit/s.
Resumo:
Methods for the calculation of complexity have been investigated as a possible alternative for the analysis of the dynamics of molecular systems. “Computational mechanics” is the approach chosen to describe emergent behavior in molecular systems that evolve in time. A novel algorithm has been developed for symbolization of a continuous physical trajectory of a dynamic system. A method for calculating statistical complexity has been implemented and tested on representative systems. It is shown that the computational mechanics approach is suitable for analyzing the dynamic complexity of molecular systems and offers new insight into the process.