922 resultados para Number development
Resumo:
Tissue transglutaminase (TG2) is a Ca2+-dependent enzyme and probably the most ubiquitously expressed member of the mammalian transglutaminase family. TG2 plays a number of important roles in a variety of biological processes. Via its transamidating function, it is responsible for the cross-linking of proteins by forming isopeptide bonds between glutamine and lysine residues. Intracellularly, Ca2+ activation of the enzyme is normally tightly regulated by the binding of GTP. However, upregulated levels of TG2 are associated with many disease states like celiac sprue, certain types of cancer, fibrosis, cystic fibrosis, multiple sclerosis, Alzheimer's, Huntington's and Parkinson's disease. Selective inhibitors for TG2 both cell penetrating and non-cell penetrating would therefore serve as novel therapeutic tools for the treatment of these disease states. Moreover, they would provide useful tools to fully elucidate the cellular mechanisms TG2 is involved in and help comprehend how the enzyme is regulated at the cellular level. The current paper is intended to give an update on the recently discovered classes of TG2 inhibitors along with their structure-activity relationships. The biological properties of these derivatives, in terms of both activity and selectivity, will also be reported in order to translate their potential for future therapeutic developments. © 2011 Springer-Verlag.
Resumo:
In a series of studies, I investigated the developmental changes in children’s inductive reasoning strategy, methodological manipulations affecting the trajectory, and driving mechanisms behind the development of category induction. I systematically controlled the nature of the stimuli used, and employed a triad paradigm in which perceptual cues were directly pitted against category membership, to explore under which circumstances children used perceptual or category induction. My induction tasks were designed for children aged 3-9 years old using biologically plausible novel items. In Study 1, I tested 264 children. Using a wide age range allowed me to systematically investigate the developmental trajectory of induction. I also created two degrees of perceptual distractor – high and low – and explored whether the degree of perceptual similarity between target and test items altered children’s strategy preference. A further 52 children were tested in Study 2, to examine whether children showing a perceptual-bias were in fact basing their choice on maturation categories. A gradual transition was observed from perceptual to category induction. However, this transition could not be due to the inability to inhibit high perceptual distractors as children of all ages were equally distracted. Children were also not basing their strategy choices on maturation categories. In Study 3, I investigated category structure (featural vs. relational category rules) and domain (natural vs. artefact) on inductive preference. I tested 403 children. Each child was assigned to either the featural or relational condition, and completed both a natural kind and an artefact task. A further 98 children were tested in Study 4, on the effect of using stimuli labels during the tasks. I observed the same gradual transition from perceptual to category induction preference in Studies 3 and 4. This pattern was stable across domains, but children developed a category-bias one year later for relational categories, arguably due to the greater demands on executive function (EF) posed by these stimuli. Children who received labels during the task made significantly more category choices than those who did not receive labels, possibly due to priming effects. Having investigated influences affecting the developmental trajectory, I continued by exploring the driving mechanism behind the development of category induction. In Study 5, I tested 60 children on a battery of EF tasks as well as my induction task. None of the EF tasks were able to predict inductive variance, therefore EF development is unlikely to be the driving factor behind the transition. Finally in Study 6, I divided 252 children into either a comparison group or an intervention group. The intervention group took part in an interactive educational session at Twycross Zoo about animal adaptations. Both groups took part in four induction tasks, two before and two a week after the zoo visits. There was a significant increase in the number of category choices made in the intervention condition after the zoo visit, a result not observed in the comparison condition. This highlights the role of knowledge in supporting the transition from perceptual to category induction. I suggest that EF development may support induction development, but the driving mechanism behind the transition is an accumulation of knowledge, and an appreciation for the importance of category membership.
Resumo:
This study explores the scenario of human resource development (HRD) in the Sultanate of Oman. The investigation was conducted with the help of a questionnaire survey in stateowned enterprises (SOEs). The research findings highlight an increased emphasis on HRD initiatives at a national level in Omani firms. There is a significant degree of awareness among the top managers regarding the benefits of a strategic approach to HRD. Despite all this, the implementation of HRD programmes has not been particularly successful. This is because the state has not been able to develop the skills and competencies of the Omani workforce to the levels required under the sixth national five–year plan. The article makes a number of recommendations in this regard. It also highlights key research areas for further examination.
Resumo:
In order to survive in the increasingly customer-oriented marketplace, continuous quality improvement marks the fastest growing quality organization’s success. In recent years, attention has been focused on intelligent systems which have shown great promise in supporting quality control. However, only a small number of the currently used systems are reported to be operating effectively because they are designed to maintain a quality level within the specified process, rather than to focus on cooperation within the production workflow. This paper proposes an intelligent system with a newly designed algorithm and the universal process data exchange standard to overcome the challenges of demanding customers who seek high-quality and low-cost products. The intelligent quality management system is equipped with the ‘‘distributed process mining” feature to provide all levels of employees with the ability to understand the relationships between processes, especially when any aspect of the process is going to degrade or fail. An example of generalized fuzzy association rules are applied in manufacturing sector to demonstrate how the proposed iterative process mining algorithm finds the relationships between distributed process parameters and the presence of quality problems.
Resumo:
Purpose - This paper provides a deeper examination of the fundamentals of commonly-used techniques - such as coefficient alpha and factor analysis - in order to more strongly link the techniques used by marketing and social researchers to their underlying psychometric and statistical rationale. Design/methodology approach - A wide-ranging review and synthesis of psychometric and other measurement literature both within and outside the marketing field is used to illuminate and reconsider a number of misconceptions which seem to have evolved in marketing research. Findings - The research finds that marketing scholars have generally concentrated on reporting what are essentially arbitrary figures such as coefficient alpha, without fully understanding what these figures imply. It is argued that, if the link between theory and technique is not clearly understood, use of psychometric measure development tools actually runs the risk of detracting from the validity of the measures rather than enhancing it. Research limitations/implications - The focus on one stage of a particular form of measure development could be seen as rather specialised. The paper also runs the risk of increasing the amount of dogma surrounding measurement, which runs contrary to the spirit of this paper. Practical implications - This paper shows that researchers may need to spend more time interpreting measurement results. Rather than simply referring to precedence, one needs to understand the link between measurement theory and actual technique. Originality/value - This paper presents psychometric measurement and item analysis theory in easily understandable format, and offers an important set of conceptual tools for researchers in many fields. © Emerald Group Publishing Limited.
Resumo:
A two-tier study is presented in this thesis. The first involves the commissioning of an extant but at the time, unproven bubbling fluidised bed fast pyrolysis unit. The unit was designed for an intended nominal throughput of 300 g/h of biomass. The unit came complete with solids separation, pyrolysis vapour quenching and oil collection systems. Modifications were carried out on various sections of the system including the reactor heating, quenching and liquid collection systems. The modifications allowed for fast pyrolysis experiments to be carried out at the appropriate temperatures. Bio-oil was generated using conventional biomass feedstocks including Willow, beechwood, Pine and Miscanthus. Results from this phase of the research showed however, that although the rig was capable of processing biomass to bio-oil, it was characterised by low mass balance closures and recurrent operational problems. The problems included blockages, poor reactor hydrodynamics and reduced organic liquid yields. The less than optimal performance of individual sections, particularly the feed and reactor systems of the rig, culminated in a poor overall performance of the system. The second phase of this research involved the redesign of two key components of the unit. An alternative feeding system was commissioned for the unit. The feed system included an off the shelf gravimetric system for accurate metering and efficient delivery of biomass. Similarly, a new bubbling fluidised bed reactor with an intended nominal throughput of 500g/h of biomass was designed and constructed. The design leveraged on experience from the initial commissioning phase with proven kinetic and hydrodynamic studies. These units were commissioned as part of the optimisation phase of the study. Also as part of this study, two varieties each, of previously unreported feedstocks namely Jatropha curcas and Moringa olifiera oil seed press cakes were characterised to determine their suitability as feedstocks for liquid fuel production via fast pyrolysis. Consequently, the feedstocks were used for the production of pyrolysis liquids. The quality of the pyrolysis liquids from the feedstocks were then investigated via a number of analytical techniques. The oils from the press cakes showed high levels of stability and reduced pH values. The improvements to the design of the fast pyrolysis unit led to higher mass balance closures and increased organic liquid yields. The maximum liquid yield obtained from the press cakes was from African Jatropha press cake at 66 wt% on a dry basis.
Resumo:
The object of this work was to further develop the idea introduced by Muaddi et al (1981) which enables some of the disadvantages of earlier destructive adhesion test methods to be overcome. The test is non-destructive in nature but it does need to be calibrated against a destructive method. Adhesion is determined by measuring the effect of plating on internal friction. This is achieved by determining the damping of vibrations of a resonating specimen before and after plating. The level of adhesion was considered by the above authors to influence the degree of damping. In the major portion of the research work the electrodeposited metal was Watt's nickel, which is ductile in nature and is therefore suitable for peel adhesion testing. The base metals chosen were aluminium alloys S1C and HE9 as it is relatively easy to produce varying levels of adhesion between the substrate and electrodeposited coating by choosing the appropriate process sequence. S1C alloy is the commercially pure aluminium and was used to produce good adhesion. HE9 aluminium alloy is a more difficult to plate alloy and was chosen to produce poorer adhesion. The "Modal Testing" method used for studying vibrations was investigated as a possible means of evaluating adhesion but was not successful and so research was concentrated on the "Q" meter. The method based on the use of a "Q" meter involves the principle of exciting vibrations in a sample, interrupting the driving signal and counting the number of oscillations of the freely decaying vibrations between two known preselected amplitudes of oscillations. It was not possible to reconstruct a working instrument using Muaddi's thesis (1982) as it had either a serious error or the information was incomplete. Hence a modified "Q" meter had to be designed and constructed but it was then difficult to resonate non-magnetic materials, such as aluminium, therefore, a comparison before and after plating could not be made. A new "Q" meter was then developed based on an Impulse Technique. A regulated miniature hammer was used to excite the test piece at the fundamental mode instead of an electronic hammer and test pieces were supported at the two predetermined nodal points using nylon threads. This instrument developed was not very successful at detecting changes due to good and poor pretreatments given before plating, however, it was more sensitive to changes at the surface such as room temperature oxidation. Statistical analysis of test results from untreated aluminium alloys show that the instrument is not always consistent, the variation was even bigger when readings were taken on different days. Although aluminium is said to form protective oxides at room temperature there was evidence that the aluminium surface changes continuously due to film formation, growth and breakdown. Nickel plated and zinc alloy immersion coated samples also showed variation in Q with time. In order to prove that the variations in Q were mainly due to surface oxidation, aluminium samples were lacquered and anodised Such treatments enveloped the active surfaces reacting with the environment and the Q variation with time was almost eliminated especially after hard anodising. This instrument detected major differences between different untreated aluminium substrates.Also Q values decreased progressively as coating thicknesses were increased. This instrument was also able to detect changes in Q due to heat-treatment of aluminium alloys.
Resumo:
This article explores the notion that the workplace is a learning environment, and that the line manager is a key player determining its effectiveness. The work discusses how performance management systems may be used to clarify expectations made of line managers with regard to employee development. The work, in addition, suggests that line manager people management expertise may be a factor inhibiting workplace development for subordinates, and makes a number of suggestions about how to prepare the line manager for effective employee development. Key issues are illustrated by reference to a case study example. The case demonstrates that a high profile management development programme within a major international organisation failed to meet all objectives because of the unwillingness of the line management team to participate in the development of subordinates back in the workplace.
Resumo:
A great number of strategy tools are being taught in strategic management modules. These tools are available to managers for use in facilitating strategic decision-making and enhancing the strategy development process in their organisations. A number of studies have been published examining which are the most popular tools; however there is little empirical evidence on how their utilisation influences the strategy process. This paper is based on a large scale international survey on the strategy development process, and seeks to examine the impact of a particular strategy tool, the Balanced Scorecard, upon the strategy process. The Balanced Scorecard is one of the most popular strategy tools whose use has evolved since its introduction in the 1990’s. Recently, it has been suggested that as a strategy tool, Balanced Scorecard can influence all elements of the strategy process. The results of this study indicate that although there are significant differences in some elements of the strategy process between the organisations that have implemented the Balanced Scorecard and those that have not, the impact is not comprehensive.
Resumo:
A new surface analysis technique has been developed which has a number of benefits compared to conventional Low Energy Ion Scattering Spectrometry (LEISS). A major potential advantage arising from the absence of charge exchange complications is the possibility of quantification. The instrumentation that has been developed also offers the possibility of unique studies concerning the interaction between low energy ions and atoms and solid surfaces. From these studies it may also be possible, in principle, to generate sensitivity factors to quantify LEISS data. The instrumentation, which is referred to as a Time-of-Flight Fast Atom Scattering Spectrometer has been developed to investigate these conjecture in practice. The development, involved a number of modifications to an existing instrument, and allowed samples to be bombarded with a monoenergetic pulsed beam of either atoms or ions, and provided the capability to analyse the spectra of scattered atoms and ions separately. Further to this a system was designed and constructed to allow incident, exit and azimuthal angles of the particle beam to be varied independently. The key development was that of a pulsed, and mass filtered atom source; which was developed by a cyclic process of design, modelling and experimentation. Although it was possible to demonstrate the unique capabilities of the instrument, problems relating to surface contamination prevented the measurement of the neutralisation probabilities. However, these problems appear to be technical rather than scientific in nature, and could be readily resolved given the appropriate resources. Experimental spectra obtained from a number of samples demonstrate some fundamental differences between the scattered ion and neutral spectra. For practical non-ordered surfaces the ToF spectra are more complex than their LEISS counterparts. This is particularly true for helium scattering where it appears, in the absence of detailed computer simulation, that quantitative analysis is limited to ordered surfaces. Despite this limitation the ToFFASS instrument opens the way for quantitative analysis of the 'true' surface region to a wider range of surface materials.
Resumo:
The present work describes the development of a proton induced X-ray emission (PIXE) analysis system, especially designed and builtfor routine quantitative multi-elemental analysis of a large number of samples. The historical and general developments of the analytical technique and the physical processes involved are discussed. The philosophy, design, constructional details and evaluation of a versatile vacuum chamber, an automatic multi-sample changer, an on-demand beam pulsing system and ion beam current monitoring facility are described.The system calibration using thin standard foils of Si, P, S,Cl, K, Ca, Ti, V, Fe, Cu, Ga, Ge, Rb, Y and Mo was undertaken at proton beam energies of 1 to 3 MeV in steps of 0.5 MeV energy and compared with theoretical calculations. An independent calibration check using bovine liver Standard Reference Material was performed. The minimum detectable limits have been experimentally determined at detector positions of 90° and 135° with respect to the incident beam for the above range of proton energies as a function of atomic number Z. The system has detection limits of typically 10- 7 to 10- 9 g for elements 14
Resumo:
Hard real-time systems are a class of computer control systems that must react to demands of their environment by providing `correct' and timely responses. Since these systems are increasingly being used in systems with safety implications, it is crucial that they are designed and developed to operate in a correct manner. This thesis is concerned with developing formal techniques that allow the specification, verification and design of hard real-time systems. Formal techniques for hard real-time systems must be capable of capturing the system's functional and performance requirements, and previous work has proposed a number of techniques which range from the mathematically intensive to those with some mathematical content. This thesis develops formal techniques that contain both an informal and a formal component because it is considered that the informality provides ease of understanding and the formality allows precise specification and verification. Specifically, the combination of Petri nets and temporal logic is considered for the specification and verification of hard real-time systems. Approaches that combine Petri nets and temporal logic by allowing a consistent translation between each formalism are examined. Previously, such techniques have been applied to the formal analysis of concurrent systems. This thesis adapts these techniques for use in the modelling, design and formal analysis of hard real-time systems. The techniques are applied to the problem of specifying a controller for a high-speed manufacturing system. It is shown that they can be used to prove liveness and safety properties, including qualitative aspects of system performance. The problem of verifying quantitative real-time properties is addressed by developing a further technique which combines the formalisms of timed Petri nets and real-time temporal logic. A unifying feature of these techniques is the common temporal description of the Petri net. A common problem with Petri net based techniques is the complexity problems associated with generating the reachability graph. This thesis addresses this problem by using concurrency sets to generate a partial reachability graph pertaining to a particular state. These sets also allows each state to be checked for the presence of inconsistencies and hazards. The problem of designing a controller for the high-speed manufacturing system is also considered. The approach adopted mvolves the use of a model-based controller: This type of controller uses the Petri net models developed, thus preservIng the properties already proven of the controller. It. also contains a model of the physical system which is synchronised to the real application to provide timely responses. The various way of forming the synchronization between these processes is considered and the resulting nets are analysed using concurrency sets.
Resumo:
A major application of computers has been to control physical processes in which the computer is embedded within some large physical process and is required to control concurrent physical processes. The main difficulty with these systems is their event-driven characteristics, which complicate their modelling and analysis. Although a number of researchers in the process system community have approached the problems of modelling and analysis of such systems, there is still a lack of standardised software development formalisms for the system (controller) development, particular at early stage of the system design cycle. This research forms part of a larger research programme which is concerned with the development of real-time process-control systems in which software is used to control concurrent physical processes. The general objective of the research in this thesis is to investigate the use of formal techniques in the analysis of such systems at their early stages of development, with a particular bias towards an application to high speed machinery. Specifically, the research aims to generate a standardised software development formalism for real-time process-control systems, particularly for software controller synthesis. In this research, a graphical modelling formalism called Sequential Function Chart (SFC), a variant of Grafcet, is examined. SFC, which is defined in the international standard IEC1131 as a graphical description language, has been used widely in industry and has achieved an acceptable level of maturity and acceptance. A comparative study between SFC and Petri nets is presented in this thesis. To overcome identified inaccuracies in the SFC, a formal definition of the firing rules for SFC is given. To provide a framework in which SFC models can be analysed formally, an extended time-related Petri net model for SFC is proposed and the transformation method is defined. The SFC notation lacks a systematic way of synthesising system models from the real world systems. Thus a standardised approach to the development of real-time process control systems is required such that the system (software) functional requirements can be identified, captured, analysed. A rule-based approach and a method called system behaviour driven method (SBDM) are proposed as a development formalism for real-time process-control systems.
Resumo:
Objectives: To develop an objective measure to enable hospital Trusts to compare their use of antibiotics. Design: Self-completion, postal questionnaire with telephone follow up. Sample: 4 hospital trusts in the English Midlands. Results: The survey showed that it was possible to collect data concerning the number of Defined Daily Doses (DDD's) of quinolone antibiotic dispensed per Finished Consultant Episode (FCE) in each Trust.. In the 4 trusts studied the mean DDD/FCE was 0.197 (range 0.117 to 0.258). This indicates that based on a typical course length of 5 days, 3.9% of patient episodes resulted in the prescription of a quinolone antibiotic. Antibiotic prescribing control measures in each Trust were found to be comparable. Conclusion: The measure will enable Trusts to objectively compare their usage of quinolone antibiotics and use this information to carry out clinical audit should differences be recorded. This is likely to be applicable to other groups of antibiotics.
Resumo:
The purpose of this study is to increase our knowledge of the nature of the surface properties of polymeric materials and improve our understanding of how these factors influence the deposition of proteins to form a reactive biological/synthetic interface. A number of surface analytical techniques were identified as being of potential benefit to this investigation and included in a multidisciplinary research program. Cell adhesion in culture was the primary biological sensor of surface properties, and it showed that the cell response to different materials can be modified by adhesion promoting protein layers: cell adhesion is a protein-mediated event. A range of surface rugosity can be produced on polystyrene, and the results presented here show that surface rugosity does not play a major role in determining a material's cell adhesiveness. Contact angle measurements showed that surface energy (specifically the polar fraction) is important in promoting cell spreading on surfaces. The immunogold labelling technique indicated that there were small, but noticeable differences, between the distribution of proteins on a range of surfaces. This study has shown that surface analysis techniques have different sensitivities in terms of detection limits and depth probed, and these are important in determining the usefulness of the information obtained. The techniques provide information on differing aspects of the biological/synthetic interface, and the consequence of this is that a range of techniques is needed in any full study of such a complex field as the biomaterials area.