12 resultados para Conformity.

em Aston University Research Archive


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two studies were conducted to examine the impact of subjective uncertainty on conformity to group norms in the attitude-behaviour context. In both studies, subjective uncertainty was manipulated using a deliberative mindset manipulation (McGregor, Zanna, Holmes, & Spencer, 2001). In Study 1 (N = 106), participants were exposed to either an attitude-congruent or an attitude-incongruent in-group norm. In Study 2(N = 83), participants were exposed to either a congruent, incongruent, or an ambiguous in-group norm. Ranges of attitude-behaviour outcomes, including attitude-intention consistency and change in attitude-certainty, were assessed. In both studies, levels of group-normative behaviour varied as a function of uncertainty condition. In Study 1, conformity to group norms, as evidenced by variations in the level of attitude-intention consistency, was observed only in the high uncertainty condition. In Study 2, exposure to an ambiguous norm had different effects for those in the low and die high uncertainty conditions. In the low uncertainty condition, greatest conformity was observed in the attitude-congruent norm condition compared with an attitude-congruent or ambiguous norm. In contrast, individuals in the high uncertainty condition displayed greatest conformity when exposed to either an attitude-congruent or an ambiguous in-group norm. The implications of these results for the role of subjective uncertainty in social influence processes are discussed. © 2007 The British Psychological Society.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Analysis of variance (ANOVA) is the most efficient method available for the analysis of experimental data. Analysis of variance is a method of considerable complexity and subtlety, with many different variations, each of which applies in a particular experimental context. Hence, it is possible to apply the wrong type of ANOVA to data and, therefore, to draw an erroneous conclusion from an experiment. This article reviews the types of ANOVA most likely to arise in clinical experiments in optometry including the one-way ANOVA ('fixed' and 'random effect' models), two-way ANOVA in randomised blocks, three-way ANOVA, and factorial experimental designs (including the varieties known as 'split-plot' and 'repeated measures'). For each ANOVA, the appropriate experimental design is described, a statistical model is formulated, and the advantages and limitations of each type of design discussed. In addition, the problems of non-conformity to the statistical model and determination of the number of replications are considered. © 2002 The College of Optometrists.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives: To compare the recognized defined daily dose per 100 bed-days (DDD/100 bed-days) measure with the defined daily dose per finished consultant episode (DDD/FCE) in a group of hospitals with a variety of medicines management strategies. To compare antibiotic usage using the above indicators in hospitals with and without electronic prescribing systems. Methods: Twelve hospitals were used in the study. Nine hospitals were selected and split into three cohorts (three high-scoring, three medium-scoring and three low-scoring) by their 2001 medicines management self-assessment scores (MMAS). An additional cohort of three electronic prescribing hospitals was included for comparison. MMAS were compared to antibiotic management scores (AMS) developed from a questionnaire relating specifically to control of antibiotics. FCEs and occupied bed-days were obtained from published statistics and statistical analyses of the DDD/100 bed-days and DDD/FCE were carried out using SPSS. Results: The DDD/100 bed-days varied from 81.33 to 189.37 whilst the DDD/FCE varied from 2.88 to 7.43. The two indicators showed a high degree of correlation with r = 0.74. MMAS were from 9 to 22 (possible range 0-23) and the AMS from 2 to 13 (possible range 0-22). The two scores showed a high degree of correlation with r = 0.74. No correlation was established between either indicator and either score. Conclusions: The WHO indicator for medicines utilization, DDD/100 bed-days, exhibited the same level of conformity as that exhibited from the use of the DDD/FCE indicating that the DDD/FCE is a useful additional indicator for identifying hospitals which require further study. The MMAS can be assumed to be an accurate guide to antibiotic medicines management controls. No relationship has been found between a high degree of medicines management control and the quantity of antibiotic prescribed. © The British Society for Antimicrobial Chemotherapy; 2004 all rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents a new approach to designing large organizational databases. The approach emphasizes the need for a holistic approach to the design process. The development of the proposed approach was based on a comprehensive examination of the issues of relevance to the design and utilization of databases. Such issues include conceptual modelling, organization theory, and semantic theory. The conceptual modelling approach presented in this thesis is developed over three design stages, or model perspectives. In the semantic perspective, concept definitions were developed based on established semantic principles. Such definitions rely on meaning - provided by intension and extension - to determine intrinsic conceptual definitions. A tool, called meaning-based classification (MBC), is devised to classify concepts based on meaning. Concept classes are then integrated using concept definitions and a set of semantic relations which rely on concept content and form. In the application perspective, relationships are semantically defined according to the application environment. Relationship definitions include explicit relationship properties and constraints. The organization perspective introduces a new set of relations specifically developed to maintain conformity of conceptual abstractions with the nature of information abstractions implied by user requirements throughout the organization. Such relations are based on the stratification of work hierarchies, defined elsewhere in the thesis. Finally, an example of an application of the proposed approach is presented to illustrate the applicability and practicality of the modelling approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

New Approach’ Directives now govern the health and safety of most products whether destined for workplace or domestic use. These Directives have been enacted into UK law by various specific legislation principally relating to work equipment, machinery and consumer products. This research investigates whether the risk assessment approach used to ensure the safety of machinery may be applied to consumer products. Crucially, consumer products are subject to the Consumer Protection Act (CPA) 1987, where there is no direct reference to “assessing risk”. This contrasts with the law governing the safety of products used in the workplace, where risk assessment underpins the approach. New Approach Directives are supported by European harmonised standards, and in the case of machinery, further supported by the risk assessment standard, EN 1050. The system regulating consumer product safety is discussed, its key elements identified and a graphical model produced. This model incorporates such matters as conformity assessment, the system of regulation, near miss and accident reporting. A key finding of the research is that New Approach Directives have a common feature of specifying essential performance requirements that provide a hazard prompt-list that can form the basis for a risk assessment (the hazard identification stage). Drawing upon 272 prosecution cases, and with thirty examples examined in detail, this research provides evidence that despite the high degree of regulation, unsafe consumer products still find their way onto the market. The research presents a number of risk assessment tools to help Trading Standards Officers (TSOs) prioritise their work at the initial inspection stage when dealing with subsequent enforcement action.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Social groups form an important part of our daily lives. Within these groups pressures exist which encourage the individual to comply with the group’s viewpoint. This influence, which creates social conformity, is known as ‘majority influence’ and is the dominant process of social control. However, there also exists a ‘minority influence’, which emerges from a small subsection of the group and is a dynamic force for social change. Minority Influence and Innovation seeks to identify the conditions under which minority influence can prevail, to change established norms, stimulate original thinking and help us to see the world in new ways. With chapters written by a range of expert contributors, areas of discussion include: •processes and theoretical issues •the factors which affect majority and minority influence •interactions between majority and minority group members This book offers a thorough evaluation of the most important current developments within this field and presents consideration of the issues that will be at the forefront of future research. As such it will be of interest to theorists and practitioners working in social psychology. This book offers a thorough evaluation of the most important current developments within this field and presents consideration of the issues that will be at the forefront of future research. As such it will be of interest to theorists and practitioners working in social psychology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lock-in is observed in real world markets of experience goods; experience goods are goods whose characteristics are difficult to determine in advance, but ascertained upon consumption. We create an agent-based simulation of consumers choosing between two experience goods available in a virtual market. We model consumers in a grid representing the spatial network of the consumers. Utilising simple assumptions, including identical distributions of product experience and consumers having a degree of follower tendency, we explore the dynamics of the model through simulations. We conduct simulations to create a lock-in before testing several hypotheses upon how to break an existing lock-in; these include the effect of advertising and free give-away. Our experiments show that the key to successfully breaking a lock-in required the creation of regions in a consumer population. Regions arise due to the degree of local conformity between agents within the regions, which spread throughout the population when a mildly superior competitor was available. These regions may be likened to a niche in a market, which gains in popularity to transition into the mainstream.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cell-based therapies have the potential to contribute to global healthcare, whereby the use of living cells and tissues can be used as medicinal therapies. Despite this potential, many challenges remain before the full value of this emerging field can be realized. The characterization of input material for cell-based therapy bioprocesses from multiple donors is necessary to identify and understand the potential implications of input variation on process development. In this work, we have characterized bone marrow derived human mesenchymal stem cells (BM-hMSCs) from multiple donors and discussed the implications of the measurable input variation on the development of autologous and allogeneic cell-based therapy manufacturing processes. The range of cumulative population doublings across the five BM-hMSC lines over 30 days of culture was 5.93, with an 18.2% range in colony forming efficiency at the end of the culture process and a 55.1% difference in the production of interleukin-6 between these cell lines. It has been demonstrated that this variation results in a range in the process time between these donor hMSC lines for a hypothetical product of over 13 days, creating potential batch timing issues when manufacturing products from multiple patients. All BM-hMSC donor lines demonstrated conformity to the ISCT criteria but showed a difference in cell morphology. Metabolite analysis showed that hMSCs from the different donors have a range in glucose consumption of 26.98 pmol cell−1 day−1, Lactate production of 29.45 pmol cell−1 day−1 and ammonium production of 1.35 pmol cell−1 day−1, demonstrating the extent of donor variability throughout the expansion process. Measuring informative product attributes during process development will facilitate progress towards consistent manufacturing processes, a critical step in the translation cell-based therapies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In today’s modern manufacturing industry there is an increasing need to improve internal processes to meet diverse client needs. Process re-engineering is an important activity that is well understood by industry but its rate of application within small to medium size enterprises (SME) is less developed. Business pressures shift the focus of SMEs toward winning new projects and contracts rather than developing long-term, sustainable manufacturing processes. Variations in manufacturing processes are inevitable, but the amount of non-conformity often exceeds the acceptable levels. This paper is focused on the re-engineering of the manufacturing and verification procedure for discrete parts production with the aim of enhancing process control and product verification. The ideologies of the ‘Push’ and ‘Pull’ approaches to manufacturing are useful in the context of process re-engineering for data improvement. Currently information is pulled from the market and prominent customers, and manufacturing companies always try to make the right product, by following customer procedures that attempt to verify against specifications. This approach can result in significant quality control challenges. The aim of this paper is to highlight the importance of process re-engineering in product verification in SMEs. Leadership, culture, ownership and process management are among the main attributes required for the successful deployment of process re-engineering. This paper presents the findings from a case study showcasing the application of a modified re-engingeering method for the manufacturing and verification process. The findings from the case study indicate there are several advantages to implementing the re-engineering method outlined in this paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A landfill represents a complex and dynamically evolving structure that can be stochastically perturbed by exogenous factors. Both thermodynamic (equilibrium) and time varying (non-steady state) properties of a landfill are affected by spatially heterogenous and nonlinear subprocesses that combine with constraining initial and boundary conditions arising from the associated surroundings. While multiple approaches have been made to model landfill statistics by incorporating spatially dependent parameters on the one hand (data based approach) and continuum dynamical mass-balance equations on the other (equation based modelling), practically no attempt has been made to amalgamate these two approaches while also incorporating inherent stochastically induced fluctuations affecting the process overall. In this article, we will implement a minimalist scheme of modelling the time evolution of a realistic three dimensional landfill through a reaction-diffusion based approach, focusing on the coupled interactions of four key variables - solid mass density, hydrolysed mass density, acetogenic mass density and methanogenic mass density, that themselves are stochastically affected by fluctuations, coupled with diffusive relaxation of the individual densities, in ambient surroundings. Our results indicate that close to the linearly stable limit, the large time steady state properties, arising out of a series of complex coupled interactions between the stochastically driven variables, are scarcely affected by the biochemical growth-decay statistics. Our results clearly show that an equilibrium landfill structure is primarily determined by the solid and hydrolysed mass densities only rendering the other variables as statistically "irrelevant" in this (large time) asymptotic limit. The other major implication of incorporation of stochasticity in the landfill evolution dynamics is in the hugely reduced production times of the plants that are now approximately 20-30 years instead of the previous deterministic model predictions of 50 years and above. The predictions from this stochastic model are in conformity with available experimental observations.