23 resultados para God or Nature
em Aston University Research Archive
Resumo:
Few works address methodological issues of how to conduct strategy-as-practice research and even fewer focus on how to analyse the subsequent data in ways that illuminate strategy as an everyday, social practice. We address this gap by proposing a quantitative method for analysing observational data, which can complement more traditional qualitative methodologies. We propose that rigorous but context-sensitive coding of transcripts can render everyday practice analysable statistically. Such statistical analysis provides a means for analytically representing patterns and shifts within the mundane, repetitive elements through which practice is accomplished. We call this approach the Event Database (EDB) and it consists of five basic coding categories that help us capture the stream of practice. Indexing codes help to index or categorise the data, in order to give context and offer some basic information about the event under discussion. Indexing codes are descriptive codes, which allow us to catalogue and classify events according to their assigned characteristics. Content codes are to do with the qualitative nature of the event; this is the essence of the event. It is a description that helps to inform judgements about the phenomenon. Nature codes help us distinguish between discursive and tangible events. We include this code to acknowledge that some events differ qualitatively from other events. Type events are codes abstracted from the data in order to help us classify events based on their description or nature. This involves significantly more judgement than the index codes but consequently is also more meaningful. Dynamics codes help us capture some of the movement or fluidity of events. This category has been included to let us capture the flow of activity over time.
Resumo:
Few works address methodological issues of how to conduct strategy-as-practice research and even fewer focus on how to analyse the subsequent data in ways that illuminate strategy as an everyday, social practice. We address this gap by proposing a quantitative method for analysing observational data, which can complement more traditional qualitative methodologies. We propose that rigorous but context-sensitive coding of transcripts can render everyday practice analysable statistically. Such statistical analysis provides a means for analytically representing patterns and shifts within the mundane, repetitive elements through which practice is accomplished. We call this approach the Event Database (EDB) and it consists of five basic coding categories that help us capture the stream of practice. Indexing codes help to index or categorise the data, in order to give context and offer some basic information about the event under discussion. Indexing codes are descriptive codes, which allow us to catalogue and classify events according to their assigned characteristics. Content codes are to do with the qualitative nature of the event; this is the essence of the event. It is a description that helps to inform judgements about the phenomenon. Nature codes help us distinguish between discursive and tangible events. We include this code to acknowledge that some events differ qualitatively from other events. Type events are codes abstracted from the data in order to help us classify events based on their description or nature. This involves significantly more judgement than the index codes but consequently is also more meaningful. Dynamics codes help us capture some of the movement or fluidity of events. This category has been included to let us capture the flow of activity over time.
Resumo:
Conventional differential scanning calorimetry (DSC) techniques are commonly used to quantify the solubility of drugs within polymeric-controlled delivery systems. However, the nature of the DSC experiment, and in particular the relatively slow heating rates employed, limit its use to the measurement of drug solubility at the drug's melting temperature. Here, we describe the application of hyper-DSC (HDSC), a variant of DSC involving extremely rapid heating rates, to the calculation of the solubility of a model drug, metronidazole, in silicone elastomer, and demonstrate that the faster heating rates permit the solubility to be calculated under non-equilibrium conditions such that the solubility better approximates that at the temperature of use. At a heating rate of 400°C/min (HDSC), metronidazole solubility was calculated to be 2.16 mg/g compared with 6.16 mg/g at 20°C/min. © 2005 Elsevier B.V. All rights reserved.
Resumo:
Relocation is one organizational phenomenon where the influence of family is prominent. Our paper thus uses it as a backdrop against which to study the work–family interface. In-depth qualitative analysis of 62 interviews with Royal Air Force personnel is used to complement the literature by demonstrating the impact on and the impact of the immediate family in relocation. The analysis provides evidence that relocation influences an employee's role as family member, other family members and the family as a whole. Findings also illustrate that families influence employees' relocation behaviour, organizational tenure and work focus. In summary, this paper supports the bidirectional nature of the work-family interface and also demonstrates that regardless of whether examining the work-to-family influence or the family-to-work influence the effects are not always negative.
Resumo:
How should the 'long' eighteenth century be defined? January 1, 1700 and December 31, 1799 are quite arbitrary dates. Why should they be chosen to segment our history rather than more significant periods of time, periods which have a coherent content, or are marked, perhaps, by the working out of a theme? Students of English literature sometimes take the long eighteenth century to extend from John Milton (Paradise Lost, 1667) to the passing of the first generation of Romantics (Keats (d. 1821), Shelley (d. 1822), Byron (d. 1824), Coleridge (d. 1834)). Students of British political history often take it to start with the accession of Charles II (the Restoration) in 1660 or, alternatively, the so-called Glorious Revolution of 1688 and to end with the great Reform Act of 1832. Others might choose different book ends. In the history of science and philosophy the terminus a quo is sometimes taken as the publication of Descartes' scientific philosophy or, in more Anglophone zones, the 1687 publication of Newton's Principia with its vision of a 'clockwork universe'. 'Nature and Nature's laws' as Alexander Pope enthused, 'lay hid in Night: God said, Let Newton be! and all was light!'.
Resumo:
Object-oriented programming is seen as a difficult skill to master. There is considerable debate about the most appropriate way to introduce novice programmers to object-oriented concepts. Is it possible to uncover what the critical aspects or features are that enhance the learning of object-oriented programming? Practitioners have differing understandings of the nature of an object-oriented program. Uncovering these different ways of understanding leads to agreater understanding of the critical aspects and their relationship tothe structure of the program produced. A phenomenographic studywas conducted to uncover practitioner understandings of the nature of an object-oriented program. The study identified five levels of understanding and three dimensions of variation within these levels. These levels and dimensions of variation provide a framework for fostering conceptual change with respect to the nature of an object-oriented program.
Resumo:
On the basis of a review of the substantive quality and service marketing literature current knowledge regarding service quality expectations was found either absent or deficient. The phenomenon is of increasing importance to both marketing researchers and management and was therefore judged worthy of scholarly consideration. Because the service quality literature was insufficiently rich when embarking on the thesis three basic research issues were considered namely the nature, determinants, and dynamics of service quality expectations. These issues were first conceptually and then qualitatively explored. This process generated research hypotheses mainly relating to a model which were subsequently tested through a series of empirical investigations using questionnaire data from field studies in a single context. The results were internally consistent and strongly supported the main research hypotheses. It was found that service quality expectations can be meaningfully described in terms of generic/service-specific, intangible/tangible, and process/outcome categories. Service-specific quality expectations were also shown to be determined by generic service quality expectations, demographic variables, personal values, psychological needs, general service sophistication, service-specific sophistication, purchase motives, and service-specific information when treating service class involvement as an exogenous variable. Subjects who had previously not directly experienced a particular service were additionally found to revise their expectations of quality when exposed to the service with change being driven by a sub-set of identified determinants.
Resumo:
Some of the problems arising from the inherent instability of emulsions are discussed. Aspects of emulsion stability are described and particular attention is given to the influence of the chemical nature of the dispersed phase on adsorbed film structure and stability, Emulsion stability has been measured by a photomicrographic technique. Electrophoresis, interfacial tension and droplet rest-time data were also obtained. Emulsions were prepared using a range of oils, including aliphatic and aromatic hydrocarbons, dispersed In a solution of sodium dodecyl sulphate. In some cases a small amount of alkane or alkanol was incorporated into the oil phase. In general the findings agree with the classical view that the stability of oil-in-water emulsions is favoured by a closely packed interfacial film and appreciable electric charge on the droplets. The inclusion of non-ionic alcohol leads to enhanced stability, presumably owing to the formation of a "mixed" interfacial film which is more closely packed and probably more coherent than that of the anionic surfactant alone. In some instances differences in stability cannot he accounted for simply by differences in interfacial adsorption or droplet charge. Alternative explanations are discussed and it is postulated that the coarsening of emulsions may occur not only hy coalescence but also through the migration of oil from small droplets to larger ones by molecular diffusion. The viability of using the coalescence rates of droplets at a plane interface as a guide to emulsion stability has been researched. The construction of a suitable apparatus and the development of a standard testing procedure are described. Coalescence-time distributions may be correlated by equations similar to those presented by other workers, or by an analysis based upon the log-normal function. Stability parameters for a range of oils are discussed in terms of differences in film drainage and the natl1re of the interfacial film. Despite some broad correlations there is generally poor agreement between droplet and emulsion stabilities. It is concluded that hydrodynamic factors largely determine droplet stability in the systems studied. Consequently droplet rest-time measurements do not provide a sensible indication of emulsion stability,
Resumo:
The aims of this study were to investigate the beliefs concerning the philosophy of science held by practising science teachers and to relate those beliefs to their pupils' understanding of the philosophy of science. Three philosophies of science, differing in the way they relate experimental work to other parts of the scientific enterprise, are described. By the use of questionnaire techniques, teachers of four extreme types were identified. These are: the H type or hypothetico-deductivist teacher, who sees experiments as potential falsifiers of hypotheses or of logical deductions from them; the I type or inductivist teacher, who regards experiments mainly as a way of increasing the range of observations available for recording before patterns are noted and inductive generalisation is carried out; the V type or verificationist teacher, who expects experiments to provide proof and to demonstrate the truth or accuracy of scientific statements; and the 0 type, who has no discernible philosophical beliefs about the nature of science or its methodology. Following interviews of selected teachers to check their responses to the questionnaire and to determine their normal teaching methods, an experiment was organised in which parallel groups were given H, I and V type teaching in the normal school situation during most of one academic year. Using pre-test and post-test scores on a specially developed test of pupil understanding of the philosophy of science, it was shown that pupils were positively affected by their teacher's implied philosophy of science. There was also some indication that V type teaching improved marks obtained in school science examinations, but appeared to discourage the more able from continuing the study of science. Effects were also noted on vocabulary used by pupils to describe scientists and their activities.
Resumo:
A survey is made of the literature relating to a number of dimensions of cognitive style, from which it is concluded that cognitive style has a strong theoretical potential as a predictor of academic performance. It is also noted that there have been few attempts to relate co gnitive style to academic performance, and that these have met with limited success. On the assumption that theories of individual differences should be congruent with theories of general functioning, an examination is made of the model of cognition presupposed by ,dimen sions of cognitive style. A central feature of this model is the distinction between cognitive content and cognitive structure. The origins of this distinction are traced back to the normative and experimental or quasi-experimental characteristics of research in psychology. The validity of the distinction is examined with reference to modern research findings, and the conclusion is drawn that the norma~ive experimental method is an increasingly inappropriate tool of research when applied to higher levels of cognitive functioning, as it cannot handle subject idiosyncracy or patterns of interaction. An examination of the presuppositions of educational research leads to the complementary conclusion that the research methods imply an oversimplified model of the educational situation. Two empirical studies are reported: (1) An experiment using conventional cognitive style dimensions as predictors of performance under two teaching methods (2) An attempt to predict individual differences in overall academic performance by means of a research technique which uses a questionnaire, intra-individual scoring, and an analysis of patterns of responses, and which attempts to take some account of subject idiosyncracy. The implifications of these studies for fUrther research are noted.
Resumo:
Sensorimotor synchronization is hypothesized to arise through two different processes, associated with continuous or discontinuous rhythmic movements. This study investigated synchronization of continuous and discontinuous movements to different pacing signals (auditory or visual), pacing interval (500, 650, 800, 950 ms) and across effectors (non-dominant vs. non-dominant hand). The results showed that mean and variability of asynchronization errors were consistently smaller for discontinuous movements compared to continuous movements. Furthermore, both movement types were timed more accurately with auditory pacing compared to visual pacing and were more accurate with the dominant hand. Shortening the pacing interval also improved sensorimotor synchronization accuracy in both continuous and discontinuous movements. These results show the dependency of temporal control of movements on the nature of the motor task, the type and rate of extrinsic sensory information as well as the efficiency of the motor actuators for sensory integration.
Resumo:
In this paper we investigate whether consideration of store-level heterogeneity in marketing mix effects improves the accuracy of the marketing mix elasticities, fit, and forecasting accuracy of the widely-applied SCAN*PRO model of store sales. Models with continuous and discrete representations of heterogeneity, estimated using hierarchical Bayes (HB) and finite mixture (FM) techniques, respectively, are empirically compared to the original model, which does not account for store-level heterogeneity in marketing mix effects, and is estimated using ordinary least squares (OLS). The empirical comparisons are conducted in two contexts: Dutch store-level scanner data for the shampoo product category, and an extensive simulation experiment. The simulation investigates how between- and within-segment variance in marketing mix effects, error variance, the number of weeks of data, and the number of stores impact the accuracy of marketing mix elasticities, model fit, and forecasting accuracy. Contrary to expectations, accommodating store-level heterogeneity does not improve the accuracy of marketing mix elasticities relative to the homogeneous SCAN*PRO model, suggesting that little may be lost by employing the original homogeneous SCAN*PRO model estimated using ordinary least squares. Improvements in fit and forecasting accuracy are also fairly modest. We pursue an explanation for this result since research in other contexts has shown clear advantages from assuming some type of heterogeneity in market response models. In an Afterthought section, we comment on the controversial nature of our result, distinguishing factors inherent to household-level data and associated models vs. general store-level data and associated models vs. the unique SCAN*PRO model specification.
Resumo:
This paper looks at the way in which, over recent years, paradigms for manufacturing management have evolved as a result of changing economic and environmental circumstances. The lean production concept, devised during the 1980s, proved robust only until the end of the bubble economy in Japan caused firms to re-examine the underlying principles of the lean production paradigm and redesign their production systems to suit the changing circumstances they were facing. Since that time a plethora of new concepts have emerged, most of which have been based on improving the way that firms are able to respond to the uncertainties of the new environment in which they have found themselves operating. The main question today is whether firms should be agile or adaptable. Both concepts imply a measure of responsiveness, but recent changes in the nature of the uncertainties have heightened the debate about what strategies should be adopted in the future.
Resumo:
Objective of the study To determine the extent and nature of unlicensed/off-label prescribing patterns in hospitalised children in Palestine. Setting Four paediatric wards in two public health system hospitals in Palestine [Caritas children’s hospital (Medical and neonatal intensive care units) and Rafidia general hospital (Medical and surgical units)]. Method A prospective survey of drugs administered to infants and children <18 years old was carried out over a five-week period in the four paediatric wards. Main outcome measure Drug-licensing status of all prescriptions was determined according to the Palestinian Registered Product List and the Physician’s Desk Reference. Results Overall, 917 drug prescriptions were administered to 387children. Of all drug prescriptions, 528 (57.5%) were licensed for use in children; 65 (7.1%) were unlicensed; and 324 (35.3%) were used off-label. Of all children, 49.6% received off-label prescriptions, 10.1% received unlicensed medications and 8.2% received both. Seventy-two percent of off-label drugs and 66% of unlicensed drugs were prescribed for children <2 years. Multivariate analysis showed that patients who were admitted to the neonatal intensive care unit and infants aged 0–1 years were most likely to receive a greater number of off-label or unlicensed medications (OR 1.80; 95% CI 1.03–3.59 and OR 1.99; 95% CI 0.88–3.73, respectively). Conclusion The present findings confirmed the elevated prevalence of unlicensed and off-label paediatric drugs use in Palestine and strongly support the need to perform well designed clinical studies in children.