300 resultados para Surprise
Resumo:
Three studies (N=144) investigated how toddlers aged 18 and 24 months pass the surprise-mark test of self-recognition. In Study 1, toddlers were surreptitiously marked in successive conditions on their legs and faces with stickers visible only in a mirror. Rates of sticker touching did not differ significantly between conditions. In Study 2, toddlers failed to touch a sticker on their legs that had been disguised before being marked. In Study 3, having been given 30-s exposure to their disguised legs before testing, toddlers touched the stickers on their legs and faces at equivalent levels. These results suggest that toddlers pass the mark test based on expectations about what they look like, expectations that are not restricted to the face.
Resumo:
Sex segregation in employment is a phenomenon that can be observed and analysed at different levels, ranging from comparisons between broad classifications by industry or occupation through to finely defined jobs within such classifications. From an aggregate perspective, the contribution of information technology (IT) employment to sex segregation is clear--it remains a highly male-dominated field apparently imbued with the ongoing masculinity of science and technology. While this situation is clearly contrary to hopes of a new industry freed from traditional distinctions between 'men's' and 'women's' work, it comes as little surprise to most feminist and labour studies analysts. An extensive literature documents the persistently masculine culture of IT employment and education (see, among many, Margolis and Fisher 2002; Wajcman 1991; Webster 1996; Wright 1996, 1997), and the idea that new occupations might escape sexism by sidestepping 'old traditions' has been effectively critiqued by writers such as Adam, who notes the fallacy of assuming a spontaneous emergence of equality in new settings (2005: 140).
Resumo:
This is the second edition of our Aston Business School (ABS) Good Practice Guide and the enthusiasm of the contributors appears undiminished. I am again reminded that I work with a group of very committed, dedicated and professional colleagues. Once again this publication is produced to celebrate and promote good teaching across the School and to offer encouragement to those imaginative and innovative staff who continue to wish to challenge students to learn to maximum effect. It is hoped that others will pick up some good ideas from the articles contained in this volume. Contributors to this Guide were not chosen because they are the best teachers in the School, although they are undoubtedly all amongst my colleagues who are exponents of enthusiastic and inspiring approaches to learning. The Quality Unit approached these individuals because they declared on their Annual Module Reflection Forms that they were doing something interesting and worthwhile which they thought others might find useful. Amongst those reading the Guide I am sure that there are many other individuals who are trying to operate similar examples of good practice in their teaching, learning and assessment methods. I hope that this publication will provoke these people into providing comments and articles of their own and that these will form the basis of next year’s Guide. It may also provoke some people to try these methods in their own teaching. The themes of the articles this year can be divided into two groups. The first theme is the quest to help students to help themselves to learn via student-run tutorials, surprise tests and mock examinations linked with individual tutorials. The second theme is making learning come to life in exciting practical ways by, for example, hands-on workshops and simulations, story telling, rhetorical questioning and discussion groups. A common theme is one of enthusiasm, reflection and commitment on behalf of the lecturers concerned. None of the approaches discussed in this publication are low effort activities on the part of the facilitator, but this effort is regarded as worthwhile as a means of creating greater student engagement. As Biggs (2003)[1] says, in his similarly inspiring way, students learn more the less passive they are in their learning. (Ref). The articles in this publication bear witness of this and much more. Since last year Aston Business School has launched its Research Centre in Higher Education Learning and Management (HELM) which is another initiative to promote excellent learning and teaching. Even before this institution has become fully operational, at least one of the articles in this publication has seen the light of day in the research arena and at least two others are ripe for dissemination to a wider audience via journal publication. More news of our successes in this activity will appear in next year’s edition. May I thank the contributors for taking time out of their busy schedules to write the articles this summer, and to Julie Green who runs the ABS Quality Unit, for putting our diverse approaches into a coherent and publishable form and for chasing us when we have needed it! I would also like to thank Ann Morton and her colleagues in the Centre for Staff Development who have supported this publication. During the last year the Centre has further stimulated the learning and teaching life of the School (and the wider University) via their Learning and Teaching Week and sponsorship of Teaching Quality Enhancement Fund (TQEF) projects. Pedagogic excellence is in better health at Aston than ever before – long may this be because this is what life in HE should be about.
Resumo:
We estimate the shape of the distribution of stock prices using data from options on the underlying asset, and test whether this distribution is distorted in a systematic manner each time a particular news event occurs. In particular we look at the response of the FTSE100 index to market wide announcements of key macroeconomic indicators and policy variables. We show that the whole distribution of stock prices can be distorted on an event day. The shift in distributional shape happens whether the event is characterized as an announcement occurrence or as a measured surprise. We find that larger surprises have proportionately greater impact, and that higher moments are more sensitive to events however characterised.
Resumo:
Considerable attention has been given in the literature to identifying and describing the effective elements which positively affect the improvement of product reliability. These have been perceived by many as the 'state of the art' in the manufacturing industry. The applicability, diffusion and effectiveness of such methods and philosophies, as a means of systematically improving the reliability of a product, come in the main from case studies and single and infra-industry empirical studies. These studies have both been carried out within the wider context of quality assurance and management, and taking reliability as a discipline in its own right. However, it is somewhat of a surprise that there are no recently published findings or research studies on the adoption of these methods by the machine tool industry. This may lead one to construct several hypothesised paradigms: (a) that machine tool manufacturers compared to other industries, are slow to respond to propositions given in the literature by theorists or (b) this may indicate that a large proportion of the manufacturers make little use of the reliability improvement techniques as described in the literature, with the overall perception that they will not lead to any significant improvements? On the other hand, it is evident that hypothetical verification of the operational and engineering methods of reliability achievement and improvement adopted in the machine tool industry is less widely researched. Therefore, research into this area is needed in order to explore the 'state of the art' practice in the machine tool industry. This is in terms of the status, structure and activities of the operation of the reliability function. This paper outlines a research programme being conducted with the co-operation of a leading machine tool manufacturer, whose UK manufacturing plant produces in the main Vertical Machining Centres (VMCs) and is continuously undergoing incremental transitions in product reliability improvement.
Resumo:
The behaviour of self adaptive systems can be emergent. The difficulty in predicting the system's behaviour means that there is scope for the system to surprise its customers and its developers. Because its behaviour is emergent, a self-adaptive system needs to garner confidence in its customers and it needs to resolve any surprise on the part of the developer during testing and mainteinance. We believe that these two functions can only be achieved if a self-adaptive system is also capable of self-explanation. We argue a self-adaptive system's behaviour needs to be explained in terms of satisfaction of its requirements. Since self-adaptive system requirements may themselves be emergent, a means needs to be found to explain the current behaviour of the system and the reasons that brought that behaviour about. We propose the use of goal-based models during runtime to offer self-explanation of how a system is meeting its requirements, and why the means of meeting these were chosen. We discuss the results of early experiments in self-explanation, and set out future work. © 2012 C.E.S.A.M.E.S.
Resumo:
Purpose - This paper examines the importance of intercultural training for lecturers; describes innovative training to address this, based on a new theoretical framework; and evaluates training and framework. Background - UK HE is becoming increasing internationally diverse. The UK HEI population is also very multicultural. The proportion of lecturers who come from outside the UK has risen. It is, therefore, important that students develop intercultural awareness. One way of doing this is to work with students. A more sustainable approach focusses training on lecturers who will embed cultural awareness into their practice. Method - This paper sets out a theoretical framework which underpins training developed for lecturers as part of a Postgraduate Certificate. The paper describes the training and evaluates the effectiveness of this. Findings and results - Findings show that participants were apprehensive about the training. Afterwards they expressed surprise at the participative approach, but were pleased with outcomes. They enjoyed the exercises and the training appeared to have opened up their outlook. They praised the freedom to share thoughts with others. Conclusions - Findings show that participants learnt intercultural skills to use in class. This was due to the design. The nature of the training encouraged reflection on cultural diversity and participants attested to the effects this would have on their teaching. These results replicate other studies Implications - The implications are immediate in the design of intercultural training in different contexts. It has already been used to design innovative training for students and managers. In both cases the same far-reaching results were achieved.
Resumo:
This article draws upon the use of photography to research the lives of children living in Accra, Ghana. Its aim is to consider method in visual research, and to reflect upon those modes of explanation and understanding that any consideration of method must require. It suggests a role for photography as a 'vector', as something capable of connecting our knowledge and understanding of the everyday with the everyday experiences and reality of others. Drawing upon the photographs and spoken testimonies of children who live and work on the street, and of children who live in a large informal settlement, the article advances an intimate connection between photography and knowledge of the everyday reality of children's lives, most evident in the capacity of children's photographs to surprise and highlight the fallibility of our understandings. © 2010 International Visual Sociology Association.
Resumo:
The behaviour of self adaptive systems can be emergent, which means that the system’s behaviour may be seen as unexpected by its customers and its developers. Therefore, a self-adaptive system needs to garner confidence in its customers and it also needs to resolve any surprise on the part of the developer during testing and maintenance. We believe that these two functions can only be achieved if a self-adaptive system is also capable of self-explanation. We argue a self-adaptive system’s behaviour needs to be explained in terms of satisfaction of its requirements. Since self-adaptive system requirements may themselves be emergent, we propose the use of goal-based requirements models at runtime to offer self-explanation of how a system is meeting its requirements. We demonstrate the analysis of run-time requirements models to yield a self-explanation codified in a domain specific language, and discuss possible future work.
Resumo:
Geography, retailing, and power are institutionally bound up together. Within these, the authors situate their research in Clegg's work on power. Online shopping offers a growing challenge to the apparent hegemony of traditional physical retail stores' format. While novel e-formats appear regularly, blogshops in Singapore are enjoying astonishing success that has taken the large retailers by surprise. Even though there are well-developed theoretical frameworks for understanding the role of institutional entrepreneurs and other major stakeholders in bringing about change and innovation, much less attention has been paid to the role of unorganized, nonstrategic actors-such as blogshops-in catalyzing retail change. The authors explore how blogshops are perceived by consumers and how they challenge the power of other shopping formats. They use Principal Components Analysis to analyze results from a survey of 349 blogshops users. While the results show that blogshops stay true to traditional online shopping attributes, deviations occur on the concept of value. Furthermore, consumer power is counter intuitively found to be strongly present in the areas related to cultural ties, excitement, and search for individualist novelty (as opposed to mass-production), thereby encouraging researchers to think critically about emerging power behavior in media practices.
Resumo:
Bayesian algorithms pose a limit to the performance learning algorithms can achieve. Natural selection should guide the evolution of information processing systems towards those limits. What can we learn from this evolution and what properties do the intermediate stages have? While this question is too general to permit any answer, progress can be made by restricting the class of information processing systems under study. We present analytical and numerical results for the evolution of on-line algorithms for learning from examples for neural network classifiers, which might include or not a hidden layer. The analytical results are obtained by solving a variational problem to determine the learning algorithm that leads to maximum generalization ability. Simulations using evolutionary programming, for programs that implement learning algorithms, confirm and expand the results. The principal result is not just that the evolution is towards a Bayesian limit. Indeed it is essentially reached. In addition we find that evolution is driven by the discovery of useful structures or combinations of variables and operators. In different runs the temporal order of the discovery of such combinations is unique. The main result is that combinations that signal the surprise brought by an example arise always before combinations that serve to gauge the performance of the learning algorithm. This latter structures can be used to implement annealing schedules. The temporal ordering can be understood analytically as well by doing the functional optimization in restricted functional spaces. We also show that there is data suggesting that the appearance of these traits also follows the same temporal ordering in biological systems. © 2006 American Institute of Physics.
Resumo:
The focus of this thesis is the extension of topographic visualisation mappings to allow for the incorporation of uncertainty. Few visualisation algorithms in the literature are capable of mapping uncertain data with fewer able to represent observation uncertainties in visualisations. As such, modifications are made to NeuroScale, Locally Linear Embedding, Isomap and Laplacian Eigenmaps to incorporate uncertainty in the observation and visualisation spaces. The proposed mappings are then called Normally-distributed NeuroScale (N-NS), T-distributed NeuroScale (T-NS), Probabilistic LLE (PLLE), Probabilistic Isomap (PIso) and Probabilistic Weighted Neighbourhood Mapping (PWNM). These algorithms generate a probabilistic visualisation space with each latent visualised point transformed to a multivariate Gaussian or T-distribution, using a feed-forward RBF network. Two types of uncertainty are then characterised dependent on the data and mapping procedure. Data dependent uncertainty is the inherent observation uncertainty. Whereas, mapping uncertainty is defined by the Fisher Information of a visualised distribution. This indicates how well the data has been interpolated, offering a level of ‘surprise’ for each observation. These new probabilistic mappings are tested on three datasets of vectorial observations and three datasets of real world time series observations for anomaly detection. In order to visualise the time series data, a method for analysing observed signals and noise distributions, Residual Modelling, is introduced. The performance of the new algorithms on the tested datasets is compared qualitatively with the latent space generated by the Gaussian Process Latent Variable Model (GPLVM). A quantitative comparison using existing evaluation measures from the literature allows performance of each mapping function to be compared. Finally, the mapping uncertainty measure is combined with NeuroScale to build a deep learning classifier, the Cascading RBF. This new structure is tested on the MNist dataset achieving world record performance whilst avoiding the flaws seen in other Deep Learning Machines.
Resumo:
Purpose: To ascertain the agreement level between intra-operative refraction using a prototype surgical Hartmann-Shack aberrometer and subjective refraction a month later. Methods: Fifty-four consecutive patients had their pseudophakic refractive measured with the aberrometer intra-operatively at the end of their cataract surgery. A masked optometrist performed subjective refraction 4 weeks later. The two sets of data were then analysed for correlation. Results: The mean spherical equivalent was −0.14 ± 0.37 D (Range: −1.41 to +1.72 D) with the prototype aberrometer and −0.34 ± 0.32 (−1.64 to +1.88 D) with subjective refraction. The measurements positively correlated to a very high degree (r =+0.81, p < 0.01). In 84.3% of cases the two measurements were within 0.50D of each other. Conclusion: The aberrometer can verify the aimed refractive status of the eye intraoperatively to avoid a refractive surprise. The aberrometer is a useful tool for real time assessment of the ocular refractive status.
Resumo:
Background: Adverse drug reactions (ADRs) cause significant morbidity and mortality and account for around 6.5% of hospital admissions. Patient experiences of serious ADRs and their long-term impact on patients' lives, including their influence on current attitudes towards medicines, have not been previously explored. Objective: The aim of the study was to explore the experiences, beliefs, and attitudes of survivors of serious ADRs, using drug-induced Stevens-Johnson syndrome (SJS) and Toxic Epidermal Necrolysis (TEN) as a paradigm. Methods: A retrospective, qualitative study was undertaken using detailed semi-structured interviews. Fourteen adult survivors of SJS and TEN, admitted to two teaching hospitals in the UK, one the location of a tertiary burns centre, were interviewed. Interview transcripts were independently analysed by three different researchers and themes emerging from the text identified. Results: All 14 patients were aware that their condition was drug induced, and all but one knew the specific drug(s) implicated. Several expressed surprise at the perceived lack of awareness of the ADR amongst healthcare professionals, and described how the ADR was mistaken for another condition. Survivors believed that causes of the ADR included (i) being given too high a dose of the drug; (ii) medical staff ignoring existing allergies; and (iii) failure to monitor blood tests. Only two believed that the reaction was unavoidable. Those who believed that the condition could have been avoided had less trust in healthcare professionals. The ADR had a persisting impact on their current lives physically and psychologically. Many now avoided medicines altogether and were fearful of becoming ill enough to need them. © 2011 Adis Data Information BV. All rights reserved. Conclusions: Life-threatening ADRs continued to affect patients’ lives long after the event. Patients’ beliefs regarding the cause of the ADR differed, and may have influenced their trust in healthcare professionals and medicines. We propose that clear communication during the acute phase of a serious ADR may therefore be important.