898 resultados para Catalão (GO)
Resumo:
"Language, Literacy and Literature combines concepts of language, literature and literacy within a pedagogical framework that leads readers through a series of learning processes. In other words, it is as much a book about teaching English as it is a book about learning how to learn about teaching. The book provides models for pre-service teachers to help identify the kinds of dispositions towards learning that a teacher needs to develop, such as curiosity, collaboration and willingness to ‘give things a go’. It further challenges the pre-service teacher to question what they think they know, as well as discover what they need to know. A range of practical and relevant exercises and activities assist in building habits of reflexive practice."-- publisher website
Resumo:
Emerging sciences, such as conceptual cost estimating, seem to have to go through two phases. The first phase involves reducing the field of study down to its basic ingredients - from systems development to technological development (techniques) to theoretical development. The second phase operates in the direction in building up techniques from theories, and systems from techniques. Cost estimating is clearly and distinctly still in the first phase. A great deal of effort has been put into the development of both manual and computer based cost estimating systems during this first phase and, to a lesser extent, the development of a range of techniques that can be used (see, for instance, Ashworth & Skitmore, 1986). Theoretical developments have not, as yet, been forthcoming. All theories need the support of some observational data and cost estimating is not likely to be an exception. These data do not need to be complete in order to build theories. As it is possible to construct an image of a prehistoric animal such as the brontosaurus from only a few key bones and relics, so a theory of cost estimating may possibly be found on a few factual details. The eternal argument of empiricists and deductionists is that, as theories need factual support, so do we need theories in order to know what facts to collect. In cost estimating, the basic facts of interest concern accuracy, the cost of achieving this accuracy, and the trade off between the two. When cost estimating theories do begin to emerge, it is highly likely that these relationships will be central features. This paper presents some of the facts we have been able to acquire regarding one part of this relationship - accuracy, and its influencing factors. Although some of these factors, such as the amount of information used in preparing the estimate, will have cost consequences, we have not yet reached the stage of quantifying these costs. Indeed, as will be seen, many of the factors do not involve any substantial cost considerations. The absence of any theory is reflected in the arbitrary manner in which the factors are presented. Rather, the emphasis here is on the consideration of purely empirical data concerning estimating accuracy. The essence of good empirical research is to .minimize the role of the researcher in interpreting the results of the study. Whilst space does not allow a full treatment of the material in this manner, the principle has been adopted as closely as possible to present results in an uncleaned and unbiased way. In most cases the evidence speaks for itself. The first part of the paper reviews most of the empirical evidence that we have located to date. Knowledge of any work done, but omitted here would be most welcome. The second part of the paper presents an analysis of some recently acquired data pertaining to this growing subject.
Resumo:
Teachers of construction economics and estimating have for a long time recognised that there is more to construction pricing than detailed calculation of costs (to the contractor). We always get to the point where we have to say "of course, experience or familiarity of the market is very important and this needs judgement, intuition, etc". Quite how important is the matter in construction pricing is not known and we tend to trivialise its effect. If judgement of the market has a minimal effect, little harm would be done, but if it is really important then some quite serious consequences arise which go well beyond the teaching environment. Major areas of concern for the quantity surveyor are in cost modelling and cost planning - neither of which pay any significant attention to the market effect. There are currently two schools of thought about the market effect issue. The first school is prepared to ignore possible effects until more is known. This may be called the pragmatic school. The second school exists solely to criticise the first school. We will call this the antagonistic school. Neither the pragmatic nor the antagonistic schools seem to be particularly keen to resolve the issue one way or the other. The founder and leader of the antagonistic school is Brian Fine whose paper in 1974 is still the basic text on the subject, and in which he coined the term 'socially acceptable' price to describe what we now recognise as the market effect. Mr Fine's argument was then, and is since, that the uncertainty surrounding the contractors' costing and cost estimating process is such that the uncertainty surrounding the contractors' cost that it logically leads to a market-orientated pricing approach. Very little factual evidence, however, seems to be available to support these arguments in any conclusive manner. A further, and more important point for the pragmatic school, is that, even if the market effect is as important as Mr Fine believes, there are no indications of how it can be measured, evaluated or predicted. Since 1974 evidence has been accumulating which tends to reinforce the antagonists' view. A review of the literature covering both contractors' and designers' estimates found many references to the use of value judgements in construction pricing (Ashworth & Skitmore, 1985), which supports the antagonistic view in implying the existence of uncertainty overload. The most convincing evidence emerged quite by accident in some research we recently completed with practicing quantity surveyors in estimating accuracy (Skitmore, 1985). In addition to demonstrating that individual quantity surveyors and certain types of buildings had significant effect on estimating accuracy, one surprise result was that only a very small amount of information was used by the most expert surveyors for relatively very accurate estimates. Only the type and size of building, it seemed, was really relevant in determining accuracy. More detailed information about the buildings' specification, and even a sight to the drawings, did not significantly improve their accuracy level. This seemed to offer clear evidence that the constructional aspects of the project were largely irrelevant and that the expert surveyors were somehow tuning in to the market price of the building. The obvious next step is to feed our expert surveyors with more relevant 'market' information in order to assess its effect. The problem with this is that our experts do not seem able to verbalise their requirements in this respect - a common occurrence in research of this nature. The lack of research into the nature of market effects on prices also means the literature provides little of benefit. Hence the need for this study. It was felt that a clearer picture of the nature of construction markets would be obtained in an environment where free enterprise was a truly ideological force. For this reason, the United States of America was chosen for the next stage of our investigations. Several people were interviewed in an informal and unstructured manner to elicit their views on the action of market forces on construction prices. Although a small number of people were involved, they were thought to be reasonably representative of knowledge in construction pricing. They were also very well able to articulate their views. Our initial reaction to the interviews was that our USA subjects held very close views to those held in the UK. However, detailed analysis revealed the existence of remarkably clear and consistent insights that would not have been obtained in the UK. Further evidence was also obtained from literature relating to the subject and some of the interviewees very kindly expanded on their views in later postal correspondence. We have now analysed all the evidence received and, although a great deal is of an anecdotal nature, we feel that our findings enable at least the basic nature of the subject to be understood and that the factors and their interrelationships can now be examined more formally in relation to construction price levels. I must express my gratitude to the Royal Institution of Chartered Surveyors' Educational Trust and the University of Salford's Department of Civil Engineering for collectively funding this study. My sincere thanks also go to our American participants who freely gave their time and valuable knowledge to us in our enquiries. Finally, I must record my thanks to Tim and Anne for their remarkable ability to produce an intelligible typescript from my unintelligible writing.
Resumo:
Despite the widespread use of paper, plastic or ceramics in dielectric capacitors, water has not been commonly used as a dielectric due to its tendency to become conductive as it easily leaches ions from the environment. We show here that when water is confined between graphene oxide sheets, it can retain its insulating nature and behave as a dielectric. A hydrated graphene oxide film was used as a dielectric spacer to construct a prototype water-dielectric capacitor. The capacitance depends on the water content of the hydrated GO film as well as the voltage applied to the device. Our results show that the capacitance per unit area of the GO film is in the range of 100–800 mF cm �2, which is 5–40 times that of the double layer capacitance per surface area of activated carbon.
Resumo:
Few would argue that the upstream oil and gas industry has become more technology- intensive over the years. At the same time, the increasing costs and complexity of today’s exploration and production (E&P) technologies are making it increasingly difficult for any one company to support an aggressive research and development (R&D) agenda single handedly. The coming together of these two evolutionary forces gives rise to important questions. How does innovation happen in the E&P industry? Specifically, what ideas and inputs flow from which parts of the industry’s value network, and where do these inputs go? And how do firms and organizations from different countries contribute differently to this process? This survey was designed to shed light on these issues.
Resumo:
We argue that there are at least two significant issues for interaction designers to consider when creating the next generation of human interfaces for civic and urban engagement: (1) The disconnect between citizens participating in either digital or physical realms has resulted in a neglect of the hybrid role that public place and situated technology can play in contributing to civic innovation. (2) Under the veneer of many social media tools, hardly any meaningful strategies or approaches are found that go beyond awareness raising and allow citizens to do more than clicking a ‘Like’ button. We call for an agenda to design the next generation of ‘digital soapboxes’ that contributes towards a new form of polity helping citizens not only to have a voice but also to appropriate their city in order to take action for change.
Resumo:
Objective: Comprehensive, accurate information about road crashes and related trauma is a prerequisite for identification and control of risk factors as well as for identifying faults within the broader road safety system. Quality data and appropriate crash investigation are critical in reducing the road toll that is rapidly growing in much of the developing world, including Pakistan. This qualitative research explored the involvement of social and cultural factors (in particular, fatalism) in risky road use in Pakistan. The findings highlight a significant issue, previously unreported in the road safety literature, namely, the link between fatalistic beliefs and inaccurate reporting of road crashes. Method: Thirty interviews (one-to one) were conducted by the first author with police officers, drivers, policy makers and religious orators in three Pakistani cities. Findings: Evidence emerged of a strong link between fatalism and the under-reporting of road crashes. In many cases, crashes and related road trauma appear to go unreported because a crash is considered to be one’s fate and, therefore, beyond personal control. Fate was also implicated in the practice of reconciliation between parties after a crash without police involvement and the seeking and granting of pardon for a road death. Conclusions: These issues represent additional factors that can contribute to under-reporting of crashes and associated trauma. Together, they highlight complications involved in establishing the true cost of road trauma in a country such as Pakistan and the difficulties faced when attempting to promote scientifically-based road safety information to counteract faith-based beliefs.
Resumo:
Overview: - Development of mixed methods research - Benefits and challenges of “mixing” - Different models - Good design - Two examples - How to report? - Have a go!
Resumo:
While social enterprises have gained increasing policy attention as vehicles for generating innovative responses to complex social and environmental problems, surprisingly little is known about them. In particular, the social innovation produced by social enterprises (Mulgan, Tucker, Ali, & Sander, 2007) has been presumed rather than demonstrated, and remains under-investigated in the literature. While social enterprises are held to be inherently innovative as they seek to response to social needs (Nicholls, 2010), there has been conjecture that the collaborative governance arrangements typical in social enterprises may be conducive to innovation (Lumpkin, Moss, Gras, Kato, & Amezcua, In press), as members and volunteers provide a source of creative ideas and are unfettered in such thinking by responsibility to deliver organisational outcomes (Hendry, 2004). However this is complicated by the sheer array of governance arrangements which exist in social enterprises, which range from flat participatory democratic structures through to hierarchical arrangements. In continental Europe, there has been a stronger focus on democratic participation as a characteristic of Social Enterprises than, for example, the USA. In response to this gap in knowledge, a research project was undertaken to identify the population of social enterprises in Australia. The size, composition and the social innovations initiated by these enterprises has been reported elsewhere (see Barraket, 2010). The purpose of this paper is to undertake a closer examination of innovation in social enterprises – particularly how the collaborative governance of social enterprises might influence innovation. Given the pre-paradigmatic state of social entrepreneurship research (Nicholls, 2010), and the importance of drawing draw on established theories in order to advance theory (Short, Moss, & Lumpkin, 2009), a number of conceptual steps are needed in order to examine how collaborative governance might influence by social enterprises. In this paper, we commence by advancing a definition as to what a social enterprise is. In light of our focus on the potential role of collaborative governance in social innovation amongst social enterprises, we go on to consider the collaborative forms of governance prevalent in the Third Sector. Then, collaborative innovation is explored. Drawing on this information and our research data, we finally consider how collaborative governance might affect innovation amongst social enterprises.
Resumo:
Serial killers are among the most popular and enduring character types in contemporary culture. In this exegesis I investigate one of the reasons for this popularity by examining the representational relationships between serial killers and serial consumers. I initially establish that all monsters, whether they are vampires, werewolves or serial killers, emerge from cultural anxieties and signify the anxiety which gave them birth. I go on to identify that the cultural anxiety at play with serial killers is consumerism and in doing so, I identify two key parallels between the serial killer and the consumer, namely a sense of lack and a desire for transformation. I then examine the ways in which the serial killer is representative of the consumer in three exemplar texts, The Silence of the Lambs by Thomas Harris, American Psycho by Bret Easton Ellis and Darkly Dreaming Dexter by Jeff Lindsay. I go on to self-reflexively examine the creation of my novel Carnivore, the accompanying draft of which has been influenced by both the exemplar texts and the findings of the exegesis.
Resumo:
Quantitative market data has traditionally been used throughout marketing and business as a tool to inform and direct design decisions. However, in our changing economic climate, businesses need to innovate and create products their customers will love. Deep customer insight methods move beyond just questioning customers and aims to provoke true emotional responses in order to reveal new opportunities that go beyond functional product requirements. This paper explores traditional market research methods and compares them to methods used to gain deep customer insights. This study reports on a collaborative research project with seven small to medium enterprises and four multi-national organisations. Firms were introduced to a design led innovation approach, and were taught the different methods to gain deep customer insights. Interviews were conducted to understand the experience and outcomes of pre-existing research methods and deep customer insight approaches. Findings concluded that deep customer insights were unlikely to be revealed through traditional market research techniques. The theoretical outcome of this study is a complementary methods matrix, providing guidance on appropriate research methods in accordance to a project’s timeline.
Resumo:
Adequate amount of graphene oxide (GO) was firstly prepared by oxidation of graphite and GO/epoxy nanocomposites were subsequently prepared by typical solution mixing technique. X-ray diffraction (XRD) pattern, X-ray photoelectron (XPS), Raman and Fourier transform infrared (FTIR) spectroscopy indicated the successful preparation of GO. Scanning electron microscopy (SEM) and Transmission electron microscopy (TEM) images of the graphite oxide showed that they consist of a large amount of graphene oxide platelets with a curled morphology containing of a thin wrinkled sheet like structure. AFM image of the exfoliated GO signified that the average thickness of GO sheets is ~1.0 nm which is very similar to GO monolayer. Mechanical properties of as prepared GO/epoxy nanocomposites were investigated. Significant improvements in both Young’s modulus and tensile strength were observed for the nanocomposites at very low level of GO loading. The Young’s modulus of the nanocomposites containing 0.5 wt% GO was 1.72 GPa, which was 35 % higher than that of the pure epoxy resin (1.28 GPa). The effective reinforcement of the GO based epoxy nanocomposites can be attributed to the good dispersion and the strong interfacial interactions between the GO sheets and the epoxy resin matrices.
Resumo:
Objectives: To identify and appraise the literature concerning nurse-administered procedural sedation and analgesia in the cardiac catheter laboratory. Design and data sources: An integrative review method was chosen for this study. MEDLINE and CINAHL databases as well as The Cochrane Database of Systematic Reviews and the Joanna Briggs Institute were searched. Nineteen research articles and three clinical guidelines were identified. Results: The authors of each study reported nurse-administered sedation in the CCL is safe due to the low incidence of complications. However, a higher percentage of deeply sedated patients were reported to experience complications than moderately sedated patients. To confound this issue, one clinical guideline permits deep sedation without an anaesthetist present, while others recommend against it. All clinical guidelines recommend nurses are educated about sedation concepts. Other findings focus on pain and discomfort and the cost-savings of nurse-administered sedation, which are associated with forgoing anaesthetic services. Conclusions: Practice is varied due to limitations in the evidence and inconsistent clinical practice guidelines. Therefore, recommendations for research and practice have been made. Research topics include determining how and in which circumstances capnography can be used in the CCL, discerning the economic impact of sedation-related complications and developing a set of objectives for nursing education about sedation. For practice, if deep sedation is administered without an anaesthetist present, it is essential nurses are adequately trained and have access to vital equipment such as capnography to monitor ventilation because deeply sedated patients are more likely to experience complications related to sedation. These initiatives will go some way to ensuring patients receiving nurse-administered procedural sedation and analgesia for a procedure in the cardiac catheter laboratory are cared for using consistent, safe and evidence-based practices.
Resumo:
Research has consistently described that patients after cardiac surgery experience disturbed sleep yet there has been limited investigation into methods to improve this experience. Complementary therapies may be a method of addressing this issue. Aim: To determine if using progressive muscle relaxation improves self-rated sleep quality for patients following cardiac surgery. Methods and Results: Thirty-five participants' quantitative data on sleep quality were obtained via questionnaire during their first post-operative week after cardiac surgery. Qualitative data were obtained through written responses to open-ended questions. No significant differences in sleep quality scores were found between pre and post-intervention of progressive muscle relaxation using the Wilcoxon Signed Ranks Test. However, the qualitative analysis discovered the intervention aided some participants in initiating their sleep by diversion of thought, inducing relaxation or alleviating pain and anxiety. Conclusions: Qualitative findings suggest that progressive muscle relaxation may help patients who have undergone cardiac surgery initiate their sleep.
Resumo:
Social media tools are often the result of innovations in Information Technology and developed by IT professionals and innovators. Nevertheless, IT professionals, many of whom are responsible for designing and building social media technologies, have not been investigated on how they themselves use or experience social media for professional purposes. This study will use Information Grounds Theory (Pettigrew, 1998) as a framework to study IT professionals’ experience in using social media for professional purposes. Information grounds facilitates the opportunistic discovery of information within social settings created temporarily at a place where people gather for a specific purpose (e.g., doctors’ waiting rooms, office tea rooms etc.), but the social atmosphere stimulates spontaneous sharing of information (Pettigrew, 1999). This study proposes that social media has the qualities that make it a rich information grounds; people participate from separate “places” in cyberspace in a synchronous manner in real-time, making it almost as dynamic and unplanned as physical information grounds. There is limited research on how social media platforms are perceived as a “place,” (a place to go to, a place to gather, or a place to be seen in) that is comparable to physical spaces. There is also no empirical study on how IT professionals use or “experience” social media. The data for this study is being collected through a study of IT professionals who currently use Twitter. A digital ethnography approach is being taken wherein the researcher uses online observations and “follows” the participants online and observes their behaviours and interactions on social media. Next, a sub-set of participants will be interviewed on their experiences with and within social media and how social media compares with traditional methods of information grounds, information communication, and collaborative environments. An Evolved Grounded Theory (Glaser, 1992) approach will be used to analyse tweets data and interviews and to map the findings against the Information Ground Theory. Findings from this study will provide foundational understanding of IT professionals’ experiences within social media, and can help both professionals and researchers understand this fast-evolving method of communications.