475 resultados para dialogic thinking, linguistic thinking, systematic-comparative approach to human co-operation, literature as a philosophic text
Resumo:
Although kimberlite pipes/bodies are usually the remains of volcanic vents, in-vent deposits, and subvolcanic intrusions, the terminology used for kimberlite rocks has largely developed independently of that used in mainstream volcanology. Existing kimberlite terminology is not descriptive and includes terms that are rarely used, used differently, and even not used at all in mainstream volcanology. In addition, kimberlite bodies are altered to varying degrees, making application of genetic terminology difficult because original components and depositional textures are commonly masked by alteration. This paper recommends an approach to the terminology for kimberlite rocks that is consistent with usage for other volcanic successions. In modern terrains the eruption and emplacement origins of deposits can often be readily deduced, but this is often not the case for old, variably altered and deformed rock successions. A staged approach is required whereby descriptive terminology is developed first, followed by application of genetic terminology once all features, including the effects of alteration on original texture and depositional features, together with contact relationships and setting, have been evaluated. Because many volcanic successions consist of both primary volcanic deposits as well as volcanic sediments, terminology must account for both possibilities.
Resumo:
What type of probability theory best describes the way humans make judgments under uncertainty and decisions under conflict? Although rational models of cognition have become prominent and have achieved much success, they adhere to the laws of classical probability theory despite the fact that human reasoning does not always conform to these laws. For this reason we have seen the recent emergence of models based on an alternative probabilistic framework drawn from quantum theory. These quantum models show promise in addressing cognitive phenomena that have proven recalcitrant to modeling by means of classical probability theory. This review compares and contrasts probabilistic models based on Bayesian or classical versus quantum principles, and highlights the advantages and disadvantages of each approach.
Resumo:
‘Complexity’ is a term that is increasingly prevalent in conversations about building capacity for 21st Century professional engineers. Society is grappling with the urgent and challenging reality of accommodating seven billion people, meeting needs and innovating lifestyle improvements in ways that do not destroy atmospheric, biological and oceanic systems critical to life. Over the last two decades in particular, engineering educators have been active in attempting to build capacity amongst professionals to deliver ‘sustainable development’ in this rapidly changing global context. However curriculum literature clearly points to a lack of significant progress, with efforts best described as ad hoc and highly varied. Given the limited timeframes for action to curb environmental degradation proposed by scientists and intergovernmental agencies, the authors of this paper propose it is imperative that curriculum renewal towards education for sustainable development proceeds rapidly, systemically, and in a transformational manner. Within this context, the paper discusses the need to consider a multiple track approach to building capacity for 21st Century engineering, including priorities and timeframes for undergraduate and postgraduate curriculum renewal. The paper begins with a contextual discussion of the term complexity and how it relates to life in the 21st Century. The authors then present a whole of system approach for planning and implementing rapid curriculum renewal that addresses the critical roles of several generations of engineering professionals over the next three decades. The paper concludes with observations regarding engaging with this approach in the context of emerging accreditation requirements and existing curriculum renewal frameworks.
Resumo:
Here we demonstrate that commercial carbon supported Pt nanoparticles react with [AuCl4]- ions at room temperature to produce a highly active Au/Pt/C material with an ultralow coverage of elemental Au on the Pt nanoparticles that exhibits significantly enhanced activity for ethanol oxidation when compared to Pt/C.
Resumo:
In this paper, a stress and coping perspective is used to outline the processes that determine employee adaptation to organisational change. A theoretical framework that simultaneously considers the effects of event characteristics, situational appraisals, coping strategies, and coping resources is reviewed. Three empirical investigations of organisational change that have tested various components of the model are then presented. In the first study, there was evidence linking event characteristics, situational appraisals, coping strategies and coping resources to levels of employee adjustment in a sample of pilots employed in a newly merged airline company. In a more focused test of the model with a sample of employees experiencing a restructuring process in their organisation it was found that the provision of change-related information enhanced levels of efficacy to deal with the change process which, in turn, predicted psychological wellbeing, client engagement, and job satisfaction. In a study of managers affected by a new remuneration scheme, there was evidence to suggest that managers who received change-specific information and opportunities to participate in the change process reported higher levels of change readiness. Managers who reported higher levels of readiness for change also reported higher levels of psychological wellbeing and job satisfaction. These studies highlight ways in which managers and change agents can help employees to cope during times of organisational change.
Resumo:
This paper provides an important and timely overview of a conceptual framework designed to assist with the development of message content, as well as the evaluation, of persuasive health messages. While an earlier version of this framework was presented in a prior publication by the authors in 2009, important refinements to the framework have seen it evolve in recent years, warranting the need for an updated review. This paper outlines the Step approach to Message Design and Testing (or SatMDT) in accordance with the theoretical evidence which underpins, as well as empirical evidence which demonstrates the relevance and feasibility, of each of the framework’s steps. The development and testing of the framework have thus far been based exclusively within the road safety advertising context; however, the view expressed herein is that the framework may have broader appeal and application to the health persuasion context.
Resumo:
In life cycle assessment studies, greenhouse gas (GHG) emissions from direct land-use change have been estimated to make a significant contribution to the global warming potential of agricultural products. However, these estimates have a high uncertainty due to the complexity of data requirements and difficulty in attribution of land-use change. This paper presents estimates of GHG emissions from direct land-use change from native woodland to grazing land for two beef production regions in eastern Australia, which were the subject of a multi-impact life cycle assessment study for premium beef production. Spatially- and temporally consistent datasets were derived for areas of forest cover and biomass carbon stocks using published remotely sensed tree-cover data and regionally applicable allometric equations consistent with Australia's national GHG inventory report. Standard life cycle assessment methodology was used to estimate GHG emissions and removals from direct land-use change attributed to beef production. For the northern-central New South Wales region of Australia estimates ranged from a net emission of 0.03 t CO2-e ha-1 year-1 to net removal of 0.12 t CO2-e ha-1 year-1 using low and high scenarios, respectively, for sequestration in regrowing forests. For the same period (1990-2010), the study region in southern-central Queensland was estimated to have net emissions from land-use change in the range of 0.45-0.25 t CO2-e ha-1 year-1. The difference between regions reflects continuation of higher rates of deforestation in Queensland until strict regulation in 2006 whereas native vegetation protection laws were introduced earlier in New South Wales. On the basis of liveweight produced at the farm-gate, emissions from direct land-use change for 1990-2010 were comparable in magnitude to those from other on-farm sources, which were dominated by enteric methane. However, calculation of land-use change impacts for the Queensland region for a period starting 2006, gave a range from net emissions of 0.11 t CO2-e ha-1 year-1 to net removals of 0.07 t CO2-e ha-1 year-1. This study demonstrated a method for deriving spatially- and temporally consistent datasets to improve estimates for direct land-use change impacts in life cycle assessment. It identified areas of uncertainty, including rates of sequestration in woody regrowth and impacts of land-use change on soil carbon stocks in grazed woodlands, but also showed the potential for direct land-use change to represent a net sink for GHG.
Resumo:
Structural fire safety has become one of the key considerations in the design and maintenance of the built infrastructure. Conventionally the fire resistance rating of load bearing Light gauge Steel Frame (LSF) walls is determined based on the standard time-temperature curve given in ISO 834. Recent research has shown that the true fire resistance of building elements exposed to building fires can be less than their fire resistance ratings determined based on standard fire tests. It is questionable whether the standard time-temperature curve truly represents the fuel loads in modern buildings. Therefore an equivalent fire severity approach has been used in the past to obtain fire resistance rating. This is based on the performance of a structural member exposed to a realistic design fire curve in comparison to that of standard fire time-temperature curve. This paper presents the details of research undertaken to develop an energy based time equivalent approach to obtain the fire resistance ratings of LSF walls exposed to realistic design fire curves with respect to standard fire exposure. This approach relates to the amount of energy transferred to the member. The proposed method was used to predict the fire resistance ratings of single and double layer plasterboard lined and externally insulated LSF walls. The predicted fire ratings were compared with the results from finite element analyses and fire design rules for three different wall configurations exposed to both rapid and prolonged fires. The comparison shows that the proposed energy method can be used to obtain the fire resistance ratings of LSF walls in the case of prolonged fires.
Resumo:
This paper analyzes the application of rights-based approaches to disaster displacement in the Asia-Pacific region in order to assess whether the current framework is sufficient to protect the rights of internally displaced persons. It identifies that disaster-induced displacement is increasingly prevalent in the region and that economic and social conditions in many countries mean that the impact of displacement is often prolonged and more severe. The paper identifies the relevant human rights principles which apply in the context of disaster-induced displacement and examines their implementation in a number of soft-law instruments. While it identifies shortcomings in impementation and enforcement, the paper concludes that a rights-based approach could be enhanced by greater engagement with existing human rights treaties and greater implementation of soft-law principles, and that no new instrument is required.
Resumo:
High-stakes testing has become an important element of the Australian educational landscape. As one part of the neo-liberal paradigm where beliefs in the individual and the free market are paramount, it is of concern how school leaders can respond to this phenomenon in an ethical manner. Ethics and ethical leadership have increased in prominence both in the educational administration literature and in the media (Cranston, Ehrich, & Kimber, 2006). In this paper we consider ethical theories on which school principals can draw, not only in the leadership of their own schools but in their relationships with other schools. We provide an example of a school leader sharing a successful intervention with other schools, illustrating that school leaders can create spaces for promoting the public good within the context of high-stakes testing.
Resumo:
Access to transport systems and the connection to such systems provided to essential economic and social activities are critical to determine households' transportation disadvantage levels. In spite of the developments in better identifying transportation disadvantaged groups, the lack of effective policies resulted in the continuum of the issue as a significant problem. This paper undertakes a pilot case investigation as test bed for a new approach developed to reduce transportation policy shortcomings. The approach, ‘disadvantage-impedance index’, aims to ease transportation disadvantages by employing representative parameters to measure the differences between policy alternatives run in a simulation environment. Implemented in the Japanese town of Arao, the index uses trip-making behaviour and resident stated preference data. The results of the index reveal that even a slight improvement in accessibility and travel quality indicators makes a significant difference in easing disadvantages. The index, integrated into a four-step model, proves to be highly robust and useful in terms of quick diagnosis in capturing effective actions, and developing potentially efficient policies.
Resumo:
This study examined the effect of an educational intervention utilizing principles of cognitive apprenticeship on students’ ability to apply clinical reasoning skills within the context of a purpose-built clinical vignette. A quasi-experimental, non-equivalent control-group design was used to evaluate the effect of the educational intervention on students’ accuracy, inaccuracy and self-confidence in clinical reasoning. This study makes an important contribution to nursing education by providing evidence to understand how best to facilitate nursing students’ development of clinical reasoning.
Resumo:
This paper reports on the results of a project aimed at creating a research-informed, pedagogically reliable, technology-enhanced learning and teaching environment that would foster engagement with learning. A first-year mathematics for engineering unit offered at a large, metropolitan Australian university provides the context for this research. As part of the project, the unit was redesigned using a framework that employed flexible, modular, connected e-learning and teaching experiences. The researchers, interested in an ecological perspective on educational processes, grounded the redesign principles in probabilistic learning design (Kirschner et al., 2004). The effectiveness of the redesigned environment was assessed through the lens of the notion of affordance (Gibson, 1977,1979, Greeno, 1994, Good, 2007). A qualitative analysis of the questionnaire distributed to students at the end of the teaching period provided insight into factors impacting on the successful creation of an environment that encourages complex, multidimensional and multilayered interactions conducive to learning.
Resumo:
Large Display Arrays (LDAs) use Light Emitting Diodes (LEDs) in order to inform a viewing audience. A matrix of individually driven LEDs allows the area represented to display text, images and video. LDAs have undergone rapid development over the past 10 years in both the modular and semi-flexible formats. This thesis critically analyses the communication architecture and processor functionality of current LDAs and presents an alternative method, that is, Scalable Flexible Large Display Arrays (SFLDAs). SFLDAs are more adaptable to a variety of applications because of enhancements in scalability and flexibility. Scalability is the ability to configure SFLDAs from 0.8m2 to 200m2. Flexibility is increased functionality within the processors to handle changes in configuration and the use of a communication architecture that standardises two-way communication throughout the SFLDA. While common video platforms such as Digital Video Interface (DVI), Serial Digital Interface (SDI), and High Definition Multimedia Interface (HDMI) are considered as solutions for the communication architecture of SFLDAs, so too is modulation, fibre optic, capacitive coupling and Ethernet. From an analysis of these architectures, Ethernet was identified as the best solution. The use of Ethernet as the communication architecture in SFLDAs means that both hardware and software modules are capable of interfacing to the SFLDAs. The Video to Ethernet Processor Unit (VEPU), Scoreboard, Image and Control Software (SICS) and Ethernet to LED Processor Unit (ELPU) have been developed to form the key components in designing and implementing the first SFLDA. Data throughput rate and spectrophotometer tests were used to measure the effectiveness of Ethernet within the SFLDA constructs. The result of testing and analysis of these architectures showed that Ethernet satisfactorily met the requirements of SFLDAs.
Resumo:
Big Datasets are endemic, but they are often notoriously difficult to analyse because of their size, heterogeneity, history and quality. The purpose of this paper is to open a discourse on the use of modern experimental design methods to analyse Big Data in order to answer particular questions of interest. By appealing to a range of examples, it is suggested that this perspective on Big Data modelling and analysis has wide generality and advantageous inferential and computational properties. In particular, the principled experimental design approach is shown to provide a flexible framework for analysis that, for certain classes of objectives and utility functions, delivers near equivalent answers compared with analyses of the full dataset under a controlled error rate. It can also provide a formalised method for iterative parameter estimation, model checking, identification of data gaps and evaluation of data quality. Finally, it has the potential to add value to other Big Data sampling algorithms, in particular divide-and-conquer strategies, by determining efficient sub-samples.