926 resultados para new categorical imperative


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently many international tertiary educational programs have capitalised on the value design and business can have upon their interception (Martin, 2009; Brown, 2008; Bruce and Bessant, 2002; Manzini, 2009). This paper discusses the role that two teaching units – New Product Development and Design Led Innovation – play in forming an understanding of commercialisation needed in today’s Industrial Design education. These units are taught consecutively in the later years of the Bachelor of Industrial Design program at the Queensland University of Technology, Brisbane, Australia. In this paper, each teaching unit is discussed in detail and then as a conglomerate, in order to form a basis of knowledge students need in order to fully capitalise on the value design has in business, and to produce a more capable Industrial Design graduate of the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has been a recent surge of interest in cooking skills in a diverse range of fields, such as health, education and public policy. There appears to be an assumption that cooking skills are in decline and that this is having an adverse impact on individual health and well-being, and family wholesomeness. The problematisation of cooking skills is not new, and can be seen in a number of historical developments that have specified particular pedagogies about food and eating. The purpose of this paper is to examine pedagogies on cooking skills and the importance accorded them. The paper draws on Foucault’s work on governmentality. By using examples from the USA, UK and Australia, the paper demonstrates the ways that authoritative discourses on the know how and the know what about food and cooking – called here ‘savoir fare’ – are developed and promulgated. These discourses, and the moral panics in which they are embedded, require individuals to make choices about what to cook and how to cook, and in doing so establish moral pedagogies concerning good and bad cooking. The development of food literacy programmes, which see cooking skills as life skills, further extends the obligations to ‘cook properly’ to wider populations. The emphasis on cooking knowledge and skills has ushered in new forms of government, firstly, through a relationship between expertise and politics which is readily visible through the authority that underpins the need to develop skills in food provisioning and preparation; secondly, through a new pluralisation of ‘social’ technologies which invites a range of private-public interest through, for example, television cooking programmes featuring cooking skills, albeit it set in a particular milieu of entertainment; and lastly, through a new specification of the subject can be seen in the formation of a choosing subject, one which has to problematise food choice in relation to expert advice and guidance. A governmentality focus shows that as discourses develop about what is the correct level of ‘savoir fare’, new discursive subject positions are opened up. Armed with the understanding of what is considered expert-endorsed acceptable food knowledge, subjects judge themselves through self-surveillance. The result is a powerful food and family morality that is both disciplined and disciplinary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article focuses on the evaluation of thesis supervision and highlights the vast range of problem areas presently documented as significant areas of concern for both graduate students and their supervisors. Additionally, the authors report on a study completed in 1995, which surveyed all Australian universities about current evaluative practices in postgraduate supervision. The conclusion of this study was that the conduct of such evaluations appears to be minimal and is primarily designed to obtain an indicator of the general “health” of a university's postgraduate supervision rather than to foster improved supervisory practices. As part of the same study, the authors conducted University faculty-based student and supervisor focus groups. Key issues emerging from these focus groups were: (1) the importance of relational aspects of supervision as the student communicates over the long term with one or more supervisors; (2) the importance of systematic feedback, monitoring, and evaluation to the supervisory process; and (3) the lack of strategies to facilitate this evaluative feedback process. On the basis of these findings, the authors designed evaluative strategies to facilitate regular ongoing feedback between students and supervisors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since the architectural design studio learning environment was first established in the early 19th century at the École des Beaux-Arts in Paris, there has been a complete transformation in how the discipline of architecture is practiced and how students of architecture acquire information. Digital technologies allow students to access information instantly and learning is no longer confined to the rigid boundaries of a physical campus environment. In many schools of architecture in Australia, the physical design studio learning environments however, remain largely unchanged. Many learning environments could be mistaken for those last refurbished 30 years ago, being devoid of any significant technological intervention. While some teaching staff are eagerly embracing new digital technologies and attempting to modify their pedagogical approaches, the physical design studio learning environment is resistant to such efforts. In a study aimed at better understanding how staff and students adapt to new blended learning environments, a group of 165 second year architecture students at a large school of architecture in Australia were separated into two different design studio learning environments. 70% of students were allocated to a traditional design studio setting and 30% to a new, high technology embedded, prototype digital learning laboratory. The digital learning laboratory was purpose designed for the case-study users, adapted Student-Centred Active Learning Environment for Undergraduate Programs [SCALE-UP] principles, and built as part of a larger university research project. The architecture students attended the same lectures, followed the same studio curriculum and completed the same pieces of assessment; the only major differences were the teaching staff and physical environment within which the studios were conducted. At the end of the semester, all staff and students were asked to complete a questionnaire about their experiences and preferences within the two respective learning environments. The questionnaire response rate represented the opinions of 100% of the 10 teaching staff and over 70% of the students. Using a qualitative grounded theory approach, data were coded, extrapolated and compared, to reveal emerging key themes. The key themes formed the basis for in-depth interviews and focus groups of teaching staff and students, allowing the researchers to understand the data in more detail. The results of the data verified what had become increasingly evident during the course of the semester: an underlying negative resistance to the new digital studio learning environment, by both staff and students. Many participants openly exhibited a yearning for a return to the traditional design studio learning environments, particularly when the new technology caused frustration, by being unreliable or failing altogether. This paper reports on the study, discusses the negative resistance and explores the major contributors to resistance. The researchers are not aware of any similar previous studies across these particular settings and believe that it offers a necessary and important contribution to emergent research about adaptation to new digital learning environments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability to estimate the asset reliability and the probability of failure is critical to reducing maintenance costs, operation downtime, and safety hazards. Predicting the survival time and the probability of failure in future time is an indispensable requirement in prognostics and asset health management. In traditional reliability models, the lifetime of an asset is estimated using failure event data, alone; however, statistically sufficient failure event data are often difficult to attain in real-life situations due to poor data management, effective preventive maintenance, and the small population of identical assets in use. Condition indicators and operating environment indicators are two types of covariate data that are normally obtained in addition to failure event and suspended data. These data contain significant information about the state and health of an asset. Condition indicators reflect the level of degradation of assets while operating environment indicators accelerate or decelerate the lifetime of assets. When these data are available, an alternative approach to the traditional reliability analysis is the modelling of condition indicators and operating environment indicators and their failure-generating mechanisms using a covariate-based hazard model. The literature review indicates that a number of covariate-based hazard models have been developed. All of these existing covariate-based hazard models were developed based on the principle theory of the Proportional Hazard Model (PHM). However, most of these models have not attracted much attention in the field of machinery prognostics. Moreover, due to the prominence of PHM, attempts at developing alternative models, to some extent, have been stifled, although a number of alternative models to PHM have been suggested. The existing covariate-based hazard models neglect to fully utilise three types of asset health information (including failure event data (i.e. observed and/or suspended), condition data, and operating environment data) into a model to have more effective hazard and reliability predictions. In addition, current research shows that condition indicators and operating environment indicators have different characteristics and they are non-homogeneous covariate data. Condition indicators act as response variables (or dependent variables) whereas operating environment indicators act as explanatory variables (or independent variables). However, these non-homogenous covariate data were modelled in the same way for hazard prediction in the existing covariate-based hazard models. The related and yet more imperative question is how both of these indicators should be effectively modelled and integrated into the covariate-based hazard model. This work presents a new approach for addressing the aforementioned challenges. The new covariate-based hazard model, which termed as Explicit Hazard Model (EHM), explicitly and effectively incorporates all three available asset health information into the modelling of hazard and reliability predictions and also drives the relationship between actual asset health and condition measurements as well as operating environment measurements. The theoretical development of the model and its parameter estimation method are demonstrated in this work. EHM assumes that the baseline hazard is a function of the both time and condition indicators. Condition indicators provide information about the health condition of an asset; therefore they update and reform the baseline hazard of EHM according to the health state of asset at given time t. Some examples of condition indicators are the vibration of rotating machinery, the level of metal particles in engine oil analysis, and wear in a component, to name but a few. Operating environment indicators in this model are failure accelerators and/or decelerators that are included in the covariate function of EHM and may increase or decrease the value of the hazard from the baseline hazard. These indicators caused by the environment in which an asset operates, and that have not been explicitly identified by the condition indicators (e.g. Loads, environmental stresses, and other dynamically changing environment factors). While the effects of operating environment indicators could be nought in EHM; condition indicators could emerge because these indicators are observed and measured as long as an asset is operational and survived. EHM has several advantages over the existing covariate-based hazard models. One is this model utilises three different sources of asset health data (i.e. population characteristics, condition indicators, and operating environment indicators) to effectively predict hazard and reliability. Another is that EHM explicitly investigates the relationship between condition and operating environment indicators associated with the hazard of an asset. Furthermore, the proportionality assumption, which most of the covariate-based hazard models suffer from it, does not exist in EHM. According to the sample size of failure/suspension times, EHM is extended into two forms: semi-parametric and non-parametric. The semi-parametric EHM assumes a specified lifetime distribution (i.e. Weibull distribution) in the form of the baseline hazard. However, for more industry applications, due to sparse failure event data of assets, the analysis of such data often involves complex distributional shapes about which little is known. Therefore, to avoid the restrictive assumption of the semi-parametric EHM about assuming a specified lifetime distribution for failure event histories, the non-parametric EHM, which is a distribution free model, has been developed. The development of EHM into two forms is another merit of the model. A case study was conducted using laboratory experiment data to validate the practicality of the both semi-parametric and non-parametric EHMs. The performance of the newly-developed models is appraised using the comparison amongst the estimated results of these models and the other existing covariate-based hazard models. The comparison results demonstrated that both the semi-parametric and non-parametric EHMs outperform the existing covariate-based hazard models. Future research directions regarding to the new parameter estimation method in the case of time-dependent effects of covariates and missing data, application of EHM in both repairable and non-repairable systems using field data, and a decision support model in which linked to the estimated reliability results, are also identified.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Power system restoration after a large area outage involves many factors, and the procedure is usually very complicated. A decision-making support system could then be developed so as to find the optimal black-start strategy. In order to evaluate candidate black-start strategies, some indices, usually both qualitative and quantitative, are employed. However, it may not be possible to directly synthesize these indices, and different extents of interactions may exist among these indices. In the existing black-start decision-making methods, qualitative and quantitative indices cannot be well synthesized, and the interactions among different indices are not taken into account. The vague set, an extended version of the well-developed fuzzy set, could be employed to deal with decision-making problems with interacting attributes. Given this background, the vague set is first employed in this work to represent the indices for facilitating the comparisons among them. Then, a concept of the vague-valued fuzzy measure is presented, and on that basis a mathematical model for black-start decision-making developed. Compared with the existing methods, the proposed method could deal with the interactions among indices and more reasonably represent the fuzzy information. Finally, an actual power system is served for demonstrating the basic features of the developed model and method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

CARBONACEOUS chondrites provide valuable information as they are the least altered examples of early Solar System material1. The matrix constitutes a major proportion of carbonaceous chondrites. Despite many past attempts, unambiguous identification of the minerals in the matrix has not been totally successful2. This is mainly due to the extremely fine-grained nature of the matrix phases. Recently, progress in the characterisation of these phases has been made by electron diffraction studies3,4. We present here the direct observation, by high resolution imaging, of phases in carbonaceous chondrite matrices. We used ion-thinned sections from the Murchison C2(M) meteorite for transmission electron microscopy. The Murchison matrix contains both ordered and disordered inter-growths of serpentine-like and brucite-like layers. Such mixed-layer structures are new types of layer silicates. © 1979 Nature Publishing Group.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research indicates that enrolments in separate special educational settings for students with disruptive behaviour have increased in a number of educational jurisdictions internationally. Recent analysis of school enrolment data has identified a similar increase in the New South Wales (NSW) government school sector; however, questions have been raised as to their use and effectiveness. To situate the NSW experiment with behaviour schools in a broader context, the paper begins with a review of the international research literature. This is followed by a discussion of the NSW experience with the aim of identifying parallels and gaps in the research. The paper concludes by outlining important questions and directions for research to better understand and improve the educational experiences and outcomes of disruptive disaffected students in Australia’s largest school system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we examine the increase in segregated placements in the New South Wales government school sector. Using disaggregated enrolment data, we point to the growing over-representation of boys in special schools and classes; particularly those of a certain age in certain support categories. In the discussion that follows, we question the role of special education in the development of new and additional forms of being “at risk.” In effect, we invert the traditional concept by asking: Who is at risk of what? In focusing on the containment of risk, are modern practices of diagnosis and segregation perpetuating risks that already disproportionately affect certain groups of individuals? Do these perceptions of and responses to risk in local schools now place these students at greater personal risk of school failure and a future marked by social exclusion? And, finally, is that risk worth the cost?

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses the findings of a research study that used semi-structured interviews to explore the views of primary school principals on inclusive education in New South Wales, Australia. Content analysis of the transcript data indicates that principals’ attitudes towards inclusive education and their success in engineering inclusive practices within their school are significantly affected by their own conception of what “inclusion” means, as well as the characteristics of the school community, and the attitudes and capacity of staff. In what follows, we present two parallel conversations that arose from the interview data to illustrate the main conceptual divisions existing between our participants’ conceptions of inclusion. First, we discuss the act of “being inclusive” which was perceived mainly as an issue of culture and pedagogy. Second, we consider the mechanics of “including,” which reflected a more instrumentalist position based on perceptions of individual student deficit, the level of support they may require and the amount of funding they can attract.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Over the last two decades, moves toward “inclusion” have prompted change in the formation of education policies, schooling structures and pedagogical practice. Yet, exclusion through the categorisation and segregation of students with diverse abilities has grown; particularly for students with challenging behaviour. This paper considers what has happened to inclusive education by focusing on three educational jurisdictions known to be experiencing different rates of growth in the identification of special educational needs: New South Wales (Australia), Alberta (Canada) and Finland (Europe). In our analysis, we consider the effects of competing policy forces that appear to thwart the development of inclusive schools in two of our case-study regions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The last few decades have witnessed a broad international movement towards the development of inclusive schools through targeted special education funding and resourcing policies. Student placement statistics are often used as a barometer of policy success but they may also be an indication of system change. In this paper, trends in student enrolments from the Australian state of New South Wales are considered in an effort to understand what effect inclusive education has had in this particular region of the world.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Flexible information exchange is critical to successful design-analysis integration, but current top-down, standards-based and model-oriented strategies impose restrictions that contradicts this flexibility. In this article we present a bottom-up, user-controlled and process-oriented approach to linking design and analysis applications that is more responsive to the varied needs of designers and design teams. Drawing on research into scientific workflows, we present a framework for integration that capitalises on advances in cloud computing to connect discrete tools via flexible and distributed process networks. We then discuss how a shared mapping process that is flexible and user friendly supports non-programmers in creating these custom connections. Adopting a services-oriented system architecture, we propose a web-based platform that enables data, semantics and models to be shared on the fly. We then discuss potential challenges and opportunities for its development as a flexible, visual, collaborative, scalable and open system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Flexible information exchange is critical to successful design integration, but current top-down, standards-based and model-oriented strategies impose restrictions that are contradictory to this flexibility. In this paper we present a bottom-up, user-controlled and process-oriented approach to linking design and analysis applications that is more responsive to the varied needs of designers and design teams. Drawing on research into scientific workflows, we present a framework for integration that capitalises on advances in cloud computing to connect discrete tools via flexible and distributed process networks. Adopting a services-oriented system architecture, we propose a web-based platform that enables data, semantics and models to be shared on the fly. We discuss potential challenges and opportunities for the development thereof as a flexible, visual, collaborative, scalable and open system.