822 resultados para Project-based Organisation
Resumo:
The implementation of systematic peer review as a professional development activity, and as a support for educational design activities is under-utilised in many Australian higher education institutions. This case study reports on the first stages of planning and implementation of an institution-wide project to enhance teaching and learning quality at a remote and regional university, where one of the major strategies for improvement is peer review. Through a systematic process of staff engagement in peer review, within and from outside the organisation, a substantial change in flexible learning is envisaged. A mix of new and different learning spaces are to be used in the project, including blended learning spaces for academic development. This paper describes the research framework that will guide the peer review process and examines the early findings of the design-based research. Leadership, awareness raising and development of a supportive community of inquiry are seen as key components for successful implementation of peer review. In addition, unique contextual elements add to the complexity of designing for transformative change within such a relatively new organization.
Resumo:
Our long-term program of research has considered the relationships between teachers’ work and identities, literacy pedagogies and schooling, particularly in high-poverty communities. Over the past decade, we have worked with teachers to consciously explore with them the possible productive synergies between critical literacy and place-based pedagogies, and the affordances of multimodal and digital literacies for students’ engagement with the places where they live and learn. These studies have been undertaken with teachers working and living in various locales—from the urban fringe to inner suburban areas undergoing urban renewal, to rural and regional communities where poverty and the politics of place bring certain distinctive opportunities and constraints to bear on pedagogy for social justice. There is now wider recognition that “social justice” may need rethinking to foreground the nonhuman world and the relation between people and politics of places, people, and environments in terms of “eco-social justice” (Green 2010; Gruenewald 2003b) or spatial justice (Soja 2011). In this chapter, we explore place as a site of knowing and as an object of study as developed through the Special Forever project by teachers in schools located in the Murray-Darling Basin bioregion. Putting the environment at the center of the literacy curriculum inevitably draws teachers into the politics of place and raises questions concerning what is worth preserving and what should be transformed. We consider how the politics of place both constrains and opens up possibilities for pedagogy for eco-social justice and review the pedagogical work that one teacher, Hannah, undertook with her upper primary class.
Resumo:
Custom designed for display on the Cube Installation situated in the new Science and Engineering Centre (SEC) at QUT, the ECOS project is a playful interface that uses real-time weather data to simulate how a five-star energy building operates in climates all over the world. In collaboration with the SEC building managers, the ECOS Project incorporates energy consumption and generation data of the building into an interactive simulation, which is both engaging to users and highly informative, and which invites play and reflection on the roles of green buildings. ECOS focuses on the principle that humans can have both a positive and negative impact on ecosystems with both local and global consequence. The ECOS project draws on the practice of Eco-Visualisation, a term used to encapsulate the important merging of environmental data visualization with the philosophy of sustainability. Holmes (2007) uses the term Eco-Visualisation (EV) to refer to data visualisations that ‘display the real time consumption statistics of key environmental resources for the goal of promoting ecological literacy’. EVs are commonly artifacts of interaction design, information design, interface design and industrial design, but are informed by various intellectual disciplines that have shared interests in sustainability. As a result of surveying a number of projects, Pierce, Odom and Blevis (2008) outline strategies for designing and evaluating effective EVs, including ‘connecting behavior to material impacts of consumption, encouraging playful engagement and exploration with energy, raising public awareness and facilitating discussion, and stimulating critical reflection.’ Consequently, Froehlich (2010) and his colleagues also use the term ‘Eco-feedback technology’ to describe the same field. ‘Green IT’ is another variation which Tomlinson (2010) describes as a ‘field at the juncture of two trends… the growing concern over environmental issues’ and ‘the use of digital tools and techniques for manipulating information.’ The ECOS Project team is guided by these principles, but more importantly, propose an example for how these principles may be achieved. The ECOS Project presents a simplified interface to the very complex domain of thermodynamic and climate modeling. From a mathematical perspective, the simulation can be divided into two models, which interact and compete for balance – the comfort of ECOS’ virtual denizens and the ecological and environmental health of the virtual world. The comfort model is based on the study of psychometrics, and specifically those relating to human comfort. This provides baseline micro-climatic values for what constitutes a comfortable working environment within the QUT SEC buildings. The difference between the ambient outside temperature (as determined by polling the Google Weather API for live weather data) and the internal thermostat of the building (as set by the user) allows us to estimate the energy required to either heat or cool the building. Once the energy requirements can be ascertained, this is then balanced with the ability of the building to produce enough power from green energy sources (solar, wind and gas) to cover its energy requirements. Calculating the relative amount of energy produced by wind and solar can be done by, in the case of solar for example, considering the size of panel and the amount of solar radiation it is receiving at any given time, which in turn can be estimated based on the temperature and conditions returned by the live weather API. Some of these variables can be altered by the user, allowing them to attempt to optimize the health of the building. The variables that can be changed are the budget allocated to green energy sources such as the Solar Panels, Wind Generator and the Air conditioning to control the internal building temperature. These variables influence the energy input and output variables, modeled on the real energy usage statistics drawn from the SEC data provided by the building managers.
Resumo:
The European Early Lung Cancer (EUELC) project aims to determine if specific genetic alterations occurring in lung carcinogenesis are detectable in the respiratory epithelium. In order to pursue this objective, nonsmall cell lung cancer (NSCLC) patients with a very high risk of developing progressive lung cancer were recruited from 12 centres in eight European countries: France, Germany, southern Ireland, Italy, the Netherlands, Poland, Spain and the UK. In addition, NSCLC patients were followed up every 6 months for 36 months. A European Bronchial Tissue Bank was set up at the University of Liverpool (Liverpool, UK) to optimise the use of biological specimens. The molecular - pathological investigations were subdivided into specific work packages that were delivered by EUELC Partners. The work packages encompassed mutational analysis, genetic instability, methylation profiling, expression profiling utilising immunohistochemistry and chip-based technologies, as well as in-depth analysis of FHIT and RARβ genes, the telomerase catalytic subunit hTERT and genotyping of susceptibility genes in specific pathways. The EUELC project engendered a tremendous collaborative effort, and it enabled the EUELC Partners to establish protocols for assessing molecular biomarkers in early lung cancer with the view to using such biomarkers for early diagnosis and as intermediate end-points in future chemopreventive programmes. Copyright©ERS Journals Ltd 2009.
Resumo:
Democracy is a multi-dimensional concept, ranging from definitions based exclusively on institutional frameworks (for example, Held, 2005, Przeworski, Alvarez, Cheibub and Limongi, 2000) to complex and integrated measures that include political and civil rights, democratic practices, values and, finally, a diverse set of institutional arrangements in society, including welfare, education, industrial relations and the legal system (Inglehart and Welzel, 2005, Jaggers and Gurr, 1995, O'Donnell, Cullel and Iazetta, 2004). This reflects the range of and distinction between merely formal electoral democracy and genuinely 'effective liberal democracy' (Inglehart and Welzel, 2005: 149), where democracy is firmly embedded not only in its institutions but in the values of its citizenry. Evidence from cross-national research confirms that formal democratic institutions, different dimensions of effective democracy, and democratic values are indeed strongly linked (Inglehart and Welzel, 2005: 154, Jaggers and Gurr, 1995: 446). Democracy is more than just a set of institutions, rules and mechanisms: it is a set of core values engrained in the 'lived experience' of its citizens. Core values of democracies are individual autonomy and egalitarianism, tolerance of diversity, and freedom from oppression for both individuals and institutions. Democracies restrain their governments by the rule of law and grant its citizens equal access to and equal treatment by legal institutions. Among these institutions, criminal justice and the treatment of those who violated rules and regulations represent sensitive seismographs for the quality of effective democracies, and the ways how democracies realise their core values.
Resumo:
The purpose of this paper is to present theoretical lenses that explain the relation between work motivation and project management success in case of temporary organizations such as projects. This paper is a part of the larger research study that first empirically identifies the constructs of work motivation in case of temporary organizations, and then empirically determines the relation between work motivation, and project management success. In the current paper, we have briefly reviewed the theories of work motivation from the work design school. These theories are predominantly drawn from the industrial/ organizational psychology literature. Then, we have considered the recent research on Nine Schools of Project Management as a point of departure to review theory on project management success. These theoretical perspectives are drawn from project management literature. We then illustrate the points of overlap for the theories drawn from these two disciplines. This review helps us to position our research study within the industrial/ organizational psychology, and project management literature as a cross-discipline study.
Resumo:
This paper summarises the development and testing of the 'store-turnover' method, a non-invasive dietary survey methodology for quantitative measurement of food and nutrient intake in remote, centralised Aboriginal communities. It then describes the use of the method in planning, implementation and evaluation of a community-based nutrition intervention project in a small Aboriginal community in the Northern Territory. During this project marked improvements in both the dietary intake of the community and biological indicators of nutritional health (including vitamin status and the degree and prevalence of several risk factors for non-communicable disease) were measured in the community over a 12-month period following the development of intervention strategies with the community. Although these specific strategies are presented, emphasis is directed towards the process involved, particularly the evaluation procedures used to monitor all stages of the project with the community.
Resumo:
Lean strategies have been developed to eliminate or reduce waste and thus improve operational efficiency in a manufacturing environment. However, in practice, manufacturers encounter difficulties to select appropriate lean strategies within their resource constraints and to quantitatively evaluate the perceived value of manufacturing waste reduction. This paper presents a methodology developed to quantitatively evaluate the contribution of lean strategies selected to reduce manufacturing wastes within the manufacturers’ resource (time) constraints. A mathematical model has been developed for evaluating the perceived value of lean strategies to manufacturing waste reduction and a step-by-step methodology is provided for selecting appropriate lean strategies to improve the manufacturing performance within their resource constraints. A computer program is developed in MATLAB for finding the optimum solution. With the help of a case study, the proposed methodology and developed model has been validated. A ‘lean strategy-wastes’ correlation matrix has been proposed to establish the relationship between the manufacturing wastes and lean strategies. Using the correlation matrix and applying the proposed methodology and developed mathematical model, authors came out with optimised perceived value of reduction of a manufacturer's wastes by implementing appropriate lean strategies within a manufacturer's resources constraints. Results also demonstrate that the perceived value of reduction of manufacturing wastes can significantly be changed based on policies and product strategy taken by a manufacturer. The proposed methodology can also be used in dynamic situations by changing the input in the programme developed in MATLAB. By identifying appropriate lean strategies for specific manufacturing wastes, a manufacturer can better prioritise implementation efforts and resources to maximise the success of implementing lean strategies in their organisation.
Resumo:
Floods are among the most devastating events that affect primarily tropical, archipelagic countries such as the Philippines. With the current predictions of climate change set to include rising sea levels, intensification of typhoon strength and a general increase in the mean annual precipitation throughout the Philippines, it has become paramount to prepare for the future so that the increased risk of floods on the country does not translate into more economic and human loss. Field work and data gathering was done within the framework of an internship at the former German Technical Cooperation (GTZ) in cooperation with the Local Government Unit of Ormoc City, Leyte, The Philippines, in order to develop a dynamic computer based flood model for the basin of the Pagsangaan River. To this end, different geo-spatial analysis tools such as PCRaster and ArcGIS, hydrological analysis packages and basic engineering techniques were assessed and implemented. The aim was to develop a dynamic flood model and use the development process to determine the required data, availability and impact on the results as case study for flood early warning systems in the Philippines. The hope is that such projects can help to reduce flood risk by including the results of worst case scenario analyses and current climate change predictions into city planning for municipal development, monitoring strategies and early warning systems. The project was developed using a 1D-2D coupled model in SOBEK (Deltares Hydrological modelling software package) and was also used as a case study to analyze and understand the influence of different factors such as land use, schematization, time step size and tidal variation on the flood characteristics. Several sources of relevant satellite data were compared, such as Digital Elevation Models (DEMs) from ASTER and SRTM data, as well as satellite rainfall data from the GIOVANNI server (NASA) and field gauge data. Different methods were used in the attempt to partially calibrate and validate the model to finally simulate and study two Climate Change scenarios based on scenario A1B predictions. It was observed that large areas currently considered not prone to floods will become low flood risk (0.1-1 m water depth). Furthermore, larger sections of the floodplains upstream of the Lilo- an’s Bridge will become moderate flood risk areas (1 - 2 m water depth). The flood hazard maps created for the development of the present project will be presented to the LGU and the model will be used to create a larger set of possible flood prone areas related to rainfall intensity by GTZ’s Local Disaster Risk Management Department and to study possible improvements to the current early warning system and monitoring of the basin section belonging to Ormoc City; recommendations about further enhancement of the geo-hydro-meteorological data to improve the model’s accuracy mainly on areas of interest will also be presented at the LGU.
Resumo:
Textual document set has become an important and rapidly growing information source in the web. Text classification is one of the crucial technologies for information organisation and management. Text classification has become more and more important and attracted wide attention of researchers from different research fields. In this paper, many feature selection methods, the implement algorithms and applications of text classification are introduced firstly. However, because there are much noise in the knowledge extracted by current data-mining techniques for text classification, it leads to much uncertainty in the process of text classification which is produced from both the knowledge extraction and knowledge usage, therefore, more innovative techniques and methods are needed to improve the performance of text classification. It has been a critical step with great challenge to further improve the process of knowledge extraction and effectively utilization of the extracted knowledge. Rough Set decision making approach is proposed to use Rough Set decision techniques to more precisely classify the textual documents which are difficult to separate by the classic text classification methods. The purpose of this paper is to give an overview of existing text classification technologies, to demonstrate the Rough Set concepts and the decision making approach based on Rough Set theory for building more reliable and effective text classification framework with higher precision, to set up an innovative evaluation metric named CEI which is very effective for the performance assessment of the similar research, and to propose a promising research direction for addressing the challenging problems in text classification, text mining and other relative fields.
Resumo:
Agent-based modelling (ABM), like other modelling techniques, is used to answer specific questions from real world systems that could otherwise be expensive or impractical. Its recent gain in popularity can be attributed to some degree to its capacity to use information at a fine level of detail of the system, both geographically and temporally, and generate information at a higher level, where emerging patterns can be observed. This technique is data-intensive, as explicit data at a fine level of detail is used and it is computer-intensive as many interactions between agents, which can learn and have a goal, are required. With the growing availability of data and the increase in computer power, these concerns are however fading. Nonetheless, being able to update or extend the model as more information becomes available can become problematic, because of the tight coupling of the agents and their dependence on the data, especially when modelling very large systems. One large system to which ABM is currently applied is the electricity distribution where thousands of agents representing the network and the consumers’ behaviours are interacting with one another. A framework that aims at answering a range of questions regarding the potential evolution of the grid has been developed and is presented here. It uses agent-based modelling to represent the engineering infrastructure of the distribution network and has been built with flexibility and extensibility in mind. What distinguishes the method presented here from the usual ABMs is that this ABM has been developed in a compositional manner. This encompasses not only the software tool, which core is named MODAM (MODular Agent-based Model) but the model itself. Using such approach enables the model to be extended as more information becomes available or modified as the electricity system evolves, leading to an adaptable model. Two well-known modularity principles in the software engineering domain are information hiding and separation of concerns. These principles were used to develop the agent-based model on top of OSGi and Eclipse plugins which have good support for modularity. Information regarding the model entities was separated into a) assets which describe the entities’ physical characteristics, and b) agents which describe their behaviour according to their goal and previous learning experiences. This approach diverges from the traditional approach where both aspects are often conflated. It has many advantages in terms of reusability of one or the other aspect for different purposes as well as composability when building simulations. For example, the way an asset is used on a network can greatly vary while its physical characteristics are the same – this is the case for two identical battery systems which usage will vary depending on the purpose of their installation. While any battery can be described by its physical properties (e.g. capacity, lifetime, and depth of discharge), its behaviour will vary depending on who is using it and what their aim is. The model is populated using data describing both aspects (physical characteristics and behaviour) and can be updated as required depending on what simulation is to be run. For example, data can be used to describe the environment to which the agents respond to – e.g. weather for solar panels, or to describe the assets and their relation to one another – e.g. the network assets. Finally, when running a simulation, MODAM calls on its module manager that coordinates the different plugins, automates the creation of the assets and agents using factories, and schedules their execution which can be done sequentially or in parallel for faster execution. Building agent-based models in this way has proven fast when adding new complex behaviours, as well as new types of assets. Simulations have been run to understand the potential impact of changes on the network in terms of assets (e.g. installation of decentralised generators) or behaviours (e.g. response to different management aims). While this platform has been developed within the context of a project focussing on the electricity domain, the core of the software, MODAM, can be extended to other domains such as transport which is part of future work with the addition of electric vehicles.
Resumo:
Robotics has created opportunities for educators to teach concepts across Science, Technology, Engineering, and Mathematics (STEM). This is one of the reasons robotics is becoming increasingly common in primary and secondary classrooms in Australia. To enable pre-service teachers to design engaging STEM activities that incorporate these technologies, robotics is part of the teaching program in the primary education degree at Queensland University of Technology (QUT). A number of pre-service teachers also choose to extend their abilities by implementing robotics activities on field studies, in schools on a voluntary basis, and in outreach activities such as the Robotics@QUT project. The Robotics@QUT project is a support network developed to build professional knowledge and capacity of classroom teachers in schools from a low SES area, engaging in robotics-based STEM activities. Professional Development (PD) workshops are provided to teachers in order to build their knowledge and confidence in implementing robotics activities in their classrooms, loan kits are provided, and pre-service teacher visits arranged to provide the teachers with on-going support. A key feature of the project is the partnerships developed between the teachers and the pre-service teachers involved in the project. The purpose of this study was to ascertain how the teachers in the project perceived the value of the PD workshops and the pre-service teachers’ involvement and what the benefits of the involvement in the project were for the pre-service teachers. Seventeen teachers completed a five-point (1-5) likert scale questionnaire regarding their involvement in the Robotics@QUT project. Teachers’ responses on the value of the project and the pre-service teacher support highlighted the benefits of the partnerships formed and provided insights into the value of the support provided by the pre-service teachers. This paper also describes one pre-service teacher’s experience with the project and the perceived benefits from being involved.
Resumo:
This paper reports on an evaluation of a collaborative robotics engagement project involving teachers from local schools and an academic from Queensland University of Technology (QUT). Engaged community projects are aimed at building stronger relationships between universities and their local communities (Sandman, Williams & Abrams, 2009). This partnership leads to mutually beneficial outcomes, builds community capacity, and can focus on aspirations and access to higher education for school students (Scull & Cuthill, 2010). The Robotics@QUT project aimed to build a partnership between local teachers and the university in order to provide students from a low SES area opportunity to engage in robotics-based Science, Technology, Engineering, and Mathematics (STEM) activities. Students from low SES regions are underrepresented at university and less likely to pursue studies in these fields (Bradley, Noonan, Nugent, & Scales, 2008). Having teachers who provide engaging STEM activities is an important motivating factor for students to enjoy STEM and do well in STEM subjects (Tytler, Osborne Williams Tytler & Clark, 2008).
Resumo:
The Chinese government should be commended for its open, concerted, and rapid response to the recent H7N9 influenza outbreak. However, the first known case was not reported until 48 days after disease onset.1 Although the difficulties in detecting the virus and the lack of suitable diagnostic methods have been the focus of discussion,2 systematic limitations that may have contributed to this delay have hardly been discussed. The detection speed of surveillance systems is limited by the highly structured nature of information flow and hierarchical organisation of these systems. Flu surveillance usually relies on notification to a central authority of laboratory confirmed cases or presentations to sentinel practices for flu-like illness. Each step in this pathway presents a bottleneck at which information and time can be lost; this limitation must be dealt with...
Resumo:
Executive Summary Emergency Departments (EDs) locally, nationally and internationally are becoming increasingly busy. Within this context, it can be challenging to deliver a health service that is safe, of high quality and cost-effective. Whilst various models are described within the literature that aim to measure ED ‘work’ or ‘activity’, they are often not linked to a measure of costs to provide such activity. It is important for hospital and ED managers to understand and apply this link so that optimal staffing and financial resourcing can be justifiably sought. This research is timely given that Australia has moved towards a national Activity Based Funding (ABF) model for ED activity. ABF is believed to increase transparency of care and fairness (i.e. equal work receives equal pay). ABF involves a person-, performance- or activity-based payment system, and thus a move away from historical “block payment” models that do not incentivise efficiency and quality. The aim of the Statewide Workforce and Activity-Based Funding Modelling Project in Queensland Emergency Departments (SWAMPED) is to identify and describe best practice Emergency Department (ED) workforce models within the current context of ED funding that operates under an ABF model. The study is comprised of five distinct phases. This monograph (Phase 1) comprises a systematic review of the literature that was completed in June 2013. The remaining phases include a detailed survey of Queensland hospital EDs’ resource levels, activity and operational models of care, development of new resource models, development of a user-friendly modelling interface for ED mangers, and production of a final report that identifies policy implications. The anticipated deliverable outcome of this research is the development of an ABF based Emergency Workforce Modelling Tool that will enable ED managers to profile both their workforce and operational models of care. Additionally, the tool will assist with the ability to more accurately inform adequate staffing numbers required in the future, inform planning of expected expenditures and be used for standardisation and benchmarking across similar EDs. Summary of the Findings Within the remit of this review of the literature, the main findings include: 1. EDs are becoming busier and more congested Rising demand, barriers to ED throughput and transitions of care all contribute to ED congestion. In addition requests by organisational managers and the community require continued broadening of the scope of services required of the ED and further increases in demand. As the population live longer with more lifestyle diseases their propensity to require ED care continues to grow. 2. Various models of care within EDs exist Models often vary to account for site specific characteritics to suit staffing profile, ED geographical location (e.g. metropolitan or rural site), and patient demographic profile (e.g. paediatrics, older persons, ethnicity). Existing and new models implemented within EDs often depend on the target outcome requiring change. Generally this is focussed on addressing issues at the input, throughput or output areas of the ED. Even with models targeting similar demographic or illness, the structure and process elements underpinning the model can vary, which can impact on outcomes and variance to the patient and carer experience between and within EDs. Major models of care to manage throughput inefficiencies include: A. Workforce Models of Care focus on the appropriate level of staffing for a given workload to provide prompt, timely and clinically effective patient care within an emergency care setting. The studies reviewed suggest that the early involvement of senior medical decision maker and/or specialised nursing roles such as Emergency Nurse Practitioners and Clinical Initiatives Nurse, primary contact or extended scope Allied Health Practitioners can facilitate patient flow and improve key indicators such as length of stay and reducing the number of those who did not wait to be seen amongst others. B. Operational Models of Care within EDs focus on mechanisms for streaming (e.g. fast-tracking) or otherwise grouping patient care based on acuity and complexity to assist with minimising any throughput inefficiencies. While studies support the positive impact of these models in general, it appears that they are most effective when they are adequately resourced. 3. Various methods of measuring ED activity exist Measuring ED activity requires careful consideration of models of care and staffing profile. Measuring activity requires the ability to account for factors including: patient census, acuity, LOS, intensity of intervention, department skill-mix plus an adjustment for non-patient care time. 4. Gaps in the literature Continued ED growth calls for new and innovative care delivery models that are safe, clinically effective and cost effective. New roles and stand-alone service delivery models are often evaluated in isolation without considering the global and economic impact on staffing profiles. Whilst various models of accounting for and measuring health care activity exist, costing studies and cost effectiveness studies are lacking for EDs making accurate and reliable assessments of care models difficult. There is a necessity to further understand, refine and account for measures of ED complexity that define a workload upon which resources and appropriate staffing determinations can be made into the future. There is also a need for continued monitoring and comprehensive evaluation of newly implemented workforce modelling tools. This research acknowledges those gaps and aims to: • Undertake a comprehensive and integrated whole of department workforce profiling exercise relative to resources in the context of ABF. • Inform workforce requirements based on traditional quantitative markers (e.g. volume and acuity) combined with qualitative elements of ED models of care; • Develop a comprehensive and validated workforce calculation tool that can be used to better inform or at least guide workforce requirements in a more transparent manner.