861 resultados para design processes
Resumo:
Business Process Management describes a holistic management approach for the systematic design, modeling, execution, validation, monitoring and improvement of organizational business processes. Traditionally, most attention within this community has been given to control-flow aspects, i.e., the ordering and sequencing of business activities, oftentimes in isolation with regards to the context in which these activities occur. In this paper, we propose an approach that allows executable process models to be integrated with Geographic Information Systems. This approach enables process models to take geospatial and other geographic aspects into account in an explicit manner both during the modeling phase and the execution phase. We contribute a structured modeling methodology, based on the well-known Business Process Model and Notation standard, which is formalized by means of a mapping to executable Colored Petri nets. We illustrate the feasibility of our approach by means of a sustainability-focused case example of a process with important ecological concerns.
Resumo:
Strategic Renewal has been the subject of research in large organisations but has received relatively little attention in small and medium enterprises. Using case study examples of small and medium manufacturing firms, this paper presents the findings from a longitudinal action research project where participating companies explored design led innovation processes to find new ways to renew their businesses. Specifically our findings indicate that when designers act as innovation catalysts in embedded longitudinal action research, SMEs engage in strategic renewal, gain a deeper appreciation of their customers, become more aware of the value proposition of the company and engage in new practices to improve their competitive advantage.
Resumo:
Purpose of this paper This research aims to examine the effects of inadequate documentation to the cost management & tendering processes in Managing Contractor Contracts using Fixed Lump Sum as a benchmark. Design/methodology/approach A questionnaire survey was conducted with industry practitioners to solicit their views on documentation quality issues associated with the construction industry. This is followed by a series of semi-structured interviews with a purpose of validating survey findings. Findings and value The results showed that documentation quality remains a significant issue, contributing to the industries inefficiency and poor reputation. The level of satisfaction for individual attributes of documentation quality varies. Attributes that do appear to be affected by the choice of procurement method include coordination, build ability, efficiency, completeness and delivery time. Similarly the use and effectiveness of risk mitigation techniques appears to vary between the methods, based on a number of factors such as documentation completeness, early involvement, fast tracking etc. Originality/value of paper This research fills the gap of existing body of knowledge in terms of limited studies on the choice of a project procurement system has an influence on the documentation quality and the level of impact. Conclusions Ultimately research concludes that the entire project team including the client and designers should carefully consider the individual projects requirements and compare those to the trade-offs associated with documentation quality and the procurement method. While documentation quality is definitely an issue to be improved upon, by identifying the projects performance requirements a procurement method can be chosen to maximise the likelihood that those requirements will be met. This allows the aspects of documentation quality considered most important to the individual project to be managed appropriately.
Resumo:
Objective: To prospectively test two simplified peer review processes, estimate the agreement between the simplified and official processes, and compare the costs of peer review. Design, participants and setting: A prospective parallel study of Project Grant proposals submitted in 2013 to the National Health and Medical Research Council (NHMRC) of Australia. The official funding outcomes were compared with two simplified processes using proposals in Public Health and Basic Science. The two simplified processes were: panels of 7 reviewers who met face-to-face and reviewed only the nine-page research proposal and track record (simplified panel); and 2 reviewers who independently reviewed only the nine-page research proposal (journal panel). The official process used panels of 12 reviewers who met face-to-face and reviewed longer proposals of around 100 pages. We compared the funding outcomes of 72 proposals that were peer reviewed by the simplified and official processes. Main outcome measures: Agreement in funding outcomes; costs of peer review based on reviewers’ time and travel costs. Results: The agreement between the simplified and official panels (72%, 95% CI 61% to 82%), and the journal and official panels (74%, 62% to 83%), was just below the acceptable threshold of 75%. Using the simplified processes would save $A2.1–$A4.9 million per year in peer review costs. Conclusions: Using shorter applications and simpler peer review processes gave reasonable agreement with the more complex official process. Simplified processes save time and money that could be reallocated to actual research. Funding agencies should consider streamlining their application processes.
Resumo:
This report describes the Year One Pilot Study processes, and articulates findings from the major project components designed to address these challenges noted above (See Figure 1). Specifically, the pilot study tested the campaign research and development process involving participatory design with young people and sector partners, and the efficacy and practicality of conducting a longitudinal, randomised control trial online with minors, including ways oflinking survey data to campaign data. Each sub-study comprehensively considered the ethical requirements of conducting online research with minors in school settings. The theoretical and methodological framework for measuring campaign engagement and efficacy (Sub-studies 3, 4 and 5) drew on the Model of Goal-Directed Behaviour (MGB) (Perugini & Bagozzi 2001) and Nudge Theory (Thaler & Sunstein, 2008).
Resumo:
Innovation and Entrepreneurship: Creating New Value covers all of the major aspects of innovation strategy and capabilities, including leadership of innovation, creativity, design led innovation, open innovation, management of the innovation portfolio and new product development processes. Ultimately, innovation is accomplished by people, and this book recognises the critical contribution of leadership and organisational culture to developing and promoting innovation behaviours. For startups and entrepreneurs, the book covers the practical, powerful tests that a new idea should be subjected to, as well as providing an overview of the entrepreneurship process. Another feature of the book is the detailed presentation of the practices common to highly innovative organisations that distinguishes them from low innovating organisations. Underpinned by research, this information is translated into an innovation audit tool that can be used by managers or students alike.
Resumo:
This paper reports on the results of a project aimed at creating a research-informed, pedagogically reliable, technology-enhanced learning and teaching environment that would foster engagement with learning. A first-year mathematics for engineering unit offered at a large, metropolitan Australian university provides the context for this research. As part of the project, the unit was redesigned using a framework that employed flexible, modular, connected e-learning and teaching experiences. The researchers, interested in an ecological perspective on educational processes, grounded the redesign principles in probabilistic learning design (Kirschner et al., 2004). The effectiveness of the redesigned environment was assessed through the lens of the notion of affordance (Gibson, 1977,1979, Greeno, 1994, Good, 2007). A qualitative analysis of the questionnaire distributed to students at the end of the teaching period provided insight into factors impacting on the successful creation of an environment that encourages complex, multidimensional and multilayered interactions conducive to learning.
Resumo:
The current study introduces a novel synthetic avenue for the preparation of profluorescent nitroxides via nitrile imine-mediated tetrazole-ene cycloaddition (NITEC). The photoinduced cycloaddition was performed under metal-free, mild conditions allowing the preparation of a library of the nitroxide functionalized pyrazolines and corresponding methoxyamines. High reaction rates and full conversion were observed, with the presence of the nitroxide having no significant impact on the cycloaddition performance. The formed products were investigated with respect to their photophysical properties in order to quantify their “switch on/off” behavior. The fluorescence quenching performance is strongly dependent on the distance between the chromophore and the free radical spin as demonstrated theoretically and experimentally. Highest levels of fluorescence quenching were achieved for pyrazolines with the nitroxide directly fused to the chromophore. Importantly, the pyrazoline profluorescent nitroxides were shown to efficiently act as sensors for redox/radical processes.
Resumo:
The climate in the Arctic is changing faster than anywhere else on earth. Poorly understood feedback processes relating to Arctic clouds and aerosol–cloud interactions contribute to a poor understanding of the present changes in the Arctic climate system, and also to a large spread in projections of future climate in the Arctic. The problem is exacerbated by the paucity of research-quality observations in the central Arctic. Improved formulations in climate models require such observations, which can only come from measurements in situ in this difficult-to-reach region with logistically demanding environmental conditions. The Arctic Summer Cloud Ocean Study (ASCOS) was the most extensive central Arctic Ocean expedition with an atmospheric focus during the International Polar Year (IPY) 2007–2008. ASCOS focused on the study of the formation and life cycle of low-level Arctic clouds. ASCOS departed from Longyearbyen on Svalbard on 2 August and returned on 9 September 2008. In transit into and out of the pack ice, four short research stations were undertaken in the Fram Strait: two in open water and two in the marginal ice zone. After traversing the pack ice northward, an ice camp was set up on 12 August at 87°21' N, 01°29' W and remained in operation through 1 September, drifting with the ice. During this time, extensive measurements were taken of atmospheric gas and particle chemistry and physics, mesoscale and boundary-layer meteorology, marine biology and chemistry, and upper ocean physics. ASCOS provides a unique interdisciplinary data set for development and testing of new hypotheses on cloud processes, their interactions with the sea ice and ocean and associated physical, chemical, and biological processes and interactions. For example, the first-ever quantitative observation of bubbles in Arctic leads, combined with the unique discovery of marine organic material, polymer gels with an origin in the ocean, inside cloud droplets suggests the possibility of primary marine organically derived cloud condensation nuclei in Arctic stratocumulus clouds. Direct observations of surface fluxes of aerosols could, however, not explain observed variability in aerosol concentrations, and the balance between local and remote aerosols sources remains open. Lack of cloud condensation nuclei (CCN) was at times a controlling factor in low-level cloud formation, and hence for the impact of clouds on the surface energy budget. ASCOS provided detailed measurements of the surface energy balance from late summer melt into the initial autumn freeze-up, and documented the effects of clouds and storms on the surface energy balance during this transition. In addition to such process-level studies, the unique, independent ASCOS data set can and is being used for validation of satellite retrievals, operational models, and reanalysis data sets.
Resumo:
Pumping systems are widely used in many applications, including municipal water/wastewater services, domestic, commercial and agricultural services, and industrial processes. They are a very significant energy user and consume nearly 20% of the world’s electrical energy demand. Therefore, improving the energy efficiency of pumping systems can provide great benefits in terms of energy, environment, and cost reduction. In this entry, an overview of pump classification with pros and cons of each type of pump is presented. The procedures used to design pumping systems are also outlined. This is then followed by a discussion on the opportunities for improving the energy efficiency of pumping systems during every stage of design, selection, operation, and maintenance.
Resumo:
The power to influence others in ever-expanding social networks in the new knowledge economy is tied to capabilities with digital media production. This chapter draws on research in elementary classrooms to examine the repertoires of cross-disciplinary knowledge that literacy learners need to produce innovative digital media via the “social web”. It focuses on the knowledge processes that occurred when elementary students engaged in multimodal text production with new digital media. It draws on Kalantzis and Cope’s (2008) heuristic for theorizing “Knowledge Processes” in the Learning by Design approach to pedagogy. Learners demonstrate eight “Knowledge Processes” across different subject domains, skills areas, and sensibilities. Drawing data from media-based lessons across several classroom and schools, this chapter examines what kinds of knowledge students utilize when they produce digital, multimodal texts in the classroom. The Learning by Design framework is used as an analytic tool to theorize how students learn when they engaged in a specific domain of learning – digital media production.
Resumo:
This paper proposes and explores the Deep Customer Insight Innovation Framework in order to develop an understanding as to how design can be integrated within existing innovation processes. The Deep Customer Insight Innovation Framework synthesises the work of Beckman and Barry (2007) as a theoretical foundation, with the framework explored within a case study of Australian Airport Corporation seeking to drive airport innovations in operations and retail performance. The integration of a deep customer insight approach develops customer-centric and highly integrated solutions as a function of concentrated problem exploration and design-led idea generation. Businesses’ facing complex innovation challenges or seeking to making sense of future opportunities will be able to integrate design into existing innovation processes, anchoring the new approach between existing market research and business development activities. This paper contributes a framework and novel understanding as to how design methods are integrated into existing innovation processes for operationalization within industry.
Resumo:
As global industries change and technology advances, traditional education systems might no longer be able to supply companies with graduates who possess an appropriate mix of skills and experience. The recent increased interest in Design Thinking as an approach to innovation has resulted in its adoption by non-design-trained professionals. This development necessitates a new method of teaching Design Thinking and its related skills and processes. As a basis for such a method, this research investigated 51 selected courses across 28 international universities to determine what Design Thinking is being taught (content), and how it is being taught (assessment and learning modes). To support the teaching and assessment of Design Thinking, this paper presents The Educational Design Ladder, an innovative resource/model that provides a process for the organisation and structuring of units for a multidisciplinary Design Thinking programme.
Resumo:
Yao, Begg, and Livingston (1996, Biometrics 52, 992-1001) considered the optimal group size for testing a series of potentially therapeutic agents to identify a promising one as soon as possible for given error rates. The number of patients to be tested with each agent was fixed as the group size. We consider a sequential design that allows early acceptance and rejection, and we provide an optimal strategy to minimize the sample sizes (patients) required using Markov decision processes. The minimization is under the constraints of the two types (false positive and false negative) of error probabilities, with the Lagrangian multipliers corresponding to the cost parameters for the two types of errors. Numerical studies indicate that there can be a substantial reduction in the number of patients required.
Resumo:
Gaussian processes (GPs) are promising Bayesian methods for classification and regression problems. Design of a GP classifier and making predictions using it is, however, computationally demanding, especially when the training set size is large. Sparse GP classifiers are known to overcome this limitation. In this letter, we propose and study a validation-based method for sparse GP classifier design. The proposed method uses a negative log predictive (NLP) loss measure, which is easy to compute for GP models. We use this measure for both basis vector selection and hyperparameter adaptation. The experimental results on several real-world benchmark data sets show better orcomparable generalization performance over existing methods.