972 resultados para DESIGN III
Resumo:
Electronic Blocks are a new programming environment, designed specifically for children aged between three and eight years. As such, the design of the Electronic Block environment is firmly based on principles of developmentally appropriate practices in early childhood education. The Electronic Blocks are physical, stackable blocks that include sensor blocks, action blocks and logic blocks. Evaluation of the Electronic Blocks with both preschool and primary school children shows that the blocks' ease of use and power of engagement have created a compelling tool for the introduction of meaningful technology education in an early childhood setting. The key to the effectiveness of the Electronic Blocks lies in an adherence to theories of development and learning throughout the Electronic Blocks design process.
Resumo:
The Comprehensive Australian Study of Entrepreneurial Emergence (CAUSEE) is a research programme that aims to uncover the factors that initiate, hinder and facilitate the process of emergence of new economic activities and organizations. It is widely acknowledged that entrepreneurship is one of the most important forces shaping changes in a country’s economic landscape (Baumol 1968; Birch 1987; Acs 1999). An understanding of the process by which new economic activity and business entities emerge is vital (Gartner 1993; Sarasvathy 2001). An important development in the study of ‘nascent entrepreneurs’ and ‘firms in gestation’ was the Panel Study of Entrepreneurial Dynamics (PSED) (Gartner et al. 2004) and its extensions in Argentina, Canada, Greece, the Netherlands, Norway and Sweden. Yet while PSED I is an important first step towards systematically studying new venture emergence, it represents just the beginning of a stream of nascent venture studies – most notably PSED II is currently being undertaken in the US (2005– 10) (Reynolds and Curtin 2008).
Resumo:
Technology-mediated collaboration process has been extensively studied for over a decade. Most applications with collaboration concepts reported in the literature focus on enhancing efficiency and effectiveness of the decision-making processes in objective and well-structured workflows. However, relatively few previous studies have investigated the applications of collaboration schemes to problems with subjective and unstructured nature. In this paper, we explore a new intelligent collaboration scheme for fashion design which, by nature, relies heavily on human judgment and creativity. Techniques such as multicriteria decision making, fuzzy logic, and artificial neural network (ANN) models are employed. Industrial data sets are used for the analysis. Our experimental results suggest that the proposed scheme exhibits significant improvement over the traditional method in terms of the time–cost effectiveness, and a company interview with design professionals has confirmed its effectiveness and significance.
Resumo:
There has recently been an emphasis within literacy studies on both the spatial dimensions of social practices (Leander & Sheehy, 2004) and the importance of incorporating design and multiple modes of meaning-making into contemporary understandings of literacy (Cope & Kalantzis, 2000; New London Group, 1996). Kress (2003) in particular has outlined the potential implications of the cultural shift from the dominance of writing, based on a logic of time and sequence in time, to the dominance of the mode of the image, based on a logic of space. However, the widespread re-design of curriculum and pedagogy by classroom teachers to allow students to capitalise on the various affordances of different modes of meaning-making – including the spatial – remains in an emergent stage. We report on a project in which university researchers’ expertise in architecture, literacy and communications enabled two teachers in one school to expand the forms of literacy that primary school children engaged in. Starting from the school community’s concerns about an urban renewal project in their neighbourhood, we worked together to develop a curriculum of spatial literacies with real-world goals and outcomes.
Resumo:
The vibration serviceability limit state is an important design consideration for two-way, suspended concrete floors that is not always well understood by many practicing structural engineers. Although the field of floor vibration has been extensively developed, at present there are no convenient design tools that deal with this problem. Results from this research have enabled the development of a much-needed, new method for assessing the vibration serviceability of flat, suspended concrete floors in buildings. This new method has been named, the Response Coefficient-Root Function (RCRF) method. Full-scale, laboratory tests have been conducted on a post-tensioned floor specimen at Queensland University of Technology’s structural laboratory. Special support brackets were fabricated to perform as frictionless, pinned connections at the corners of the specimen. A series of static and dynamic tests were performed in the laboratory to obtain basic material and dynamic properties of the specimen. Finite-element-models have been calibrated against data collected from laboratory experiments. Computational finite-element-analysis has been extended to investigate a variety of floor configurations. Field measurements of floors in existing buildings are in good agreement with computational studies. Results from this parametric investigation have led to the development of new approach for predicting the design frequencies and accelerations of flat, concrete floor structures. The RCRF method is convenient tool to assist structural engineers in the design for the vibration serviceability limit-state of in-situ concrete floor systems.
Resumo:
This paper extends the work of “Luxury fashion : the role of innovation as a key contributing factor in the development of luxury fashion goods and sustainable fashion design” (Finn, 2011). The discussion here begins with the observation that post consumer textile waste remains a major obstacle in realising a model of sustainable fashion design and production however, amongst the millions of tonnes of textile and clothing sent to landfill each year there is little evidence of authentic luxury branded goods ending life as landfill. The sustainable fashion movement often support approaches such as fashion up-cycle, re-cycle and cradle to cradle solutions. This paper argues that the priority should be to break the cycle of consumerism as an immediate intervention in ongoing unsustainable (and in some cases unethical) practices involved in the production of fashion goods. The connections between maker and consumer are explored through object analysis and the findings raise questions of the separation between luxury fashion goods and fashion goods that bear luxury fashion branding. This paper suggests that unethical and subversive exploitation of these connections may be used to promote increased consumerism while at the same time purporting exclusivity and superior craftsmanship.
Resumo:
A Wireless Sensor Network (WSN) is a set of sensors that are integrated with a physical environment. These sensors are small in size, and capable of sensing physical phenomena and processing them. They communicate in a multihop manner, due to a short radio range, to form an Ad Hoc network capable of reporting network activities to a data collection sink. Recent advances in WSNs have led to several new promising applications, including habitat monitoring, military target tracking, natural disaster relief, and health monitoring. The current version of sensor node, such as MICA2, uses a 16 bit, 8 MHz Texas Instruments MSP430 micro-controller with only 10 KB RAM, 128 KB program space, 512 KB external ash memory to store measurement data, and is powered by two AA batteries. Due to these unique specifications and a lack of tamper-resistant hardware, devising security protocols for WSNs is complex. Previous studies show that data transmission consumes much more energy than computation. Data aggregation can greatly help to reduce this consumption by eliminating redundant data. However, aggregators are under the threat of various types of attacks. Among them, node compromise is usually considered as one of the most challenging for the security of WSNs. In a node compromise attack, an adversary physically tampers with a node in order to extract the cryptographic secrets. This attack can be very harmful depending on the security architecture of the network. For example, when an aggregator node is compromised, it is easy for the adversary to change the aggregation result and inject false data into the WSN. The contributions of this thesis to the area of secure data aggregation are manifold. We firstly define the security for data aggregation in WSNs. In contrast with existing secure data aggregation definitions, the proposed definition covers the unique characteristics that WSNs have. Secondly, we analyze the relationship between security services and adversarial models considered in existing secure data aggregation in order to provide a general framework of required security services. Thirdly, we analyze existing cryptographic-based and reputationbased secure data aggregation schemes. This analysis covers security services provided by these schemes and their robustness against attacks. Fourthly, we propose a robust reputationbased secure data aggregation scheme for WSNs. This scheme minimizes the use of heavy cryptographic mechanisms. The security advantages provided by this scheme are realized by integrating aggregation functionalities with: (i) a reputation system, (ii) an estimation theory, and (iii) a change detection mechanism. We have shown that this addition helps defend against most of the security attacks discussed in this thesis, including the On-Off attack. Finally, we propose a secure key management scheme in order to distribute essential pairwise and group keys among the sensor nodes. The design idea of the proposed scheme is the combination between Lamport's reverse hash chain as well as the usual hash chain to provide both past and future key secrecy. The proposal avoids the delivery of the whole value of a new group key for group key update; instead only the half of the value is transmitted from the network manager to the sensor nodes. This way, the compromise of a pairwise key alone does not lead to the compromise of the group key. The new pairwise key in our scheme is determined by Diffie-Hellman based key agreement.
Resumo:
The mechanism for the decomposition of hydrotalcite remains unsolved. Controlled rate thermal analysis enables this decomposition pathway to be explored. The thermal decomposition of hydrotalcites with hexacyanoferrite(II) and hexacyanoferrate(III) in the interlayer has been studied using controlled rate thermal analysis technology. X-ray diffraction shows the hydrotalcites studied have a d(003) spacing of 11.1 and 10.9 Å which compares with a d-spacing of 7.9 and 7.98 Å for the hydrotalcite with carbonate or sulphate in the interlayer. Calculations based upon CRTA measurements show that 7 moles of water is lost, proving the formula of hexacyanoferrite(II) intercalated hydrotalcite is Mg6Al2(OH)16[Fe(CN)6]0.5 .7 H2O and for the hexacyanoferrate(III) intercalated hydrotalcite is Mg6Al2(OH)16[Fe(CN)6]0.66 * 9 H2O. Dehydroxylation combined with CN unit loss occurs in three steps between a) 310 and 367°C b) 367 and 390°C and c) between 390 and 428°C for both the hexacyanoferrite(II) and hexacyanoferrate(III) intercalated hydrotalcite.
Resumo:
Interaction Design is a fast developing branch of Industrial Design. The availability of cheap microprocessors and sensor electronics allow interactions between people and products that were until recently impossible. This has added additional layers of complexity to the design process. Novice designers find it difficult to effectively juggle these complexities and typically tend to focus on one aspect at a time. They also tend to take a linear, step-by-step approach to the design process in contrast to expert designers who pursue “parallel lines of thought” whilst simultaneously co-evolving both problem and solution. (Lawson, 1993) This paper explores an approach that encourages designers (in this case novice designers) to take a parallel rather than linear approach to the design process. It also addresses the problem of social loafing that tends to occur in team activities.
Resumo:
Regardless of technology benefits, safety planners still face difficulties explaining errors related to the use of different technologies and evaluating how the errors impact the performance of safety decision making. This paper presents a preliminary error impact analysis testbed to model object identification and tracking errors caused by image-based devices and algorithms and to analyze the impact of the errors for spatial safety assessment of earthmoving and surface mining activities. More specifically, this research designed a testbed to model workspaces for earthmoving operations, to simulate safety-related violations, and to apply different object identification and tracking errors on the data collected and processed for spatial safety assessment. Three different cases were analyzed based on actual earthmoving operations conducted at a limestone quarry. Using the testbed, the impacts of the errors were investigated for the safety planning purpose.
Resumo:
Without the virtually free services of nature like clean air and water, humans would not last long. Natural systems can be incorporated in existing urban structures or spaces to add public amenity, mitigate the heat island effect, reduce pollution, add oxygen, and ensure water, electricity and food security in urban areas. Th ere are many eco-solutions that could radically reduce resource consumption and pollution and even provide surplus ecosystem services in the built environment at little or no operational cost, if adequately supported by design. Th is paper is the fi rst of a two part paper that explains what eco-services are, then provides examples of how design can generate natural as well as social capital. Using examples of actual and notional solutions, both papers set out to challenge designers to ‘think again’, and invent ways of creating net positive environmental gains through built environment design.
Resumo:
Without the virtually free services of nature like clean air and water, humans would not last long. Natural systems can be incorporated in existing urban structures or spaces to add public amenity, mitigate the heat island eff ect, reduce pollution, add oxygen, and ensure water, electricity and food security in urban areas. Th ere are many eco-solutions that could radically reduce resource consumption and pollution and even provide surplus ecosystem services in the built environment at little or no operational cost, if adequately supported by design. Th is is the second part of a two part paper that explains what eco-services are, then provides examples of how design can generate natural as well as social capital. Using examples of actual and notional solutions, both papers set out to challenge designers to ‘think again’, and invent ways of creating net positive environmental gains through built environment design.
Resumo:
In 2008, a three-year pilot ‘pay for performance’ (P4P) program, known as ‘Clinical Practice Improvement Payment’ (CPIP) was introduced into Queensland Health (QHealth). QHealth is a large public health sector provider of acute, community, and public health services in Queensland, Australia. The organisation has recently embarked on a significant reform agenda including a review of existing funding arrangements (Duckett et al., 2008). Partly in response to this reform agenda, a casemix funding model has been implemented to reconnect health care funding with outcomes. CPIP was conceptualised as a performance-based scheme that rewarded quality with financial incentives. This is the first time such a scheme has been implemented into the public health sector in Australia with a focus on rewarding quality, and it is unique in that it has a large state-wide focus and includes 15 Districts. CPIP initially targeted five acute and community clinical areas including Mental Health, Discharge Medication, Emergency Department, Chronic Obstructive Pulmonary Disease, and Stroke. The CPIP scheme was designed around key concepts including the identification of clinical indicators that met the set criteria of: high disease burden, a well defined single diagnostic group or intervention, significant variations in clinical outcomes and/or practices, a good evidence, and clinician control and support (Ward, Daniels, Walker & Duckett, 2007). This evaluative research targeted Phase One of implementation of the CPIP scheme from January 2008 to March 2009. A formative evaluation utilising a mixed methodology and complementarity analysis was undertaken. The research involved three research questions and aimed to determine the knowledge, understanding, and attitudes of clinicians; identify improvements to the design, administration, and monitoring of CPIP; and determine the financial and economic costs of the scheme. Three key studies were undertaken to ascertain responses to the key research questions. Firstly, a survey of clinicians was undertaken to examine levels of knowledge and understanding and their attitudes to the scheme. Secondly, the study sought to apply Statistical Process Control (SPC) to the process indicators to assess if this enhanced the scheme and a third study examined a simple economic cost analysis. The CPIP Survey of clinicians elicited 192 clinician respondents. Over 70% of these respondents were supportive of the continuation of the CPIP scheme. This finding was also supported by the results of a quantitative altitude survey that identified positive attitudes in 6 of the 7 domains-including impact, awareness and understanding and clinical relevance, all being scored positive across the combined respondent group. SPC as a trending tool may play an important role in the early identification of indicator weakness for the CPIP scheme. This evaluative research study supports a previously identified need in the literature for a phased introduction of Pay for Performance (P4P) type programs. It further highlights the value of undertaking a formal risk assessment of clinician, management, and systemic levels of literacy and competency with measurement and monitoring of quality prior to a phased implementation. This phasing can then be guided by a P4P Design Variable Matrix which provides a selection of program design options such as indicator target and payment mechanisms. It became evident that a clear process is required to standardise how clinical indicators evolve over time and direct movement towards more rigorous ‘pay for performance’ targets and the development of an optimal funding model. Use of this matrix will enable the scheme to mature and build the literacy and competency of clinicians and the organisation as implementation progresses. Furthermore, the research identified that CPIP created a spotlight on clinical indicators and incentive payments of over five million from a potential ten million was secured across the five clinical areas in the first 15 months of the scheme. This indicates that quality was rewarded in the new QHealth funding model, and despite issues being identified with the payment mechanism, funding was distributed. The economic model used identified a relative low cost of reporting (under $8,000) as opposed to funds secured of over $300,000 for mental health as an example. Movement to a full cost effectiveness study of CPIP is supported. Overall the introduction of the CPIP scheme into QHealth has been a positive and effective strategy for engaging clinicians in quality and has been the catalyst for the identification and monitoring of valuable clinical process indicators. This research has highlighted that clinicians are supportive of the scheme in general; however, there are some significant risks that include the functioning of the CPIP payment mechanism. Given clinician support for the use of a pay–for-performance methodology in QHealth, the CPIP scheme has the potential to be a powerful addition to a multi-faceted suite of quality improvement initiatives within QHealth.