545 resultados para Design engineering.
Resumo:
Value Management (VM) is a proven methodology that provides a structured framework using supporting tools and techniques that facilitate effective decision-making in many types of projects, thus achieving ‘best value’ for clients. It offers an exceptionally robust approach to exploring the need and function of projects to be aligned with client’s objectives. The functional analysis and creativity phases of VM are crucial as it focused on utilising innovative thinking to understand the objectives of clients’ projects and provide value-adding solutions at the early discovery stages of projects. There is however a perception of VM as just being another cost-cutting tool, which has overshadowed the fundamental benefits of the method, therefore negating both influence and wider use in the construction industry. This paper describes findings from a series of case studies conducted at project and corporate levels of a current public funded infrastructure projects in Malaysia. The study aims to investigate VM processes practised by the project client organisation and evaluate the effects of project team involvement in VM workshops during the design-stage of these projects. The focus of the study is on how issues related to ‘upstream’ infrastructure design aimed at improving ‘downstream’ construction process on-site, are being resolved through multi-disciplinary team consideration and decision-making. Findings from the case studies indicate that the mix of disciplines of project team members at a design-stage of a VM workshop has minimal influence on improving construction processes. However, the degree of interaction, institutionalized thinking, cultural dimensions and visualization aids adopted, have a significant impact in maximizing creativity amongst project team members during VM workshop. The case studies conducted for this research have focused on infrastructure projects that utilise traditional VM workshop as client’s chosen VM methodology to review and develop designs. Documents review and semi-structured interview with project teams are used as data collection techniques for the case study. The significant outcomes of this research are expected to offer alternative perspectives for construction professionals and clients to minimise the constraints and strengthen strategies for implementing VM on future projects.
Resumo:
This paper reports the findings from a series of scoping interviews designed to evaluate, ground and refine the initial understandings, assumptions and concepts of a research team in a larger project about the role of social and tangible technologies in maintaining good habits into old age. Participants' understandings of some basic terms used in the research are presented along with a discussion of their current use of new and established information and communications technologies and the existing barriers to ongoing uptake of emerging technologies. The findings suggest that we question common assumptions about both ageing and technology usage by ageing people, demonstrating the contribution such early scoping interviews can make within design research projects.
Resumo:
Whole-body computer control interfaces present new opportunities to engage children with games for learning. Stomp is a suite of educational games that use such a technology, allowing young children to use their whole body to interact with a digital environment projected on the floor. To maximise the effectiveness of this technology, tenets of self-determination theory (SDT) are applied to the design of Stomp experiences. By meeting user needs for competence, autonomy, and relatedness our aim is to increase children's engagement with the Stomp learning platform. Analysis of Stomp's design suggests that these tenets are met. Observations from a case study of Stomp being used by young children show that they were highly engaged and motivated by Stomp. This analysis demonstrates that continued application of SDT to Stomp will further enhance user engagement. It also is suggested that SDT, when applied more widely to other whole-body multi-user interfaces, could instil similar positive effects.
Resumo:
This action research project investigated the use of a collaborative learning approach for addressing issues associated with teaching urban design to large, diverse cohorts. As a case study, I observed two semesters of an urban design unit that I revised between 2011 and 2012 to incorporate collaborative learning activities. Data include instructional materials, participant observations, peer-reviews of collaborative learning activities, feedback from students and instructors and student projects. Themes that emerged through qualitative analysis include the challenge of removing inequalities inherent in the diverse cohort, the challenge of unifying project guidance and marking criteria, and the challenge of providing project guidance for a very large cohort. Most notably, the study revealed a need to clarify learning objectives relating to design principles in order to fully transition to and benefit from a collaborative learning model.
Resumo:
For the evaluation, design, and planning of traffic facilities and measures, traffic simulation packages are the de facto tools for consultants, policy makers, and researchers. However, the available commercial simulation packages do not always offer the desired work flow and flexibility for academic research. In many cases, researchers resort to designing and building their own dedicated models, without an intrinsic incentive (or the practical means) to make the results available in the public domain. To make matters worse, a substantial part of these efforts pertains to rebuilding basic functionality and, in many respects, reinventing the wheel. This problem not only affects the research community but adversely affects the entire traffic simulation community and frustrates the development of traffic simulation in general. For this problem to be addressed, this paper describes an open source approach, OpenTraffic, which is being developed as a collaborative effort between the Queensland University of Technology, Australia; the National Institute of Informatics, Tokyo; and the Technical University of Delft, the Netherlands. The OpenTraffic simulation framework enables academies from geographic areas and disciplines within the traffic domain to work together and contribute to a specific topic of interest, ranging from travel choice behavior to car following, and from response to intelligent transportation systems to activity planning. The modular approach enables users of the software to focus on their area of interest, whereas other functional modules can be regarded as black boxes. Specific attention is paid to a standardization of data inputs and outputs for traffic simulations. Such standardization will allow the sharing of data with many existing commercial simulation packages.
Resumo:
Threats against computer networks evolve very fast and require more and more complex measures. We argue that teams respectively groups with a common purpose for intrusion detection and prevention improve the measures against rapid propagating attacks similar to the concept of teams solving complex tasks known from field of work sociology. Collaboration in this sense is not easy task especially for heterarchical environments. We propose CIMD (collaborative intrusion and malware detection) as a security overlay framework to enable cooperative intrusion detection approaches. Objectives and associated interests are used to create detection groups for exchange of security-related data. In this work, we contribute a tree-oriented data model for device representation in the scope of security. We introduce an algorithm for the formation of detection groups, show realization strategies for the system and conduct vulnerability analysis. We evaluate the benefit of CIMD by simulation and probabilistic analysis.
Resumo:
The umbrella of Australian research higher degree (RHD) offerings has broadened from the traditional MPhil/PhD programmes to include a range of professional masters and doctoral degrees. This article reports on the experiences of three PhD students, engaged in an informally managed industry partnered research programme, described in this article as the work integrated research higher degree (WIRHD). Their learning process shares the attributes from both the traditional PhD programme and professional doctorates. However, because of the blended nature of the learning contexts, candidates engaged in the WIRHD programme must address a wider range of issues than those following the traditional RHD pathway. An exploratory case study approach was adopted with the view to develop an integrative framework to explain the various contexts that influence the learning experience of WIRHD candidates, as well as a structured approach to guide this contemporary form of industry partnered WIRHD process.
Resumo:
Crowdsourcing has become a popular approach for capitalizing on the potential of large and open crowds of people external to the organization. While crowdsourcing as a phenomenon is studied in a variety of fields, research mostly focuses on isolated aspects and little is known about the integrated design of crowdsourcing efforts. We introduce a socio-technical systems perspective on crowdsourcing, which provides a deeper understanding of the components and relationships in crowdsourcing systems. By considering the function of crowdsourcing systems within their organizational context, we develop a typology of four distinct system archetypes. We analyze the characteristics of each type and derive a number of design requirements for the respective system components. The paper lays a foundation for IS-based crowdsourcing research, channels related academic work, and helps guiding the study and design of crowdsourcing information systems.
Resumo:
Reasoning with uncertain knowledge and belief has long been recognized as an important research issue in Artificial Intelligence (AI). Several methodologies have been proposed in the past, including knowledge-based systems, fuzzy sets, and probability theory. The probabilistic approach became popular mainly due to a knowledge representation framework called Bayesian networks. Bayesian networks have earned reputation of being powerful tools for modeling complex problem involving uncertain knowledge. Uncertain knowledge exists in domains such as medicine, law, geographical information systems and design as it is difficult to retrieve all knowledge and experience from experts. In design domain, experts believe that design style is an intangible concept and that its knowledge is difficult to be presented in a formal way. The aim of the research is to find ways to represent design style knowledge in Bayesian net works. We showed that these networks can be used for diagnosis (inferences) and classification of design style. The furniture design style is selected as an example domain, however the method can be used for any other domain.
Resumo:
Key distribution is one of the most challenging security issues in wireless sensor networks where sensor nodes are randomly scattered over a hostile territory. In such a sensor deployment scenario, there will be no prior knowledge of post deployment configuration. For security solutions requiring pair wise keys, it is impossible to decide how to distribute key pairs to sensor nodes before the deployment. Existing approaches to this problem are to assign more than one key, namely a key-chain, to each node. Key-chains are randomly drawn from a key-pool. Either two neighbouring nodes have a key in common in their key-chains, or there is a path, called key-path, among these two nodes where each pair of neighbouring nodes on this path has a key in common. Problem in such a solution is to decide on the key-chain size and key-pool size so that every pair of nodes can establish a session key directly or through a path with high probability. The size of the key-path is the key factor for the efficiency of the design. This paper presents novel, deterministic and hybrid approaches based on Combinatorial Design for key distribution. In particular, several block design techniques are considered for generating the key-chains and the key-pools. Comparison to probabilistic schemes shows that our combinatorial approach produces better connectivity with smaller key-chain sizes.
Resumo:
Secure communications in distributed Wireless Sensor Networks (WSN) operating under adversarial conditions necessitate efficient key management schemes. In the absence of a priori knowledge of post-deployment network configuration and due to limited resources at sensor nodes, key management schemes cannot be based on post-deployment computations. Instead, a list of keys, called a key-chain, is distributed to each sensor node before the deployment. For secure communication, either two nodes should have a key in common in their key-chains, or they should establish a key through a secure-path on which every link is secured with a key. We first provide a comparative survey of well known key management solutions for WSN. Probabilistic, deterministic and hybrid key management solutions are presented, and they are compared based on their security properties and re-source usage. We provide a taxonomy of solutions, and identify trade-offs in them to conclude that there is no one size-fits-all solution. Second, we design and analyze deterministic and hybrid techniques to distribute pair-wise keys to sensor nodes before the deployment. We present novel deterministic and hybrid approaches based on combinatorial design theory and graph theory for deciding how many and which keys to assign to each key-chain before the sensor network deployment. Performance and security of the proposed schemes are studied both analytically and computationally. Third, we address the key establishment problem in WSN which requires key agreement algorithms without authentication are executed over a secure-path. The length of the secure-path impacts the power consumption and the initialization delay for a WSN before it becomes operational. We formulate the key establishment problem as a constrained bi-objective optimization problem, break it into two sub-problems, and show that they are both NP-Hard and MAX-SNP-Hard. Having established inapproximability results, we focus on addressing the authentication problem that prevents key agreement algorithms to be used directly over a wireless link. We present a fully distributed algorithm where each pair of nodes can establish a key with authentication by using their neighbors as the witnesses.
Resumo:
Good daylighting design in buildings not only provides a comfortable luminous environment, but also delivers energy savings and comfortable and healthy environments for building occupants. Yet, there is still no consensus on how to assess what constitutes good daylighting design. Currently amongst building performance guidelines, Daylighting factors (DF) or minimum illuminance values are the standard; however, previous research has shown the shortcomings of these metrics. New computer software for daylighting analysis contains new more advanced metrics for daylighting (Climate Base Daylight Metrics-CBDM). Yet, these tools (new metrics or simulation tools) are not currently understood by architects and are not used within architectural firms in Australia. A survey of architectural firms in Brisbane showed the most relevant tools used by industry. The purpose of this paper is to assess and compare these computer simulation tools and new tools available architects and designers for daylighting. The tools are assessed in terms of their ease of use (e.g. previous knowledge required, complexity of geometry input, etc.), efficiency (e.g. speed, render capabilities, etc.) and outcomes (e.g. presentation of results, etc. The study shows tools that are most accessible for architects, are those that import a wide variety of files, or can be integrated into the current 3d modelling software or package. These software’s need to be able to calculate for point in times simulations, and annual analysis. There is a current need in these software solutions for an open source program able to read raw data (in the form of spreadsheets) and show that graphically within a 3D medium. Currently, development into plug-in based software’s are trying to solve this need through third party analysis, however some of these packages are heavily reliant and their host program. These programs however which allow dynamic daylighting simulation, which will make it easier to calculate accurate daylighting no matter which modelling platform the designer uses, while producing more tangible analysis today, without the need to process raw data.
Resumo:
As a result of growing evidence regarding the effects of environmental characteristics on the health and wellbeing of people in healthcare facilities (HCFs), more emphasis is being placed on, and more attention being paid to, the consequences of design choices in HCFs. Therefore, we have critically reviewed the implications of key indoor physical design parameters, in relation to their potential impact on human health and wellbeing. In addition, we discussed these findings within the context of the relevant guidelines and standards for the design of HCFs. A total of 810 abstracts, which met the inclusion criteria, were identified through a Pubmed search, and these covered journal articles, guidelines, books, reports and monographs in the studied area. Of these, 231 full publications were selected for this review. According to the literature, the most beneficial design elements were: single-bed patient rooms, safe and easily cleaned surface materials, sound-absorbing ceiling tiles, adequate and sufficient ventilation, thermal comfort, natural daylight, control over temperature and lighting, views, exposure and access to nature, and appropriate equipment, tools and furniture. The effects of some design elements, such as lighting (e.g. artificial lighting levels) and layout (e.g. decentralized versus centralized nurses’ stations), on staff and patients vary, and “the best design practice” for each HCF should always be formulated in co-operation with different user groups and a multi-professional design team. The relevant guidelines and standards should also be considered in future design, construction and renovations, in order to produce more favourable physical indoor environments in HCFs.
Resumo:
This paper characterises nitrogen and phosphorus wash-off processes on urban road surfaces to create fundamental knowledge to strengthen stormwater treatment design. The study outcomes confirmed that the composition of initially available nutrients in terms of their physical association with solids and chemical speciation determines the wash-off characteristics. Nitrogen and phosphorus wash-off processes are independent of land use, but there are notable differences. Nitrogen wash-off is a “source limiting” process while phosphorus wash-off is “transport limiting”. Additionally, a clear separation between nitrogen and phosphorus wash-off processes based on dissolved and particulate forms confirmed that the common approach of replicating nutrients wash-off based on solids wash-off could lead to misleading outcomes particularly in the case of nitrogen. Nitrogen is present primarily in dissolved and organic form and readily removed even by low intensity rainfall events, which is an important consideration for nitrogen removal targeted treatment design. In the case of phosphorus, phosphate constitutes the primary species in wash-off for the particle size fraction <75 µm, while other species are predominant in particle size range >75 µm. This means that phosphorus removal targeted treatment design should consider both phosphorus speciation as well as particle size.
Resumo:
BACKGROUND There is increasing enrolment of international students in the Engineering and Information Technology disciplines and anecdotal evidence of a need for additional understanding and support for these students and their supervisors due to differences both in academic and social cultures. While there is a growing literature on supervisory styles and guidelines on effective supervision, there is little on discipline-specific, cross-cultural supervision responding to the growing diversity. In this paper, we report findings from a study of Engineering and Information technology Higher Degree Research (HDR)students and supervision in three Australian universities. PURPOSE The aim was to assess perceptions of students and supervisors of factors influencing success that are particular to international or culturally and linguistically diverse (CaLD) HDR students in Engineering and Information technology. DESIGN/METHOD Online survey and qualitative data was collected from international and CaLD HDR students and supervisors at the three universities. Bayesian network analysis, inferential statistics, and qualitative analysis provided the main findings. RESULTS Survey results indicate that both students and supervisors are positive about their experiences, and do not see language or culture as particularly problematic. The survey results also reveal strong consistency between the perceptions of students and supervisors on most factors influencing success. Qualitative analysis of critical supervision incidents has provided rich data that could help improve support services. CONCLUSIONS In contrast with anecdotal evidence, HDR completion data from the three universities reveal that international students, on average, complete in shorter time periods than domestic students. The analysis suggests that success is linked to a complex set of factors involving the student, supervision, the institution and broader community.