981 resultados para Design Editorial
Resumo:
Despite opposition from environmentalists, farmers and parts of the fishing industry, on 23 August 2012, the $6.4bn Alpha Coal mine and rail project in Queensland was approved under the EPBC Act, subject to 19 conditions.1 The approval relates to the proposed construction and operation of an open-cut coal mine and 495km railway line to Abbott Point...
Resumo:
For the evaluation, design, and planning of traffic facilities and measures, traffic simulation packages are the de facto tools for consultants, policy makers, and researchers. However, the available commercial simulation packages do not always offer the desired work flow and flexibility for academic research. In many cases, researchers resort to designing and building their own dedicated models, without an intrinsic incentive (or the practical means) to make the results available in the public domain. To make matters worse, a substantial part of these efforts pertains to rebuilding basic functionality and, in many respects, reinventing the wheel. This problem not only affects the research community but adversely affects the entire traffic simulation community and frustrates the development of traffic simulation in general. For this problem to be addressed, this paper describes an open source approach, OpenTraffic, which is being developed as a collaborative effort between the Queensland University of Technology, Australia; the National Institute of Informatics, Tokyo; and the Technical University of Delft, the Netherlands. The OpenTraffic simulation framework enables academies from geographic areas and disciplines within the traffic domain to work together and contribute to a specific topic of interest, ranging from travel choice behavior to car following, and from response to intelligent transportation systems to activity planning. The modular approach enables users of the software to focus on their area of interest, whereas other functional modules can be regarded as black boxes. Specific attention is paid to a standardization of data inputs and outputs for traffic simulations. Such standardization will allow the sharing of data with many existing commercial simulation packages.
Resumo:
The Queensland Government released its new Environmental Offset Policy in July 2008. This policy creates a set of overarching principles which are to be incorporated into existing environmental offset policy. This article is the final article in a set of three interrelated articles discussing the operation and implementation of environmental offsets in Queensland. The first article discusses the Environmental Offsets Discussion Paper and the existing environmental offset requirements. No significant changes have been made to these existing offset requirements under the new Environmental Offset Policy. This article also touches briefly on the legal issues associated with design and implementation of environmental offset and trading frameworks. The second article considered the compatibility of different land tenure arrangements in Queensland against the requirements for the creation and trade of environmental offsets. The third article being the present article, discusses the application of the new Environmental Offset Policy while also analysing the legal issues associated with environmental offsets in further detail.
Resumo:
Threats against computer networks evolve very fast and require more and more complex measures. We argue that teams respectively groups with a common purpose for intrusion detection and prevention improve the measures against rapid propagating attacks similar to the concept of teams solving complex tasks known from field of work sociology. Collaboration in this sense is not easy task especially for heterarchical environments. We propose CIMD (collaborative intrusion and malware detection) as a security overlay framework to enable cooperative intrusion detection approaches. Objectives and associated interests are used to create detection groups for exchange of security-related data. In this work, we contribute a tree-oriented data model for device representation in the scope of security. We introduce an algorithm for the formation of detection groups, show realization strategies for the system and conduct vulnerability analysis. We evaluate the benefit of CIMD by simulation and probabilistic analysis.
Resumo:
Crowdsourcing has become a popular approach for capitalizing on the potential of large and open crowds of people external to the organization. While crowdsourcing as a phenomenon is studied in a variety of fields, research mostly focuses on isolated aspects and little is known about the integrated design of crowdsourcing efforts. We introduce a socio-technical systems perspective on crowdsourcing, which provides a deeper understanding of the components and relationships in crowdsourcing systems. By considering the function of crowdsourcing systems within their organizational context, we develop a typology of four distinct system archetypes. We analyze the characteristics of each type and derive a number of design requirements for the respective system components. The paper lays a foundation for IS-based crowdsourcing research, channels related academic work, and helps guiding the study and design of crowdsourcing information systems.
Resumo:
Reasoning with uncertain knowledge and belief has long been recognized as an important research issue in Artificial Intelligence (AI). Several methodologies have been proposed in the past, including knowledge-based systems, fuzzy sets, and probability theory. The probabilistic approach became popular mainly due to a knowledge representation framework called Bayesian networks. Bayesian networks have earned reputation of being powerful tools for modeling complex problem involving uncertain knowledge. Uncertain knowledge exists in domains such as medicine, law, geographical information systems and design as it is difficult to retrieve all knowledge and experience from experts. In design domain, experts believe that design style is an intangible concept and that its knowledge is difficult to be presented in a formal way. The aim of the research is to find ways to represent design style knowledge in Bayesian net works. We showed that these networks can be used for diagnosis (inferences) and classification of design style. The furniture design style is selected as an example domain, however the method can be used for any other domain.
Resumo:
Key distribution is one of the most challenging security issues in wireless sensor networks where sensor nodes are randomly scattered over a hostile territory. In such a sensor deployment scenario, there will be no prior knowledge of post deployment configuration. For security solutions requiring pair wise keys, it is impossible to decide how to distribute key pairs to sensor nodes before the deployment. Existing approaches to this problem are to assign more than one key, namely a key-chain, to each node. Key-chains are randomly drawn from a key-pool. Either two neighbouring nodes have a key in common in their key-chains, or there is a path, called key-path, among these two nodes where each pair of neighbouring nodes on this path has a key in common. Problem in such a solution is to decide on the key-chain size and key-pool size so that every pair of nodes can establish a session key directly or through a path with high probability. The size of the key-path is the key factor for the efficiency of the design. This paper presents novel, deterministic and hybrid approaches based on Combinatorial Design for key distribution. In particular, several block design techniques are considered for generating the key-chains and the key-pools. Comparison to probabilistic schemes shows that our combinatorial approach produces better connectivity with smaller key-chain sizes.
Resumo:
Secure communications in distributed Wireless Sensor Networks (WSN) operating under adversarial conditions necessitate efficient key management schemes. In the absence of a priori knowledge of post-deployment network configuration and due to limited resources at sensor nodes, key management schemes cannot be based on post-deployment computations. Instead, a list of keys, called a key-chain, is distributed to each sensor node before the deployment. For secure communication, either two nodes should have a key in common in their key-chains, or they should establish a key through a secure-path on which every link is secured with a key. We first provide a comparative survey of well known key management solutions for WSN. Probabilistic, deterministic and hybrid key management solutions are presented, and they are compared based on their security properties and re-source usage. We provide a taxonomy of solutions, and identify trade-offs in them to conclude that there is no one size-fits-all solution. Second, we design and analyze deterministic and hybrid techniques to distribute pair-wise keys to sensor nodes before the deployment. We present novel deterministic and hybrid approaches based on combinatorial design theory and graph theory for deciding how many and which keys to assign to each key-chain before the sensor network deployment. Performance and security of the proposed schemes are studied both analytically and computationally. Third, we address the key establishment problem in WSN which requires key agreement algorithms without authentication are executed over a secure-path. The length of the secure-path impacts the power consumption and the initialization delay for a WSN before it becomes operational. We formulate the key establishment problem as a constrained bi-objective optimization problem, break it into two sub-problems, and show that they are both NP-Hard and MAX-SNP-Hard. Having established inapproximability results, we focus on addressing the authentication problem that prevents key agreement algorithms to be used directly over a wireless link. We present a fully distributed algorithm where each pair of nodes can establish a key with authentication by using their neighbors as the witnesses.
Resumo:
Good daylighting design in buildings not only provides a comfortable luminous environment, but also delivers energy savings and comfortable and healthy environments for building occupants. Yet, there is still no consensus on how to assess what constitutes good daylighting design. Currently amongst building performance guidelines, Daylighting factors (DF) or minimum illuminance values are the standard; however, previous research has shown the shortcomings of these metrics. New computer software for daylighting analysis contains new more advanced metrics for daylighting (Climate Base Daylight Metrics-CBDM). Yet, these tools (new metrics or simulation tools) are not currently understood by architects and are not used within architectural firms in Australia. A survey of architectural firms in Brisbane showed the most relevant tools used by industry. The purpose of this paper is to assess and compare these computer simulation tools and new tools available architects and designers for daylighting. The tools are assessed in terms of their ease of use (e.g. previous knowledge required, complexity of geometry input, etc.), efficiency (e.g. speed, render capabilities, etc.) and outcomes (e.g. presentation of results, etc. The study shows tools that are most accessible for architects, are those that import a wide variety of files, or can be integrated into the current 3d modelling software or package. These software’s need to be able to calculate for point in times simulations, and annual analysis. There is a current need in these software solutions for an open source program able to read raw data (in the form of spreadsheets) and show that graphically within a 3D medium. Currently, development into plug-in based software’s are trying to solve this need through third party analysis, however some of these packages are heavily reliant and their host program. These programs however which allow dynamic daylighting simulation, which will make it easier to calculate accurate daylighting no matter which modelling platform the designer uses, while producing more tangible analysis today, without the need to process raw data.
Resumo:
A recent comment in the Journal of Sports Sciences (MacNamara & Collins, 2011) highlighted some major concerns with the current structure of talent identification and development (TID) programmes of Olympic athletes (e.g. Gulbin, 2008; Vaeyens, Gullich, Warr, & Philippaerts, 2009). In a cogent commentary, MacNamara and Collins (2011) provided a short review of the extant literature, which was both timely and insightful. Specifically, they criticised the ubiquitous one-dimensional ‘physically-biased’ attempts to produce world class performers, emphasising the need to consider a number of key environmental variables in a more multi-disciplinary perspective. They also lamented the wastage of talent, and alluded to the operational and opportunistic nature of current talent transfer programmes. A particularly compelling aspect of the comment was their allusion to high profile athletes who had ‘failed’ performance evaluation tests and then proceeded to succeed in that sport. This issue identifies a problem with current protocols for evaluating performance and is a line of research that is sorely needed in the area of talent development. To understand the nature of talent wastage that might be occurring in high performance programmes in sport, future empirical work should seek to follow the career paths of ‘successful’ and ‘unsuccessful’ products of TID programmes, in comparative analyses. Pertinent to the insights of MacNamara and Collins (2011), it remains clear that a number of questions have not received enough attention from sport scientists interested in talent development, including: (i) why is there so much wastage of talent in such programmes? And (ii), why are there so few reported examples of successful talent transfer programmes? These questions highlight critical areas for future investigation. The aim of this short correspondence is to discuss these and other issues researchers and practitioners might consider, and to propose how an ecological dynamics underpinning to such investigations may help the development of existing protocols...
Resumo:
Russell, Benton and Kingsley (2010) recently suggested a new association football test comprising three different tasks for the evaluation of players' passing, dribbling and shooting skills. Their stated intention was to enhance ‘ecological validity’ of current association football skills tests allowing generalisation of results from the new protocols to performance constraints that were ‘representative’ of experiences during competitive game situations. However, in this comment we raise some concerns with their use of the term ‘ecological validity’ to allude to aspects of ‘representative task design’. We propose that in their paper the authors confused understanding of environmental properties, performance achievement and generalisability of the test and its outcomes. Here, we argue that the tests designed by Russell and colleagues did not include critical sources of environmental information, such as the active role of opponents, which players typically use to organise their actions during performance. Static tasks which are not representative of the competitive performance environment may lead to different emerging patterns of movement organisation and performance outcomes, failing to effectively evaluate skills performance in sport.
Resumo:
As a result of growing evidence regarding the effects of environmental characteristics on the health and wellbeing of people in healthcare facilities (HCFs), more emphasis is being placed on, and more attention being paid to, the consequences of design choices in HCFs. Therefore, we have critically reviewed the implications of key indoor physical design parameters, in relation to their potential impact on human health and wellbeing. In addition, we discussed these findings within the context of the relevant guidelines and standards for the design of HCFs. A total of 810 abstracts, which met the inclusion criteria, were identified through a Pubmed search, and these covered journal articles, guidelines, books, reports and monographs in the studied area. Of these, 231 full publications were selected for this review. According to the literature, the most beneficial design elements were: single-bed patient rooms, safe and easily cleaned surface materials, sound-absorbing ceiling tiles, adequate and sufficient ventilation, thermal comfort, natural daylight, control over temperature and lighting, views, exposure and access to nature, and appropriate equipment, tools and furniture. The effects of some design elements, such as lighting (e.g. artificial lighting levels) and layout (e.g. decentralized versus centralized nurses’ stations), on staff and patients vary, and “the best design practice” for each HCF should always be formulated in co-operation with different user groups and a multi-professional design team. The relevant guidelines and standards should also be considered in future design, construction and renovations, in order to produce more favourable physical indoor environments in HCFs.
Resumo:
Air pollution has significant impacts on both the environment and human health. Therefore, urban areas have received ever growing attention, because they not only have the highest concentrations of air pollutants, but they also have the highest human population. In modern societies, urban air quality (UAQ) is routinely evaluated and local authorities provide regular reports to the public about current UAQ levels. Both local and international authorities also recommended that some air pollutant concentrations remain below a certain level, with the aim of reducing emissions and improving the air quality, both in urban areas and on a more regional scale. In some countries, protocols aimed at reducing emissions have come in force as a result of international agreements.