851 resultados para Television -- Antennas -- Design and construction -- Data processing
Resumo:
The Commonwealth Department of Industry, Science and Resources is identifying best practice case study examples of supply chain management within the building and construction industry to illustrate the concepts, innovations and initiatives that are at work. The projects provide individual enterprises with examples of how to improve their performance, and the competitiveness of the industry as a whole.
Resumo:
The goal of this research project is to develop specific BIM objects for temporary construction activities which are fully integrated with object design, construction efficiency and safety parameters. Specifically, the project will deliver modularised electronic scaffolding and formwork objects that will allow designers to easily incorporate them into BIM models to facilitate smarter and safer infrastructure and building construction. This research first identified there is currently a distinct lack of BIM objects for temporary construction works resulting in productivity loss during design and construction, and opportunities for improved consideration of safety standards and practices with the design of scaffolding and formwork. This is particularly relevant in Australia, given the “harmonisation” of OHS legislation across all states and territories from 1 January 2012, meaning that enhancements to Queensland practices will have direct application across Australia. Thus, in conjunction with government and industry partners in Queensland, Australia, the research team developed a strategic three-phase research methodology: (1) the preliminary review phase on industrial scaffolding and formwork practices and BIM implementation; (2) the BIM object development phase with specific safety and productivity functions; and (3) the Queensland-wide workshop phase for product dissemination and training. This paper discusses background review findings, details of the developed methodology, and expected research outcomes and their contributions to the Australian construction industry.
Resumo:
Objectives:Despite many years of research, there is currently no treatment available that results in major neurological or functional recovery after traumatic spinal cord injury (tSCI). In particular, no conclusive data related to the role of the timing of decompressive surgery, and the impact of injury severity on its benefit, have been published to date. This paper presents a protocol that was designed to examine the hypothesized association between the timing of surgical decompression and the extent of neurological recovery in tSCI patients.Study design: The SCI-POEM study is a Prospective, Observational European Multicenter comparative cohort study. This study compares acute (<12 h) versus non-acute (>12 h, <2 weeks) decompressive surgery in patients with a traumatic spinal column injury and concomitant spinal cord injury. The sample size calculation was based on a representative European patient cohort of 492 tSCI patients. During a 4-year period, 300 patients will need to be enrolled from 10 trauma centers across Europe. The primary endpoint is lower-extremity motor score as assessed according to the 'International standards for neurological classification of SCI' at 12 months after injury. Secondary endpoints include motor, sensory, imaging and functional outcomes at 3, 6 and 12 months after injury.Conclusion:In order to minimize bias and reduce the impact of confounders, special attention is paid to key methodological principles in this study protocol. A significant difference in safety and/or efficacy endpoints will provide meaningful information to clinicians, as this would confirm the hypothesis that rapid referral to and treatment in specialized centers result in important improvements in tSCI patients.Spinal Cord advance online publication, 17 April 2012; doi:10.1038/sc.2012.34.
Resumo:
Numerous different and sometimes discrepant interests can be affected, both positively and negatively, throughout the course of a major infrastructure and construction (MIC) project. Failing to address and meet the concerns and expectations of the stakeholders involved has resulted in many project failures. One way to address this issue is through a participatory approach to project decision making. Whether the participation mechanism is effective or not depends largely on the client/owner. This paper provides a means of systematically evaluating the effectiveness of the public participation exercise, or even the whole project, through the measurement of stakeholder satisfaction. Since the process of satisfaction measurement is complicated and uncertain, requiring approximate reasoning involving human intuition, a fuzzy approach is adopted. From this, a multi-factor hierarchical fuzzy comprehensive evaluation model is established to facilitate the evaluation of satisfaction in both single stakeholder group and overall MIC project stakeholders.
Resumo:
New substation technology, such as non-conventional instrument transformers,and a need to reduce design and construction costs, are driving the adoption of Ethernet based digital process bus networks for high voltage substations. Protection and control applications can share a process bus, making more efficient use of the network infrastructure. This paper classifies and defines performance requirements for the protocols used in a process bus on the basis of application. These include GOOSE, SNMP and IEC 61850-9-2 sampled values. A method, based on the Multiple Spanning Tree Protocol (MSTP) and virtual local area networks, is presented that separates management and monitoring traffic from the rest of the process bus. A quantitative investigation of the interaction between various protocols used in a process bus is described. These tests also validate the effectiveness of the MSTP based traffic segregation method. While this paper focusses on a substation automation network, the results are applicable to other real-time industrial networks that implement multiple protocols. High volume sampled value data and time-critical circuit breaker tripping commands do not interact on a full duplex switched Ethernet network, even under very high network load conditions. This enables an efficient digital network to replace a large number of conventional analog connections between control rooms and high voltage switchyards.
Resumo:
This paper is based on an Australian Learning & Teaching Council (ALTC) funded evaluation in 13 universities across Australia and New Zealand of the use of Engineers Without Borders (EWB) projects in first-year engineering courses. All of the partner institutions have implemented this innovation differently and comparison of these implementations affords us the opportunity to assemble "a body of carefully gathered data that provides evidence of which approaches work for which students in which learning environments". This study used a mixed-methods data collection approach and a realist analysis. Data was collected by program logic analysis with course co-ordinators, observation of classes, focus groups with students, exit survey of students and interviews with staff as well as scrutiny of relevant course and curriculum documents. Course designers and co-ordinators gave us a range of reasons for using the projects, most of which alluded to their presumed capacity to deliver experience in and learning of higher order thinking skills in areas such as sustainability, ethics, teamwork and communication. For some students, however, the nature of the projects decreased their interest in issues such as ethical development, sustainability and how to work in teams. We also found that the projects provoked different responses from students depending on the nature of the courses in which they were embedded (general introduction, design, communication, or problem-solving courses) and their mode of delivery (lecture, workshop or online).
Resumo:
Threats against computer networks evolve very fast and require more and more complex measures. We argue that teams respectively groups with a common purpose for intrusion detection and prevention improve the measures against rapid propagating attacks similar to the concept of teams solving complex tasks known from field of work sociology. Collaboration in this sense is not easy task especially for heterarchical environments. We propose CIMD (collaborative intrusion and malware detection) as a security overlay framework to enable cooperative intrusion detection approaches. Objectives and associated interests are used to create detection groups for exchange of security-related data. In this work, we contribute a tree-oriented data model for device representation in the scope of security. We introduce an algorithm for the formation of detection groups, show realization strategies for the system and conduct vulnerability analysis. We evaluate the benefit of CIMD by simulation and probabilistic analysis.
Resumo:
The design and construction community has shown increasing interest in adopting building information models (BIMs). The richness of information provided by BIMs has the potential to streamline the design and construction processes by enabling enhanced communication, coordination, automation and analysis. However, there are many challenges in extracting construction-specific information out of BIMs. In most cases, construction practitioners have to manually identify the required information, which is inefficient and prone to error, particularly for complex, large-scale projects. This paper describes the process and methods we have formalized to partially automate the extraction and querying of construction-specific information from a BIM. We describe methods for analyzing a BIM to query for spatial information that is relevant for construction practitioners, and that is typically represented implicitly in a BIM. Our approach integrates ifcXML data and other spatial data to develop a richer model for construction users. We employ custom 2D topological XQuery predicates to answer a variety of spatial queries. The validation results demonstrate that this approach provides a richer representation of construction-specific information compared to existing BIM tools.
Resumo:
Traffic congestion has a significant impact on the economy and environment. Encouraging the use of multimodal transport (public transport, bicycle, park’n’ride, etc.) has been identified by traffic operators as a good strategy to tackle congestion issues and its detrimental environmental impacts. A multi-modal and multi-objective trip planner provides users with various multi-modal options optimised on objectives that they prefer (cheapest, fastest, safest, etc) and has a potential to reduce congestion on both a temporal and spatial scale. The computation of multi-modal and multi-objective trips is a complicated mathematical problem, as it must integrate and utilize a diverse range of large data sets, including both road network information and public transport schedules, as well as optimising for a number of competing objectives, where fully optimising for one objective, such as travel time, can adversely affect other objectives, such as cost. The relationship between these objectives can also be quite subjective, as their priorities will vary from user to user. This paper will first outline the various data requirements and formats that are needed for the multi-modal multi-objective trip planner to operate, including static information about the physical infrastructure within Brisbane as well as real-time and historical data to predict traffic flow on the road network and the status of public transport. It will then present information on the graph data structures representing the road and public transport networks within Brisbane that are used in the trip planner to calculate optimal routes. This will allow for an investigation into the various shortest path algorithms that have been researched over the last few decades, and provide a foundation for the construction of the Multi-modal Multi-objective Trip Planner by the development of innovative new algorithms that can operate the large diverse data sets and competing objectives.
Resumo:
The main objective of this paper is to describe the development of a remote sensing airborne air sampling system for Unmanned Aerial Systems (UAS) and provide the capability for the detection of particle and gas concentrations in real time over remote locations. The design of the air sampling methodology started by defining system architecture, and then by selecting and integrating each subsystem. A multifunctional air sampling instrument, with capability for simultaneous measurement of particle and gas concentrations was modified and integrated with ARCAA’s Flamingo UAS platform and communications protocols. As result of the integration process, a system capable of both real time geo-location monitoring and indexed-link sampling was obtained. Wind tunnel tests were conducted in order to evaluate the performance of the air sampling instrument in controlled nonstationary conditions at the typical operational velocities of the UAS platform. Once the remote fully operative air sampling system was obtained, the problem of mission design was analyzed through the simulation of different scenarios. Furthermore, flight tests of the complete air sampling system were then conducted to check the dynamic characteristics of the UAS with the air sampling system and to prove its capability to perform an air sampling mission following a specific flight path.
Resumo:
Deterministic computer simulations of physical experiments are now common techniques in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena of this nature. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This thesis investigates some practical issues in the design and analysis of computer experiments and attempts to answer some of the questions faced by experimenters using computer experiments. In particular, the question of the number of computer experiments and how they should be augmented is studied and attention is given to when the response is a function over time.