999 resultados para Efficient Presentations
Resumo:
IT-supported field data management benefits on-site construction management by improving accessibility to the information and promoting efficient communication between project team members. However, most of on-site safety inspections still heavily rely on subjective judgment and manual reporting processes and thus observers’ experiences often determine the quality of risk identification and control. This study aims to develop a methodology to efficiently retrieve safety-related information so that the safety inspectors can easily access to the relevant site safety information for safer decision making. The proposed methodology consists of three stages: (1) development of a comprehensive safety database which contains information of risk factors, accident types, impact of accidents and safety regulations; (2) identification of relationships among different risk factors based on statistical analysis methods; and (3) user-specified information retrieval using data mining techniques for safety management. This paper presents an overall methodology and preliminary results of the first stage research conducted with 101 accident investigation reports.
Resumo:
Despite an ostensibly technology-driven society, the ability to communicate orally continues to feature as an essential ability for students at school and university, as it is for graduates in the workplace. Pedagogically, one rationale is that the need to develop effective oral communication skills is tied to life-long learning which includes successful participation in future work-related tasks. One tangible way that educators have assessed proficiency in the area of communication is through prepared oral presentations. While much of the literature uses the terms 'oral communication' and 'oral presentation' interchangeably, some writers question the role more formal presentations play in the overall development of oral communication skills. However, such formal speaking tasks continue to be a recognised assessment practice in both the secondary school and academy, and, therefore, worthy of further investigation. Adding to the discussion, this thesis explores the knowledge and skills students bring into the academy from previous educational experiences. It examines some of the teaching and assessment methods used in secondary schools to develop oral communication skills through the use of formal oral presentations. Specifically, it investigates criterion-referenced assessment sheets and how these tools are used as a form of instruction, as well as their role and effectiveness in the evaluation of student ability. The focus is on the student's perspective and includes 12 semi-structured interviews with school students. The purpose of this thesis is to explore key thematics underpinning oral communication and to identify tensions between expectations and practice. While acknowledging the breadth and depth of material available under the heading of 'communication theory', this study specifically draws on an expanded view of the rhetorical tradition to fully interrogate the assumptions supporting the practice of assessing oral presentations. Finally, this thesis recommends reconnecting with an updated understanding of rhetoric as a way of assisting in the development of expressive, articulate and discerning communicators.
Resumo:
Finding an appropriate linking method to connect different dimensional element types in a single finite element model is a key issue in the multi-scale modeling. This paper presents a mixed dimensional coupling method using multi-point constraint equations derived by equating the work done on either side of interface connecting beam elements and shell elements for constructing a finite element multiscale model. A typical steel truss frame structure is selected as case example and the reduced scale specimen of this truss section is then studied in the laboratory to measure its dynamic and static behavior in global truss and local welded details while the different analytical models are developed for numerical simulation. Comparison of dynamic and static response of the calculated results among different numerical models as well as the good agreement with those from experimental results indicates that the proposed multi-scale model is efficient and accurate.
Resumo:
Pulmonary drug delivery is the focus of much research and development because of its great potential to produce maximum therapeutic benefit. Among the available options the dry powder inhaler (DPI) is the preferred device for the treatment of an increasingly diverse number of diseases. However, as drug delivery from a DPI involves a complicated set of physical processes and the integration of drug formulations, device design and patient usage, the engineering development of this medical technology is proving to be a great challenge. Currently there is large range of devices that are either available on the market or under development, however, none exhibit superior clinical efficacy. A major concern is the inter- and intra-patient variability of the drug dosage delivered to the deep lungs. The extent of variability depends on the drug formulation, the device design and the patient’s inhalation profile. This article reviews recent advances in DPI technology and presents the key factors which motivate and constrain the successful engineering of a universal, patient-independent DPI that is capable of efficient, reliable and repeatable drug delivery. A strong emphasis is placed on the physical processes of drug powder aerosolisation, deagglomeration, and dispersion and on the engineering of formulations and inhalers that can optimise these processes.
Resumo:
Vehicular safety applications, such as cooperative collision warning systems, rely on beaconing to provide situational awareness that is needed to predict and therefore to avoid possible collisions. Beaconing is the continual exchange of vehicle motion-state information, such as position, speed, and heading, which enables each vehicle to track its neighboring vehicles in real time. This work presents a context-aware adaptive beaconing scheme that dynamically adapts the beaconing repetition rate based on an estimated channel load and the danger severity of the interactions among vehicles. The safety, efficiency, and scalability of the new scheme is evaluated by simulating vehicle collisions caused by inattentive drivers under various road traffic densities. Simulation results show that the new scheme is more efficient and scalable, and is able to improve safety better than the existing non-adaptive and adaptive rate schemes.
Resumo:
Improving energy efficiency has become increasingly important in data centers in recent years to reduce the rapidly growing tremendous amounts of electricity consumption. The power dissipation of the physical servers is the root cause of power usage of other systems, such as cooling systems. Many efforts have been made to make data centers more energy efficient. One of them is to minimize the total power consumption of these servers in a data center through virtual machine consolidation, which is implemented by virtual machine placement. The placement problem is often modeled as a bin packing problem. Due to the NP-hard nature of the problem, heuristic solutions such as First Fit and Best Fit algorithms have been often used and have generally good results. However, their performance leaves room for further improvement. In this paper we propose a Simulated Annealing based algorithm, which aims at further improvement from any feasible placement. This is the first published attempt of using SA to solve the VM placement problem to optimize the power consumption. Experimental results show that this SA algorithm can generate better results, saving up to 25 percentage more energy than First Fit Decreasing in an acceptable time frame.
Resumo:
Server consolidation using virtualization technology has become an important technology to improve the energy efficiency of data centers. Virtual machine placement is the key in the server consolidation. In the past few years, many approaches to the virtual machine placement have been proposed. However, existing virtual machine placement approaches to the virtual machine placement problem consider the energy consumption by physical machines in a data center only, but do not consider the energy consumption in communication network in the data center. However, the energy consumption in the communication network in a data center is not trivial, and therefore should be considered in the virtual machine placement in order to make the data center more energy-efficient. In this paper, we propose a genetic algorithm for a new virtual machine placement problem that considers the energy consumption in both the servers and the communication network in the data center. Experimental results show that the genetic algorithm performs well when tackling test problems of different kinds, and scales up well when the problem size increases.
Resumo:
IT-supported field data management benefits on-site construction management by improving accessibility to the information and promoting efficient communication between project team members. However, most of on-site safety inspections still heavily rely on subjective judgment and manual reporting processes and thus observers’ experiences often determine the quality of risk identification and control. This study aims to develop a methodology to efficiently retrieve safety-related information so that the safety inspectors can easily access to the relevant site safety information for safer decision making. The proposed methodology consists of three stages: (1) development of a comprehensive safety database which contains information of risk factors, accident types, impact of accidents and safety regulations; (2) identification of relationships among different risk factors based on statistical analysis methods; and (3) user-specified information retrieval using data mining techniques for safety management. This paper presents an overall methodology and preliminary results of the first stage research conducted with 101 accident investigation reports.
Resumo:
In this paper, we present WebPut, a prototype system that adopts a novel web-based approach to the data imputation problem. Towards this, Webput utilizes the available information in an incomplete database in conjunction with the data consistency principle. Moreover, WebPut extends effective Information Extraction (IE) methods for the purpose of formulating web search queries that are capable of effectively retrieving missing values with high accuracy. WebPut employs a confidence-based scheme that efficiently leverages our suite of data imputation queries to automatically select the most effective imputation query for each missing value. A greedy iterative algorithm is also proposed to schedule the imputation order of the different missing values in a database, and in turn the issuing of their corresponding imputation queries, for improving the accuracy and efficiency of WebPut. Experiments based on several real-world data collections demonstrate that WebPut outperforms existing approaches.
Resumo:
This paper introduces PartSS, a new partition-based fil- tering for tasks performing string comparisons under edit distance constraints. PartSS offers improvements over the state-of-the-art method NGPP with the implementation of a new partitioning scheme and also improves filtering abil- ities by exploiting theoretical results on shifting and scaling ranges, thus accelerating the rate of calculating edit distance between strings. PartSS filtering has been implemented within two major tasks of data integration: similarity join and approximate membership extraction under edit distance constraints. The evaluation on an extensive range of real-world datasets demonstrates major gain in efficiency over NGPP and QGrams approaches.
Resumo:
This paper presents an efficient face detection method suitable for real-time surveillance applications. Improved efficiency is achieved by constraining the search window of an AdaBoost face detector to pre-selected regions. Firstly, the proposed method takes a sparse grid of sample pixels from the image to reduce whole image scan time. A fusion of foreground segmentation and skin colour segmentation is then used to select candidate face regions. Finally, a classifier-based face detector is applied only to selected regions to verify the presence of a face (the Viola-Jones detector is used in this paper). The proposed system is evaluated using 640 x 480 pixels test images and compared with other relevant methods. Experimental results show that the proposed method reduces the detection time to 42 ms, where the Viola-Jones detector alone requires 565 ms (on a desktop processor). This improvement makes the face detector suitable for real-time applications. Furthermore, the proposed method requires 50% of the computation time of the best competing method, while reducing the false positive rate by 3.2% and maintaining the same hit rate.
Resumo:
Reliable communications is one of the major concerns in wireless sensor networks (WSNs). Multipath routing is an effective way to improve communication reliability in WSNs. However, most of existing multipath routing protocols for sensor networks are reactive and require dynamic route discovery. If there are many sensor nodes from a source to a destination, the route discovery process will create a long end-to-end transmission delay, which causes difficulties in some time-critical applications. To overcome this difficulty, the efficient route update and maintenance processes are proposed in this paper. It aims to limit the amount of routing overhead with two-tier routing architecture and introduce the combination of piggyback and trigger update to replace the periodic update process, which is the main source of unnecessary routing overhead. Simulations are carried out to demonstrate the effectiveness of the proposed processes in improvement of total amount of routing overhead over existing popular routing protocols.