967 resultados para Standard fire tests
Resumo:
Many of the more extreme bushfire prone landscapes in Australia are located in colder climate regions. For such sites, the National Construction Code regulates that houses satisfy both the Australian Standard for Bushfire (AS 3959:2009) and achieve a 6 Star energy rating. When combined these requirements present a considerable challenge to the construction of affordable housing - a problem which is often exacerbated by the complex topography of bushifre prone landscapes. Dr Weir presents a series of case studies from his architetcural practice which highlight the need for further design-led research into affordable housing - a ground up holistic approach to design which recolciles energy performance, human behaviourm, bushland conservation and bushfire safety.
Resumo:
Thin-sectioned samples mounted on glass slides with common petrographic epoxies cannot be easily removed (for subsequent ion-milling) by standard methods such as heating or dissolution in solvents. A method for the removal of such samples using a radio frequency (RF) generated oxygen plasma has been investigated for a number of typical petrographic and ceramic thin sections. Sample integrity and thickness were critical factors that determined the etching rate of adhesive and the survivability of the sample. Several tests were performed on a variety of materials in order to estimate possible heating or oxidation damage from the plasma. Temperatures in the plasma chamber remained below 138°C and weight changes in mineral powders etched for 76 hr were less than ±4%. A crystal of optical grade calcite showed no apparent surface damage after 48 hr of etching. Any damage from the oxygen plasma is apparently confined to the surface of the sample, and is removed during the ion-milling stage of transmission electron microscopy (TEM) sample preparation.
Resumo:
Chatrooms, for example Internet Relay Chat, are generally multi-user, multi-channel and multiserver chat-systems which run over the Internet and provide a protocol for real-time text-based conferencing between users all over the world. While a well-trained human observer is able to understand who is chatting with whom, there are no efficient and accurate automated tools to determine the groups of users conversing with each other. A precursor to analysing evolving cyber-social phenomena is to first determine what the conversations are and which groups of chatters are involved in each conversation. We consider this problem in this paper. We propose an algorithm to discover all groups of users that are engaged in conversation. Our algorithms are based on a statistical model of a chatroom that is founded on our experience with real chatrooms. Our approach does not require any semantic analysis of the conversations, rather it is based purely on the statistical information contained in the sequence of posts. We improve the accuracy by applying some graph algorithms to clean the statistical information. We present some experimental results which indicate that one can automatically determine the conversing groups in a chatroom, purely on the basis of statistical analysis.
Resumo:
The future emergence of many types of airborne vehicles and unpiloted aircraft in the national airspace means collision avoidance is of primary concern in an uncooperative airspace environment. The ability to replicate a pilot’s see and avoid capability using cameras coupled with vision based avoidance control is an important part of an overall collision avoidance strategy. But unfortunately without range collision avoidance has no direct way to guarantee a level of safety. Collision scenario flight tests with two aircraft and a monocular camera threat detection and tracking system were used to study the accuracy of image-derived angle measurements. The effect of image-derived angle errors on reactive vision-based avoidance performance was then studied by simulation. The results show that whilst large angle measurement errors can significantly affect minimum ranging characteristics across a variety of initial conditions and closing speeds, the minimum range is always bounded and a collision never occurs.
Resumo:
Population increase and economic developments can lead to construction as well as demolition of infrastructures such as buildings, bridges, roads, etc and used concrete is the main waste product of them. Recycling of waste concrete to obtain the recycled concrete aggregates (RCA) for base and/or sub-base materials in road construction is a foremost application to be promoted to gain economical and sustainable benefits. As the mortar, bricks, glass and asphalt present in different constituents in RCA, it exhibits inconsistent properties and performance. In this study, six different types of RCA samples were subjected classification tests such as particle size distribution, plasticity, compaction test and California Bearing Ratio (CBR). Results were compared with those of the standard road materials used in Queensland, Australia and found that ‘RM1-100/RM3-0’ and ‘RM1-80/RM3-20’ samples are sitting in the margin of the minimum required specifications of base materials while others are lower than that.
Resumo:
Three dimensional conjugate heat transfer simulation of a standard parabolic trough thermal collector receiver is performed numerically in order to visualize and analyze the surface thermal characteristics. The computational model is developed in Ansys Fluent environment based on some simplified assumptions. Three test conditions are selected from the existing literature to verify the numerical model directly, and reasonably good agreement between the model and the test results confirms the reliability of the simulation. Solar radiation flux profile around the tube is also approximated from the literature. An in house macro is written to read the input solar flux as a heat flux wall boundary condition for the tube wall. The numerical results show that there is an abrupt variation in the resultant heat flux along the circumference of the receiver. Consequently, the temperature varies throughout the tube surface. The lower half of the horizontal receiver enjoys the maximum solar flux, and therefore, experiences the maximum temperature rise compared to the upper part with almost leveled temperature. Reasonable attributions and suggestions are made on this particular type of conjugate thermal system. The knowledge that gained so far from this study will be used to further the analysis and to design an efficient concentrator photovoltaic collector in near future.
Resumo:
Due to the development of XML and other data models such as OWL and RDF, sharing data is an increasingly common task since these data models allow simple syntactic translation of data between applications. However, in order for data to be shared semantically, there must be a way to ensure that concepts are the same. One approach is to employ commonly usedschemas—called standard schemas —which help guarantee that syntactically identical objects have semantically similar meanings. As a result of the spread of data sharing, there has been widespread adoption of standard schemas in a broad range of disciplines and for a wide variety of applications within a very short period of time. However, standard schemas are still in their infancy and have not yet matured or been thoroughly evaluated. It is imperative that the data management research community takes a closer look at how well these standard schemas have fared in real-world applications to identify not only their advantages, but also the operational challenges that real users face. In this paper, we both examine the usability of standard schemas in a comparison that spans multiple disciplines, and describe our first step at resolving some of these issues in our Semantic Modeling System. We evaluate our Semantic Modeling System through a careful case study of the use of standard schemas in architecture, engineering, and construction, which we conducted with domain experts. We discuss how our Semantic Modeling System can help the broader problem and also discuss a number of challenges that still remain.
Resumo:
PURPOSE: To test the reliability of Timed Up and Go Tests (TUGTs) in cardiac rehabilitation (CR) and compare TUGTs to the 6-Minute Walk Test (6MWT) for outcome measurement. METHODS: Sixty-one of 154 consecutive community-based CR patients were prospectively recruited. Subjects undertook repeated TUGTs and 6MWTs at the start of CR (start-CR), postdischarge from CR (post-CR), and 6 months postdischarge from CR (6 months post-CR). The main outcome measurements were TUGT time (TUGTT) and 6MWT distance (6MWD). RESULTS: Mean (SD) TUGTT1 and TUGTT2 at the 3 assessments were 6.29 (1.30) and 5.94 (1.20); 5.81 (1.22) and 5.53 (1.09); and 5.39 (1.60) and 5.01 (1.28) seconds, respectively. A reduction in TUGTT occurred between each outcome point (P ≤ .002). Repeated TUGTTs were strongly correlated at each assessment, intraclass correlation (95% CI) = 0.85 (0.76–0.91), 0.84 (0.73–0.91), and 0.90 (0.83–0.94), despite a reduction between TUGTT1 and TUGTT2 of 5%, 5%, and 7%, respectively (P ≤ .006). Relative decreases in TUGTT1 (TUGTT2) occurred from start-CR to post-CR and from start-CR to 6 months post-CR of −7.5% (−6.9%) and −14.2% (−15.5%), respectively, while relative increases in 6MWD1 (6MWD2) occurred, 5.1% (7.2%) and 8.4% (10.2%), respectively (P < .001 in all cases). Pearson correlation coefficients for 6MWD1 to TUGTT1 and TUGTT2 across all times were −0.60 and −0.68 (P < .001) and the intraclass correlations (95% CI) for the speeds derived from averaged 6MWDs and TUGTTs were 0.65 (0.54, 0.73) (P < .001). CONCLUSIONS: Similar relative changes occurred for the TUGT and the 6MWT in CR. A significant correlation between the TUGTT and 6MWD was demonstrated, and we suggest that the TUGT may provide a related or a supplementary measurement of functional capacity in CR.
Resumo:
Russell, Benton and Kingsley (2010) recently suggested a new association football test comprising three different tasks for the evaluation of players' passing, dribbling and shooting skills. Their stated intention was to enhance ‘ecological validity’ of current association football skills tests allowing generalisation of results from the new protocols to performance constraints that were ‘representative’ of experiences during competitive game situations. However, in this comment we raise some concerns with their use of the term ‘ecological validity’ to allude to aspects of ‘representative task design’. We propose that in their paper the authors confused understanding of environmental properties, performance achievement and generalisability of the test and its outcomes. Here, we argue that the tests designed by Russell and colleagues did not include critical sources of environmental information, such as the active role of opponents, which players typically use to organise their actions during performance. Static tasks which are not representative of the competitive performance environment may lead to different emerging patterns of movement organisation and performance outcomes, failing to effectively evaluate skills performance in sport.
Resumo:
The power of testing for a population-wide association between a biallelic quantitative trait locus and a linked biallelic marker locus is predicted both empirically and deterministically for several tests. The tests were based on the analysis of variance (ANOVA) and on a number of transmission disequilibrium tests (TDT). Deterministic power predictions made use of family information, and were functions of population parameters including linkage disequilibrium, allele frequencies, and recombination rate. Deterministic power predictions were very close to the empirical power from simulations in all scenarios considered in this study. The different TDTs had very similar power, intermediate between one-way and nested ANOVAs. One-way ANOVA was the only test that was not robust against spurious disequilibrium. Our general framework for predicting power deterministically can be used to predict power in other association tests. Deterministic power calculations are a powerful tool for researchers to plan and evaluate experiments and obviate the need for elaborate simulation studies.
Resumo:
A satellite based observation system can continuously or repeatedly generate a user state vector time series that may contain useful information. One typical example is the collection of International GNSS Services (IGS) station daily and weekly combined solutions. Another example is the epoch-by-epoch kinematic position time series of a receiver derived by a GPS real time kinematic (RTK) technique. Although some multivariate analysis techniques have been adopted to assess the noise characteristics of multivariate state time series, statistic testings are limited to univariate time series. After review of frequently used hypotheses test statistics in univariate analysis of GNSS state time series, the paper presents a number of T-squared multivariate analysis statistics for use in the analysis of multivariate GNSS state time series. These T-squared test statistics have taken the correlation between coordinate components into account, which is neglected in univariate analysis. Numerical analysis was conducted with the multi-year time series of an IGS station to schematically demonstrate the results from the multivariate hypothesis testing in comparison with the univariate hypothesis testing results. The results have demonstrated that, in general, the testing for multivariate mean shifts and outliers tends to reject less data samples than the testing for univariate mean shifts and outliers under the same confidence level. It is noted that neither univariate nor multivariate data analysis methods are intended to replace physical analysis. Instead, these should be treated as complementary statistical methods for a prior or posteriori investigations. Physical analysis is necessary subsequently to refine and interpret the results.
Resumo:
The use of Wireless Sensor Networks (WSNs) for Structural Health Monitoring (SHM) has become a promising approach due to many advantages such as low cost, fast and flexible deployment. However, inherent technical issues such as data synchronization error and data loss have prevented these distinct systems from being extensively used. Recently, several SHM-oriented WSNs have been proposed and believed to be able to overcome a large number of technical uncertainties. Nevertheless, there is limited research verifying the applicability of those WSNs with respect to demanding SHM applications like modal analysis and damage identification. This paper first presents a brief review of the most inherent uncertainties of the SHM-oriented WSN platforms and then investigates their effects on outcomes and performance of the most robust Output-only Modal Analysis (OMA) techniques when employing merged data from multiple tests. The two OMA families selected for this investigation are Frequency Domain Decomposition (FDD) and Data-driven Stochastic Subspace Identification (SSI-data) due to the fact that they both have been widely applied in the past decade. Experimental accelerations collected by a wired sensory system on a large-scale laboratory bridge model are initially used as clean data before being contaminated by different data pollutants in sequential manner to simulate practical SHM-oriented WSN uncertainties. The results of this study show the robustness of FDD and the precautions needed for SSI-data family when dealing with SHM-WSN uncertainties. Finally, the use of the measurement channel projection for the time-domain OMA techniques and the preferred combination of the OMA techniques to cope with the SHM-WSN uncertainties is recommended.
Resumo:
High-stakes literacy testing is now a ubiquitous educational phenomenon. However, it remains a relatively recent phenomenon in Australia. Hence it is possible to study the ways in which such tests are reorganising educators’ work during this period of change. This paper draws upon Dorothy Smith’s Institutional Ethnography and critical policy analysis to consider this problem and reports on interview data from teachers and the principal in small rural school in a poor area of South Australia. In this context high-stakes testing and the associated diagnostic school review unleashes a chain of actions within the school which ultimately results in educators doubting their professional judgments, increasing the investment in testing, narrowing their teaching of literacy and purchasing levelled reading schemes. The effects of high-stakes testing in disadvantaged schools are identified and discussed.
Resumo:
The Australian Government and most Australian road authorities have set ambitious greenhouse gas emission (GHGe) reduction targets for the near future, many of which have translated into action plans. However, previous research has shown that the various Australian state road authorities are at different stages of implementing ‘green’ initiatives in construction planning and development, with considerable gaps in their monitoring, tendering, and contracting. This study illustrates the differences between procurement standards and project specific practices that aim to reduce GHGe from road construction projects in three of the largest Australian road construction clients, with a focus on the tools used, contract type and incentives for better performance.