871 resultados para nonparametric rationality tests
Resumo:
Cold-formed steel lipped channel beams (LCB) are used extensively in residential, industrial and commercial buildings as load bearing structural elements. Their shear strengths are considerably reduced when web openings are included for the purpose of locating building services. Past research has shown that the shear capacities of LCBs were reduced by up to 70% due to the inclusion of these web openings. Hence there is a need to improve the shear capacities of LCBs with web openings. A cost effective way of eliminating the detrimental effects of large web openings is to attach suitable stiffeners around the web openings and restore the original shear strength and stiffness of the LCBs. Hence detailed experimental studies were undertaken to investigate the shear behaviour and strength of LCBs with stiffened web openings. Both plate and stud stiffeners with varying sizes and thicknesses were attached to the web elements of LCBs using different screw-fastening arrangements. Simply supported test specimens of LCBs with aspect ratios of 1.0 and 1.5 were loaded at mid-span until failure. Test results showed that the plate stiffeners established using AISI recommendations are inadequate to restore the shear strengths of LCBs with web openings. Hence new stiffener arrangements have been proposed for LCBs based on experimental results. This paper presents the details of this experimental study on the shear strength of lipped channel beams with stiffened web openings, and the results.
Resumo:
Cold-formed steel stud walls are an important component of Light Steel Framing (LSF) building systems used in commercial, industrial and residential buildings. In the conventional LSF stud wall systems, thin-walled steel studs are protected from fire by placing one or two layers of plasterboard on both sides with or without cavity insulation. However, there is very limited data about the structural and thermal performance of these wall systems while past research showed contradicting results about the benefits of cavity insulation. This research proposed a new LSF stud wall system in which a composite panel made of two plasterboards with insulation between them was used to improve the fire rating of walls. Full scale fire tests were conducted using both conventional steel stud walls with and without the use of cavity insulation and the new composite panel system. Eleven full scale load bearing wall specimens were tested to study the thermal and structural performances of the load bearing wall assemblies under standard fire conditions. These tests showed that the use of cavity insulation led to inferior fire performance of walls while also providing good explanations and supporting test data to overcome the incorrect industry assumptions about cavity insulation. Tests demonstrated that the use of external insulation in a composite panel form enhanced the thermal and structural performances of stud walls and increased their fire resistance rating significantly. This paper presents the details of the full scale fire tests of load-bearing wall assemblies lined with plasterboards and different types of insulation under varying load ratios. Test results including the temperature and deflection profiles of walls measured during the fire tests will be presented along with their failure modes and failure times.
Resumo:
Abstract. Fire resistance has become an important part in structural design due to the ever increasing loss of properties and lives every year. Conventionally the fire rating of load bearing Light gauge Steel Frame (LSF) walls is determined using standard fire tests based on the time-temperature curve given in ISO 834 [1]. Full scale fire testing based on this standard time-temperature curve originated from the application of wood burning furnaces in the early 1900s and it is questionable whether it truly represents the fuel loads in modern buildings. Hence a detailed fire research study into the performance of LSF walls was undertaken using real design fires based on Eurocode parametric curves [2] and Barnett’s ‘BFD’ curves [3]. This paper presents the development of these real fire curves and the results of full scale experimental study into the structural and fire behaviour of load bearing LSF stud wall systems.
Resumo:
A number of tests and test batteries are available for the prediction of older driver safety, but many of these have not been validated against standardized driving outcome measures. The aim of this study was to evaluate a series of previously described screening tests in terms of their ability to predict the potential for safe and unsafe driving. Participants included 79 community-dwelling older drivers (M=72.16 years, SD=5.46; range 65-88 years; 57 males and 22 females) who completed a previously validated multi-disciplinary driving assessment, a hazard perception test, a hazard change detection test and a battery of vision and cognitive tests. Participants also completed a standardized on-road driving assessment. The multi-disciplinary test battery had the highest predictive ability with a sensitivity of 80% and a specificity of 73%, followed by the hazard perception test which demonstrated a sensitivity of 75% and a specificity of 61%. These findings suggest that a relatively simple and practical battery of tests from a range of domains has the capacity to predict safe and unsafe driving in older adults.
Resumo:
It is acknowledged around the world that many university students struggle with learning to program (McCracken et al., 2001; McGettrick et al., 2005). In this paper, we describe how we have developed a research programme to systematically study and incrementally improve our teaching. We have adopted a research programme with three elements: (1) a theory that provides an organising framework for defining the type of phenomena and data of interest, (2) data on how the class as a whole performs on formative assessment tasks that are framed from within the organising framework, and (3) data from one-on-one think aloud sessions, to establish why students struggle with some of those in-class formative assessment tasks. We teach introductory computer programming, but this three-element structure of our research is applicable to many areas of engineering education research.
Resumo:
The UN Convention on the Rights of Persons with Disability (CRPD) promotes equal and full participation by children in education. Equity of educational access for all students, including students with disability, free from discrimination, is the first stated national goal of Australian education (MCEETYA 2008). Australian federal disability discrimination law, the Disability Discrimination Act 1992 (DDA), follows the Convention, with the federal Disability Standards for Education 2005 (DSE) enacting specific requirements for education. This article discusses equity of processes for inclusion of students with disability in Australian educational accountability testing, including international tests in which many countries participate. The conclusion drawn is that equitable inclusion of students with disability in current Australian educational accountability testing in not occurring from a social perspective and is not in principle compliant with law. However, given the reluctance of courts to intervene in education matters and the uncertainty of an outcome in any court consideration, the discussion shows that equitable inclusion in accountability systems is available through policy change rather than expensive, and possibly unsuccessful, legal challenges.
Resumo:
Current design standards do not provide adequate guidelines for the fire design of cold-formed steel compression members subject to flexural-torsional buckling. Eurocode 3 Part 1.2 (2005) recommends the same fire design guidelines for both hot-rolled and cold-formed steel compression members subject to flexural-torsional buckling although considerable behavioural differences exist between cold-formed and hot-rolled steel members. Past research has recommended the use of ambient temperature cold-formed steel design rules for the fire design of cold-formed steel compression members provided appropriately reduced mechanical properties are used at elevated temperatures. To assess the accuracy of flexural-torsional buckling design rules in both ambient temperature cold-formed steel design and fire design standards, an experimental study of slender cold-formed steel compression members was undertaken at both ambient and elevated temperatures. This paper presents the details of this experimental study, its results, and their comparison with the predictions from the current design rules. It was found that the current ambient temperature design rules are conservative while the fire design rules are overly conservative. Suitable recommendations have been made in relation to the currently available design rules for flexural-torsional buckling including methods of improvement. Most importantly, this paper has addressed the lack of experimental results for slender cold-formed steel columns at elevated temperatures.
Resumo:
Chatrooms, for example Internet Relay Chat, are generally multi-user, multi-channel and multiserver chat-systems which run over the Internet and provide a protocol for real-time text-based conferencing between users all over the world. While a well-trained human observer is able to understand who is chatting with whom, there are no efficient and accurate automated tools to determine the groups of users conversing with each other. A precursor to analysing evolving cyber-social phenomena is to first determine what the conversations are and which groups of chatters are involved in each conversation. We consider this problem in this paper. We propose an algorithm to discover all groups of users that are engaged in conversation. Our algorithms are based on a statistical model of a chatroom that is founded on our experience with real chatrooms. Our approach does not require any semantic analysis of the conversations, rather it is based purely on the statistical information contained in the sequence of posts. We improve the accuracy by applying some graph algorithms to clean the statistical information. We present some experimental results which indicate that one can automatically determine the conversing groups in a chatroom, purely on the basis of statistical analysis.
Resumo:
The future emergence of many types of airborne vehicles and unpiloted aircraft in the national airspace means collision avoidance is of primary concern in an uncooperative airspace environment. The ability to replicate a pilot’s see and avoid capability using cameras coupled with vision based avoidance control is an important part of an overall collision avoidance strategy. But unfortunately without range collision avoidance has no direct way to guarantee a level of safety. Collision scenario flight tests with two aircraft and a monocular camera threat detection and tracking system were used to study the accuracy of image-derived angle measurements. The effect of image-derived angle errors on reactive vision-based avoidance performance was then studied by simulation. The results show that whilst large angle measurement errors can significantly affect minimum ranging characteristics across a variety of initial conditions and closing speeds, the minimum range is always bounded and a collision never occurs.
Resumo:
PURPOSE: To test the reliability of Timed Up and Go Tests (TUGTs) in cardiac rehabilitation (CR) and compare TUGTs to the 6-Minute Walk Test (6MWT) for outcome measurement. METHODS: Sixty-one of 154 consecutive community-based CR patients were prospectively recruited. Subjects undertook repeated TUGTs and 6MWTs at the start of CR (start-CR), postdischarge from CR (post-CR), and 6 months postdischarge from CR (6 months post-CR). The main outcome measurements were TUGT time (TUGTT) and 6MWT distance (6MWD). RESULTS: Mean (SD) TUGTT1 and TUGTT2 at the 3 assessments were 6.29 (1.30) and 5.94 (1.20); 5.81 (1.22) and 5.53 (1.09); and 5.39 (1.60) and 5.01 (1.28) seconds, respectively. A reduction in TUGTT occurred between each outcome point (P ≤ .002). Repeated TUGTTs were strongly correlated at each assessment, intraclass correlation (95% CI) = 0.85 (0.76–0.91), 0.84 (0.73–0.91), and 0.90 (0.83–0.94), despite a reduction between TUGTT1 and TUGTT2 of 5%, 5%, and 7%, respectively (P ≤ .006). Relative decreases in TUGTT1 (TUGTT2) occurred from start-CR to post-CR and from start-CR to 6 months post-CR of −7.5% (−6.9%) and −14.2% (−15.5%), respectively, while relative increases in 6MWD1 (6MWD2) occurred, 5.1% (7.2%) and 8.4% (10.2%), respectively (P < .001 in all cases). Pearson correlation coefficients for 6MWD1 to TUGTT1 and TUGTT2 across all times were −0.60 and −0.68 (P < .001) and the intraclass correlations (95% CI) for the speeds derived from averaged 6MWDs and TUGTTs were 0.65 (0.54, 0.73) (P < .001). CONCLUSIONS: Similar relative changes occurred for the TUGT and the 6MWT in CR. A significant correlation between the TUGTT and 6MWD was demonstrated, and we suggest that the TUGT may provide a related or a supplementary measurement of functional capacity in CR.
Resumo:
Russell, Benton and Kingsley (2010) recently suggested a new association football test comprising three different tasks for the evaluation of players' passing, dribbling and shooting skills. Their stated intention was to enhance ‘ecological validity’ of current association football skills tests allowing generalisation of results from the new protocols to performance constraints that were ‘representative’ of experiences during competitive game situations. However, in this comment we raise some concerns with their use of the term ‘ecological validity’ to allude to aspects of ‘representative task design’. We propose that in their paper the authors confused understanding of environmental properties, performance achievement and generalisability of the test and its outcomes. Here, we argue that the tests designed by Russell and colleagues did not include critical sources of environmental information, such as the active role of opponents, which players typically use to organise their actions during performance. Static tasks which are not representative of the competitive performance environment may lead to different emerging patterns of movement organisation and performance outcomes, failing to effectively evaluate skills performance in sport.
Resumo:
The power of testing for a population-wide association between a biallelic quantitative trait locus and a linked biallelic marker locus is predicted both empirically and deterministically for several tests. The tests were based on the analysis of variance (ANOVA) and on a number of transmission disequilibrium tests (TDT). Deterministic power predictions made use of family information, and were functions of population parameters including linkage disequilibrium, allele frequencies, and recombination rate. Deterministic power predictions were very close to the empirical power from simulations in all scenarios considered in this study. The different TDTs had very similar power, intermediate between one-way and nested ANOVAs. One-way ANOVA was the only test that was not robust against spurious disequilibrium. Our general framework for predicting power deterministically can be used to predict power in other association tests. Deterministic power calculations are a powerful tool for researchers to plan and evaluate experiments and obviate the need for elaborate simulation studies.
Resumo:
A satellite based observation system can continuously or repeatedly generate a user state vector time series that may contain useful information. One typical example is the collection of International GNSS Services (IGS) station daily and weekly combined solutions. Another example is the epoch-by-epoch kinematic position time series of a receiver derived by a GPS real time kinematic (RTK) technique. Although some multivariate analysis techniques have been adopted to assess the noise characteristics of multivariate state time series, statistic testings are limited to univariate time series. After review of frequently used hypotheses test statistics in univariate analysis of GNSS state time series, the paper presents a number of T-squared multivariate analysis statistics for use in the analysis of multivariate GNSS state time series. These T-squared test statistics have taken the correlation between coordinate components into account, which is neglected in univariate analysis. Numerical analysis was conducted with the multi-year time series of an IGS station to schematically demonstrate the results from the multivariate hypothesis testing in comparison with the univariate hypothesis testing results. The results have demonstrated that, in general, the testing for multivariate mean shifts and outliers tends to reject less data samples than the testing for univariate mean shifts and outliers under the same confidence level. It is noted that neither univariate nor multivariate data analysis methods are intended to replace physical analysis. Instead, these should be treated as complementary statistical methods for a prior or posteriori investigations. Physical analysis is necessary subsequently to refine and interpret the results.
Resumo:
The use of Wireless Sensor Networks (WSNs) for Structural Health Monitoring (SHM) has become a promising approach due to many advantages such as low cost, fast and flexible deployment. However, inherent technical issues such as data synchronization error and data loss have prevented these distinct systems from being extensively used. Recently, several SHM-oriented WSNs have been proposed and believed to be able to overcome a large number of technical uncertainties. Nevertheless, there is limited research verifying the applicability of those WSNs with respect to demanding SHM applications like modal analysis and damage identification. This paper first presents a brief review of the most inherent uncertainties of the SHM-oriented WSN platforms and then investigates their effects on outcomes and performance of the most robust Output-only Modal Analysis (OMA) techniques when employing merged data from multiple tests. The two OMA families selected for this investigation are Frequency Domain Decomposition (FDD) and Data-driven Stochastic Subspace Identification (SSI-data) due to the fact that they both have been widely applied in the past decade. Experimental accelerations collected by a wired sensory system on a large-scale laboratory bridge model are initially used as clean data before being contaminated by different data pollutants in sequential manner to simulate practical SHM-oriented WSN uncertainties. The results of this study show the robustness of FDD and the precautions needed for SSI-data family when dealing with SHM-WSN uncertainties. Finally, the use of the measurement channel projection for the time-domain OMA techniques and the preferred combination of the OMA techniques to cope with the SHM-WSN uncertainties is recommended.