614 resultados para data availability
Resumo:
Background/aim In response to the high burden of disease associated with chronic heart failure (CHF), in particular the high rates of hospital admissions, dedicated CHF management programs (CHF-MP) have been developed. Over the past five years there has been a rapid growth of CHF-MPs in Australia. Given the apparent mismatch between the demand for, and availability of CHF-MPs, this paper has been designed to discuss the accessibility to and quality of current CHF-MPs in Australia. Methods The data presented in this report has been combined from the research of the co-authors, in particular a review of the inequities in access to chronic heart failure which utilised geographical information systems (GIS) and the survey of heterogeneity in quality and service provision in Australian. Results Of the 62 CHF-MPs surveyed in this study 93% (58) centres had been located areas that are rated as Highly Accessible. This result indicated that most of the CHF-MPs have been located in capital cities or large regional cities. Six percent (4 CHF-MPs) had been located in Accessible areas which were country towns or cities. No CHF-MPs had been established outside of cities to service the estimated 72,000 individuals with CHF living in rural and remote areas. 16% of programs recruited NYHA Class I patients and of these 20% lacked confirmation (echocardiogram) of their diagnosis. Conclusion Overall, these data highlight the urgent need to provide equitable access to CHF-MP's. When establishing CHF-MPs consideration of current evidence based models to ensure quality in practice.
Resumo:
This paper reports an empirical study on measuring transit service reliability using the data from a Web-based passenger survey on a major transit corridor in Brisbane, Australia. After an introduction of transit service reliability measures, the paper presents the results from the case study including study area, data collection, and reliability measures obtained. This includes data exploration of boarding/arrival lateness, in-vehicle time variation, waiting time variation, and headway adherence. Impacts of peak-period effects and separate operation on service reliability are examined. Relationships between transit service characteristics and passenger waiting time are also discussed. A summary of key findings and an agenda of future research are offered in conclusions.
Resumo:
This paper discusses the statistical analyses used to derive bridge live loads models for Hong Kong from a 10-year weigh-in-motion (WIM) data. The statistical concepts required and the terminologies adopted in the development of bridge live load models are introduced. This paper includes studies for representative vehicles from the large amount of WIM data in Hong Kong. Different load affecting parameters such as gross vehicle weights, axle weights, axle spacings, average daily number of trucks etc are first analyzed by various stochastic processes in order to obtain the mathematical distributions of these parameters. As a prerequisite to determine accurate bridge design loadings in Hong Kong, this study not only takes advantages of code formulation methods used internationally but also presents a new method for modelling collected WIM data using a statistical approach.
Resumo:
Throughout the world, state and nation standardised testing of children, has become a "huge industry" (English, 2002). Although English is referring to the American system which has been involved in standardised testing for over half a century, the same could be said of many other countries, including Australia. It has been only in recent years that Australia has embraced national testing as part of a wider reform effort to bring about increased accountability in schooling. The results of high-stakes tests in Australia are now published in newspapers and electronically on the Australian federal government's MySchool website (www.myschoold.edu.au). MySchool provides results on the National Assessment Program - Literacy and Numeracy (NAPLAN) for students in Years 3,5, 7 and 9. Data are available that compare schools to statistically similar schools. This more recent publication of national testing results in Australia is a visible example of "contractual accountability", described by Mulford, Edmunds, Kendall, Kendall and Bishop (2008) as " the degree to which [actors] are fulfilling the expectations of particular audiences in terms of standards, outcomes and results" (p.20).
Resumo:
The use of stable isotope ratios δ18O and δ2H are well established in assessment of groundwater systems and their hydrology. The conventional approach is based on x/y plots and relation to various MWL’s, and plots of either ratio against parameters such as Clor EC. An extension of interpretation is the use of 2D maps and contour plots, and 2D hydrogeological vertical sections. An enhancement of presentation and interpretation is the production of “isoscapes”, usually as 2.5D surface projections. We have applied groundwater isotopic data to a 3D visualisation, using the alluvial aquifer system of the Lockyer Valley. The 3D framework is produced in GVS (Groundwater Visualisation System). This format enables enhanced presentation by displaying the spatial relationships and allowing interpolation between “data points” i.e. borehole screened zones where groundwater enters. The relative variations in the δ18O and δ2H values are similar in these ambient temperature systems. However, δ2H better reflects hydrological processes, whereas δ18O also reflects aquifer/groundwater exchange reactions. The 3D model has the advantage that it displays borehole relations to spatial features, enabling isotopic ratios and their values to be associated with, for example, bedrock groundwater mixing, interaction between aquifers, relation to stream recharge, and to near-surface and return irrigation water evaporation. Some specific features are also shown, such as zones of leakage of deeper groundwater (in this case with a GAB signature). Variations in source of recharging water at a catchment scale can be displayed. Interpolation between bores is not always possible depending on numbers and spacing, and by elongate configuration of the alluvium. In these cases, the visualisation uses discs around the screens that can be manually expanded to test extent or intersections. Separate displays are used for each of δ18O and δ2H and colour coding for isotope values.
Resumo:
The Lockyer Valley in southeast Queensland supports important and intensive irrigation which is dependant on the quality and availability of groundwater. Prolonged drought conditions from ~1997 resulted in a depletion of the alluvial aquifers, and concern for the long-term sustainability of this resource. By 2008, many areas of the valley were at < 20% of storage. Some relief occurred with rain events in early 2009, then in December 2010 - January 2011, most of southeast Queensland experienced unprecedented flooding. These storm-based events have caused a shift in research focus from investigations of drought conditions and mitigation to flood response analysis. For the alluvial aquifer system of the valley, a preliminary assessment of groundwater observation bore data, prior to and during the flood, indicates that there is a spatially variable aquifer response. While water levels in some bores screened in unconfined shallow aquifers have recovered by more than 10 m within a short period of time (months), others show only a small or moderate response. Measurements of pre- and post-flood groundwater levels and high-resolution time-series records from data loggers are considered within the framework of a 3D geological model of the Lockyer Valley using Groundwater Visualisation System(GVS). Groundwater level fluctuations covering both drought and flood periods are used to estimate groundwater recharge using the water table fluctuation method (WTF), supplemented by estimates derived using chloride mass balance. The presentation of hydraulic and recharge information in a 3D format has considerable advantages over the traditional 2D presentation of data. The 3D approach allows the distillation of multiple types of information(topography, geological, hydraulic and spatial) into one representation that provides valuable insights into the major controls of groundwater flow and recharge. The influence of aquifer lithology on the spatial variability of groundwater recharge is also demonstrated.
Resumo:
Data analysis sessions are a common feature of discourse analytic communities, often involving participants with varying levels of expertise to those with significant expertise. Learning how to do data analysis and working with transcripts, however, are often new experiences for doctoral candidates within the social sciences. While many guides to doctoral education focus on procedures associated with data analysis (Heath, Hindmarsh, & Luff, 2010; McHoul & Rapley, 2001; Silverman, 2011; Wetherall, Taylor, & Yates, 2001), the in situ practices of doing data analysis are relatively undocumented. This chapter has been collaboratively written by members of a special interest research group, the Transcript Analysis Group (TAG), who meet regularly to examine transcripts representing audio- and video-recorded interactional data. Here, we investigate our own actual interactional practices and participation in this group where each member is both analyst and participant. We particularly focus on the pedagogic practices enacted in the group through investigating how members engage in the scholarly practice of data analysis. A key feature of talk within the data sessions is that members work collaboratively to identify and discuss ‘noticings’ from the audio-recorded and transcribed talk being examined, produce candidate analytic observations based on these discussions, and evaluate these observations. Our investigation of how talk constructs social practices in these sessions shows that participants move fluidly between actions that demonstrate pedagogic practices and expertise. Within any one session, members can display their expertise as analysts and, at the same time, display that they have gained an understanding that they did not have before. We take an ethnomethodological position that asks, ‘what’s going on here?’ in the data analysis session. By observing the in situ practices in fine-grained detail, we show how members participate in the data analysis sessions and make sense of a transcript.
Resumo:
This study investigated the Kinaesthetic Fusion Effect (KFE) first described by Craske and Kenny in 1981. The current study did not replicate these findings. Participants did not perceive any reduction in the sagittal separation of a button pressed by the index finger of one arm and a probe touching the other, following repeated exposure to the tactile stimuli present on both unseen arms. This study’s failure to replicate the widely-cited KFE as described by Craske et al. (1984) suggests that it may be contingent on several aspects of visual information, especially the availability of a specific visual reference, the role of instructions regarding gaze direction, and the potential use of a line of sight strategy when referring felt positions to an interposed surface. In addition, a foreshortening effect was found; this may result from a line-of-sight judgment and represent a feature of the reporting method used. The transformed line of sight data were regressed against the participant reported values, resulting in a slope of 1.14 (right arm) and 1.11 (left arm), and r > 0.997 for each. The study also provides additional evidence that mis-perceptions of the mediolateral position of the limbs specifically their separation and consistent with notions of Gestalt grouping, is somewhat labile and can be influenced by active motions causing touch of one limb by the other. Finally, this research will benefit future studies that require participants to report the perceived locations of the unseen limbs.