932 resultados para data complexity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports an empirical study on measuring transit service reliability using the data from a Web-based passenger survey on a major transit corridor in Brisbane, Australia. After an introduction of transit service reliability measures, the paper presents the results from the case study including study area, data collection, and reliability measures obtained. This includes data exploration of boarding/arrival lateness, in-vehicle time variation, waiting time variation, and headway adherence. Impacts of peak-period effects and separate operation on service reliability are examined. Relationships between transit service characteristics and passenger waiting time are also discussed. A summary of key findings and an agenda of future research are offered in conclusions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In an Australian context, the term hooning refers to risky driving behaviours such as illegal street racing and speed trials, as well as behaviours that involve unnecessary noise and smoke, which include burn outs, donuts, fish tails, drifting and other skids. Hooning receives considerable negative media attention in Australia, and since the 1990s all Australian jurisdictions have implemented vehicle impoundment programs to deal with the problem. However, there is limited objective evidence of the road safety risk associated with hooning behaviours. Attempts to estimate the risk associated with hooning are limited by official data collection and storage practices, and the willingness of drivers to admit to their illegal behaviour in the event of a crash. International evidence suggests that illegal street racing is associated with only a small proportion of fatal crashes; however, hooning in an Australian context encompasses a broader group of driving behaviours than illegal street racing alone, and it is possible that the road safety risks will differ with these behaviours. There is evidence from North American jurisdictions that vehicle impoundment programs are effective for managing drink driving offenders, and drivers who continue to drive while disqualified or suspended both during and post-impoundment. However, these programs used impoundment periods of 30 – 180 days (depending on the number of previous offences). In Queensland the penalty for a first hooning offence is 48 hours, while the vehicle can be impounded for up to 3 months for a second offence, or permanently for a third or subsequent offence within three years. Thus, it remains unclear whether similar effects will be seen for hooning offenders in Australia, as no evaluations of vehicle impoundment programs for hooning have been published. To address these research needs, this program of research consisted of three complementary studies designed to: (1) investigate the road safety implications of hooning behaviours in terms of the risks associated with the specific behaviours, and the drivers who engage in these behaviours; and (2) assess the effectiveness of current approaches to dealing with the problem; in order to (3) inform policy and practice in the area of hooning behaviour. Study 1 involved qualitative (N = 22) and quantitative (N = 290) research with drivers who admitted engaging in hooning behaviours on Queensland roads. Study 2 involved a systematic profile of a large sample of drivers (N = 834) detected and punished for a hooning offence in Queensland, and a comparison of their driving and crash histories with a randomly sampled group of Queensland drivers with the same gender and age distribution. Study 3 examined the post-impoundment driving behaviour of hooning offenders (N = 610) to examine the effects of vehicle impoundment on driving behaviour. The theoretical framework used to guide the research incorporated expanded deterrence theory, social learning theory, and driver thrill-seeking perspectives. This framework was used to explore factors contributing to hooning behaviours, and interpret the results of the aspects of the research designed to explore the effectiveness of vehicle impoundment as a countermeasure for hooning. Variables from each of the perspectives were related to hooning measures, highlighting the complexity of the behaviour. This research found that the road safety risk of hooning behaviours appears low, as only a small proportion of the hooning offences in Study 2 resulted in a crash. However, Study 1 found that hooning-related crashes are less likely to be reported than general crashes, particularly when they do not involve an injury, and that higher frequencies of hooning behaviours are associated with hooning-related crash involvement. Further, approximately one fifth of drivers in Study 1 reported being involved in a hooning-related crash in the previous three years, which is comparable to general crash involvement among the general population of drivers in Queensland. Given that hooning-related crashes represented only a sub-set of crash involvement for this sample, this suggests that there are risks associated with hooning behaviour that are not apparent in official data sources. Further, the main evidence of risk associated with the behaviour appears to relate to the hooning driver, as Study 2 found that these drivers are likely to engage in other risky driving behaviours (particularly speeding and driving vehicles with defects or illegal modifications), and have significantly more traffic infringements, licence sanctions and crashes than drivers of a similar (i.e., young) age. Self-report data from the Study 1 samples indicated that Queensland’s vehicle impoundment and forfeiture laws are perceived as severe, and that many drivers have reduced their hooning behaviour to avoid detection. However, it appears that it is more common for drivers to have simply changed the location of their hooning behaviour to avoid detection. When the post-impoundment driving behaviour of the sample of hooning offenders was compared to their pre-impoundment behaviour to examine the effectiveness of vehicle impoundment in Study 3, it was found that there was a small but significant reduction in hooning offences, and also for other traffic infringements generally. As Study 3 was observational, it was not possible to control for extraneous variables, and is, therefore, possible that some of this reduction was due to other factors, such as a reduction in driving exposure, the effects of changes to Queensland’s Graduated Driver Licensing scheme that were implemented during the study period and affected many drivers in the offender sample due to their age, or the extension of vehicle impoundment to other types of offences in Queensland during the post-impoundment period. However, there was a protective effect observed, in that hooning offenders did not show the increase in traffic infringements in the post period that occurred within the comparison sample. This suggests that there may be some effect of vehicle impoundment on the driving behaviour of hooning offenders, and that this effect is not limited to their hooning driving behaviour. To be more confident in these results, it is necessary to measure driving exposure during the post periods to control for issues such as offenders being denied access to vehicles. While it was not the primary aim of this program of research to compare the utility of different theoretical perspectives, the findings of the research have a number of theoretical implications. For example, it was found that only some of the deterrence variables were related to hooning behaviours, and sometimes in the opposite direction to predictions. Further, social learning theory variables had stronger associations with hooning. These results suggest that a purely legal approach to understanding hooning behaviours, and designing and implementing countermeasures designed to reduce these behaviours, are unlikely to be successful. This research also had implications for policy and practice, and a number of recommendations were made throughout the thesis to improve the quality of relevant data collection practices. Some of these changes have already occurred since the expansion of the application of vehicle impoundment programs to other offences in Queensland. It was also recommended that the operational and resource costs of these laws should be compared to the road safety benefits in ongoing evaluations of effectiveness to ensure that finite traffic policing resources are allocated in a way that produces maximum road safety benefits. However, as the evidence of risk associated with the hooning driver is more compelling than that associated with hooning behaviour, it was argued that the hooning driver may represent the better target for intervention. Suggestions for future research include ongoing evaluations of the effectiveness of vehicle impoundment programs for hooning and other high-risk driving behaviours, and the exploration of additional potential targets for intervention to reduce hooning behaviour. As the body of knowledge regarding the factors contributing to hooning increases, along with the identification of potential barriers to the effectiveness of current countermeasures, recommendations for changes in policy and practice for hooning behaviours can be made.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The literature abounds with descriptions of failures in high-profile projects and a range of initiatives has been generated to enhance project management practice (e.g., Morris, 2006). Estimating from our own research, there are scores of other project failures that are unrecorded. Many of these failures can be explained using existing project management theory; poor risk management, inaccurate estimating, cultures of optimism dominating decision making, stakeholder mismanagement, inadequate timeframes, and so on. Nevertheless, in spite of extensive discussion and analysis of failures and attention to the presumed causes of failure, projects continue to fail in unexpected ways. In the 1990s, three U.S. state departments of motor vehicles (DMV) cancelled major projects due to time and cost overruns and inability to meet project goals (IT-Cortex, 2010). The California DMV failed to revitalize their drivers’ license and registration application process after spending $45 million. The Oregon DMV cancelled their five year, $50 million project to automate their manual, paper-based operation after three years when the estimates grew to $123 million; its duration stretched to eight years or more and the prototype was a complete failure. In 1997, the Washington state DMV cancelled their license application mitigation project because it would have been too big and obsolete by the time it was estimated to be finished. There are countless similar examples of projects that have been abandoned or that have not delivered the requirements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis investigates the place of online moderation in supporting teachers to work in a system of standards-based assessment. The participants of the study were fifty middle school teachers who met online with the aim of developing consistency in their judgement decisions. Data were gathered through observation of the online meetings, interviews, surveys and the collection of artefacts. The data were viewed and analysed through sociocultural theories of learning and sociocultural theories of technology, and demonstrates how utilising these theories can add depth to understanding the added complexity of developing shared meaning of standards in an online context. The findings contribute to current understanding of standards-based assessment by examining the social moderation process as it acts to increase the reliability of judgements that are made within a standards framework. Specifically, the study investigates the opportunities afforded by conducting social moderation practices in a synchronous online context. The study explicates how the technology affects the negotiation of judgements and the development of shared meanings of assessment standards, while demonstrating how involvement in online moderation discussions can support teachers to become and belong within a practice of standards-based assessment. This research responds to a growing international interest in standards-based assessment and the use of social moderation to develop consistency in judgement decisions. Online moderation is a new practice to address these concerns on a systemic basis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses the statistical analyses used to derive bridge live loads models for Hong Kong from a 10-year weigh-in-motion (WIM) data. The statistical concepts required and the terminologies adopted in the development of bridge live load models are introduced. This paper includes studies for representative vehicles from the large amount of WIM data in Hong Kong. Different load affecting parameters such as gross vehicle weights, axle weights, axle spacings, average daily number of trucks etc are first analyzed by various stochastic processes in order to obtain the mathematical distributions of these parameters. As a prerequisite to determine accurate bridge design loadings in Hong Kong, this study not only takes advantages of code formulation methods used internationally but also presents a new method for modelling collected WIM data using a statistical approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Throughout the world, state and nation standardised testing of children, has become a "huge industry" (English, 2002). Although English is referring to the American system which has been involved in standardised testing for over half a century, the same could be said of many other countries, including Australia. It has been only in recent years that Australia has embraced national testing as part of a wider reform effort to bring about increased accountability in schooling. The results of high-stakes tests in Australia are now published in newspapers and electronically on the Australian federal government's MySchool website (www.myschoold.edu.au). MySchool provides results on the National Assessment Program - Literacy and Numeracy (NAPLAN) for students in Years 3,5, 7 and 9. Data are available that compare schools to statistically similar schools. This more recent publication of national testing results in Australia is a visible example of "contractual accountability", described by Mulford, Edmunds, Kendall, Kendall and Bishop (2008) as " the degree to which [actors] are fulfilling the expectations of particular audiences in terms of standards, outcomes and results" (p.20).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of stable isotope ratios δ18O and δ2H are well established in assessment of groundwater systems and their hydrology. The conventional approach is based on x/y plots and relation to various MWL’s, and plots of either ratio against parameters such as Clor EC. An extension of interpretation is the use of 2D maps and contour plots, and 2D hydrogeological vertical sections. An enhancement of presentation and interpretation is the production of “isoscapes”, usually as 2.5D surface projections. We have applied groundwater isotopic data to a 3D visualisation, using the alluvial aquifer system of the Lockyer Valley. The 3D framework is produced in GVS (Groundwater Visualisation System). This format enables enhanced presentation by displaying the spatial relationships and allowing interpolation between “data points” i.e. borehole screened zones where groundwater enters. The relative variations in the δ18O and δ2H values are similar in these ambient temperature systems. However, δ2H better reflects hydrological processes, whereas δ18O also reflects aquifer/groundwater exchange reactions. The 3D model has the advantage that it displays borehole relations to spatial features, enabling isotopic ratios and their values to be associated with, for example, bedrock groundwater mixing, interaction between aquifers, relation to stream recharge, and to near-surface and return irrigation water evaporation. Some specific features are also shown, such as zones of leakage of deeper groundwater (in this case with a GAB signature). Variations in source of recharging water at a catchment scale can be displayed. Interpolation between bores is not always possible depending on numbers and spacing, and by elongate configuration of the alluvium. In these cases, the visualisation uses discs around the screens that can be manually expanded to test extent or intersections. Separate displays are used for each of δ18O and δ2H and colour coding for isotope values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data analysis sessions are a common feature of discourse analytic communities, often involving participants with varying levels of expertise to those with significant expertise. Learning how to do data analysis and working with transcripts, however, are often new experiences for doctoral candidates within the social sciences. While many guides to doctoral education focus on procedures associated with data analysis (Heath, Hindmarsh, & Luff, 2010; McHoul & Rapley, 2001; Silverman, 2011; Wetherall, Taylor, & Yates, 2001), the in situ practices of doing data analysis are relatively undocumented. This chapter has been collaboratively written by members of a special interest research group, the Transcript Analysis Group (TAG), who meet regularly to examine transcripts representing audio- and video-recorded interactional data. Here, we investigate our own actual interactional practices and participation in this group where each member is both analyst and participant. We particularly focus on the pedagogic practices enacted in the group through investigating how members engage in the scholarly practice of data analysis. A key feature of talk within the data sessions is that members work collaboratively to identify and discuss ‘noticings’ from the audio-recorded and transcribed talk being examined, produce candidate analytic observations based on these discussions, and evaluate these observations. Our investigation of how talk constructs social practices in these sessions shows that participants move fluidly between actions that demonstrate pedagogic practices and expertise. Within any one session, members can display their expertise as analysts and, at the same time, display that they have gained an understanding that they did not have before. We take an ethnomethodological position that asks, ‘what’s going on here?’ in the data analysis session. By observing the in situ practices in fine-grained detail, we show how members participate in the data analysis sessions and make sense of a transcript.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A forced landing is an unscheduled event in flight requiring an emergency landing, and is most commonly attributed to engine failure, failure of avionics or adverse weather. Since the ability to conduct a successful forced landing is the primary indicator for safety in the aviation industry, automating this capability for unmanned aerial vehicles (UAVs) will help facilitate their integration into, and subsequent routine operations over civilian airspace. Currently, there is no commercial system available to perform this task; however, a team at the Australian Research Centre for Aerospace Automation (ARCAA) is working towards developing such an automated forced landing system. This system, codenamed Flight Guardian, will operate onboard the aircraft and use machine vision for site identification, artificial intelligence for data assessment and evaluation, and path planning, guidance and control techniques to actualize the landing. This thesis focuses on research specific to the third category, and presents the design, testing and evaluation of a Trajectory Generation and Guidance System (TGGS) that navigates the aircraft to land at a chosen site, following an engine failure. Firstly, two algorithms are developed that adapts manned aircraft forced landing techniques to suit the UAV planning problem. Algorithm 1 allows the UAV to select a route (from a library) based on a fixed glide range and the ambient wind conditions, while Algorithm 2 uses a series of adjustable waypoints to cater for changing winds. A comparison of both algorithms in over 200 simulated forced landings found that using Algorithm 2, twice as many landings were within the designated area, with an average lateral miss distance of 200 m at the aimpoint. These results present a baseline for further refinements to the planning algorithms. A significant contribution is seen in the design of the 3-D Dubins Curves planning algorithm, which extends the elementary concepts underlying 2-D Dubins paths to account for powerless flight in three dimensions. This has also resulted in the development of new methods in testing for path traversability, in losing excess altitude, and in the actual path formation to ensure aircraft stability. Simulations using this algorithm have demonstrated lateral and vertical miss distances of under 20 m at the approach point, in wind speeds of up to 9 m/s. This is greater than a tenfold improvement on Algorithm 2 and emulates the performance of manned, powered aircraft. The lateral guidance algorithm originally developed by Park, Deyst, and How (2007) is enhanced to include wind information in the guidance logic. A simple assumption is also made that reduces the complexity of the algorithm in following a circular path, yet without sacrificing performance. Finally, a specific method of supplying the correct turning direction is also used. Simulations have shown that this new algorithm, named the Enhanced Nonlinear Guidance (ENG) algorithm, performs much better in changing winds, with cross-track errors at the approach point within 2 m, compared to over 10 m using Park's algorithm. A fourth contribution is made in designing the Flight Path Following Guidance (FPFG) algorithm, which uses path angle calculations and the MacCready theory to determine the optimal speed to fly in winds. This algorithm also uses proportional integral- derivative (PID) gain schedules to finely tune the tracking accuracies, and has demonstrated in simulation vertical miss distances of under 2 m in changing winds. A fifth contribution is made in designing the Modified Proportional Navigation (MPN) algorithm, which uses principles from proportional navigation and the ENG algorithm, as well as methods specifically its own, to calculate the required pitch to fly. This algorithm is robust to wind changes, and is easily adaptable to any aircraft type. Tracking accuracies obtained with this algorithm are also comparable to those obtained using the FPFG algorithm. For all three preceding guidance algorithms, a novel method utilising the geometric and time relationship between aircraft and path is also employed to ensure that the aircraft is still able to track the desired path to completion in strong winds, while remaining stabilised. Finally, a derived contribution is made in modifying the 3-D Dubins Curves algorithm to suit helicopter flight dynamics. This modification allows a helicopter to autonomously track both stationary and moving targets in flight, and is highly advantageous for applications such as traffic surveillance, police pursuit, security or payload delivery. Each of these achievements serves to enhance the on-board autonomy and safety of a UAV, which in turn will help facilitate the integration of UAVs into civilian airspace for a wider appreciation of the good that they can provide. The automated UAV forced landing planning and guidance strategies presented in this thesis will allow the progression of this technology from the design and developmental stages, through to a prototype system that can demonstrate its effectiveness to the UAV research and operations community.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study model selection strategies based on penalized empirical loss minimization. We point out a tight relationship between error estimation and data-based complexity penalization: any good error estimate may be converted into a data-based penalty function and the performance of the estimate is governed by the quality of the error estimate. We consider several penalty functions, involving error estimates on independent test data, empirical VC dimension, empirical VC entropy, and margin-based quantities. We also consider the maximal difference between the error on the first half of the training data and the second half, and the expected maximal discrepancy, a closely related capacity estimate that can be calculated by Monte Carlo integration. Maximal discrepancy penalty functions are appealing for pattern classification problems, since their computation is equivalent to empirical risk minimization over the training data with some labels flipped.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate the use of certain data-dependent estimates of the complexity of a function class, called Rademacher and Gaussian complexities. In a decision theoretic setting, we prove general risk bounds in terms of these complexities. We consider function classes that can be expressed as combinations of functions from basis classes and show how the Rademacher and Gaussian complexities of such a function class can be bounded in terms of the complexity of the basis classes. We give examples of the application of these techniques in finding data-dependent risk bounds for decision trees, neural networks and support vector machines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study sample-based estimates of the expectation of the function produced by the empirical minimization algorithm. We investigate the extent to which one can estimate the rate of convergence of the empirical minimizer in a data dependent manner. We establish three main results. First, we provide an algorithm that upper bounds the expectation of the empirical minimizer in a completely data-dependent manner. This bound is based on a structural result due to Bartlett and Mendelson, which relates expectations to sample averages. Second, we show that these structural upper bounds can be loose, compared to previous bounds. In particular, we demonstrate a class for which the expectation of the empirical minimizer decreases as O(1/n) for sample size n, although the upper bound based on structural properties is Ω(1). Third, we show that this looseness of the bound is inevitable: we present an example that shows that a sharp bound cannot be universally recovered from empirical data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of complexity. The estimates we establish give optimal rates and are based on a local and empirical version of Rademacher averages, in the sense that the Rademacher averages are computed from the data, on a subset of functions with small empirical error. We present some applications to classification and prediction with convex function classes, and with kernel classes in particular.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In fault detection and diagnostics, limitations coming from the sensor network architecture are one of the main challenges in evaluating a system’s health status. Usually the design of the sensor network architecture is not solely based on diagnostic purposes, other factors like controls, financial constraints, and practical limitations are also involved. As a result, it quite common to have one sensor (or one set of sensors) monitoring the behaviour of two or more components. This can significantly extend the complexity of diagnostic problems. In this paper a systematic approach is presented to deal with such complexities. It is shown how the problem can be formulated as a Bayesian network based diagnostic mechanism with latent variables. The developed approach is also applied to the problem of fault diagnosis in HVAC systems, an application area with considerable modeling and measurement constraints.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The reduction of CO2 emissions and social exclusion are two key elements of UK transport strategy. Despite intensive research on each theme, little effort has so far been made linking the relationship between emissions and social exclusion. In addition, current knowledge on each theme is limited to urban areas; little research is available on these themes for rural areas. This research contributes to this gap in the literature by analysing 157 weekly activity-travel diary data collected from three case study areas with differential levels of area accessibility and area mobility options, located in rural Northern Ireland. Individual weekly CO2 emission levels from personal travel diaries (both hot exhaust emission and cold-start emission) were calculated using average speed models for different modes of transport. The socio-spatial patterns associated with CO2 emissions were identified using a general linear model whereas binary logistic regression analyses were conducted to identify mode choice behaviour and activity patterns. This research found groups that emitted a significantly lower level of CO2 included individuals living in an area with a higher level of accessibility and mobility, non-car, non-working, and low-income older people. However, evidence in this research also shows that although certain groups (e.g. those working, and residing in an area with a lower level of accessibility) emitted higher levels of CO2, their rate of participation in activities was however found to be significantly lower compared to their counterparts. Based on the study findings, this research highlights the need for both soft (e.g. teleworking) and physical (e.g. accessibility planning) policy measures in rural areas in order to meet government’s stated CO2 reduction targets while at the same time enhancing social inclusion.