700 resultados para Data reporting


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper analyses the expected value of OD volumes from probe with fixed error, error that is proportional to zone size and inversely proportional to zone size. To add realism to the analysis, real trip ODs in the Tokyo Metropolitan Region are synthesised. The results show that for small zone coding with average radius of 1.1km, and fixed measurement error of 100m, an accuracy of 70% can be expected. The equivalent accuracy for medium zone coding with average radius of 5km would translate into a fixed error of approximately 300m. As expected small zone coding is more sensitive than medium zone coding as the chances of the probe error envelope falling into adjacent zones are higher. For the same error radii, error proportional to zone size would deliver higher level of accuracy. As over half (54.8%) of the trip ends start or end at zone with equivalent radius of ≤ 1.2 km and only 13% of trips ends occurred at zones with equivalent radius ≥2.5km, measurement error that is proportional to zone size such as mobile phone would deliver higher level of accuracy. The synthesis of real OD with different probe error characteristics have shown that expected value of >85% is difficult to achieve for small zone coding with average radius of 1.1km. For most transport applications, OD matrix at medium zone coding is sufficient for transport management. From this study it can be drawn that GPS with error range between 2 and 5m, and at medium zone coding (average radius of 5km) would provide OD estimates greater than 90% of the expected value. However, for a typical mobile phone operating error range at medium zone coding the expected value would be lower than 85%. This paper assumes transmission of one origin and one destination positions from the probe. However, if multiple positions within the origin and destination zones are transmitted, map matching to transport network could be performed and it would greatly improve the accuracy of the probe data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a model to estimate travel time using cumulative plots. Three different cases considered are i) case-Det, for only detector data; ii) case-DetSig, for detector data and signal controller data and iii) case-DetSigSFR: for detector data, signal controller data and saturation flow rate. The performance of the model for different detection intervals is evaluated. It is observed that detection interval is not critical if signal timings are available. Comparable accuracy can be obtained from larger detection interval with signal timings or from shorter detection interval without signal timings. The performance for case-DetSig and for case-DetSigSFR is consistent with accuracy generally more than 95% whereas, case-Det is highly sensitive to the signal phases in the detection interval and its performance is uncertain if detection interval is integral multiple of signal cycles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Light Detection and Ranging (LIDAR) has great potential to assist vegetation management in power line corridors by providing more accurate geometric information of the power line assets and vegetation along the corridors. However, the development of algorithms for the automatic processing of LIDAR point cloud data, in particular for feature extraction and classification of raw point cloud data, is in still in its infancy. In this paper, we take advantage of LIDAR intensity and try to classify ground and non-ground points by statistically analyzing the skewness and kurtosis of the intensity data. Moreover, the Hough transform is employed to detected power lines from the filtered object points. The experimental results show the effectiveness of our methods and indicate that better results were obtained by using LIDAR intensity data than elevation data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A few studies examined interactive effects between air pollution and temperature on health outcomes. This study is to examine if temperature modified effects of ozone and cardiovascular mortality in 95 large US cities. A nonparametric and a parametric regression models were separately used to explore interactive effects of temperature and ozone on cardiovascular mortality during May and October, 1987-2000. A Bayesian meta-analysis was used to pool estimates. Both models illustrate that temperature enhanced the ozone effects on mortality in the northern region, but obviously in the southern region. A 10-ppb increment in ozone was associated with 0.41 % (95% posterior interval (PI): -0.19 %, 0.93 %), 0.27 % (95% PI: -0.44 %, 0.87 %) and 1.68 % (95% PI: 0.07 %, 3.26 %) increases in daily cardiovascular mortality corresponding to low, moderate and high levels of temperature, respectively. We concluded that temperature modified effects of ozone, particularly in the northern region.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Total deposition of petrol, diesel and environmental tobacco smoke (ETS) aerosols in the human respiratory tract for nasal breathing conditions was computed for 14 nonsmoking volunteers, considering the specific anatomical and respiratory parameters of each volunteer and the specific size distribution for each inhalation experiment. Theoretical predictions were 34.6% for petrol, 24.0% for diesel, and 18.5% for ETS particles. Compared to the experimental results, predicted deposition values were consistently smaller than the measured data (41.4% for petrol, 29.6% for diesel, and 36.2% for ETS particles). The apparent discrepancy between experimental data on total deposition and modeling results may be reconciled by considering the non-spherical shape of the test aerosols by diameter-dependent dynamic shape factors to account for differences between mobility-equivalent and volume-equivalent or thermodynamic diameters. While the application of dynamic shape factors is able to explain the observed differences for petrol and diesel particles, additional mechanisms may be required for ETS particle deposition, such as the size reduction upon inspiration by evaporation of volatile compounds and/or condensation-induced restructuring, and, possibly, electrical charge effects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traffic congestion is an increasing problem with high costs in financial, social and personal terms. These costs include psychological and physiological stress, aggressivity and fatigue caused by lengthy delays, and increased likelihood of road crashes. Reliable and accurate traffic information is essential for the development of traffic control and management strategies. Traffic information is mostly gathered from in-road vehicle detectors such as induction loops. Traffic Message Chanel (TMC) service is popular service which wirelessly send traffic information to drivers. Traffic probes have been used in many cities to increase traffic information accuracy. A simulation to estimate the number of probe vehicles required to increase the accuracy of traffic information in Brisbane is proposed. A meso level traffic simulator has been developed to facilitate the identification of the optimal number of probe vehicles required to achieve an acceptable level of traffic reporting accuracy. Our approach to determine the optimal number of probe vehicles required to meet quality of service requirements, is to simulate runs with varying numbers of traffic probes. The simulated traffic represents Brisbane’s typical morning traffic. The road maps used in simulation are Brisbane’s TMC maps complete with speed limits and traffic lights. Experimental results show that that the optimal number of probe vehicles required for providing a useful supplement to TMC (induction loop) data lies between 0.5% and 2.5% of vehicles on the road. With less probes than 0.25%, little additional information is provided, while for more probes than 5%, there is only a negligible affect on accuracy for increasingly many probes on the road. Our findings are consistent with on-going research work on traffic probes, and show the effectiveness of using probe vehicles to supplement induction loops for accurate and timely traffic information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traffic law enforcement is based on deterrence principles, whereby drivers control their behaviour in order to avoid an undesirable sanction. For “hooning”-related driving behaviours in Queensland, the driver’s vehicle can be impounded for 48 hours, 3 months, or permanently depending on the number of previous hooning offences. It is assumed that the threat of losing something of value, their vehicle, will discourage drivers from hooning. While official data shows that the rate of repeat offending is low, an in-depth understanding of the deterrent effects of these laws should involve qualitative research with targeted drivers. A sample of 22 drivers who reported engaging in hooning behaviours participated in focus group discussions about the vehicle impoundment laws as applied to hooning offences in Queensland. The findings suggested that deterrence theory alone cannot fully explain hooning behaviour, as participants reported hooning frequently, and intended to continue doing so, despite reporting that it is likely that they will be caught, and perceiving the vehicle impoundment laws to be extremely severe. The punishment avoidance aspect of deterrence theory appears important, as well as factors over and above legal issues, particularly social influences. A concerning finding was drivers’ willingness to flee from police in order to avoid losing their vehicle permanently for a third offence, despite acknowledging risks to their own safety and that of others. This paper discusses the study findings in terms of the implications for future research directions, enforcement practices and policy development for hooning and other traffic offences for which vehicle impoundment is applied.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The wide range of contributing factors and circumstances surrounding crashes on road curves suggest that no single intervention can prevent these crashes. This paper presents a novel methodology, based on data mining techniques, to identify contributing factors and the relationship between them. It identifies contributing factors that influence the risk of a crash. Incident records, described using free text, from a large insurance company were analysed with rough set theory. Rough set theory was used to discover dependencies among data, and reasons using the vague, uncertain and imprecise information that characterised the insurance dataset. The results show that male drivers, who are between 50 and 59 years old, driving during evening peak hours are involved with a collision, had a lowest crash risk. Drivers between 25 and 29 years old, driving from around midnight to 6 am and in a new car has the highest risk. The analysis of the most significant contributing factors on curves suggests that drivers with driving experience of 25 to 42 years, who are driving a new vehicle have the highest crash cost risk, characterised by the vehicle running off the road and hitting a tree. This research complements existing statistically based tools approach to analyse road crashes. Our data mining approach is supported with proven theory and will allow road safety practitioners to effectively understand the dependencies between contributing factors and the crash type with the view to designing tailored countermeasures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: The objectives of this article are to explore the extent to which the International Statistical Classification of Diseases and Related Health Problems (ICD) has been used in child abuse research, to describe how the ICD system has been applied and to assess factors affecting the reliability of ICD coded data in child abuse research.----- Methods: PubMed, CINAHL, PsychInfo and Google Scholar were searched for peer reviewed articles written since 1989 that used ICD as the classification system to identify cases and research child abuse using health databases. Snowballing strategies were also employed by searching the bibliographies of retrieved references to identify relevant associated articles. The papers identified through the search were independently screened by two authors for inclusion, resulting in 47 studies selected for the review. Due to heterogeneity of studies metaanalysis was not performed.----- Results: This paper highlights both utility and limitations of ICD coded data. ICD codes have been widely used to conduct research into child maltreatment in health data systems. The codes appear to be used primarily to determine child maltreatment patterns within identified diagnoses or to identify child maltreatment cases for research.----- Conclusions: A significant impediment to the use of ICD codes in child maltreatment research is the under-ascertainment of child maltreatment by using coded data alone. This is most clearly identified and, to some degree, quantified, in research where data linkage is used. Practice Implications: The importance of improved child maltreatment identification will assist in identifying risk factors and creating programs that can prevent and treat child maltreatment and assist in meeting reporting obligations under the CRC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A data-driven background dataset refinement technique was recently proposed for SVM based speaker verification. This method selects a refined SVM background dataset from a set of candidate impostor examples after individually ranking examples by their relevance. This paper extends this technique to the refinement of the T-norm dataset for SVM-based speaker verification. The independent refinement of the background and T-norm datasets provides a means of investigating the sensitivity of SVM-based speaker verification performance to the selection of each of these datasets. Using refined datasets provided improvements of 13% in min. DCF and 9% in EER over the full set of impostor examples on the 2006 SRE corpus with the majority of these gains due to refinement of the T-norm dataset. Similar trends were observed for the unseen data of the NIST 2008 SRE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. Ecological data sets often use clustered measurements or use repeated sampling in a longitudinal design. Choosing the correct covariance structure is an important step in the analysis of such data, as the covariance describes the degree of similarity among the repeated observations. 2. Three methods for choosing the covariance are: the Akaike information criterion (AIC), the quasi-information criterion (QIC), and the deviance information criterion (DIC). We compared the methods using a simulation study and using a data set that explored effects of forest fragmentation on avian species richness over 15 years. 3. The overall success was 80.6% for the AIC, 29.4% for the QIC and 81.6% for the DIC. For the forest fragmentation study the AIC and DIC selected the unstructured covariance, whereas the QIC selected the simpler autoregressive covariance. Graphical diagnostics suggested that the unstructured covariance was probably correct. 4. We recommend using DIC for selecting the correct covariance structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report provides an introduction to our analyses of secondary data with respect to violent acts and incidents relating to males living in rural settings in Australia. It clarifies important aspects of our overall approach primarily by concentrating on three elements that required early scoping and resolution. Firstly, a wide and inclusive view of violence which encompasses measures of violent acts and incidents and also data identifying risk taking behaviour and the consequences of violence is outlined and justified. Secondly, the classification used to make comparisons between the city and the bush together with associated caveats is outlined. The third element discussed is in relation to national injury data. Additional commentary resulting from exploration, examination and analyses of secondary data is published online in five subsequent reports in this series.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thousands of Australian children are sexually abused every year, and the effects can be severe and long lasting. Not only is child sexual abuse a public health problem, but the acts inflicted are criminal offences. Child sexual abuse usually occurs in private, typically involving relationships featuring a massive imbalance in power and an abuse of that power. Those who inflict child sexual abuse seek to keep it secret, whether by threats or more subtle persuasion. As a method of responding to this phenomenon and in an effort to uncover cases of sexual abuse that otherwise would not come to light, governments in Australian States and Territories have enacted legislation requiring designated persons to report suspected child sexual abuse. With Western Australia’s new legislation having commenced on 1 January 2009, every Australian State and Territory government has now passed these laws, so that there is now, for the first time, an almost harmonious legislative approach across Australia to the reporting of child sexual abuse. Yet there remain differences in the State and Territory laws regarding who has to make reports, which cases of sexual abuse are required to be reported, and whether suspected future abuse must be reported. These differences indicate that further refinement of the laws is required

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigates the everyday practices of young children acting in their social worlds within the context of the school playground. It employs an ethnographic ethnomethodological approach using conversation analysis. In the context of child participation rights advanced by the United Nations Convention on the Rights of the Child (UNCRC) and childhood studies, the study considers children’s social worlds and their participation agendas. The participants of the study were a group of young children in a preparatory year setting in a Queensland school. These children, aged 4 to 6 years, were videorecorded as they participated in their day-to-day activities in the classroom and in the playground. Data collection took place over a period of three months, with a total of 26 hours of video data. Episodes of the video-recordings were shown to small groups of children and to the teacher to stimulate conversations about what they saw on the video. The conversations were audio-recorded. This method acknowledged the child’s standpoint and positioned children as active participants in accounting for their relationships with others. These accounts are discussed as interactionally built comments on past joint experiences and provided a starting place for analysis of the video-recorded interaction. Four data chapters are presented in this thesis. Each data chapter investigates a different topic of interaction. The topics include how children use “telling” as a tactical tool in the management of interactional trouble, how children use their “ideas” as possessables to gain ownership of a game and the interactional matters that follow, how children account for interactional matters and bid for ownership of “whose idea” for the game and finally, how a small group of girls orientated to a particular code of conduct when accounting for their actions in a pretend game of “school”. Four key themes emerged from the analysis. The first theme addresses two arenas of action operating in the social world of children, pretend and real: the “pretend”, as a player in a pretend game, and the “real”, as a classroom member. These two arenas are intertwined. Through inferences to explicit and implicit “codes of conduct”, moral obligations are invoked as children attempt to socially exclude one another, build alliances and enforce their own social positions. The second theme is the notion of shared history. This theme addresses the history that the children reconstructed, and acts as a thread that weaves through their interactions, with implications for present and future relationships. The third theme is around ownership. In a shared context, such as the playground, ownership is a highly contested issue. Children draw on resources such as rules, their ideas as possessables, and codes of behaviour as devices to construct particular social and moral orders around owners of the game. These themes have consequences for children’s participation in a social group. The fourth theme, methodological in nature, shows how the researcher was viewed as an outsider and novice and was used as a resource by the children. This theme is used to inform adult-child relationships. The study was situated within an interest in participation rights for children and perspectives of children as competent beings. Asking children to account for their participation in playground activities situates children as analysers of their own social worlds and offers adults further information for understanding how children themselves construct their social interactions. While reporting on the experiences of one group of children, this study opens up theoretical questions about children’s social orders and these influences on their everyday practices. This thesis uncovers how children both participate in, and shape, their everyday social worlds through talk and interaction. It investigates the consequences that taken-for-granted activities of “playing the game” have for their social participation in the wider culture of the classroom. Consideration of this significance may assist adults to better understand and appreciate the social worlds of young children in the school playground.